April 12, 2024
This extended abstract introduces a class of graph learning applicable to cases where the underlying graph has polytopic uncertainty, i.e., the graph is not exactly known, but its parameters or properties vary within a known range. By incorporating this assumption that the graph lies in a polytopic set into two established graph learning frameworks, we find that our approach yields better results with less computation.
March 30, 2009
We describe a maximum entropy approach for computing volumes and counting integer points in polyhedra. To estimate the number of points from a particular set X in R^n in a polyhedron P in R^n, by solving a certain entropy maximization problem, we construct a probability distribution on the set X such that a) the probability mass function is constant on the intersection of P and X and b) the expectation of the distribution lies in P. This allows us to apply Central Limit Theor...
June 13, 2022
The first three sections of this survey represent an updated and much expanded version of the abstract of my talk at FPSAC'2010: new results are incorporated and several concrete conjectures on the interactions between the three perspectives on normal polytopes in the title are proposed. The last section outlines new challenges in general convex polytopes, motivated by the study of normal polytopes.
July 15, 2022
We algorithmically determine the regions and facets of all dimensions of the canonical polyhedral complex, the universal object into which a ReLU network decomposes its input space. We show that the locations of the vertices of the canonical polyhedral complex along with their signs with respect to layer maps determine the full facet structure across all dimensions. We present an algorithm which calculates this full combinatorial structure, making use of our theorems that the...
June 24, 2015
These are course notes I wrote for my Fall 2013 graduate topics course on geometric structures, taught at ICERM. The notes rework many of proofs in William P. Thurston's beautiful but hard-to-understand paper, "Shapes of Polyhedra". A number of people, both in and out of the class, found these notes very useful and so I decided to put them on the arXiv.
December 22, 2002
We give an elementary introduction to some recent polyhedral techniques for understanding and solving systems of multivariate polynomial equations. We provide numerous concrete examples and illustrations, and assume no background in algebraic geometry or convex geometry. Highlights include the following: (1) A completely self-contained proof of an extension of Bernstein's Theorem. Our extension relates volumes of polytopes with the number of connected components of the comp...
February 2, 2019
Self-polar polytopes are convex polytopes that are equal to an orthogonal transformation of their polar sets. These polytopes were first studied by Lov\'{a}sz as a means of establishing the chromatic number of distance graphs on spheres, and they can also be used to construct triangle-free graphs with arbitrarily high chromatic number. We investigate the existence, construction, facial structure, and practical applications of self-polar polytopes, as well as the place of thes...
April 27, 2023
We study the structure of the set of all possible affine hyperplane sections of a convex polytope. We present two different cell decompositions of this set, induced by hyperplane arrangements. Using our decomposition, we bound the number of possible combinatorial types of sections and craft algorithms that compute optimal sections of the polytope according to various combinatorial and metric criteria, including sections that maximize the number of $k$-dimensional faces, maxim...
January 26, 2016
In this paper, a novel method, called polyhedron volume ratio classification (PVRC) is proposed for image recognition
October 7, 2018
Deep learning has received much attention lately due to the impressive empirical performance achieved by training algorithms. Consequently, a need for a better theoretical understanding of these problems has become more evident in recent years. In this work, using a unified framework, we show that there exists a polyhedron which encodes simultaneously all possible deep neural network training problems that can arise from a given architecture, activation functions, loss functi...