Spring 2019

Graphical models, Exchangeable models and Graphons

About

Graphical model (GM) is a succinct way to represent a complex probability distribution on a collection of random variables. Such a probability distribution factorizes according to the graph underlying the GM. As such, the graph structure of the GM encodes interdependencies among the variables, making GMs a powerful and general framework within which to reason about high-dimensional data. The combinatorial structure brings to the fore the computational aspect underlying statistical tasks, helping to design various simple-to-implement and scalable statistical inference algorithms. As a result, GM are extensively used in practice across domains including communication (e.g. LDPC codes), control (e.g. Kalman filters), social science (e.g. denoising surveys), natural language processing (e.g. topic model ), signal processing (e.g. speech recognition), image processing (e.g. deep graphical model), machine learning (e.g. recommendation systems), biology and sciences at large (gene regulatory network, causal inference). On the other hand, it has driven exciting intellectual quest of understanding the boundary of computational and statistical tradeoffs as well as efficient inference algorithms. Exchangeable models (EM) encode the symmetries present in the underlying distributions (or data). Such symmetries are present in many modern, seemingly complex settings. For example, in the context of recommendation systems design (a la Netflix, Amazon, YouTube, etc.), ideally one would like to capture the complexity of human preferences across available choices in the model utilized for inference and decision making. This is hard – it is, in a nutshell, (one of the) holy grail of social sciences. As it turns out, these system, as are designed, fundamentally possess symmetries (anonymity) leading to exchangeable model. Now such a seemingly innocent property of distribution has deep implications on its structural properties. Specifically, classical results like those of de Finetti and Aldous-Hoover provide remarkable characterization for such distributions – effectively, such distributions have simple and succinct graphical model representation. In the example of recommendation systems, this results into designing extremely simple inference algorithms. More generally, this has resulted into a wide ranging applications of such models, e.g. topic modeling in natural language processing, clustering algorithms. To make such approaches applicable, computational methods developed in the context of graphical models (e.g. variational method) are popularly utilized for inference and model learning. Graphons, on one hand emerge as a characterization of exchangeable distributions, on the other hand have been identified as limiting objects for graph sequences. The underlying metric with respect to which convergence is established, captures “local” structure properties of the graph. This has led to remarkable connections between study of graphons, local property testing and sublinear time algorithms. The relationship to exchangeable models have led to connecting graphon estimation to modern applications such as community detection and recommendation systems. It is not surprisingly that algorithms from graphical model (e.g. Belief propagation) have played instrumental role in specific instances of graphon estimation like community detection.

WORKSHOP (TBD)