ID: 1601.04650

Statistical Mechanics of High-Dimensional Inference

January 18, 2016

View on ArXiv

Similar papers 2

Statistical mechanics of low-rank tensor decomposition

October 23, 2018

88% Match
Jonathan Kadmon, Surya Ganguli
Machine Learning
Machine Learning
Neurons and Cognition

Often, large, high dimensional datasets collected across multiple modalities can be organized as a higher order tensor. Low-rank tensor decomposition then arises as a powerful and widely used tool to discover simple low dimensional structures underlying such data. However, we currently lack a theoretical understanding of the algorithmic behavior of low-rank tensor decompositions. We derive Bayesian approximate message passing (AMP) algorithms for recovering arbitrarily shaped...

Find SimilarView on arXiv

Fundamental Limits of PhaseMax for Phase Retrieval: A Replica Analysis

August 10, 2017

88% Match
Oussama Dhifallah, Yue M. Lu
Information Theory
Information Theory

We consider a recently proposed convex formulation, known as the PhaseMax method, for solving the phase retrieval problem. Using the replica method from statistical mechanics, we analyze the performance of PhaseMax in the high-dimensional limit. Our analysis predicts the \emph{exact} asymptotic performance of PhaseMax. In particular, we show that a sharp phase transition phenomenon takes place, with a simple analytical formula characterizing the phase transition boundary. Thi...

Find SimilarView on arXiv

High Dimensional Expectation-Maximization Algorithm: Statistical Optimization and Asymptotic Normality

December 30, 2014

87% Match
Zhaoran Wang, Quanquan Gu, ... , Liu Han
Machine Learning

We provide a general theory of the expectation-maximization (EM) algorithm for inferring high dimensional latent variable models. In particular, we make two contributions: (i) For parameter estimation, we propose a novel high dimensional EM algorithm which naturally incorporates sparsity structure into parameter estimation. With an appropriate initialization, this algorithm converges at a geometric rate and attains an estimator with the (near-)optimal statistical rate of conv...

Find SimilarView on arXiv

Statistical physics of linear and bilinear inference problems

July 3, 2016

87% Match
Christophe Schülke
Information Theory
Information Theory

The recent development of compressed sensing has led to spectacular advances in the understanding of sparse linear estimation problems as well as in algorithms to solve them. It has also triggered a new wave of developments in the related fields of generalized linear and bilinear inference problems, that have very diverse applications in signal processing and are furthermore a building block of deep neural networks. These problems have in common that they combine a linear mix...

Find SimilarView on arXiv

Computational and Statistical Tradeoffs via Convex Relaxation

November 5, 2012

87% Match
Venkat Chandrasekaran, Michael I. Jordan
Statistics Theory
Information Theory
Information Theory
Optimization and Control
Statistics Theory

In modern data analysis, one is frequently faced with statistical inference problems involving massive datasets. Processing such large datasets is usually viewed as a substantial computational challenge. However, if data are a statistician's main resource then access to more data should be viewed as an asset rather than as a burden. In this paper we describe a computational framework based on convex relaxation to reduce the computational complexity of an inference procedure w...

Find SimilarView on arXiv

High Dimensional Statistical Inference and Random Matrices

November 19, 2006

87% Match
Iain M. Johnstone
Statistics Theory
Probability
Statistics Theory

Multivariate statistical analysis is concerned with observations on several variables which are thought to possess some degree of inter-dependence. Driven by problems in genetics and the social sciences, it first flowered in the earlier half of the last century. Subsequently, random matrix theory (RMT) developed, initially within physics, and more recently widely in mathematics. While some of the central objects of study in RMT are identical to those of multivariate statistic...

Find SimilarView on arXiv

An Impossibility Result for High Dimensional Supervised Learning

January 29, 2013

87% Match
Mohammad Hossein Rohban, Prakash Ishwar, Birant Orten, ... , Saligrama Venkatesh
Machine Learning

We study high-dimensional asymptotic performance limits of binary supervised classification problems where the class conditional densities are Gaussian with unknown means and covariances and the number of signal dimensions scales faster than the number of labeled training samples. We show that the Bayes error, namely the minimum attainable error probability with complete distributional knowledge and equally likely classes, can be arbitrarily close to zero and yet the limiting...

Find SimilarView on arXiv

Information bottleneck theory of high-dimensional regression: relevancy, efficiency and optimality

August 8, 2022

87% Match
Vudtiwat Ngampruetikorn, David J. Schwab
cs.IT
cond-mat.stat-mech
cs.LG
math.IT
physics.data-an
stat.ML

Avoiding overfitting is a central challenge in machine learning, yet many large neural networks readily achieve zero training loss. This puzzling contradiction necessitates new approaches to the study of overfitting. Here we quantify overfitting via residual information, defined as the bits in fitted models that encode noise in training data. Information efficient learning algorithms minimize residual information while maximizing the relevant bits, which are predictive of the...

Find SimilarView on arXiv

Bayesian inference in high-dimensional models

January 12, 2021

87% Match
Sayantan Banerjee, Ismaël Castillo, Subhashis Ghosal
Statistics Theory
Statistics Theory

Models with dimension more than the available sample size are now commonly used in various applications. A sensible inference is possible using a lower-dimensional structure. In regression problems with a large number of predictors, the model is often assumed to be sparse, with only a few predictors active. Interdependence between a large number of variables is succinctly described by a graphical model, where variables are represented by nodes on a graph and an edge between t...

Find SimilarView on arXiv

Sparse Representations, Inference and Learning

June 28, 2023

87% Match
Clarissa Lauditi, Emanuele Troiani, Marc Mézard
Statistical Mechanics
Information Theory
Machine Learning
Information Theory
Machine Learning

In recent years statistical physics has proven to be a valuable tool to probe into large dimensional inference problems such as the ones occurring in machine learning. Statistical physics provides analytical tools to study fundamental limitations in their solutions and proposes algorithms to solve individual instances. In these notes, based on the lectures by Marc M\'ezard in 2022 at the summer school in Les Houches, we will present a general framework that can be used in a l...

Find SimilarView on arXiv