ID: 1905.07499

LR-GLM: High-Dimensional Bayesian Inference Using Low-Rank Data Approximations

May 17, 2019

View on ArXiv

Similar papers 2

New frontiers in Bayesian modeling using the INLA package in R

July 24, 2019

89% Match
Niekerk Janet van, Haakon Bakka, ... , Schenk Olaf
Methodology
Computation

The INLA package provides a tool for computationally efficient Bayesian modeling and inference for various widely used models, more formally the class of latent Gaussian models. It is a non-sampling based framework which provides approximate results for Bayesian inference, using sparse matrices. The swift uptake of this framework for Bayesian modeling is rooted in the computational efficiency of the approach and catalyzed by the demand presented by the big data era. In this p...

Find SimilarView on arXiv

Bayesian sparse multiple regression for simultaneous rank reduction and variable selection

December 2, 2016

89% Match
Antik Chakraborty, Anirban Bhattacharya, Bani K. Mallick
Methodology
Statistics Theory
Statistics Theory

We develop a Bayesian methodology aimed at simultaneously estimating low-rank and row-sparse matrices in a high-dimensional multiple-response linear regression model. We consider a carefully devised shrinkage prior on the matrix of regression coefficients which obviates the need to specify a prior on the rank, and shrinks the regression matrix towards low-rank and row-sparse structures. We provide theoretical support to the proposed methodology by proving minimax optimality o...

Find SimilarView on arXiv

Bayesian Variable Selection in a Million Dimensions

August 2, 2022

89% Match
Martin Jankowiak
Methodology
Machine Learning
Computation
Machine Learning

Bayesian variable selection is a powerful tool for data analysis, as it offers a principled method for variable selection that accounts for prior information and uncertainty. However, wider adoption of Bayesian variable selection has been hampered by computational challenges, especially in difficult regimes with a large number of covariates P or non-conjugate likelihoods. To scale to the large P regime we introduce an efficient MCMC scheme whose cost per iteration is sublinea...

Find SimilarView on arXiv

Efficient Computation of High-Dimensional Penalized Generalized Linear Mixed Models by Latent Factor Modeling of the Random Effects

May 14, 2023

89% Match
Hillary M. Heiling, Naim U. Rashid, Quefeng Li, Xianlu L. Peng, ... , Ibrahim Joseph G.
Methodology
Computation

Modern biomedical datasets are increasingly high dimensional and exhibit complex correlation structures. Generalized Linear Mixed Models (GLMMs) have long been employed to account for such dependencies. However, proper specification of the fixed and random effects in GLMMs is increasingly difficult in high dimensions, and computational complexity grows with increasing dimension of the random effects. We present a novel reformulation of the GLMM using a factor model decomposit...

Find SimilarView on arXiv

Parallel Gaussian Process Regression for Big Data: Low-Rank Representation Meets Markov Approximation

November 17, 2014

89% Match
Kian Hsiang Low, Jiangbo Yu, ... , Jaillet Patrick
Machine Learning
Distributed, Parallel, and C...
Machine Learning

The expressive power of a Gaussian process (GP) model comes at a cost of poor scalability in the data size. To improve its scalability, this paper presents a low-rank-cum-Markov approximation (LMA) of the GP model that is novel in leveraging the dual computational advantages stemming from complementing a low-rank approximate representation of the full-rank GP based on a support set of inputs with a Markov approximation of the resulting residual process; the latter approximati...

Find SimilarView on arXiv

Scalable expectation propagation for generalized linear models

July 2, 2024

89% Match
Niccolò Anceschi, Augusto Fasano, ... , Rebaudo Giovanni
Computation

Generalized linear models (GLMs) arguably represent the standard approach for statistical regression beyond the Gaussian likelihood scenario. When Bayesian formulations are employed, the general absence of a tractable posterior distribution has motivated the development of deterministic approximations, which are generally more scalable than sampling techniques. Among them, expectation propagation (EP) showed extreme accuracy, usually higher than many variational Bayes solutio...

Find SimilarView on arXiv

A Variational Approach for Modeling High-dimensional Spatial Generalized Linear Mixed Models

February 24, 2024

89% Match
Jin Hyung Lee, Ben Seiyon Lee
Methodology

Gaussian and discrete non-Gaussian spatial datasets are prevalent across many fields such as public health, ecology, geosciences, and social sciences. Bayesian spatial generalized linear mixed models (SGLMMs) are a flexible class of models designed for these data, but SGLMMs do not scale well, even to moderately large datasets. State-of-the-art scalable SGLMMs (i.e., basis representations or sparse covariance/precision matrices) require posterior sampling via Markov chain Mon...

Find SimilarView on arXiv

Laplace Approximation in High-dimensional Bayesian Regression

March 28, 2015

89% Match
Rina Foygel Barber, Mathias Drton, Kean Ming Tan
Statistics Theory
Statistics Theory

We consider Bayesian variable selection in sparse high-dimensional regression, where the number of covariates $p$ may be large relative to the samples size $n$, but at most a moderate number $q$ of covariates are active. Specifically, we treat generalized linear models. For a single fixed sparse model with well-behaved prior distribution, classical theory proves that the Laplace approximation to the marginal likelihood of the model is accurate for sufficiently large sample si...

Find SimilarView on arXiv

The Kernel Interaction Trick: Fast Bayesian Discovery of Pairwise Interactions in High Dimensions

May 16, 2019

88% Match
Raj Agrawal, Jonathan H. Huggins, ... , Broderick Tamara
Computation
Machine Learning
Methodology
Machine Learning

Discovering interaction effects on a response of interest is a fundamental problem faced in biology, medicine, economics, and many other scientific disciplines. In theory, Bayesian methods for discovering pairwise interactions enjoy many benefits such as coherent uncertainty quantification, the ability to incorporate background knowledge, and desirable shrinkage properties. In practice, however, Bayesian methods are often computationally intractable for even moderate-dimensio...

Find SimilarView on arXiv

Light and Widely Applicable MCMC: Approximate Bayesian Inference for Large Datasets

March 13, 2015

88% Match
Florian Maire, Nial Friel, Pierre Alquier
Methodology

Light and Widely Applicable (LWA-) MCMC is a novel approximation of the Metropolis-Hastings kernel targeting a posterior distribution defined on a large number of observations. Inspired by Approximate Bayesian Computation, we design a Markov chain whose transition makes use of an unknown but fixed, fraction of the available data, where the random choice of sub-sample is guided by the fidelity of this sub-sample to the observed data, as measured by summary (or sufficient) stat...

Find SimilarView on arXiv