ID: math/0410087

Convergence rates for posterior distributions and adaptive estimation

October 5, 2004

View on ArXiv
Tzee-Ming Huang
Mathematics
Statistics
Statistics Theory
Statistics Theory

The goal of this paper is to provide theorems on convergence rates of posterior distributions that can be applied to obtain good convergence rates in the context of density estimation as well as regression. We show how to choose priors so that the posterior distributions converge at the optimal rate without prior knowledge of the degree of smoothness of the density function or the regression function to be estimated.

Similar papers 1

A note on Bayesian convergence rates under local prior support conditions

January 15, 2012

94% Match
Ryan Martin, Liang Hong, Stephen G. Walker
Statistics Theory
Statistics Theory

Bounds on Bayesian posterior convergence rates, assuming the prior satisfies both local and global support conditions, are now readily available. In this paper we explore, in the context of density estimation, Bayesian convergence rates assuming only local prior support conditions. Our results give optimal rates under minimal conditions using very simple arguments.

Find SimilarView on arXiv

On convergence rates of Bayesian predictive densities and posterior distributions

September 29, 2012

92% Match
Ryan Martin, Liang Hong
Statistics Theory
Statistics Theory

Frequentist-style large-sample properties of Bayesian posterior distributions, such as consistency and convergence rates, are important considerations in nonparametric problems. In this paper we give an analysis of Bayesian asymptotics based primarily on predictive densities. Our analysis is unified in the sense that essentially the same approach can be taken to develop convergence rate results in iid, mis-specified iid, independent non-iid, and dependent data cases.

Find SimilarView on arXiv

Multivariate Density Estimation via Adaptive Partitioning (II): Posterior Concentration

August 19, 2015

90% Match
Linxi Liu, Wing Hung Wong
Statistics Theory
Statistics Theory

In this paper, we study a class of non-parametric density estimators under Bayesian settings. The estimators are piecewise constant functions on binary partitions. We analyze the concentration rate of the posterior distribution under a suitable prior, and demonstrate that the rate does not directly depend on the dimension of the problem. This paper can be viewed as an extension of a parallel work where the convergence rate of a related sieve MLE was established. Compared to t...

Find SimilarView on arXiv

Convergence Rates of Nonparametric Posterior Distributions

April 17, 2008

90% Match
Yang Xing
Statistics Theory
Statistics Theory

We study the asymptotic behavior of posterior distributions. We present general posterior convergence rate theorems, which extend several results on posterior convergence rates provided by Ghosal and Van der Vaart (2000), Shen and Wasserman (2001) and Walker, Lijor and Prunster (2007). Our main tools are the Hausdorff $\alpha$-entropy introduced by Xing and Ranneby (2008) and a new notion of prior concentration, which is a slight improvement of the usual prior concentration p...

Find SimilarView on arXiv

On the Convergence of Bayesian Regression Models

October 6, 2010

90% Match
Yuao Hu
Statistics Theory
Statistics Theory

We consider heteroscedastic nonparametric regression models, when both the mean function and variance function are unknown and to be estimated with nonparametric approaches. We derive convergence rates of posterior distributions for this model with different priors, including splines and Gaussian process priors. The results are based on the general ones on the rates of convergence of posterior distributions for independent, non-identically distributed observations, and are es...

Find SimilarView on arXiv

Bayesian adaptation

July 8, 2014

90% Match
Catia Scricciolo
Methodology

In the need for low assumption inferential methods in infinite-dimensional settings, Bayesian adaptive estimation via a prior distribution that does not depend on the regularity of the function to be estimated nor on the sample size is valuable. We elucidate relationships among the main approaches followed to design priors for minimax-optimal rate-adaptive estimation meanwhile shedding light on the underlying ideas.

Find SimilarView on arXiv

Data-driven priors and their posterior concentration rates

April 19, 2016

90% Match
Ryan Martin, Stephen G. Walker
Statistics Theory
Statistics Theory

In high-dimensional problems, choosing a prior distribution such that the corresponding posterior has desirable practical and theoretical properties can be challenging. This begs the question: can the data be used to help choose a good prior? In this paper, we develop a general strategy for constructing a data-driven or empirical prior and sufficient conditions for the corresponding posterior distribution to achieve a certain concentration rate. The idea is that the prior sho...

Find SimilarView on arXiv

Rates of convergence for the posterior distributions of mixtures of Betas and adaptive nonparametric estimation of the density

January 11, 2010

89% Match
Judith Rousseau
Statistics Theory
Statistics Theory

In this paper, we investigate the asymptotic properties of nonparametric Bayesian mixtures of Betas for estimating a smooth density on $[0,1]$. We consider a parametrization of Beta distributions in terms of mean and scale parameters and construct a mixture of these Betas in the mean parameter, while putting a prior on this scaling parameter. We prove that such Bayesian nonparametric models have good frequentist asymptotic properties. We determine the posterior rate of concen...

Find SimilarView on arXiv

Convergence rates for Bayesian density estimation of infinite-dimensional exponential families

August 1, 2007

89% Match
Catia Scricciolo
Statistics Theory
Statistics Theory

We study the rate of convergence of posterior distributions in density estimation problems for log-densities in periodic Sobolev classes characterized by a smoothness parameter p. The posterior expected density provides a nonparametric estimation procedure attaining the optimal minimax rate of convergence under Hellinger loss if the posterior distribution achieves the optimal rate over certain uniformity classes. A prior on the density class of interest is induced by a prior ...

Find SimilarView on arXiv

Adaptive posterior convergence rates in non-linear latent variable models

January 26, 2017

89% Match
Shuang Zhou, Debdeep Pati, ... , Dunson David
Statistics Theory
Statistics Theory

Non-linear latent variable models have become increasingly popular in a variety of applications. However, there has been little study on theoretical properties of these models. In this article, we study rates of posterior contraction in univariate density estimation for a class of non-linear latent variable models where unobserved U(0,1) latent variables are related to the response variables via a random non-linear regression with an additive error. Our approach relies on cha...

Find SimilarView on arXiv