November 7, 2006
Similar papers 4
October 11, 2010
We consider the problem of estimating the predictive density of future observations from a non-parametric regression model. The density estimators are evaluated under Kullback--Leibler divergence and our focus is on establishing the exact asymptotics of minimax risk in the case of Gaussian errors. We derive the convergence rate and constant for minimax risk among Bayesian predictive densities under Gaussian priors and we show that this minimax risk is asymptotically equivalen...
June 23, 2014
We propose nonparametric estimation of divergence measures between continuous distributions. Our approach is based on a plug-in kernel- type estimators of density functions. We give the uniform in bandwidth consistency for the proposal estimators. As a consequence, their asymp- totic 100% confidence intervals are also provided.
April 8, 2011
The aim of this paper is to introduce new statistical criterions for estimation, suitable for inference in models with common continuous support. This proposal is in the direct line of a renewed interest for divergence based inference tools imbedding the most classical ones, such as maximum likelihood, Chi-square or Kullback Leibler. General pseudodistances with decomposable structure are considered, they allowing to define minimum pseudodistance estimators, without using non...
February 22, 2007
We consider an extension of $\epsilon$-entropy to a KL-divergence based complexity measure for randomized density estimation methods. Based on this extension, we develop a general information-theoretical inequality that measures the statistical complexity of some deterministic and randomized density estimators. Consequences of the new inequality will be presented. In particular, we show that this technique can lead to improvements of some classical results concerning the conv...
March 12, 2020
In this paper, we propose an estimator of the generalized maximum mean discrepancy between several distributions, constructed by modifying a naive estimator. Asymptotic normality is obtained for this estimator both under equality of these distributions and under the alternative hypothesis.
January 29, 2024
In this paper we provide an asymptotic theory for the symmetric version of the Kullback--Leibler (KL) divergence. We define a estimator for this divergence and study its asymptotic properties. In particular, we prove Law of Large Numbers (LLN) and the convergence to the normal law in the Central Limit Theorem (CLT) using this estimator.
August 3, 2011
In this note we prove the dual representation formula of the divergence between two distributions in a parametric model. Resulting estimators for the divergence as for the parameter are derived. These estimators do not make use of any grouping nor smoothing. It is proved that all differentiable divergences induce the same estimator of the parameter on any regular exponential family, which is nothing else but the MLE.
September 12, 2012
Entropy-type integral functionals of densities are widely used in mathematical statistics, information theory, and computer science. Examples include measures of closeness between distributions (e.g., density power divergence) and uncertainty characteristics for a random variable (e.g., R\'enyi entropy). In this paper, we study U-statistic estimators for a class of such functionals. The estimators are based on epsilon-close vector observations in the corresponding independent...
October 20, 2021
Under mild conditions, it is shown the strong consistency of the Bayes estimator of the density. Moreover, the Bayes risk (for some common loss functions) of the Bayes estimator of the density (i.e. the posterior predictive density) reaches zero when the sample size goes to $\infty$. In passing, a similar result is obtained for the estimation of the sampling distribution.
February 12, 2017
This paper considers the problem of inliers and empty cells and the resulting issue of relative inefficiency in estimation under pure samples from a discrete population when the sample size is small. Many minimum divergence estimators in the $S$-divergence family, although possessing very strong outlier stability properties, often have very poor small sample efficiency in the presence of inliers and some are not even defined in the presence of a single empty cell; this limits...