ID: math/0611193

A note on the asymptotic distribution of the minimum density power divergence estimator

November 7, 2006

View on ArXiv

Similar papers 3

The Logarithmic Super Divergence and Statistical Inference : Asymptotic Properties

June 9, 2014

86% Match
Avijit Maji, Abhik Ghosh, Ayanendranath Basu
Statistics Theory
Statistics Theory

Statistical inference based on divergence measures have a long history. Recently, Maji, Ghosh and Basu (2014) have introduced a general family of divergences called the logarithmic super divergence (LSD) family. This family acts as a superfamily for both of the logarithmic power divergence (LPD) family (eg. Renyi, 1961) and the logarithmic density power divergence (LDPD)family introduced by Jones et al. (2001). In this paper we describe the asymptotic properties of the infere...

Find SimilarView on arXiv

The B-Exponential Divergence and its Generalizations with Applications to Parametric Estimation

October 27, 2018

86% Match
Taranga Mukherjee, Abhijit Mandal, Ayanendranath Basu
Methodology

In this paper a new family of minimum divergence estimators based on the Bregman divergence is proposed, where the defining convex function has an exponential nature. These estimators avoid the necessity of using an intermediate kernel density and many of them also have strong robustness properties. It is further demonstrated that the proposed approach can be extended to construct a class of generalized estimating equations, where the pool of the resultant estimators encompas...

Find SimilarView on arXiv

Density power divergence for general integer-valued time series with multivariate exogenous covariate

June 22, 2020

86% Match
Mamadou Lamine Diop, William Kengne
Statistics Theory
Statistics Theory

In this article, we study a robust estimation method for a general class of integer-valued time series models. The conditional distribution of the process belongs to a broad class of distribution and unlike classical autoregressive framework, the conditional mean of the process also depends on some multivariate exogenous covariate. We derive a robust inference procedure based on the minimum density power divergence. Under certain regularity conditions, we establish that...

Find SimilarView on arXiv

Divergence measures estimation and its asymptotic normality theory in the discrete case

December 12, 2018

86% Match
Ba Amadou Diadie, Gane Samb Lo
Statistics Theory
Statistics Theory

In this paper we provide the asymptotic theory of the general of $\phi$-divergences measures, which include the most common divergence measures : Renyi and Tsallis families and the Kullback-Leibler measure. We are interested in divergence measures in the discrete case. One sided and two-sided statistical tests are derived as well as symmetrized estimators. Almost sure rates of convergence and asymptotic normality theorem are obtained in the general case, and next particulariz...

Find SimilarView on arXiv

Minimizing robust density power-based divergences for general parametric density models

July 11, 2023

86% Match
Akifumi Okuno
Methodology
Machine Learning

Density power divergence (DPD) is designed to robustly estimate the underlying distribution of observations, in the presence of outliers. However, DPD involves an integral of the power of the parametric density models to be estimated; the explicit form of the integral term can be derived only for specific densities, such as normal and exponential densities. While we may perform a numerical integration for each iteration of the optimization algorithms, the computational comple...

Find SimilarView on arXiv

Robust Empirical Bayes Small Area Estimation with Density Power Divergence

February 22, 2017

86% Match
Shonosuke Sugasawa
Methodology

A two-stage normal hierarchical model called the Fay--Herriot model and the empirical Bayes estimator are widely used to provide indirect and model-based estimates of means in small areas. However, the performance of the empirical Bayes estimator might be poor when the assumed normal distribution is misspecified. In this article, we propose a simple modification by using density power divergence and suggest a new robust empirical Bayes small area estimator. The mean squared e...

Find SimilarView on arXiv

Characterizing Logarithmic Bregman Functions

May 12, 2021

86% Match
Souvik Ray, Subrata Pal, ... , Basu Ayanendranath
Statistics Theory
Statistics Theory

Minimum divergence procedures based on the density power divergence and the logarithmic density power divergence have been extremely popular and successful in generating inference procedures which combine a high degree of model efficiency with strong outlier stability. Such procedures are always preferable in practical situations over procedures which achieve their robustness at a major cost of efficiency or are highly efficient but have poor robustness properties. The densit...

Find SimilarView on arXiv

Non-Parametric Maximum Likelihood Density Estimation and Simulation-Based Minimum Distance Estimators

December 17, 2010

85% Match
Florian Gach, Benedikt M. Pötscher
Statistics Theory
Probability
Methodology
Statistics Theory

Indirect inference estimators (i.e., simulation-based minimum distance estimators) in a parametric model that are based on auxiliary non-parametric maximum likelihood density estimators are shown to be asymptotically normal. If the parametric model is correctly specified, it is furthermore shown that the asymptotic variance-covariance matrix equals the inverse of the Fisher-information matrix. These results are based on uniform-in-parameters convergence rates and a uniform-in...

Find SimilarView on arXiv

Towards a better understanding of the dual representation of phi divergences

June 6, 2015

85% Match
Diaa Al Mohamad
Methodology

The aim of this paper is to study different estimation procedures based on $\varphi-$divergences. The dual representation of $\varphi-$divergences based on the Fenchel-Legendre duality is the main interest of this study. It provides a way to estimate $\varphi-$divergences by a simple plug-in of the empirical distribution without any smoothing technique. Resulting estimators are thoroughly studied theoretically and with simulations showing that the so called minimum $\varphi-$...

Find SimilarView on arXiv

Asymptotic Properties of Minimum S-Divergence Estimator for Discrete Models

March 25, 2014

85% Match
Abhik Ghosh
Methodology
Statistics Theory
Statistics Theory

Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to the classical techniques based on maximum likelihood and related methods. Recently Ghosh et al. (2013) proposed a general class of divergence measures, namely the S-Divergence Family and discussed its usefulness in robust parametric estimation through some numerical illustrations. In this present paper, we develop the asymptotic properties of the proposed minimum S-D...

Find SimilarView on arXiv