ID: math/0611193

A note on the asymptotic distribution of the minimum density power divergence estimator

November 7, 2006

View on ArXiv

Similar papers 2

Estimating divergence functionals and the likelihood ratio by convex risk minimization

September 4, 2008

87% Match
XuanLong Nguyen, Martin J. Wainwright, Michael I. Jordan
Statistics Theory
Information Theory
Information Theory
Statistics Theory

We develop and analyze $M$-estimation methods for divergence functionals and the likelihood ratios of two probability distributions. Our method is based on a non-asymptotic variational characterization of $f$-divergences, which allows the problem of estimating divergences to be tackled via convex empirical risk optimization. The resulting estimators are simple to implement, requiring only the solution of standard convex programs. We present an analysis of consistency and conv...

Find SimilarView on arXiv

On Estimating $L_2^2$ Divergence

October 30, 2014

87% Match
Akshay Krishnamurthy, Kirthevasan Kandasamy, ... , Wasserman Larry
Machine Learning

We give a comprehensive theoretical characterization of a nonparametric estimator for the $L_2^2$ divergence between two continuous distributions. We first bound the rate of convergence of our estimator, showing that it is $\sqrt{n}$-consistent provided the densities are sufficiently smooth. In this smooth regime, we then show that our estimator is asymptotically normal, construct asymptotic confidence intervals, and establish a Berry-Ess\'{e}en style inequality characterizin...

Find SimilarView on arXiv

The Logarithmic Super Divergence and its use in Statistical Inference

July 15, 2014

87% Match
Avijit Maji, Abhik Ghosh, Ayanendranath Basu
Methodology
Statistics Theory
Applications
Statistics Theory

This paper introduces a new superfamily of divergences that is similar in spirit to the S-divergence family introduced by Ghosh et al. (2013). This new family serves as an umbrella that contains the logarithmic power divergence family (Renyi, 1961; Maji, Chakraborty and Basu 2014) and the logarithmic density power divergence family (Jones et al., 2001) as special cases. Various properties of this new family and the corresponding minimum distance procedures are discussed with ...

Find SimilarView on arXiv

Statistical Inference based on Bridge Divergences

June 19, 2017

87% Match
Arun Kumar Kuchibhotla, Somabha Mukherjee, Ayanendranath Basu
Methodology

M-estimators offer simple robust alternatives to the maximum likelihood estimator. Much of the robustness literature, however, has focused on the problems of location, location-scale and regression estimation rather than on estimation of general parameters. The density power divergence (DPD) and the logarithmic density power divergence (LDPD) measures provide two classes of competitive M-estimators (obtained from divergences) in general parametric models which contain the MLE...

Find SimilarView on arXiv

The Minimum S-Divergence Estimator under Continuous Models: The Basu-Lindsay Approach

August 6, 2014

87% Match
Abhik Ghosh, Ayanendranath Basu
Statistics Theory
Applications
Methodology
Statistics Theory

Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to the classical maximum likelihood based techniques. Recently Ghosh et al. (2013) proposed a general class of divergence measures for robust statistical inference, named the S-Divergence Family. Ghosh (2014) discussed its asymptotic properties for the discrete model of densities. In the present paper, we develop the asymptotic properties of the proposed minimum S-Diver...

Find SimilarView on arXiv

Robust Wald-type tests for non-homogeneous observations based on minimum density power divergence estimator

July 7, 2017

87% Match
Ayanendranath Basu, Abhik Ghosh, ... , Pardo Leandro
Methodology
Statistics Theory
Statistics Theory

This paper considers the problem of robust hypothesis testing under non-identically distributed data. We propose Wald-type tests for both simple and composite hypothesis for independent but non-homogeneous observations based on the robust minimum density power divergence estimator of the common underlying parameter. Asymptotic and theoretical robustness properties of the proposed tests have been discussed. Application to the problem of testing the general linear hypothesis in...

Find SimilarView on arXiv

Asymptotic Breakdown Point Analysis for a General Class of Minimum Divergence Estimators

April 15, 2023

87% Match
Subhrajyoty Roy, Abir Sarkar, ... , Basu Ayanendranath
Statistics Theory
Statistics Theory

Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to classical techniques based on maximum likelihood and related methods. Basu et al. (1998) introduced the density power divergence (DPD) family as a measure of discrepancy between two probability density functions and used this family for robust estimation of the parameter for independent and identically distributed data. Ghosh et al. (2017) proposed a more general cla...

Find SimilarView on arXiv

Divergences and Duality for Estimation and Test under Moment Condition Models

February 3, 2010

87% Match
Michel LSTA Broniatowski, Amor LM-Reims, LSTA Keziou
Statistics Theory
Statistics Theory

We introduce estimation and test procedures through divergence minimiza- tion for models satisfying linear constraints with unknown parameter. These procedures extend the empirical likelihood (EL) method and share common features with generalized empirical likelihood approach. We treat the problems of existence and characterization of the divergence projections of probability distributions on sets of signed finite measures. We give a precise characterization of duality, for t...

Find SimilarView on arXiv

On the Robustness of a Divergence based Test of Simple Statistical Hypotheses

April 21, 2014

86% Match
Abhik Ghosh, Ayanendranath Basu, Leandro Pardo
Statistics Theory
Methodology
Statistics Theory

The most popular hypothesis testing procedure, the likelihood ratio test, is known to be highly non-robust in many real situations. Basu et al. (2013a) provided an alternative robust procedure of hypothesis testing based on the density power divergence; however, although the robustness properties of the latter test were intuitively argued for by the authors together with extensive empirical substantiation of the same, no theoretical robustness properties were presented in thi...

Find SimilarView on arXiv

Robust Density Power Divergence Estimates for Panel Data Models

August 5, 2021

86% Match
Abhijit Mandal, Beste Hamiye Beyaztas, Soutir Bandyopadhyay
Methodology
Statistics Theory
Statistics Theory

The panel data regression models have become one of the most widely applied statistical approaches in different fields of research, including social, behavioral, environmental sciences, and econometrics. However, traditional least-squares-based techniques frequently used for panel data models are vulnerable to the adverse effects of the data contamination or outlying observations that may result in biased and inefficient estimates and misleading statistical inference. In this...

Find SimilarView on arXiv