November 7, 2006
We establish consistency and asymptotic normality of the minimum density power divergence estimator under regularity conditions different from those originally provided by Basu et al.
Similar papers 1
March 29, 2014
In testing of hypothesis the robustness of the tests is an important concern. Generally, the maximum likelihood based tests are most efficient under standard regularity conditions, but they are highly non-robust even under small deviations from the assumed conditions. In this paper we have proposed generalized Wald-type tests based on minimum density power divergence estimators for parametric hypotheses. This method avoids the use of nonparametric density estimation and the b...
October 27, 2019
Preserving the robustness of the procedure has, at the present time, become almost a default requirement for statistical data analysis. Since efficiency at the model and robustness under misspecification of the model are often in conflict, it is important to choose such inference procedures which provide the best compromise between these two concepts. Some minimum Bregman divergence estimators and related tests of hypothesis seem to be able to do well in this respect, with th...
August 16, 2020
In this paper a new family of minimum divergence estimators based on the Bregman divergence is proposed. The popular density power divergence (DPD) class of estimators is a sub-class of Bregman divergences. We propose and study a new sub-class of Bregman divergences called the exponentially weighted divergence (EWD). Like the minimum DPD estimator, the minimum EWD estimator is recognised as an M-estimator. This characterisation is useful while discussing the asymptotic behavi...
January 27, 2025
Density-power-based divergences are known to provide robust inference procedures against outliers, and their extensions have been widely studied. A characteristic of successful divergences is that the estimation problem can be reduced to M-estimation. In this paper, we define a norm-based Bregman density power divergence (NB-DPD) -- density-power-based divergence with functional flexibility within the framework of Bregman divergences that can be reduced to M-estimation. We sh...
November 4, 2009
This paper deals with four types of point estimators based on minimization of information-theoretic divergences between hypothetical and empirical distributions. These were introduced (i) by Liese & Vajda (2006) and independently Broniatowski & Keziou (2006), called here power superdivergence estimators, (ii) by Broniatowski & Keziou (2009), called here power subdivergence estimators, (iii) by Basu et al. (1998), called here power pseudodistance estimators, and (iv) by Vajda ...
August 26, 2019
This paper presents new families of Rao-type test statistics based on the minimum density power divergence estimators which provide robust generalizations for testing simple and composite null hypotheses. The asymptotic null distributions of the proposed tests are obtained and their robustness properties are also theoretically studied. Numerical illustrations are provided to substantiate the theory developed. On the whole, the proposed tests are seen to be excellent alternati...
December 21, 2020
Density-based minimum divergence procedures represent popular techniques in parametric statistical inference. They combine strong robustness properties with high (sometimes full) asymptotic efficiency. Among density-based minimum distance procedures, the methods based on the Bregman-divergence have the attractive property that the empirical formulation of the divergence does not require the use of any non-parametric smoothing technique such as kernel density estimation. The m...
March 25, 2014
The minimum divergence estimators have proved to be useful tools in the area of robust inference. The robustness of such estimators are measured using the classical Influence functions. However, in many complex situations like testing a composite hypothesis using divergence require the estimators to be restricted into some subspace of the parameter space. The robustness of these restricted minimum divergence estimators are very important in order to have overall robust infere...
March 3, 2014
In any parametric inference problem, the robustness of the procedure is a real concern. A procedure which retains a high degree of efficiency under the model and simultaneously provides stable inference under data contamination is preferable in any practical situation over another procedure which achieves its efficiency at the cost of robustness or vice versa. The density power divergence family of Basu et al. (1998) provides a flexible class of divergences where the adjustme...
May 13, 2021
Divergence measures have a long association with statistical inference, machine learning and information theory. The density power divergence and related measures have produced many useful (and popular) statistical procedures, which provide a good balance between model efficiency on one hand and outlier stability or robustness on the other. The logarithmic density power divergence, a particular logarithmic transform of the density power divergence, has also been very successf...