November 7, 2006
Similar papers 3
June 9, 2014
Statistical inference based on divergence measures have a long history. Recently, Maji, Ghosh and Basu (2014) have introduced a general family of divergences called the logarithmic super divergence (LSD) family. This family acts as a superfamily for both of the logarithmic power divergence (LPD) family (eg. Renyi, 1961) and the logarithmic density power divergence (LDPD)family introduced by Jones et al. (2001). In this paper we describe the asymptotic properties of the infere...
October 27, 2018
In this paper a new family of minimum divergence estimators based on the Bregman divergence is proposed, where the defining convex function has an exponential nature. These estimators avoid the necessity of using an intermediate kernel density and many of them also have strong robustness properties. It is further demonstrated that the proposed approach can be extended to construct a class of generalized estimating equations, where the pool of the resultant estimators encompas...
June 22, 2020
In this article, we study a robust estimation method for a general class of integer-valued time series models. The conditional distribution of the process belongs to a broad class of distribution and unlike classical autoregressive framework, the conditional mean of the process also depends on some multivariate exogenous covariate. We derive a robust inference procedure based on the minimum density power divergence. Under certain regularity conditions, we establish that...
December 12, 2018
In this paper we provide the asymptotic theory of the general of $\phi$-divergences measures, which include the most common divergence measures : Renyi and Tsallis families and the Kullback-Leibler measure. We are interested in divergence measures in the discrete case. One sided and two-sided statistical tests are derived as well as symmetrized estimators. Almost sure rates of convergence and asymptotic normality theorem are obtained in the general case, and next particulariz...
July 11, 2023
Density power divergence (DPD) is designed to robustly estimate the underlying distribution of observations, in the presence of outliers. However, DPD involves an integral of the power of the parametric density models to be estimated; the explicit form of the integral term can be derived only for specific densities, such as normal and exponential densities. While we may perform a numerical integration for each iteration of the optimization algorithms, the computational comple...
February 22, 2017
A two-stage normal hierarchical model called the Fay--Herriot model and the empirical Bayes estimator are widely used to provide indirect and model-based estimates of means in small areas. However, the performance of the empirical Bayes estimator might be poor when the assumed normal distribution is misspecified. In this article, we propose a simple modification by using density power divergence and suggest a new robust empirical Bayes small area estimator. The mean squared e...
May 12, 2021
Minimum divergence procedures based on the density power divergence and the logarithmic density power divergence have been extremely popular and successful in generating inference procedures which combine a high degree of model efficiency with strong outlier stability. Such procedures are always preferable in practical situations over procedures which achieve their robustness at a major cost of efficiency or are highly efficient but have poor robustness properties. The densit...
December 17, 2010
Indirect inference estimators (i.e., simulation-based minimum distance estimators) in a parametric model that are based on auxiliary non-parametric maximum likelihood density estimators are shown to be asymptotically normal. If the parametric model is correctly specified, it is furthermore shown that the asymptotic variance-covariance matrix equals the inverse of the Fisher-information matrix. These results are based on uniform-in-parameters convergence rates and a uniform-in...
June 6, 2015
The aim of this paper is to study different estimation procedures based on $\varphi-$divergences. The dual representation of $\varphi-$divergences based on the Fenchel-Legendre duality is the main interest of this study. It provides a way to estimate $\varphi-$divergences by a simple plug-in of the empirical distribution without any smoothing technique. Resulting estimators are thoroughly studied theoretically and with simulations showing that the so called minimum $\varphi-$...
March 25, 2014
Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to the classical techniques based on maximum likelihood and related methods. Recently Ghosh et al. (2013) proposed a general class of divergence measures, namely the S-Divergence Family and discussed its usefulness in robust parametric estimation through some numerical illustrations. In this present paper, we develop the asymptotic properties of the proposed minimum S-D...