November 7, 2006
Similar papers 5
October 12, 2020
Many real-life data sets can be analyzed using Linear Mixed Models (LMMs). Since these are ordinarily based on normality assumptions, under small deviations from the model the inference can be highly unstable when the associated parameters are estimated by classical methods. On the other hand, the density power divergence (DPD) family, which measures the discrepancy between two probability density functions, has been successfully used to build robust estimators with high stab...
August 30, 2019
We propose and investigate a new estimation method for the parameters of models consisting of smooth density functions on the positive half axis. The procedure is based on a recently introduced characterization result for the respective probability distributions, and is to be classified as a minimum distance estimator, incorporating as a distance function the $L^q$-norm. Throughout, we deal rigorously with issues of existence and measurability of these implicitly defined esti...
December 26, 2011
In this Note we introduce a new methodology for Bayesian inference through the use of $\phi$-divergences and the duality technique. The asymptotic laws of the estimates are established.
November 3, 2020
This paper is an attempt to set a justification for making use of some dicrepancy indexes, starting from the classical Maximum Likelihood definition, and adapting the corresponding basic principle of inference to situations where minimization of those indexes between a model and some extension of the empirical measure of the data appears as its natural extension. This leads to the so called generalized bootstrap setting for which minimum divergence inference seems to replace ...
November 22, 2008
We introduce estimation and test procedures through divergence optimization for discrete or continuous parametric models. This approach is based on a new dual representation for divergences. We treat point estimation and tests for simple and composite hypotheses, extending maximum likelihood technique. An other view at the maximum likelihood approach, for estimation and test, is given. We prove existence and consistency of the proposed estimates. The limit laws of the estimat...
March 2, 2022
Density-based directed distances -- particularly known as divergences -- between probability distributions are widely used in statistics as well as in the adjacent research fields of information theory, artificial intelligence and machine learning. Prominent examples are the Kullback-Leibler information distance (relative entropy) which e.g. is closely connected to the omnipresent maximum likelihood estimation method, and Pearson's chisquare-distance which e.g. is used for th...
February 4, 2015
Experiments often yield non-identically distributed data for statistical analysis. Tests of hypothesis under such set-ups are generally performed using the likelihood ratio test, which is non-robust with respect to outliers and model misspecification. In this paper, we consider the set-up of non-identically but independently distributed observations and develop a general class of test statistics for testing parametric hypothesis based on the density power divergence. The prop...
March 6, 2014
Statistical techniques are used in all branches of science to determine the feasibility of quantitative hypotheses. One of the most basic applications of statistical techniques in comparative analysis is the test of equality of two population means, generally performed under the assumption of normality. In medical studies, for example, we often need to compare the effects of two different drugs, treatments or preconditions on the resulting outcome. The most commonly used test...
June 22, 2021
While robust divergence such as density power divergence and $\gamma$-divergence is helpful for robust statistical inference in the presence of outliers, the tuning parameter that controls the degree of robustness is chosen in a rule-of-thumb, which may lead to an inefficient inference. We here propose a selection criterion based on an asymptotic approximation of the Hyvarinen score applied to an unnormalized model defined by robust divergence. The proposed selection criterio...
May 31, 2014
The main purpose of this paper is to introduce and study the behavior of minimum {\phi}-divergence estimators as an alternative to the maximum likelihood estimator in latent class models for binary items. As it will become clear below, minimum {\phi}-divergence estimators are a natural extension of the maximum likelihood estimator. The asymptotic properties of minimum {\phi}-divergence estimators for latent class models for binary data are developed. Finally, to compare the e...