November 4, 2020
This paper gives a review of concentration inequalities which are widely employed in non-asymptotical analyses of mathematical statistics in a wide range of settings, from distribution-free to distribution-dependent, from sub-Gaussian to sub-exponential, sub-Gamma, and sub-Weibull random variables, and from the mean to the maximum concentration. This review provides results in these settings with some fresh new results. Given the increasing popularity of high-dimensional data and inference, results in the context of high-dimensional linear and Poisson regressions are also provided. We aim to illustrate the concentration inequalities with known constants and to improve existing bounds with sharper constants.
Similar papers 1
October 4, 2019
In this report, we aim to exemplify concentration inequalities and provide easy to understand proofs for it. Our focus is on the inequalities which are helpful in the design and analysis of machine learning algorithms.
September 1, 2014
We explore the applications of our previously established likelihood-ratio method for deriving concentration inequalities for a wide variety of univariate and multivariate distributions. New concentration inequalities for various distributions are developed without the idea of minimizing moment generating functions.
February 4, 2021
Constant-specified and exponential concentration inequalities play an essential role in the finite-sample theory of machine learning and high-dimensional statistics area. We obtain sharper and constants-specified concentration inequalities for the sum of independent sub-Weibull random variables, which leads to a mixture of two tails: sub-Gaussian for small deviations and sub-Weibull for large deviations from the mean. These bounds are new and improve existing bounds with shar...
February 11, 2019
In this note, we derive concentration inequalities for random vectors with subGaussian norm (a generalization of both subGaussian random vectors and norm bounded random vectors), which are tight up to logarithmic factors.
December 17, 2017
The aim of this paper is to discuss various concentration inequalities for U-statistics and most recent results. A special focus will be on providing proofs for bounds on the U-statistics using classical concentration inequalities, which, although the results well known, the proofs are not found in the literature.
April 8, 2018
Concentration inequalities form an essential toolkit in the study of high dimensional (HD) statistical methods. Most of the relevant statistics literature in this regard is based on sub-Gaussian or sub-exponential tail assumptions. In this paper, we first bring together various probabilistic inequalities for sums of independent random variables under much more general exponential type (namely sub-Weibull) tail assumptions. These results extract a part sub-Gaussian tail behavi...
August 18, 2013
We propose a new approach for deriving probabilistic inequalities based on bounding likelihood ratios. We demonstrate that this approach is more general and powerful than the classical method frequently used for deriving concentration inequalities such as Chernoff bounds. We discover that the proposed approach is inherently related to statistical concepts such as monotone likelihood ratio, maximum likelihood, and the method of moments for parameter estimation. A connection be...
October 4, 2013
The purpose of this note is to present several aspects of concentration phenomena in high dimensional geometry. At the heart of the study is a geometric analysis point of view coming from the theory of high dimensional convex bodies. The topic has a broad audience going from algorithmic convex geometry to random matrices. We have tried to emphasize different problems relating these areas of research. Another connected area is the study of probability in Banach spaces where so...
January 7, 2015
In recent years, random matrices have come to play a major role in computational mathematics, but most of the classical areas of random matrix theory remain the province of experts. Over the last decade, with the advent of matrix concentration inequalities, research has advanced to the point where we can conquer many (formerly) challenging problems with a page or two of arithmetic. The aim of this monograph is to describe the most successful methods from this area along with ...
October 17, 2022
The Maximum Likelihood Estimator (MLE) serves an important role in statistics and machine learning. In this article, for i.i.d. variables, we obtain constant-specified and sharp concentration inequalities and oracle inequalities for the MLE only under exponential moment conditions. Furthermore, in a robust setting, the sub-Gaussian type oracle inequalities of the log-truncated maximum likelihood estimator are derived under the second-moment condition.