ID: math/9804068

A note on sums of independent random variables

April 14, 1998

View on ArXiv

Similar papers 4

Moments of Sums of Independent and Identically Distributed Random Variables

May 31, 2011

83% Match
Daniel M. Packwood
Statistics Theory
Statistics Theory

We present an analytic method for computing the moments of a sum of independent and identically distributed random variables. The limiting behavior of these sums is very important to statistical theory, and the moment expressions that we derive allow for it to be studied relatively easily. We show this by presenting a new proof of the central limit theorem and several other convergence results.

Find SimilarView on arXiv

Tail and moment estimates for chaoses generated by symmetric random variables with logarithmically concave tails

July 8, 2010

83% Match
Radosław Adamczak, Rafał Latała
Probability

We present two-sided estimates of moments and tails of polynomial chaoses of order at most three generated by independent symmetric random variables with log-concave tails as well as for chaoses of arbitrary order generated by independent symmetric exponential variables. The estimates involve only deterministic quantities and are optimal up to constants depending only on the order of the chaos variable.

Find SimilarView on arXiv

Moment free deviation inequalities for linear combinations of independent random variables with power-type tails

July 5, 2022

83% Match
Daniel J. Fresen
Probability
Functional Analysis

We present order of magnitude estimates for the quantiles of non-negative linear combinations of non-negative random variables, as well as deviation inequalities for general linear combinations of independent random variables, under the assumption that all random variables satisfy the same power-type tail bound on $\mathbb{P}\{\left\vert X_i\right\vert>t\}$ of the form $t^{-q}$, $t^{-q/2}$ or $t^{-q/2}(\ln t)^{q/2}$, for $q>2$. The third type is applicable in the nonlinear se...

Find SimilarView on arXiv

Estimates for probabilities of independent events and infinite series

September 28, 2016

83% Match
Jürgen Grahl, Shahar Nevo
Probability

This paper deals with (finite or infinite) sequences of arbitrary independent events in some probability space. We find sharp lower bounds for the probability of a union of such events when the sum of their probabilities is given. The results have parallel meanings in terms of infinite series.

Find SimilarView on arXiv

Comparison of Sums of independent Identically Distributed Random Variables

October 7, 1993

83% Match
Stephen J. Montgomery-Smith
Functional Analysis

Let S_k be the k-th partial sum of Banach space valued independent identically distributed random variables. In this paper, we compare the tail distribution of ||S_k|| with that of ||S_j||, and deduce some tail distribution maximal inequalities. Theorem: There is universal constant c such that for j < k Pr(||S_j|| > t) <= c Pr(||S_k|| > t/c).

Find SimilarView on arXiv

Quantitative bounds for large deviations of heavy tailed random variables

February 7, 2022

83% Match
Quirin Vogel
Probability

The probability that the sum of independent, centered, identically distributed, heavy-tailed random variables achieves a very large value is asymptotically equal to the probability that there exists a single summand equalling that value. We quantify the error in this approximation. We furthermore characterise of the law of the individual summands, conditioned on the sum being large.

Find SimilarView on arXiv

Dependency-dependent Bounds for Sums of Dependent Random Variables

November 4, 2018

83% Match
Christoph H. Lampert, Liva Ralaivola, Alexander Zimin
Probability

We consider the problem of bounding large deviations for non-i.i.d. random variables that are allowed to have arbitrary dependencies. Previous works typically assumed a specific dependence structure, namely the existence of independent components. Bounds that depend on the degree of dependence between the observations have only been studied in the theory of mixing processes, where variables are time-ordered. Here, we introduce a new way of measuring dependences within an unor...

Find SimilarView on arXiv

Estimates on the tail behavior of Gaussian polynomials. The discussion of a result of Latala

December 11, 2009

83% Match
Peter Major
Probability

In this paper a result of Latala about the tail behavior of Gaussian polynomials will be discussed. Latala proved an interesting result about this problem in paper [2]. But his proof applied an incorrect statement at a crucial point. Hence the question may arise whether the main result of paper [2] is valid. The goal of this paper is to settle this problem by presenting such a proof where the application of the erroneous statement is avoided. I discuss the proofs in detail ev...

Find SimilarView on arXiv

Exact Constants in the Rosenthal Moment Inequalities for Sums of independent centered Random Variables

November 27, 2004

83% Match
B. Naimark, E. Ostrovsky
Probability

We study the exact constants in the moment inequalities for sums of centered independent random variables: improve their asymptotics, low and upper bounds, calculate more exact asymptotics, elaborate the numerical algorithm for their calculation, study the class of smoothing etc.

Find SimilarView on arXiv

A general Hsu-Robbins-Erdos type estimate of tail probabilities of sums of independent identically distributed random variables

October 28, 1998

83% Match
Alexander R. Pruss
Probability

Let X_1,X_2,... be a sequence of independent and identically distributed random variables, and put S_n=X_1+...+X_n. Under some conditions on the positive sequence tau_n and the positive increasing sequence a_n, we give necessary and sufficient conditions for the convergence of sum_{n=1}^infty tau_n P(|S_n|>t a_n) for all t>0, generalizing Baum and Katz's (1965) generalization of the Hsu-Robbins-Erdos (1947, 1949) law of large numbers, also allowing us to characterize the conv...

Find SimilarView on arXiv