ID: math/0211192

Concentration of norms and eigenvalues of random matrices

November 12, 2002

View on ArXiv

Similar papers 2

The limit of the operator norm for random matrices with a variance profile

April 22, 2024

87% Match
Dimitris Cheliotis, Michail Louvaris
Probability

In this work we study symmetric random matrices with variance profile satisfying certain conditions. We establish the convergence of the operator norm of these matrices to the largest element of the support of the limiting empirical spectral distribution. We prove that it is sufficient for the entries of the matrix to have finite only the $4$-th moment or the $4+\epsilon$ moment in order for the convergence to hold in probability or almost surely respectively. Our approach de...

Find SimilarView on arXiv

A concentration inequality and a local law for the sum of two random matrices

October 2, 2010

87% Match
Vladislav Kargin
Probability

Let H=A+UBU* where A and B are two N-by-N Hermitian matrices and U is a Haar-distributed random unitary matrix, and let \mu_H, \mu_A, and \mu_B be empirical measures of eigenvalues of matrices H, A, and B, respectively. Then, it is known (see, for example, Pastur-Vasilchuk, CMP, 2000, v.214, pp.249-286) that for large N, measure \mu_H is close to the free convolution of measures \mu_A and \mu_B, where the free convolution is a non-linear operation on probability measures. T...

Find SimilarView on arXiv

The Expected Norm of a Sum of Independent Random Matrices: An Elementary Approach

June 15, 2015

87% Match
Joel A. Tropp
Probability
Statistics Theory
Statistics Theory

In contemporary applied and computational mathematics, a frequent challenge is to bound the expectation of the spectral norm of a sum of independent random matrices. This quantity is controlled by the norm of the expected square of the random matrix and the expectation of the maximum squared norm achieved by one of the summands; there is also a weak dependence on the dimension of the random matrix. The purpose of this paper is to give a complete, elementary proof of this impo...

Find SimilarView on arXiv

Recent developments in non-asymptotic theory of random matrices

January 11, 2013

87% Match
Mark Rudelson
Probability
Functional Analysis

Non-asymptotic theory of random matrices strives to investigate the spectral properties of random matrices, which are valid with high probability for matrices of a large fixed size. Results obtained in this framework find their applications in high-dimensional convexity, analysis of convergence of algorithms, as well as in random matrix theory itself. In these notes we survey some recent results in this area and describe the techniques aimed for obtaining explicit probability...

Find SimilarView on arXiv

Sharp Bounds for the Concentration of the Resolvent in Convex Concentration Settings

January 2, 2022

86% Match
Cosme Louart
Probability

Considering random matrix $X \in \mathcal M_{p,n}$ with independent columns satisfying the convex concentration properties issued from a famous theorem of Talagrand, we express the linear concentration of the resolvent $Q = (I_p - \frac{1}{n}XX^T) ^{-1}$ around a classical deterministic equivalent with a good observable diameter for the nuclear norm. The general proof relies on a decomposition of the resolvent as a series of powers of $X$.

Find SimilarView on arXiv

Second-Order Matrix Concentration Inequalities

April 22, 2015

86% Match
Joel A. Tropp
Probability
Statistics Theory
Statistics Theory

Matrix concentration inequalities give bounds for the spectral-norm deviation of a random matrix from its expected value. These results have a weak dimensional dependence that is sometimes, but not always, necessary. This paper identifies one of the sources of the dimensional term and exploits this insight to develop sharper matrix concentration inequalities. In particular, this analysis delivers two refinements of the matrix Khintchine inequality that use information beyond ...

Find SimilarView on arXiv

Estimates for the concentration functions of weighted sums of independent random variables

March 25, 2012

86% Match
Yu. S. Eliseeva, A. Yu. Zaitsev
Probability

Let $X,X_1,...,X_n$ be independent identically distributed random variables. The paper deals with the question about the behavior of the concentration function of the random variable $\sum_{k=1}^{n}a_k X_k$ according to the arithmetic structure of coefficients $a_k$. Recently the interest to this question has increased significantly due to the study of distributions of eigenvalues of random matrices. In this paper we formulate and prove some refinements of the results of Frie...

Find SimilarView on arXiv

Deriving Matrix Concentration Inequalities from Kernel Couplings

May 3, 2013

86% Match
Daniel Paulin, Lester Mackey, Joel A. Tropp
Probability
Functional Analysis

This paper derives exponential tail bounds and polynomial moment inequalities for the spectral norm deviation of a random matrix from its mean value. The argument depends on a matrix extension of Stein's method of exchangeable pairs for concentration of measure, as introduced by Chatterjee. Recent work of Mackey et al. uses these techniques to analyze random matrices with additive structure, while the enhancements in this paper cover a wider class of matrix-valued random elem...

Find SimilarView on arXiv

On the expectation of operator norms of random matrices

March 8, 2016

86% Match
Olivier Guédon, Aicke Hinrichs, ... , Prochno Joscha
Probability

We prove estimates for the expected value of operator norms of Gaussian random matrices with independent and mean-zero entries, acting as operators from $\ell^m_{p^*}$ to $\ell_q^n$, $1\leq p^* \leq 2 \leq q \leq \infty$.

Find SimilarView on arXiv

Sums of random Hermitian matrices and an inequality by Rudelson

April 22, 2010

86% Match
Roberto Imbuzeiro Oliveira
Probability

We give a new, elementary proof of a key inequality used by Rudelson in the derivation of his well-known bound for random sums of rank-one operators. Our approach is based on Ahlswede and Winter's technique for proving operator Chernoff bounds. We also prove a concentration inequality for sums of random matrices of rank one with explicit constants.

Find SimilarView on arXiv