Bernstein Bounds
Posted on
I noticed that the papers of matrix/tensor completion always talk about the Bernstein inequality, then I picked the Bernstein Bounds discussed in Wainwright (2019).
A random variable $X$ with mean $\mu=\bbE[X]$ is sub-exponential if there are non-negative parameter $(\nu,\alpha)$ such that
$$
\bbE[e^{\lambda(X-\mu)}]\le e^{\frac{\nu^2\lambda^2}{2}}\quad \text{for all }\vert \lambda\vert < \frac{1}{\alpha}\,.
$$
Given a random variable $X$ with mean $\mu=\bbE[X]$ and variance $\sigma^2=\E[X^2]-\mu^2$, the Bernstein's condition with parameter $b$ holds if
$$
\vert \bbE[(X-\mu)^k]\vert \le \frac 12k!\sigma^2b^{k-2}\quad\text{for }k=3,4,\ldots
$$
One sufficient condition for Bernstein’s condition to hold is that $X$ be bounded.
For any random variable satisfying the Bernstein condition, we have
$$
\bbE[e^{\lambda(X-\mu)}]\le e^{\frac{\lambda^2\sigma^2}{1-b\vert\lambda\vert}}\quad\text{for all }\vert\lambda\vert <\frac 1b\,,
$$
and moreover, the concentration inequality
$$
P[\vert X-\mu\vert \ge t]\le 2e^{-\frac{t^2}{2(\sigma^2+bt)}}\quad\text{for all}t\ge 0\,.
$$
A zero-mean symmetric random matrix $Q$ satisfies a Bernstein condition with parameter $b > 0$ if
$$
\bbE[Q^j]\preceq \frac 12j!b^{j-2}\var(Q)\quad\text{for }j=3,4,\ldots
$$
Let $Q_1,\ldots,Q_n$ be a sequence of independent, zero-mean, symmetric random matrices that satisfy the Bernstein condition with parameter $b>0$. Then for all $\delta>0$, the operator norm satisfies the tail bound
$$
P\left[ \Vert \frac 1n\sum_{i=1}^nQ_i\Vert_2\ge \delta \right]\le 2\rank(\sum_{i=1}^n\var(Q_i))\exp\left\{-\frac{n\delta^2}{2(\sigma^2+b\delta)}\right\}\,.
$$