Multidimensional Chebyshev's inequality
In probability theory, the multidimensional Chebyshev's inequality is a generalization of Chebyshev's inequality, which puts a bound on the probability of the event that a random variable differs from its expected value by more than a specified amount.
Let be an -dimensional random vector with expected value and covariance matrix
If is a positive-definite matrix, for any real number :
Proof
Since is positive-definite, so is . Define the random variable
Since is positive, Markov's inequality holds:
Finally,
Infinite dimensions
There is a straightforward extension of the vector version of Chebyshev's inequality to infinite dimensional settings. Let X be a random variable which takes values in a Fréchet space (equipped with seminorms || ⋅ ||α). This includes most common settings of vector-valued random variables, e.g., when is a Banach space (equipped with a single norm), a Hilbert space, or the finite-dimensional setting as described above.
Suppose that X is of "strong order two", meaning that
for every seminorm || ⋅ ||α. This is a generalization of the requirement that X have finite variance, and is necessary for this strong form of Chebyshev's inequality in infinite dimensions. The terminology "strong order two" is due to Vakhania.[1]
Let be the Pettis integral of X (i.e., the vector generalization of the mean), and let
be the standard deviation with respect to the seminorm || ⋅ ||α. In this setting we can state the following:
- General version of Chebyshev's inequality.
Proof. The proof is straightforward, and essentially the same as the finitary version. If σα = 0, then X is constant (and equal to μ) almost surely, so the inequality is trivial.
If
then ||X − μ||α > 0, so we may safely divide by ||X − μ||α. The crucial trick in Chebyshev's inequality is to recognize that .
The following calculations complete the proof:
References
- Vakhania, Nikolai Nikolaevich. Probability distributions on linear spaces. New York: North Holland, 1981.