Stability (probability)
In probability theory, the stability of a random variable is the property that a linear combination of two independent copies of the variable has the same distribution, up to location and scale parameters.[1] The distributions of random variables having this property are said to be "stable distributions". Results available in probability theory show that all possible distributions having this property are members of a four-parameter family of distributions. The article on the stable distribution describes this family together with some of the properties of these distributions.
The importance in probability theory of "stability" and of the stable family of probability distributions is that they are "attractors" for properly normed sums of independent and identically distributed random variables.
Important special cases of stable distributions are the normal distribution, the Cauchy distribution and the Lévy distribution. For details see stable distribution.
Definition
There are several basic definitions for what is meant by stability. Some are based on summations of random variables and others on properties of characteristic functions.
Definition via distribution functions
Feller[2] makes the following basic definition. A random variable X is called stable (has a stable distribution) if, for n independent copies Xi of X, there exist constants cn > 0 and dn such that
where this equality refers to equality of distributions. A conclusion drawn from this starting point is that the sequence of constants cn must be of the form
- for
A further conclusion is that it is enough for the above distributional identity to hold for n=2 and n=3 only.[3]
Stability in probability theory
There are a number of mathematical results that can be derived for distributions which have the stability property. That is, all possible families of distributions which have the property of being closed under convolution are being considered.[4] It is convenient here to call these stable distributions, without meaning specifically the distribution described in the article named stable distribution, or to say that a distribution is stable if it is assumed that it has the stability property. The following results can be obtained for univariate distributions which are stable.
- Stable distributions are always infinitely divisible.[5]
- All stable distributions are absolutely continuous.[6]
- All stable distributions are unimodal.[7]
Other types of stability
The above concept of stability is based on the idea of a class of distributions being closed under a given set of operations on random variables, where the operation is "summation" or "averaging". Other operations that have been considered include:
- geometric stability: here the operation is to take the sum of a random number of random variables, where the number has a geometric distribution.[8] The counterpart of the stable distribution in this case is the geometric stable distribution
- Max-stability: here the operation is to take the maximum of a number of random variables. The counterpart of the stable distribution in this case is the generalized extreme value distribution, and the theory for this case is dealt with as extreme value theory. See also the stability postulate. A version of this case in which the minimum is taken instead of the maximum is available by a simple extension.
Notes
- Lukacs, E. (1970) Section 5.7
- Feller (1971), Section VI.1
- Feller (1971), Problem VI.13.3
- Lukacs, E. (1970) Section 5.7
- Lukacs, E. (1970) Theorem 5.7.1
- Lukacs, E. (1970) Theorem 5.8.1
- Lukacs, E. (1970) Theorem 5.10.1
- Klebanov et al. (1984)
References
- Lukacs, E. (1970) Characteristic Functions. Griffin, London.
- Feller, W. (1971) An Introduction to Probability Theory and Its Applications, Volume 2. Wiley. ISBN 0-471-25709-5
- Klebanov, L.B., Maniya, G.M., Melamed, I.A. (1984) "A problem of V. M. Zolotarev and analogues of infinitely divisible and stable distributions in a scheme for summation of a random number of random variables". Theory Probab. Appl., 29, 791–794