Universal probability bound
A universal probability bound is a probabilistic threshold whose existence is asserted by William A. Dembski and is used by him in his works promoting intelligent design. It is defined as
A degree of improbability below which a specified event of that probability cannot reasonably be attributed to chance regardless of whatever probabilitistic resources from the known universe are factored in.[1]
Part of a series on |
Intelligent design |
---|
Concepts |
Movement |
Campaigns |
Authors |
Organisations |
Reactions |
|
Creationism |
|
Dembski asserts that one can effectively estimate a positive value which is a universal probability bound. The existence of such a bound would imply that certain kinds of random events whose probability lies below this value can be assumed not to have occurred in the observable universe, given the resources available in the entire history of the observable universe. Contrapositively, Dembski uses the threshold to argue that the occurrence of certain events cannot be attributed to chance alone. Universal probability bound is then used to argue against random evolution. However evolution is not based on random events only (genetic drift), but also on natural selection.
The idea that events with fantastically small, but positive probabilities, are effectively negligible[2] was discussed by the French mathematician Émile Borel primarily in the context of cosmology and statistical mechanics.[3] However, there is no widely accepted scientific basis for claiming that certain positive values are universal cutoff points for effective negligibility of events. Borel, in particular, was careful to point out that negligibility was relative to a model of probability for a specific physical system.[4][5]
Dembski appeals to cryptographic practice in support of the concept of the universal probability bound, noting that cryptographers have sometimes compared the security of encryption algorithms against brute force attacks by the likelihood of success of an adversary utilizing computational resources bounded by very large physical constraints. An example of such a constraint might be obtained for example, by assuming that every atom in the observable universe is a computer of a certain type and these computers are running through and testing every possible key. Although universal measures of security are used much less frequently than asymptotic ones[6] and the fact that a keyspace is very large may be less relevant if the cryptographic algorithm used has vulnerabilities which make it susceptible to other kinds of attacks,[7] asymptotic approaches and directed attacks would, by definition, be unavailable under chance-based scenarios such as those relevant to Dembski's universal probability bound. As a result, Dembski's appeal to cryptography is best understood as referring to brute force attacks, rather than directed attacks.
Dembski's estimate
Dembski's original value for the universal probability bound is 1 in 10150, derived as the inverse of the product of the following approximate quantities:[8][9]
- 1080, the number of elementary particles in the observable universe.
- 1045, the maximum rate per second at which transitions in physical states can occur (i.e., the inverse of the Planck time).
- 1025, a billion times longer than the typical estimated age of the universe in seconds.
Thus, 10150 = 1080 × 1045 × 1025. Hence, this value corresponds to an upper limit on the number of physical events that could possibly have occurred in the observable part of the universe since the Big Bang.
Dembski has recently (as of 2005) refined his definition to be the inverse of the product of two different quantities:[10]
- An upper bound on the computational resources of the universe in its entire history. This is estimated by Seth Lloyd as 10120 elementary logic operations on a register of 1090 bits[11][12]
- The (variable) rank complexity of the event under consideration.[13]
If the latter quantity equals 10150, then the overall universal probability bound corresponds to the original value.
References
- ISCID Encyclopedia of Science and Philosophy (1999)
- Negligible means having probability zero. Effectively negligible means, roughly, that in some operational sense or in some computational sense, the event is indistinguishable from a negligible one.
- Émile Borel, Elements of the Theory of Probability (translated by John Freund), Prentice Hall, 1965, Chapter 6. See also Citations from Borel's articles.
- Though Dembski credits Borel for the idea, there is clear evidence that Borel, following accepted scientific practice in the foundations of statistics, was not referring to a universal bound, independent of the statistical model used.
- Cobb, L. (2005) Borel's Law and Creationism, Aetheling Consultants.
- For a precise definition of effective negligibility in cryptography, see Michael Luby, Pseudorandomness and Cryptographic Applications, Princeton Computer Science Series, 1996.
- Though Dembski repeatedly appeals to cryptography in support of the concept of the universal probability bound, in practice cryptographers hardly use measures which are in any way related to it. A more useful concept is that of work factor. See p. 44, A. J. Menezes, P. C. van Oorschot, S. A. Vanstone, Handbook of Applied Cryptography, CRC Press, 1996.
- William A. Dembski (1998). The Design Inference pg 213, section 6.5
- William A. Dembski (2004). The Design Revolution: Answering the Toughest Questions About Intelligent Design pg 85
- William A. Dembski (2005). ""Specification: The Pattern That Signifies Intelligence (382k PDF)".
- Lloyd, Seth (2002). "Computational Capacity of the Universe". Physical Review Letters. 88 (23): 237901. arXiv:quant-ph/0110141. doi:10.1103/PhysRevLett.88.237901. PMID 12059399. S2CID 6341263.
- The number 1090 seems to play no role in Dembski's analysis, On page 23 of Specification: The Pattern That Signifies Intelligence, Dembski says
- "Lloyd has shown that 10120constitutes the maximal number of bit operations that the known, observable universe could have performed throughout its entire multi-billion year history."
- The rank complexity is Dembski's φ function which ranks patterns in order of their descriptive complexity. See specified complexity.