PCP theorem

In computational complexity theory, the PCP theorem (also known as the PCP characterization theorem) states that every decision problem in the NP complexity class has probabilistically checkable proofs (proofs that can be checked by a randomized algorithm) of constant query complexity and logarithmic randomness complexity (uses a logarithmic number of random bits).

The PCP theorem says that for some universal constant K, for every n, any mathematical proof for a statement of length n can be rewritten as a different proof of length poly(n) that is formally verifiable with 99% accuracy by a randomized algorithm that inspects only K letters of that proof.

The PCP theorem is the cornerstone of the theory of computational hardness of approximation, which investigates the inherent difficulty in designing efficient approximation algorithms for various optimization problems. It has been described by Ingo Wegener as "the most important result in complexity theory since Cook's theorem"[1] and by Oded Goldreich as "a culmination of a sequence of impressive works […] rich in innovative ideas".[2]

Formal statement

The PCP theorem states that

NP = PCP[O(log n), O(1)],

where PCP[r(n), q(n)] is the class of problems for which a probabilistically checkable proof of a solution can be given, such that the proof can be checked in polynomial time using r(n) bits of randomness and by reading q(n) bits of the proof, correct proofs are always accepted, and incorrect proofs are rejected with probability at least 1/2. n is the length in bits of the description of a problem instance. Note further that the verification algorithm is non-adaptive: the choice of bits of the proof to check depend only on the random bits and the description of the problem instance, not the actual bits of the proof.

PCP and hardness of approximation

An alternative formulation of the PCP theorem states that the maximum fraction of satisfiable constraints of a constraint satisfaction problem is NP-hard to approximate within some constant factor.[3]

Formally, for some constants q and α < 1, the following promise problem (Lyes, Lno) is an NP-hard decision problem:

  • Lyes = {Φ: all constraints in Φ are simultaneously satisfiable}
  • Lno = {Φ: every assignment satisfies fewer than an α fraction of Φ's constraints},

where Φ is a constraint satisfaction problem (CSP) over a Boolean alphabet with at most q variables per constraint.

The connection to the class PCP mentioned above can be seen by noticing that checking a constant number of bits q in a proof can be seen as evaluating a constraint in q Boolean variables on those bits of the proof. Since the verification algorithm uses O(log n) bits of randomness, it can be represented as a CSP as described above with poly(n) constraints. The other characterisation of the PCP theorem then guarantees the promise condition with α = 1/2: if the NP problem's answer is yes, then every constraint (which corresponds to a particular value for the random bits) has a satisfying assignment (an acceptable proof); otherwise, any proof should be rejected with probability at least 1/2, which means any assignment must satisfy fewer than 1/2 of the constraints (which means it will be accepted with probability lower than 1/2). Therefore, an algorithm for the promise problem would be able to solve the underlying NP problem, and hence the promise problem must be NP hard.

As a consequence of this theorem, it can be shown that the solutions to many natural optimization problems including maximum boolean formula satisfiability, maximum independent set in graphs, and the shortest vector problem for lattices cannot be approximated efficiently unless P = NP. This can be done by reducing the problem of approximating a solution to such problems to a promise problem of the above form. These results are sometimes also called PCP theorems because they can be viewed as probabilistically checkable proofs for NP with some additional structure.

Proof

A proof of a weaker result, NP ⊆ PCP[n3, 1] is given in one of the lectures of Dexter Kozen.[4]

History

The PCP theorem is the culmination of a long line of work on interactive proofs and probabilistically checkable proofs. The first theorem relating standard proofs and probabilistically checkable proofs is the statement that NEXPPCP[poly(n), poly(n)], proved by Babai, Fortnow & Lund (1990).

Origin of the initials

The notation PCPc(n), s(n)[r(n), q(n)] is explained at probabilistically checkable proof. The notation is that of a function that returns a certain complexity class. See the explanation mentioned above.

The name of this theorem (the "PCP theorem") probably comes either from "PCP" meaning "probabilistically checkable proof", or from the notation mentioned above (or both).

First theorem [in 1990]

Subsequently, the methods used in this work were extended by Babai, Lance Fortnow, Levin, and Szegedy in 1991 (Babai et al. 1991), Feige, Goldwasser, Lund, Safra, and Szegedy (1991), and Arora and Safra in 1992 (Arora & Safra 1992) to yield a proof of the PCP theorem by Arora, Lund, Motwani, Sudan, and Szegedy in 1998 (Arora et al. 1998).

The 2001 Gödel Prize was awarded to Sanjeev Arora, Uriel Feige, Shafi Goldwasser, Carsten Lund, László Lovász, Rajeev Motwani, Shmuel Safra, Madhu Sudan, and Mario Szegedy for work on the PCP theorem and its connection to hardness of approximation.

In 2005 Irit Dinur discovered a significantly simpler proof of the PCP theorem, using expander graphs.[5] She received the 2019 Gödel Prize for this. [6]

Quantum analog

In 2012, Thomas Vidick and Tsuyoshi Ito published a result[7] that showed a "strong limitation on the ability of entangled provers to collude in a multiplayer game". This could be a step toward proving the quantum analogue of the PCP theorem, since when the result[7] was reported in the media,[8][9] professor Dorit Aharonov called it "the quantum analogue of an earlier paper on multiprover interactive proofs" that "basically led to the PCP theorem".[9]

In 2018, Thomas Vidick and Anand Natarajan proved[10] a games variant of quantum PCP theorem under randomized reduction. It states that QMAMIP*[log(n), 1, 1/2], where MIP*[f(n), c, s] is a complexity class of multi-prover quantum interactive proofs systems with f(n)-bit classical communications, and the completeness is c and the soundness is s. They also showed that the Hamiltonian version of a quantum PCP conjecture, namely a local Hamiltonian problem with constant promise gap c  s is QMA-hard, implies the games quantum PCP theorem.

NLTS conjecture is a fundamental unresolved obstacle and precursor to a quantum analog of PCP.[11]

Notes

  1. Ingo Wegener (2005). Complexity Theory: Exploring the Limits of Efficient Algorithms. Springer. p. 161. ISBN 978-3-540-21045-0.
  2. Oded Goldreich (2008). Computational Complexity: A Conceptual Perspective. Cambridge University Press. p. 405. ISBN 978-0-521-88473-0.
  3. Arora, Sanjeev; Barak, Boaz (2009). Computational Complexity: a Modern Approach (PDF) (Draft). Cambridge University Press.
  4. Kozen, Dexter C. (2006). Theory of Computation. Texts in Computer Science. London: Springer-Verlag. pp. 119–127. ISBN 9781846282973.
  5. See the 2005 preprint, ECCC TR05-046. The authoritative version of the paper is Dinur (2007).
  6. EATSC 2019 Gödel Prize, retrieved 2019-09-11.
  7. Ito, Tsuyoshi; Vidick, Thomas (2012). "A multi-prover interactive proof for NEXP sound against entangled provers". 53rd Annual IEEE Symposium on Foundations of Computer Science, FOCS 2012, New Brunswick, NJ, USA, October 20–23, 2012. IEEE Computer Society. pp. 243–252. arXiv:1207.0550. doi:10.1109/FOCS.2012.11.
  8. Hardesty, Larry (2012-07-30). "MIT News Release: 10-year-old problem in theoretical computer science falls". MIT News Office. Archived from the original on 2014-02-02. Retrieved 2012-08-10. Interactive proofs are the basis of cryptographic systems now in wide use, but for computer scientists, they're just as important for the insight they provide into the complexity of computational problems.
  9. Hardesty, Larry (2012-07-31). "10-year-old problem in theoretical computer science falls". MIT News Office. Archived from the original on 2012-08-01. Retrieved 2012-08-10. Dorit Aharonov, a professor of computer science and engineering at Hebrew University in Jerusalem, says that Vidick and Ito's paper is the quantum analogue of an earlier paper on multiprover interactive proofs that "basically led to the PCP theorem, and the PCP theorem is no doubt the most important result of complexity in the past 20 years." Similarly, she says, the new paper "could be an important step toward proving the quantum analogue of the PCP theorem, which is a major open question in quantum complexity theory."
  10. Natarajan, A.; Vidick, T. (October 2018). "Low-Degree Testing for Quantum States, and a Quantum Entangled Games PCP for QMA". 2018 IEEE 59th Annual Symposium on Foundations of Computer Science (FOCS). pp. 731–742. arXiv:1801.03821. Bibcode:2018arXiv180103821N. doi:10.1109/FOCS.2018.00075. ISBN 978-1-5386-4230-6. S2CID 53062680.
  11. "On the NLTS Conjecture". Simons Institute for the Theory of Computing. 2021-06-30. Retrieved 2022-08-08.

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.