Understanding Probabilistic Complexity Classes in Complexity Theory

csce 637 complexity theory n.w
1 / 55
Embed
Share

Explore the concept of probabilistic complexity classes in complexity theory focusing on the min-Cut problem. Learn about an algorithm for finding the minimum number of edges to separate a graph into multiple pieces efficiently. Dive into the intricacies of this concept with detailed explanations and visual aids.

  • Complexity Theory
  • Probabilistic Classes
  • Min-Cut Problem
  • Algorithm
  • Graph Separation

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. CSCE-637 Complexity Theory Fall 2020 Instructor: Jianer Chen Office: HRBB 338C Phone: 845-4259 Email: chen@cse.tamu.edu Notes 17: Probabilistic Complexity Classes

  2. Probabilistic Complexity Classes

  3. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces.

  4. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ).

  5. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B A D E C F

  6. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B A D E C F

  7. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B A A D E D C C F F

  8. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B A A D E D C C F F

  9. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B A A A D E D D C C C F F

  10. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B A A A D E D D C C C F F

  11. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B B A A A A D E D D D C C C F F

  12. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B B A A A A D E D D D C C C F F

  13. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B B B A A A A A D E D D D C C C F F

  14. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B B B A A A A A D E D D D C C C F F

  15. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B B B A A A A A D E D D D C C C F F

  16. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B B B A A A A A D E D D D C C C F F

  17. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B B B A A A A A D E D D D C C C F F

  18. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. An algorithm for min-Cut. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). B B B B B A A A A A D E D D D C C C F F

  19. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ).

  20. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. Analysis 1. Suppose that the min-Cut has size k; Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ).

  21. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. Analysis 1. Suppose that the min-Cut has size k; 2. vertex-degree k; Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ).

  22. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. Analysis 1. Suppose that the min-Cut has size k; 2. vertex-degree k; Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). 3. #edges kn/2;

  23. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. Analysis 1. Suppose that the min-Cut has size k; 2. vertex-degree k; Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). 3. #edges kn/2; 4. Let a min-Cut be C: |C| = k;

  24. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. Analysis 1. Suppose that the min-Cut has size k; 2. vertex-degree k; Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). 3. #edges kn/2; 4. Let a min-Cut be C: |C| = k; 5. The probability that a randomly picked edge is not in C = (#edges k)/#edges (kn/2 k)/(kn/2) = 1 2/n.

  25. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. Analysis 1. Suppose that the min-Cut has size k; 2. vertex-degree k; Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). 3. #edges kn/2; 4. Let a min-Cut be C: |C| = k; 5. The probability that a randomly picked edge is not in C = (#edges k)/#edges (kn/2 k)/(kn/2) = 1 2/n. 6. The probability that no edge picked in step 1.2 is in C 2 2 ?=1 ? 21 = ? ?+1 ? ? 1

  26. Probabilistic Complexity Classes The min-Cut Problem. Give a graph G, remove the minimum number of edges in G to separate G into more than one pieces. Analysis 1. Suppose that the min-Cut has size k; 2. vertex-degree k; Input: a graph G 1. Repeat h times 1.1 G1 = G; minC = E; 1.2 While (G1has > 2 vertex) randomly pick an edge in G1 and shrink it; 1.3 let E be the set of edges between the 2 vertices of G1; 1.4 If (|minC| > |E |) minC = E ; 2. return(E ). 3. #edges kn/2; 4. Let a min-Cut be C: |C| = k; 5. The probability that a randomly picked edge is not in C = (#edges k)/#edges (kn/2 k)/(kn/2) = 1 2/n. 6. The probability that no edge picked in step 1.2 is in C 2 2 ?=1 ? 21 = ? ?+1 ? ? 1 7. Thus, if we choose h = 5n2, then the algorithm has a high probability (> 0.999) to return a real min-cut in step 2.

  27. Probabilistic Complexity Classes Definition. A probabilistic (or randomized) algorithm (or Turing machine) is the one that can toss a coin (with equal probability) to make a decision among two given options.

  28. Probabilistic Complexity Classes Definition. A probabilistic (or randomized) algorithm (or Turing machine) is the one that can toss a coin (with equal probability) to make a decision among two given options. Implementation.

  29. Probabilistic Complexity Classes Definition. A probabilistic (or randomized) algorithm (or Turing machine) is the one that can toss a coin (with equal probability) to make a decision among two given options. Implementation. A probabilistic Turing machine may have statements like: (stoss, a) = (s1, b1, m1); (stoss, a) = (s2, b2, m2);

  30. Probabilistic Complexity Classes Definition. A probabilistic (or randomized) algorithm (or Turing machine) is the one that can toss a coin (with equal probability) to make a decision among two given options. Implementation. A probabilistic Turing machine may have statements like: (stoss, a) = (s1, b1, m1); (stoss, a) = (s2, b2, m2); A randomized algorithm may have statements like: toss a coin: If (head) statement 1, Else statement 2.

  31. Probabilistic Complexity Classes Definition. A probabilistic (or randomized) algorithm (or Turing machine) is the one that can toss a coin (with equal probability) to make a decision among two given options. Implementation. A probabilistic Turing machine may have statements like: (stoss, a) = (s1, b1, m1); (stoss, a) = (s2, b2, m2); A randomized algorithm may have statements like: toss a coin: If (head) statement 1, Else statement 2. Probabilistic computation can be implemented using (pseudo-)random number generators.

  32. Probabilistic Complexity Classes

  33. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability )

  34. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3.

  35. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1.

  36. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. Certain numbers above can be refined.

  37. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. can be any for 0 < < 1 Certain numbers above can be refined.

  38. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. can be any for 0 < < 1 M1/2-machine(x) 1. repeat h times If (M (x) accepts) Then accept; 2. reject.

  39. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. can be any for 0 < < 1 1. If y is not in L, M (y) never accepts, so step 1 of M1/2 cannot accept y. M1/2-machine(x) 1. repeat h times If (M (x) accepts) Then accept; 2. reject.

  40. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. can be any for 0 < < 1 1. If y is not in L, M (y) never accepts, so step 1 of M1/2 cannot accept y. So M1/2 rejects y with prob. 1; M1/2-machine(x) 1. repeat h times If (M (x) accepts) Then accept; 2. reject.

  41. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. can be any for 0 < < 1 1. If y is not in L, M (y) never accepts, so step 1 of M1/2 cannot accept y. So M1/2 rejects y with prob. 1; If x is in L, the prob. M (x) rejects is < 1 - . M1/2-machine(x) 1. repeat h times If (M (x) accepts) Then accept; 2. reject. 2.

  42. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. can be any for 0 < < 1 1. If y is not in L, M (y) never accepts, so step 1 of M1/2 cannot accept y. So M1/2 rejects y with prob. 1; If x is in L, the prob. M (x) rejects is < 1 - . The prob. M1/2 rejects x = the prob. M (x) rejects all h times < (1- )h < 1/2. M1/2-machine(x) 1. repeat h times If (M (x) accepts) Then accept; 2. reject. 2.

  43. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. can be any for 0 < < 1 1. If y is not in L, M (y) never accepts, so step 1 of M1/2 cannot accept y. So M1/2 rejects y with prob. 1; If x is in L, the prob. M (x) rejects is < 1 - . The prob. M1/2 rejects x = the prob. M (x) rejects all h times < (1- )h < 1/2. The prob. M1/2 accepts x 1/2. M1/2-machine(x) 1. repeat h times If (M (x) accepts) Then accept; 2. reject. 2.

  44. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > and for each y not in L, M on y returns NO with probability (i.e., YES with probability ) Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. can be any c for 1/2 < c < 1 Definition. A decision problem L is in VPP (or R) if there is a poly- time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. can be any for 0 < < 1 1. If y is not in L, M (y) never accepts, so step 1 of M1/2 cannot accept y. So M1/2 rejects y with prob. 1; If x is in L, the prob. M (x) rejects is < 1 - . The prob. M1/2 rejects x = the prob. M (x) rejects all h times < (1- )h < 1/2. The prob. M1/2 accepts x 1/2. M1/2-machine(x) 1. repeat h times If (M (x) accepts) Then accept; 2. reject. 2.

  45. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > 1/2 and for each y not in L, M on y returns NO with probability 1/2. Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. Theorem. R BPP PP.

  46. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > 1/2 and for each y not in L, M on y returns NO with probability 1/2. Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. Theorem. R BPP PP. Proof. By definitions.

  47. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > 1/2 and for each y not in L, M on y returns NO with probability 1/2. Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. Theorem. R BPP PP. Theorem. R NP.

  48. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > 1/2 and for each y not in L, M on y returns NO with probability 1/2. Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. Theorem. R BPP PP. Theorem. R NP. Proof. Use the R-machine as an NP-machine.

  49. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > 1/2 and for each y not in L, M on y returns NO with probability 1/2. Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. Theorem. R BPP PP. Theorem. R NP. Theorem. NP PP.

  50. Probabilistic Complexity Classes Definition. A decision problem L is in PP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability > 1/2 and for each y not in L, M on y returns NO with probability 1/2. Definition. A decision problem L is in BPP if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 2/3 and for each y not in L, M on y returns NO with probability 2/3. Definition. A decision problem L is in VPP (or R) if there is a poly-time probabilistic Turing machine M such that for each x in L, M on x returns YES with probability 1/2 and for each y not in L, M on y returns NO with probability 1. Theorem. R BPP PP. Theorem. R NP. Theorem. NP PP. Proof. Let Q be in NP, accepted by an NP-machine MNP.

Related


More Related Content