
Relativized Complexity Classes
Explore the study of oracles in complexity theory that separate classes such as P, NP, and the poly-time hierarchy. Delve into the nuances of relativized complexity through examples like the Furst-Saxe-Sipser theorem and the parity function. Consider the implications of oracle sets in providing insights into the P = NP problem.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
CSCE-637 Complexity Theory Fall 2020 Instructor: Jianer Chen Office: HRBB 338C Phone: 845-4259 Email: chen@cse.tamu.edu Notes 24: Approximation Algorithms
Relativized Complexity Therefore, 1. There is an oracle set A such that PA = NPA; and 2. There is an oracle set B such that PB NPB.
Relativized Complexity Therefore, 1. There is an oracle set A such that PA = NPA; and 2. There is an oracle set B such that PB NPB. This hints that settling the P = NP problem probably cannot be trivial.
Relativized Complexity The study of oracles that separate complexity classes has been extensive.
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ?
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function.
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. 1. Parity function: P(x1x2 xn) = (x1+x2+ + xn) mod 2.
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. 1. 2. Parity function: P(x1x2 xn) = (x1+x2+ + xn) mod 2. Here the circuit -gates and -gates can have unbounded fan-in.
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. 1. 2. Parity function: P(x1x2 xn) = (x1+x2+ + xn) mod 2. Here the circuit -gates and -gates can have unbounded fan-in. Remarks. 1. Parity function can be easily computed in linear time.
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. 1. 2. Parity function: P(x1x2 xn) = (x1+x2+ + xn) mod 2. Here the circuit -gates and -gates can have unbounded fan-in. Remarks. 1. Parity function can be easily computed in linear time. 2. Parity function can be easily computed by NC1 circuits of linear size.
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. 1. 2. Parity function: P(x1x2 xn) = (x1+x2+ + xn) mod 2. Here the circuit -gates and -gates can have unbounded fan-in. Remarks. 1. Parity function can be easily computed in linear time. 2. Parity function can be easily computed by NC1 circuits of linear size. x 0 0 1 1 y 0 1 0 1 x y = (x y) ( x y) 0 1 1 0
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. 1. 2. Parity function: P(x1x2 xn) = (x1+x2+ + xn) mod 2. Here the circuit -gates and -gates can have unbounded fan-in. Remarks. 1. Parity function can be easily computed in linear time. 2. Parity function can be easily computed by NC1 circuits of linear size. x1 x3 x5 x7 x2 x4 x6 x8 x 0 0 1 1 y 0 1 0 1 x y = (x y) ( x y) 0 1 1 0
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. Andrew C. Yao, Separating the polynomial-time hierarchy by oracles, Proc. 26th Annual Symp. on Foundations of Computer Science, pp. 1-10.
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. Andrew C. Yao, Separating the polynomial-time hierarchy by oracles, Proc. 26th Annual Symp. on Foundations of Computer Science, pp. 1-10. Johan Hastad, Computational limitations of small depth circuits, Ph.D. thesis, MIT.
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. Andrew C. Yao, Separating the polynomial-time hierarchy by oracles, Proc. 26th Annual Symp. on Foundations of Computer Science, pp. 1-10. (A component of his Turing Award) Johan Hastad, Computational limitations of small depth circuits, Ph.D. thesis, MIT.
Relativized Complexity The study of oracles that separate complexity classes has been extensive. Example: Is there an oracle set B that separates all levels of the poly-time hierarchy, i.e., ( ? ?)B ( ?+1 ? )B ? Furst-Saxe-Sipser: The problem can be reduced to proving an exponential lower bound on the size of constant-depth Boolean circuits that compute the parity function. Andrew C. Yao, Separating the polynomial-time hierarchy by oracles, Proc. 26th Annual Symp. on Foundations of Computer Science, pp. 1-10. (A component of his Turing Award) Johan Hastad, Computational limitations of small depth circuits, Ph.D. thesis, MIT. (The result for his Godel Prize)
Approximation Algorithms Consider optimization problems, instead of decision problems.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G. Optimization Problem: 1. A set of instances; 2. Each instance has a set of (candidate) solutions; 3. A solution y of an instance x has a value w(x, y); 4. Optimal solution: max or min
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G. Optimization Problem: 1. A set of instances; 2. Each instance has a set of (candidate) solutions; 3. A solution y of an instance x has a value w(x, y); 4. Optimal solution: max or min Example (min-VC): (1) instances: graphs; (2) solutions to an instance G: vertex covers of G; (3) the value of a solution C to an instance G: f(G, C) = |C|; (4) opt = min.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G. Optimization Problem: 1. A set of instances; 2. Each instance has a set of (candidate) solutions; 3. A solution y of an instance x has a value w(x, y); 4. Optimal solution: max or min Min-VC is hard because the decision version of Vertex- Cover is NP-hard. Example (min-VC): (1) instances: graphs; (2) solutions to an instance G: vertex covers of G; (3) the value of a solution C to an instance G: f(G, C) = |C|; (4) opt = min.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G. Formally, an optimization problem Q is NP-hard if there is an NP-hard (decision) problem L such that Optimization Problem: 1. A set of instances; 2. Each instance has a set of (candidate) solutions; 3. A solution y of an instance x has a value w(x, y); 4. Optimal solution: max or min Poly-time computable instance x of L instance f1(x) of Q f1 f2(y)= yes/no decision on x opt solution y of f1(x) f2 Example (min-VC): (1) instances: graphs; (2) solutions to an instance G: vertex covers of G; (3) the value of a solution C to an instance G: f(G, C) = |C|; (4) opt = min.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G. Formally, an optimization problem Q is NP-hard if there is an NP-hard (decision) problem L such that Optimization Problem: 1. A set of instances; 2. Each instance has a set of (candidate) solutions; 3. A solution y of an instance x has a value w(x, y); 4. Optimal solution: max or min Poly-time computable instance x of L (G, k) instance f1(x) of Q f1 f2(y)= yes/no decision on x opt solution y of f1(x) f2 Example (min-VC): (1) instances: graphs; (2) solutions to an instance G: vertex covers of G; (3) the value of a solution C to an instance G: f(G, C) = |C|; (4) opt = min.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G. Formally, an optimization problem Q is NP-hard if there is an NP-hard (decision) problem L such that Optimization Problem: 1. A set of instances; 2. Each instance has a set of (candidate) solutions; 3. A solution y of an instance x has a value w(x, y); 4. Optimal solution: max or min Poly-time computable instance x of L (G, k) instance f1(x) of Q f1(G, k) = G f1 f2(y)= yes/no decision on x opt solution y of f1(x) f2 Example (min-VC): (1) instances: graphs; (2) solutions to an instance G: vertex covers of G; (3) the value of a solution C to an instance G: f(G, C) = |C|; (4) opt = min.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G. Formally, an optimization problem Q is NP-hard if there is an NP-hard (decision) problem L such that Optimization Problem: 1. A set of instances; 2. Each instance has a set of (candidate) solutions; 3. A solution y of an instance x has a value w(x, y); 4. Optimal solution: max or min Poly-time computable instance x of L (G, k) instance f1(x) of Q f1(G, k) = G f1 f2(y)= yes/no decision on x opt solution y of f1(x) y = min VC C f2 Example (min-VC): (1) instances: graphs; (2) solutions to an instance G: vertex covers of G; (3) the value of a solution C to an instance G: f(G, C) = |C|; (4) opt = min.
Approximation Algorithms Consider optimization problems, instead of decision problems. Definition. G: a graph. A vertex cover of G is a set C of vertices in G such that every edge in G has at least one end in C. min-VC: Given a graph G, find the smallest vertex cover for G. Formally, an optimization problem Q is NP-hard if there is an NP-hard (decision) problem L such that Optimization Problem: 1. A set of instances; 2. Each instance has a set of (candidate) solutions; 3. A solution y of an instance x has a value w(x, y); 4. Optimal solution: max or min Poly-time computable instance x of L (G, k) instance f1(x) of Q f1(G, k) = G f1 f2(y)= yes/no decision on x f2(C) = yes if |C| k opt solution y of f1(x) y = min VC C f2 Example (min-VC): (1) instances: graphs; (2) solutions to an instance G: vertex covers of G; (3) the value of a solution C to an instance G: f(G, C) = |C|; (4) opt = min.
Approximation Algorithms Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution?
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M.
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M.
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M.
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M.
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M. C is poly-time constructible.
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M. C is poly-time constructible. Is C a vertex cover?
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M. C is poly-time constructible. Is C a vertex cover? Yes
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M. C is poly-time constructible. Is C a vertex cover? Yes How good is the vertex cover C?
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M. C is poly-time constructible. Is C a vertex cover? Yes How good is the vertex cover C? at most twice of the minimum VC.
Approximation Algorithms Thus, it is difficult to find optimal solutions to an NP-hard optimization problem. Optimization Problem: 1. A set of instances; 2. Each instance has a set of solutions; 3. A solution y of an instance x has a value f(x, y); 4. Objective: max or min How about finding a semi-optimal solution, i.e., a good enough solution? Example (min-VC): 1. Find a maximal matching M in G; 2. Output C = all vertices in M. C is poly-time constructible. Is C a vertex cover? Yes How good is the vertex cover C? at most twice of the minimum VC. Thus, the approximation ratio 2.