
Local Lemma Theorem and Applications in Randomized Algorithms
Explore the Local Lemma Theorem and its applications in Randomized Algorithms, including Lovasz Local Lemma, Union Bound, and k-wise independent events. Learn about applying the Local Lemma for k-CNF formulas, the Hat Game, and more.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Randomized Algorithms 2024-5 Lecturer: Moni Naor Lecture 10
The Lovasz Local Lemma Statement Applications Constructive Algorithmic
The Union Bound ? = ?1,...?? For ??, the dependencies on ? \ {??} may be arbitrary Theorem: If there are 0 ??< 1 s.t. for all ??: Pr[??] ?? ?=1 then the probability that no event ??happens is positive. If all the ??= ? then we need that ? ? < 1 Example: a CNF formula with ? clauses where each clause has more than log? literals is necessarily satisfiable a collection of random events. ???< 1 For a uniformly random assignment
The (Lovasz) Local Lemma Question: what happens with k-wise independent events? ? = ?1,...?? For ??, let (??) be a minimal set of events that ?? depends on ??is independent of all events in ? \ (??) information on events not in (??) {??} does not affect the probability of event ??happening. Theorem: If there are 0 ? < 1 and d s.t. for all ??: Pr[??] ? | (??)| ? ? ? ? < 1 then the probability that no event ??happens is positive. a collection of random events. Not symmetric Simultaneously: ????[ ? [?] ??] > 0
Applying The Local Lemma for k-CNF Theorem: If there are 0 ? < 1 and d s.t. for all ??: Pr[??] ? (??) d pde <1 then the probability that no event ??happens is positive. No dependency on the number of variables n or number of clauses m Theorem: Every k-CNF formula in which each variable appears in less than 2?/?? clauses is satisfiable. Proof: for a uniformly random assignment to the variables, Prob[??is not satisfied] 2 ? Each clause depends on at most 2? Event ??: Clause ?? not satisfied ?? ? = 2?/? clauses. If two clauses do not share a variable, their satisfiability is independent Taking p= 2 ?and d= 2?/? we get that the probability the formula is satisfiable is positive. How to find the satisfying assignment?
Example: LLL and the Hat Game Nodes of a graph G=(V,E) are assigned one of q colors. At each node there is a person observing the colors of the neighbors, but not the color at the node wearing a hat of that color. Players should guess the color of their own hat Players win if at least one player guesses correctly Do they have a deterministic strategy? HG(G): largest q for which there is a winning strategy for graph G For every node ? a mapping ??: ?|? ? | [?] For every color assignment: at least one node ? gets its own color right What happens in the complete graph? ? ? : neighbors of ?
LLL and the Hat Game Claim: In any graph ? with maximum degree : ??(?) ? . For a graph G and strategies ??: ?|? ? | [?] Choose colors at random from [q]. Let ??be the event that node ? guesses correctly ??[??] =1 Why not distance 2? There are of these ? Event ??is independent of all {??|? no? ? ? ? } pde <1 Probability that no ??occurs simultaneously is larger than 0!
Proof of The Local Lemma Theorem: If there are 0 ? < 1 and d s.t. for all ??: Pr[??] ? (??) d pd4 <1 then the probability that no event ??happens is positive. Induction Hypothesis: Let S {1, . . . , m}. By induction on ? = 0,...,? 1. If |?| ?, then for all k S we have Pr ?? ? ? ?? 2? For s=0: Pr ?? ? For s>0, Claim: ??[ ? ? ??] > 0 ??[ ? ? ??] = ? ?Pr[ ??| ?=1 ??] ? 1 By induction hypothesis ? 1 = ? ?(1 Pr ?? ?=1 ? ?(1 2?) ??)
Proof of The Local Lemma Theorem: If there are 0 ? < 1 and d s.t. for all ??: Pr[??] ? (??) d pd4 <1 then the probability that no event ??happens is positive. Induction Hypothesis: Let S {1, . . . , m}. By induction on ? = 0,...,? 1. If |S| s, then for all k S we have Pr ?? ? ? ?? 2? For s>0, Claim: Pr[ ? ? ??] > 0 ---------------------------------------------------------------------------------------------- Let ? = ?1 ?2where ?2are independent of ?? If ?2=S we are done: Pr ?? ? ? ?? = Pr ?? ? Otherwise, we know that | ?1| d Notation: let ??= ? ? ?? The event added From independence
Proof of The Local Lemma manipulation ? = ?1 ?2 ??= ? ? ?? Want to prove Pr ?? ? ? ?? 2? b ? Pr ?? ??1??2] ?? ??1??2] Pr ???? =Pr[?? ??] = 1/2 Pr[??] Independence of ?2 Pr ?? ??1??2] Pr[??| ??2] = Pr[??] p Since |?2| < |?| = ? can apply induction hypothesis to Pr ????2] = Pr ?? ? ?2 ?? 2? Now ?? ? ?2 ?? ?? ??1??2] = ?? ? ?1 1 ? S1Pr ?? ? ?2 ?? 1 ?2? 1 2?? 1 Induction hypothesis 2
Proof of The Local Lemma Know that Pr ?? ? ? ?? 2? Want to show ??[ ? [?] ??] > 0 ?Pr[ ??| ?=1 ? 1 ??[ ? ? ??] = ?=1 ??] ?(1 Pr[?? ?=1? 1 ??] ) ?=1 = ?=1 ? 1 2? > 0
On Making the Local Lemma Constructive The proof does not give an algorithm to find the point in space that makes all the bad events disappear. Approach for finding them: Each variable either true, false or undefined If clause not satisfied, make sure enough free variables Partition instance problem into smaller ones Logarithmic in size By partial assignment. Then do an exhaustive search on each part We know it exists
Constructive vs. Algorithmic For the algorithm to work we need a base case that relies on the (non-constructive) LLL Suggested by J zsef Beck Suboptimal dependency on k Moser and Tardos suggested an algorithmic version of the LLL
Structure of Proofs by Compression or Encoding Relevant to both upper and lower bounds Consider a random string used in the system Could be The random function that should be inverted The randomness used by an algorithm Show that the event you want to bound implies a compression of the randomness Probability of compressing ? bits is 2 ? Hence: probability of bad event is bound by 2 ? Sometimes need extra randomness to help us
Proof by Encoding Inverting a random function is hard Given black-box access to a random function ?: 0,1? 0,1?and A challenge y 0,1? goal is to find x s.t f(x) =y Claim: number of queries needed is close to 2? Proof: Homework
Proof by Encoding Given black-box access to a random function ?: 0,1? 0,1?and A challenge y 0,1? goal is to find x s.t ?(?) = ? Claim: number of queries needed is close to 2? Proof: Suppose can invert y within T steps with probability p Will use to compress ? to fewer than ?2?bits needed for random function Fix the inverter s bit as well as y Sequence of queries and answers ?1,?1, ?2,?2, ,(??,??) With probability at least p: one of the ?? s is y Savings: ? log? Only log T Bit to record Which one Only these recorded
The Lovasz Local Lemma Statement Application Constructive Algorithmic
The Union Bound ? = ?1,...?? For ??, the dependencies on ? \ {??} may be arbitrary Theorem: If there are 0 ??< 1 s.t. for all ??: Pr[??] ?? ?=1 then the probability that no event ??happens is positive. If all the ??= ? then we need that ? ? < 1 Example: a CNF formula with ? clauses where each clause has more than log ? literals is necessarily satisfiable Event ??: clause i is not satisfied. Pr ?? < 1/? a collection of random events. ???< 1 For a uniformly random assignment
The (Lovasz) Local Lemma Question: what happens with k-wise independent events? ? = ?1,...?? For ??, let (??) be a minimal set of events that ?? depends on ??is independent of all events in ? \ (??) information on events not in (??) {??} does not affect the probability of event ??happening. Theorem: If there are 0 ? < 1 and d s.t. for all ??: Pr[??] ? | (??)| ? ? ? ? < 1 then the probability that no event ??happens is positive. a collection of random events. Not symmetric Simultaneously: ????[ ? [?] ??] > 0
Applying The Local Lemma for k-CNF Theorem: If there are 0 ? < 1 and d s.t. for all ??: Pr[??] ? (??) d pde <1 then the probability that no event ??happens is positive. No dependency on the number of variables n or number of clauses m Theorem: Every k-CNF formula in which each variable appears in less than 2?/?? clauses is satisfiable. Proof: for a uniformly random assignment to the variables, Prob[??is not satisfied] 2 ? Each clause depends on at most 2? Event ??: Clause ?? not satisfied If two clauses do not share a variable, their satisfiability is independent ?? ? = 2?/? clauses. Taking p= 2 ?and d< 2? formula is satisfiable is positive. How to find the satisfying assignment? ?: we get that the probability the
The (Lovasz) Local Lemma: General ? = ?1,...?? Let D=(V,E) be a dependency digraph for the events ?1,...?? the set of vertices V = {1,2,... ,m} for each 1 ? ?, the event ?iis mutually independent of all the events {??| (?,?) ?}. Suppose there are ?1,?2, ??such that 0 ??< 1and Pr ?? ?? ?,? ?1 ?? Then ????[ ? [?] ??] ?=1 a collection of random events. ? 1 ?? > 0
Example: LLL and the Hat Game Nodes of a graph G=(V,E) are assigned one of q colors. At each node there is a person observing the colors of the neighbors, but not the color at the node wearing a hat of that color. Players should guess the color of their own hat Players win if at least one player guesses correctly Do they have a deterministic strategy? HG(G): largest q for which there is a winning strategy for graph G For every node ? a mapping ??: ?|? ? | [?] For every color assignment: at least one node ? gets its own color right ? ? : neighbors of ?
LLL and the Hat Game Claim: In an graph ? with maximum degree : ??(?) ? . For a graph G and strategies ??: ?|? ? | [?] Choose colors at random from [q]. Let ??be the event that node ? guesses correctly ??[??] =1 Why not distance 2? There are of these ? Event ??is independent of all {??|? no? ? ? ? } pde <1 Probability that no ??occurs simultaneously is larger than 0
Proof of The Local Lemma Theorem: If there are 0 ? < 1 and d s.t. for all ??: Pr[??] ? (??) d pd4 <1 then the probability that no event ??happens is positive. Induction Hypothesis: Let S {1, . . . , m}. By induction on s = 0, . . . , m 1. If |S| s, then for all k S we have Pr ?? ? ? ?? 2? For s=0: Pr ?? ? For s>0, Claim: ??[ ? ? ??] >0 ??[ ? ? ??] = ? ?Pr[ ??| ?=1 ??] = ? ?(1 Pr ?? ?=1 ? ?(1 2?) ? 1 ? 1 ?? By induction hypothesis
Proof of The Local Lemma Theorem: If there are 0 ? < 1 and d s.t. for all ??: Pr[??] ? (??) d pd4 <1 then the probability that no event ??happens is positive. Induction Hypothesis: Let S {1, . . . , m}. By induction on s = 0, . . . , m 1. If |S| s, then for all k S we have Pr ?? ? ? ?? 2? For s>0, Claim: Pr[ ? ? ??] > 0 ---------------------------------------------------------------------------------------------- Let ? = ?1 ?2where ?2are independent of ?? If ?2=S we are done: Pr ?? ? ? ?? = Pr ?? ? Otherwise, we know that | ?1| d Notation: let ??= ? ? ?? The event added From independence
Proof of The Local Lemma manipulation ? = ?1 ?2 ??= ? ? ?? Want to prove Pr ?? ? ? ?? 2? ? Pr ?? ??1??2] ?? ??1??2] Pr ???? =Pr[?? ??] = 1/2 Pr[??] Independence of ?2 Pr ?? ??1??2] Pr[??| ??2] = Pr[??] p Since |S_2| < |S|=s can apply induction hypothesis to Pr ????2] = Pr ?? ? ?2 ?? 2? Now ?? ? ?2 ?? ?? ??1??2] = ?? ? ?1 1 ? S1Pr ?? ? ?2 ?? 1 ?2? 1 2?? 1 Induction hypothesis 2
Proof of The Local Lemma Know that Pr ?? ? ? ?? 2? Want to show ????[ ? [?] ??] > 0 ?Pr[ ??| ?=1 ?(1 Pr[?? ?=1? 1 ??] ) ? 1 2? > 0 ? 1 ??[ ? ? ??] = ?=1 = ?=1 ?=1 ??]
On Making the Local Lemma Constructive The proof does not give an algorithm to find the point in space that makes all the bad events disappear. Approach for finding them: Partition instance problem into smaller ones Logarithmic in size By partial assignment. Then do an exhaustive search on each part Each variable is either true, false or undefined If clause not satisfied, make sure enough free variables We know it exists
Constructive vs. Algorithmic For the algorithm to work we need a base case that relies on the (non-constructive) LLL Suggested by J zsef Beck Suboptimal dependency on k Moser and Tardos suggested an algorithmic version of the LLL
Recall: Proof by Encoding: A general method for analyzing probabilistic events A method to prove success of an algorithm by showing that failure allows us to compress the random bits used. Prob. of compressing and chopping off w bits is ? ? Example: the birthday paradox Suppose that you can expect a collision in less than k random elements ?1,?2, ,??from a domain of size n If ??=??then can encode ??using i. Instead of log n bits need log k + log k bits. Not likely to happen before ? For i LZ style ? For j
Algorithm and History Generator Let p= 2 ?and d= 2?/? be as before 1. Pick a random initial assignment for the variables, each variable is set to `true independently with probability 1/2. 2. While some clause ??is not satisfied: 1. Choose the unsatisfied ??with the smallest value of i 2. Enter i in binary using log2? bits in the history 3. Call LocalCorrect on clause ??. From final assignment and stuff in stack can reconstruct the random tape
LocalCorrect including possibly C itself LocalCorrect(C): 1. Resample new values for every variable in clause C. 2. While some ??that shares a variable with C is not satisfied Choose the unsatisfied ??sharing a variable with C with the smallest value of j Enter 0 followed by naming j in binary using k 3 bits in the history use LocalCorrect on clause ??. 3. Enter 1 in the history. Need only k-3 bits to specify it, since this is the number of neighbors!
Correctness Claim: if the algorithm stops, it returns a correct assignment A clause can become satisfied and unsatisfied again multiple times through the recursive process, but When returning to the main routine and completing the call to LocalCorrect(??), we have satisfied the clause ??and any clause that was previously satisfied has stayed satisfied because of the recursion. Need to to show is that the recursive process has to stop.
The Algorithm Stops Claim: the algorithm stops in expected O(m log m) time Expectation is taken over the random coin tosses of the algorithm. Proof by compressing the random coins Imagine that the random decisions come from a stream. First n bits for the initial assignment Then each call to LocalCorrect k bits to the assignment Show how to compress stream if running time J is too large 1 1 0 0 1 0 0 0 1 1 1 0 1 0 1 0 0
History The history includes a list of the clauses that LocalCorrect is called on by the main routine. Also includes a list of the recursive calls to LocalCorrect, Using the neighbors of current clause The algorithm uses a flag bit 0 and a flag bit 1 to mark the start and end of recursive calls Suppose there were J calls altogether to LocalCorrect
Tracing Back the Randomness Used Claim: given the (a) final assignment and the history as recorded by the algorithm it is possible to trace back the randomness used n Final Assignment 1 1 0 1 0 0 1 1 1 0 1 History 0 1 0 1 0 1 1 0 initial Assignment J 1 1 0 1 0 0 1 1 1 1 0 n k
Tracing Back the Randomness Used Claim: given the (a) final assignment and (b) the history as recorded by the algorithm it is possible to trace back the randomness used Can construct a tree of the recursive calls. Given the final assignment, can roll back the assignment Since we know that before any clause was called it was not satisfied a unique configuration to the k variables! How many bits? ? log ? for calls from main (k-3)+2 bits for LocalCorrect n bit for final assignment Altogether ? log ? + ? + ?(? 1) On the other hand: how many bits used by algorithm: n bits for initial assignment k bits for each recursive call Altogether ?+Jk ?? ? > ? log m ? + ? log m + ? ? 1 < ? + ?? Compression!
Variant on the Algorithm Let p= 2 ?and d= 2?/? be as before 1. Pick a random assignment for the variables, each variable is set to `true independently with probability 1/2. 2. As long as there is some clause that is not satisfied, pick an arbitrary unsatisfied clause, and assign fresh random values to all its variables. Claim: algorithm terminates with a satisfying assignment in expected polynomial time. Expectation is taken over the random coin tosses of the algorithm.
WebDiarios de Motocicleta Mihai P tra cu (1982-2012) This is a nice idea that looks like another application of "Moser's Method," but there's something I don't quite get. In your encoding proof, H is entropy of the bitstring (?1),?(?1), , (??),?(??). Can you recover this bitstring from the encoding? As far as I can see, you can discover which edges are in the Cuckoo graph, but not which keys generated those edges. Or did I miss something?
I haven't looked at Moser's paper, but, given the blog-hype it received, I sincerely hope he does something smarter than just expressing probabilities in encoding terminology. This is a trivial idea that I presume many people had. As for the encoding, an edge is identical to a key. In other words, I may assume that the keys are {1,...,n}. You notice how I always say that I am encoding an edge with lg n bits. To put it in your terminology, there are n edges (??) ?(??). I can encode which edge is involved in some subgraph using lg n bits. So I do indeed recover the h and g values for all edges.