Theoretical Computer Science Overview: Algorithms, Complexity, and Probability

cmsc5706 topics in theoretical computer science n.w
1 / 40
Embed
Share

Dive into the fundamentals of theoretical computer science with a focus on algorithms, complexity, and probability. Understand the importance of time in computational procedures and how algorithms provide step-by-step instructions for efficient problem-solving. Homework assignments, lecture notes, and practical examples enhance learning in this math-centric course. No exams, just 12 homework assignments where the top 10 scores count towards your final grade. Embrace the world of algorithms and computational complexity in this insightful educational journey led by Instructor Shengyu Zhang.

  • Theoretical Computer Science
  • Algorithms
  • Complexity
  • Probability
  • Homework

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. CMSC5706 Topics in Theoretical Computer Science Week 1: Review of Algorithms and Probability Instructor: Shengyu Zhang

  2. First week Part I: About the course Part II: About algorithms and complexity What are algorithms? Growth of functions What is the complexity of an algorithm / a problem Part III: Review of probability Tail bounds

  3. Part I: About the course

  4. Info Webpage: http://www.cse.cuhk.edu.hk/~syzhang/course/MScAlg15 Information (time and venue, TA, textbook, etc.) Lecture slides Homework Announcements Flavor: More math than programming.

  5. Homework Homework assignments (100%). No exam. 12 homework. You only need to complete 10. If you do more than 10, the 10 with the highest scores count.

  6. textbook No textbook. Lecture notes available before classes. Some general references are listed in the course website as well.

  7. Part II: About algorithms and complexity

  8. A good example: driving directions Suppose we want to drive from CUHK to Central. How to route? Let s ask Google.

  9. Whats good here: Step by step. Each step is either turn left/right, or go straight for meters. An estimated time is also given. An algorithm is a computational procedure that has step-by-step instructions. It ll be good if an estimated time is given.

  10. More on complexity Why time matters? Time is money! Being late means 0 value Weather forecast. Homework. Running time: the number of elementary steps Assuming that each step only costs a small (or fixed) amount of time.

  11. complexity The worst-case time complexity of an algorithm A is the running time of A on the worst-case input instance. Cost ? = max input ?(running time of ? on ?) The worst-case time complexity of a computational problem P is the worst-case complexity of the best algorithm A that solves the problem. the best algorithm that gives right answers on all inputs. min algorithm ?max Cost ? = input ?(running time of ? on ?)

  12. Hardness of problems can vary a lot Multiplication: 1234 * 5678 = ? 7006652 2749274929483758 * 4827593028759302 = ? Can you finish it in 10 minutes? Do you think you can handle multiplication easily?

  13. Complexity of integer multiplication ?1?2 ?? In general, for ?-digit integers: ?1?2 ?? --------------------------- * * * * * * * * + * * * * --------------------------------- * * * * * * * ?1?2 ?? ?1?2 ??=? [Q] How fast is our algorithm? For each ??(? = ?,? 1, ,1) we calculate ?? ?1?2 ??, ? single-digit multiplications ? single-digit additions We finally add the ? results (with proper shifts) 2?2 single-digit additions. Altogether: 4?2 elementary operations single-digit additions/multiplications Multiplication is not very hard even by hand, isn t it?

  14. Inverse problem The problem inverse to Integer Multiplication is Factoring. 35 = ? * ? 437? 8633? It s getting harder and harder, Much harder even with one more digit added! The best known algorithm: running time 2?(?1/3)

  15. The bright side Hard problems can be used for cryptography! RSA [Rivest, Shamir, Adleman]: widely-used today, broken if one can factor quickly! One-line message: Quantum computers can factor quickly!

  16. Messages Message 1: We care about the speed of the increase, especially when the size is very large. Many interesting instances in both theory and practice are of huge (and growing!) sizes.

  17. Message 2: We care about the big picture first. Is the problem as easy as multiplication, or as hard as factoring?

  18. In this regard, we consider the so called asymptotic behavior, Eventually, i.e. for large ?, is the function like ?, or ?2, or 2?? with constant factors ignored at first i.e. we care about the difference between ?2 and 2? much more than that between ?2 and 1000?2 Engineering reason: speedup of a constant factor (say of 10) is easily achieved in a couple of years

  19. Some examples Which increases faster? (100?2,0.01 2?) (0.1 log?,10?) (1010?,10 10?2)

  20. Big-O and small-o In general: ?(?) = ?(?(?)): for some constant ?, ?(?) ? ?(?), when ? is sufficiently large. i.e. ?, ? s.t. ? > ?, we have ?(?) ? ?(?). ?(?) = ?(?(?)): for any constant c, ?(?) ? ?(?), when ? is sufficiently large. i.e. ?, ? s.t. ? > ?, we have ?(?) ? ?(?).

  21. The other direction ?(?) = ?(?(?)): ?(?) ? ?(?) for some constant ? and large ?. i.e. ?, ? s.t. ? > ?, we have ?(?) ? ?(?). ?(?) = (?(?)): ?(?) ? ?(?) for some constant ? and large ?. i.e. ?, ? s.t. ? > ?, we have ?(?) ? ?(?). ?(?) = (?(?)): ?(?) = ?(?(?)) and ?(?) = (?(?)) i.e. ?1 ? ? ? ? ?2 ?(?) for two constants ?1 and ?2 and large ?.

  22. Intuition ? = ?(?) ? ? ? = ?(?) ? < ? ? = (?) ? ? ? = ?(?) ? > ? ? = (?) ? = ? ? = ? ? ? = (?) ? ? ? ? ? = ?(?) ? = ?(?) ? < ? ? > ? ? = (?) ? = ?(?) ? = ? ? ? & ? ? & ? = (?)

  23. Examples 10? = ?(0.1?2) ?2= ?(2?/10) ?1/3= ?(10log?) ?3= ?2 3/2= ?(?2) log2?2= 2log2? = (log2?) log2(2?) = 1 + log2? = (log2?)

  24. Part III: Probability and tail bounds

  25. Finite sample space Sample space : set the all possible outcomes of a random process. Suppose that is finite. Events: subsets of . Probability function. ?: ?, s.t. ? ? 0, ? . ? ? ? = 1. For event ? , the probability of event ? happening is ? ? = ? ?? ? .

  26. Union of events Consider two events ?1 and ?2. ? ?1 ?2 = ? ?1 + ? ?2 ?(?1 ?2). In general, we have the following union bound: ? ??? ?? ??

  27. Independence of events Two events ? and ? are independent if ? ? ? = ? ? ? ? Conditional probability: For two events ? and ? with ? ? > 0, the probability of ? conditioned on ? is ? ? ? =? ? ? . ? ?

  28. Random variable A random variable ? is a function ?: ?. Pr ? = ? = ? :? ? =??(?). Two random variables ? and ? are independent if Pr ? = ? ? = ? = Pr ? = ? Pr ? = ? .

  29. Expectation Expectation: ? ? = ? ? ? ?(?) = ? Range(?)? Pr[? = ?] Linearity of expectation: ? ??? = ?? ?? no matter whether ?? s are independent or not.

  30. variance The variance of ? is ??? ? = ? ? ? ? 2= ? ?2 ? ?2 The standard deviation of ? is ??? ? ? =

  31. Concentration and tail bounds In many analysis of randomized algorithms, we need to study how concentrated a random variable ? is close to its mean ?[?]. Many times ? = ?1+ + ??. Upper bounds of Pr[? deviates from ?[?] a lot] is called tail bounds. 31

  32. Markovs Inequality: when you only know expectation [Thm] If ? 0, then ?? ? ? ? ? ?. In other words, if ?[?] = ?, then ?? ? ?? 1 ?. Proof. ? ? ? ?? ? ? . Dropping some nonnegative terms always make it smaller. 32

  33. Moments Def. The ?th moment of a random variable ? is ??[?] = ?[ ? ? ? ?] ? = 2: variance. 2] ???[?] = ?[ ? ? ? = ?[?2 2? ?[?] + ? ?2] = ? ?2 2? ? ? ? + ? ?2 = ? ?2 ? ?2 33

  34. Chebyshevs Inequality: when you also know variance ?? |? ? ? | ? ??? ? In other words, ?? |? ? ? | ? [Thm] . ?2 1 ?2. ???[?] Proof. = ??[ ? ? ? = ??[ ? ? ? ?[ ? ? ? = ??? ? /?2 ??[|? ?[?]| ?] 2 ?2] 2 ?2] 2]/?2 // Markov on ? ? ? // recall: ???[?] = ? ? ? ? 2 2 34

  35. Inequality by the ?th-moment (?: even) [Thm] Proof. = ??[ ? ? ? = ??[ ? ? ? ? ? ? ? = ??? /?? ??[|? ?[?]| ?] ??? /??. ??[|? ?[?]| ?] ? ??] // ? is even ? ??] ?/??// Markov on ? ? ? ? 35

  36. Chernoffs Bound [Thm] Suppose ??= 1 with prob.? with prob.1 ? 0 and let ? = ?1+ + ??. Then ?? |? ?| ?? ? ?2?/3, where ? = ?? = E E ? . 36

  37. Some basic applications One-sided error: Suppose an algorithm for a decision problem has ?(?) = 0: no error ?(?) = 1: output ?(?) = 0 with probability 1/2 We want to decrease this to ?. How? 1 ? Run the algorithm log2 iff all executions answer 0. times. Output 0 37

  38. Two-sided error Suppose a randomized algorithm has two- sided error ?(?) = 0: output ?(?) = 0 with probability > 2/3 ?(?) = 1: output ?(?) = 1 with probability > 2/3 How? Run the algorithm ?(log(1/?)) steps and take a majority vote. 38

  39. Using Chernoffs bound Run the algorithm ? times, getting ? outputs. Suppose they are ?1, ,??. Let ? = ?1+ + ?? if ?(?) = 0: ??= 1 w.p.? <1 3, thus ?[?] = ?? <? 3, so ?[?] = ?? >2? 3. if ?(?) = 1: ??= 1 w.p.? >2 3. 39

  40. Recall Chernoff: ?? |? ?| ?? ??2?/3 . If ?(?) = 0: ? = ?[?] <? 3. ? =? 2 ? 3=? 6, so ? =?/6 ?/3=1 2. 6 ? ?2? ?? ? ? 2 ?? ? ?? ? 3 = 2 (?). Similar for ?(?) = 1. The error prob. decays exponentially with # of trials! 40

Related


More Related Content