Understanding Discrete Random Variables in Probability for Computing

chapter 3 n.w
1 / 23
Embed
Share

Explore the concept of discrete random variables in probability theory for computing, including definitions, examples, and distributions. Learn how to associate random variables with events and understand probability mass functions and cumulative distribution functions.

  • Probability
  • Computing
  • Discrete Random Variables
  • Distributions
  • Events

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Chapter 3 Discrete Random Variables "Introduction to Probability for Computing", Harchol-Balter '24 1

  2. Random Variables Defn: A random variable (r.v.) is a real-valued function of the outcome of an experiment involving randomness. Example: Experiment: Roll two dice Q: Here are some r.v.s. What values can these take on? X = sum of the rolls Y = difference of the rolls Z = max of the rolls W = value of the first roll We can now ask, What is ?{? = 11}? 2 "Introduction to Probability for Computing", Harchol-Balter '24

  3. Random Variables Defn: A random variable (r.v.) is a real-valued function of the outcome of an experiment involving randomness. Example: Throw 2 darts uniformly at random at unit interval Here are some random variables: D = difference in location of the 2 darts L = location of leftmost dart Q: Can you define some more r.v.s? 1 0 3 "Introduction to Probability for Computing", Harchol-Balter '24

  4. Random Variables Defn: A discrete random variable can take on at most a countably infinite number of possible values, whereas a continuous random variable can take on an uncountable set of possible values. Q: Which of these random variables is discrete and which is continuous? The sum of the rolls of two dice The number of arrivals at a website by time t The time until the next arrival at a website The CPU time requirement of an HTTP request 4 "Introduction to Probability for Computing", Harchol-Balter '24

  5. From Random Variables to Events We use CAPITAL letters to denote random variables. When we set a random variable (r.v.) equal to a value, we get an event, and all the theorems we learned about events and their probabilities now apply. Probability of Event Random Variable (R.V.) Event 1 6 ? = sum of 2 rolls of a die ? = 7 ? > 10 ? = number arrivals to a website within the next hour ? ? > 10 = ? ? > 10 |weekday 5 7 + ? ? > 10 |weekend 2 7 5 "Introduction to Probability for Computing", Harchol-Balter '24

  6. Discrete Random Variables Defn: A discrete r.v. takes on a countable number of values, each with some probability. A discrete r.v. is associated with a discrete distribution that represents the likelihood of each of these values occurring. We sometimes define a r.v. by its associated distribution. Defn: For a discrete r.v. ?, the probability mass function of ? is: ??? = ?{? = ?} Q: What is this? The cumulative distribution function of ? is: ??(?) ??? = ? ? ? = ??(?) ? ? ? The tail of ? is: ??? = ? ? > ? = 1 ??(?) 6 "Introduction to Probability for Computing", Harchol-Balter '24

  7. Common Discrete R.V.s / Distributions 7 "Introduction to Probability for Computing", Harchol-Balter '24

  8. Bernoulli(?) Experiment: Flip a single coin, with probability ? of Heads. Random Variable ? = value of the coin flip Defn: ? ????????? ? : ? = 1 w.p. ? 0 w.p. 1 ? Q: What distribution is shown above, with what parameter? 8 "Introduction to Probability for Computing", Harchol-Balter '24

  9. Binomial(?,?) Experiment: Flip a coin, with probability ? of Heads, ? times Random Variable ? = number of heads ? Defn: ? ???????? ?,? : Q: What is this? ? ? ? ? ? ??1 ?? ? ??1 ?? ? ??? = ?=0 where ? = 0,1,2, ,? (Hint: binomial expansion) Binomial(? = 20,? = 0.3) 9 "Introduction to Probability for Computing", Harchol-Balter '24

  10. Geometric(?) Experiment: Flip a coin, with probability ? of Heads, until see first head Random Variable ? = number flips until first head Q: What is: Defn: ? ????????? ? : ??? = ? ? > ? ? ??? = 1 ?? 1 ? Q: What is this? (1 ?)? 1 ? where ? = 1,2,3, ?=1 Geometric(? = 0.3) 10 "Introduction to Probability for Computing", Harchol-Balter '24

  11. Pop Quiz Q: You have a room of ? disks. Each disk independently dies with probability ?. How are the following quantities distributed? ? Binomial(?,?) Geometric(?) Bernoulli(?) a) The number of disks that die in the first year b) The number of years until a particular disk dies c) The state of a particular disk after one year 11 "Introduction to Probability for Computing", Harchol-Balter '24

  12. Poisson(?) The Poisson distribution occurs naturally when looking at a mixture of a large number of independent sources. Defn: ? ??????? ? : Q: What is this? ? ? ? ?? ?! ??? =? ? ?? ?! ?=0 (Hint: Taylor series of ??) where ? = 0,1,2,3, Poisson(? = 6) Q: Does the shape of the Poisson p.m.f. remind you of another distribution? 12 "Introduction to Probability for Computing", Harchol-Balter '24

  13. Two Random Variables Defn: The joint probability mass function between discrete r.v. s? and ? is: ??,??,? = ?{? = ? & ? = ?} or equivalently, ?{? = ? ,? = ?} or ?{? = ? ? = ?}, where, by definition: ??,??,? = 1. ? ? 13 "Introduction to Probability for Computing", Harchol-Balter '24

  14. Marginal Probability Mass Function How is ??? related to ??,??,? ? Table shows ??,??,? ? = 1 0.05 0.05 0.2 ? = 2 0.05 0.1 0 ? = 0 0.4 0.05 0.1 ? = 0 ??1 = 0.2 ? = 1 ? = 2 ??? = ??,??,? ??0 = 0.55 ? ??? = ??,??,? Called marginal probabilities because written in the margins. ? 14 "Introduction to Probability for Computing", Harchol-Balter '24

  15. Independence Defn: Discrete random variables ? and ? are independent (written ? ?) if : ??,??,? = ??? ?? ? , ?,? Q: If ? and ? are independent, what does this say about ? ? = ? ? = ?}? 15 "Introduction to Probability for Computing", Harchol-Balter '24

  16. Independence Defn: Discrete random variables ? and ? are independent (written ? ?) if : ??,??,? = ??? ?? ? , ?,? Q: If ? and ? are independent, what does this say about ? ? = ? ? = ?}? ? ? = ? ? = ?} =?{? = ? & ? = ?} ?{? = ?} =? ? = ? ? ? = ? ?{? = ?} = ? ? = ? 16 "Introduction to Probability for Computing", Harchol-Balter '24

  17. Who Fails First? You have a disk with probability ?1of failing each day, and a CPU which independently has probability ?2of failing each day. ?? ?? Q: What is the probability that the disk fails before the CPU? Intuitively, what answer makes sense? "Introduction to Probability for Computing", Harchol-Balter '24 17

  18. Who Fails First? You have a disk with probability ?1of failing each day, and a CPU which independently has probability ?2of failing each day. Q: What is the probability that the disk fails before the CPU? ?1= days until disk fails ?????????(?1) ?1 ?2 ?2= days until CPU fails ?????????(?2) ?{?1< ?2} = ??1,?2?,?2 = ??1? ??2(?2) ?=1 ?2=?+1 ?=1 ?2=?+1 ? 1?1 1 ?2 ?2 1?2 = 1 ?1 ?=1 ?2=?+1 18 "Introduction to Probability for Computing", Harchol-Balter '24

  19. Who Fails First? You have a disk with probability ?1of failing each day, and a CPU which independently has probability ?2of failing each day. Q: What is the probability that the disk fails before the CPU? ?1= days until disk fails ?????????(?1) ?1 ?2 ?2= days until CPU fails ?????????(?2) ? 1?1 1 ?2 ?2 1?2= ? ?1< ?2 = 1 ?1 ?=1 ?2=?+1 after some algebra ?1(1 ?2) 1 (1 ?2)(1 ?1) = 19 "Introduction to Probability for Computing", Harchol-Balter '24

  20. Who Fails First? You have a disk with probability ?1of failing each day, and a CPU which independently has probability ?2of failing each day. ?? ?? ?1(1 ?2) 1 (1 ?2)(1 ?1) ?{disk fails before CPU fails}= But WHY? "Introduction to Probability for Computing", Harchol-Balter '24 20

  21. Who Fails First? You have a disk with probability ?1of failing each day, and a CPU which independently has probability ?2of failing each day. ?1(1 ?2) 1 (1 ?2)(1 ?1) ?{disk fails before CPU fails}= Intuition: Think about flipping 2 coins each day. There may be many days where both coins are heads. We only care about the first day where the coins are not both heads. Given that both coins are not heads, what s the probability that coin 1 is H and coin 2 is T? ?{coin 1 is H & coin 2 is T | not both tails}=?{coin 1 is H & coin 2 is T} ?1(1 ?2) 1 (1 ?2)(1 ?1) = ?{not both tails} "Introduction to Probability for Computing", Harchol-Balter '24 21

  22. Law of Total Probability Theorem: [Law of Total Probability for Discrete R.V.s] Let ? be an event. Let Y be a discrete r.v. ?{?} = ?{? ? = ?} = ?{? |? = ?} ?{? = ?} ? ? For a discrete r.v. X : ?{? = ?} = ?{? = ? ? = ?} = ?{? = ? |? = ?} ?{? = ?} ? ? Proof: Follows immediately from Law of Total Probability for Events, if we realize that ? = ? represents an event and the set of events ? = ? over all ? form a partition. 22 "Introduction to Probability for Computing", Harchol-Balter '24

  23. Who Fails First? Disk with prob. ?1of failing each day, and a CPU with indpt. prob. ?2of failing each day. Q: What is the probability that the disk fails before the CPU? (Redo using conditioning!) ?2= days until CPU fails ?????????(?2) ?1= days until disk fails ?????????(?1) ?1 ?2 ? ?1< ?2 = ? ?1< ?2 ?1= ?} ?{?1= ?} ?=1 = ? ? < ?2 ?1= ?} ? ?1= ? ?=1 = ?{?2> ?} ? ?1= ? ?=1 ?1(1 ?2) 1 (1 ?2)(1 ?1) ? 1 ?1 ? 1 ?1= = 1 ?2 ?=1 23 "Introduction to Probability for Computing", Harchol-Balter '24

Related


More Related Content