Independence of Random Variables in CSE 312 Winter 25 Lecture 10 Variance

more independence n.w
1 / 39
Embed
Share

Explore the concept of independence of random variables in CSE 312 Winter 25 Lecture 10 Variance, covering pairwise and mutual independence, implications, and scenarios of independence and non-independence of random variables.

  • CSE
  • Independence
  • Random Variables
  • Variance

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. More Independence; CSE 312 Winter 25 Lecture 10 Variance

  2. More Independence

  3. Independence of Random Variables That s for events what about random variables? Independence (of random variables) ? and ? are independent if for all ?, ? = ?,? = = ? = ? (? = ) We ll often use commas instead of symbol.

  4. Independence of Random Variables The for all values is important. We say that the event the sum is 7 is independent of the red die is 5 What about ? = the sum of two dice and ? = the value of the red die

  5. Independence of Random Variables The for all values is important. We say that the event the sum is 7 is independent of the red die is 5 What about ? = the sum of two dice and ? = the value of the red die NOT independent. ? = 2,? = 5 ? = 2 (? = 5) (for example)

  6. Independence of Random Variables Flip a coin independently 2? times. Let ?be the number of heads in the first ?flips. Let ?be the number of heads in the last ?flips. ? and ? are independent.

  7. Independence for 3 or more events For three or more events, we need two kinds of independence Pairwise Independence Events ?1,?2, ,?? are pairwise independent if ?? ?? = ?? ?? for all ?,? Mutual Independence Events ?1,?2, ,?? are mutually independent if ??? ??? ???= ??? ??? ??? for every subset {??,??, ,??} of {?,?, ,?}.

  8. Mutual Independence for RVs A little simpler to write down than for events Mutual Independence (of random variables) ?1,?2, ,?? are mutually independent if for all ?1,?2, ,?? ?1= ?1,?2= ?2, ,??= ?? = ?1= ?1 ?2= ?2 (??= ??) DON T need to check all subsets for random variables But you do need to check all values (all possible ??) still.

  9. What does Independence give you? ? ?? = ? ? ?[?] Var ? + ? = Var ? + Var(?) What does ?? mean? I rolled two dice, let ? be the red die, ? the blue die. ?? is the random variable that tells you the product of the two dice. That s a function that takes in an outcome and gives you a number back so a random variable!! (Same for ? + ?).

  10. Functions of a random variable Let ?,? be random variables defined on the same sample space. Functions of ? and/or ? like ? + ? ?2 2? + 3 Etc. Are Are random variables! (Say what the outcome is, and these functions give you a number. They re functions from . That s the definition of a random variable! of a random variable! . That s the definition

  11. Expectations of functions of random variables Let s say we have a random variable ? and a function ?. What is ? ? ? ? = ? ? ? ? ? ? ? Equivalently: ? ? ? (?) = ? g X? (? ? = ?) Notice that ? ? ? might not be ? ? ? .

  12. Variance

  13. Where are we? A random variable is a way to summarize what outcome you saw. The Expectation of a random variable is its average value. A way to summarize a random variable

  14. Variance Another one number summary of a random variable. But wait, we already have expectation, what s this for?

  15. Consider these two games Would you be willing to play these games? Game 1: I will flip a fair coin; if it s heads, I pay you $1. If it s tails, you pay me $1. Let ?1 be your profit if you play game 1 Game 2: I will flip a fair coin; if it s heads, I pay you $10,000. If it s tails, you pay me $10,000. Let ?2 be your profit if you play game 2. Both games are fair (? ?1 = ? ?2 = 0)

  16. Consider these two games Would you be willing to play these games? Game 1: I will flip a fair coin; if it s heads, I pay you $1. If it s tails, you pay me $1. Let ?1 be your profit if you play game 1 Game 2: I will flip a fair coin; if it s heads, I pay you $10,000. If it s tails, you pay me $10,000. Let ?2 be your profit if you play game 2. Both games are fair (? ?1 = ? ?2 = 0)

  17. Whats the difference Expectation tells you what the average will be But it doesn t tell you how extreme your results could be. Nor how likely those extreme results are. Game 2 has many (well, only) very extreme results. In expectation they cancel out but if you can only play once it would be nice to measure that.

  18. Designing a Measure Try 1 Well let s measure how far all the events are away from the center, and how likely they are ? ? ? ? ? ? What happens with Game 1? 1 2 1 0 +1 What happens with Game 2? 1 2 100000 0 +1 5000 5000 = 0 2 ( 1 0) 2 ( 100000 0) 1 2 1 2= 0

  19. Designing a Measure Try 2 How do we prevent cancelling? Squaring makes everything positive. 2 ? ? ? ? ? ? What happens with Game 1? 1 2 1 02+1 What happens with Game 2? 1 2 100000 02+1 5,000,000,000 + 5,000,000,000 = 1010 2 1 02 2 100000 02 1 2+1 2= 1

  20. Why Squaring Why not absolute value? Or Fourth power? Squaring is nicer algebraically. Our goal with variance was to talk about the spread of results. Squaring makes extreme results even more extreme. Fourth power over-emphasizes the extreme results (for our purposes).

  21. Variance Variance The variance of a random variable ? is 2 = ? ? ? ? 2= ?[?2] ? ?2 Var ? = ? ? ? ? ? ? The first two forms are the definition. The last one is an algebra trick.

  22. Variance of a die Let ? be the result of rolling a fair die. 2= ?[ ? 3.52] Var X = ? ? ? ? =1 61 3.52+1 =35 62 3.52+1 63 3.52+1 64 3.52+1 65 3.52+1 66 3.52 12 2.92. 1 6 ?2 3.52=91 Or ? ?2 ? ? 6 2= ?=1 6 3.52 2.92

  23. Variance of ? Coin Flips Flip a coin ? times, where it comes up heads with probability ? each time (independently). Let ? be the total number of heads. We ll see next time ? ? = ??. Also define: ??= 1 if flip ? is heads 0 otherwise

  24. Variance of ? Coin Flips Flip a coin ? times, where it comes up heads with probability ? each time (independently). Let ? be the total number of heads. What about Var(?) 2= ? (?) ? ? ??2 ? ??1 ?? ? ? ??2 ? ? ? ? ? ? = ?=0 Algebra time?

  25. Variance If ? and ? are independent then ??? ? + ? = ??? ? + ???(?) Are the ?? independent? Yes! In this problem ?? is independent of ?? for ? ? where ??= 1 if flip ? was heads 0 otherwise

  26. Variance ? ? Var ? = Var( ?=1 ??) = ?=1 Var(??) What s the Var(??)? ? (?? ? ??]2 = ?[ ?? ?2] = ? 1 ?2+ 1 ? 0 ?2 1 ? + ? = ?(1 ?). OR ??? ?? = ? ?? = ? 1 ? 2= ? ?? ?2= ? ?2= ?(1 ?) . 2 ? ??

  27. Plugging In ? ? Var ? = Var( ?=1 ??) = ?=1 Var(??) What s the Var(??)? ?(1 ?). ? 1 ? =??(1 ?). ? Var ? = ?=1

  28. Expectation and Variance arent everything Alright, so expectation and variance is everything right? No! Flip a fair coin 3 times indep. Count heads. Flip a biased coin (prob heads=2/3) until heads. Count flips. PMF 1 with E=3/2, Var=3/4 PMF 2 with E=3/2, Var=3/4 0.4 0.70 0.35 0.60 0.3 0.50 0.25 0.40 0.2 0.15 0.30 0.1 0.20 0.05 0.10 0 0.00 1 2 3 4 1 2 3 4 5 6 7 8 9 10 A PMF or CDF *does* fully describe a random variable.

  29. Proof of Calculation Trick 2 expanding the square 2] linearity of expectation. 2] linearity of expectation. 2expectation of a constant is the constant 2= ? ?2 2?? ? + ? ? ? ? ? ? = ? ?2 ? 2?? ? + ?[ ? ? = ? ?2 2? ? ?[?] + ?[ ? ? = ? ?2 2? ? ?[?] + ? ? = ? ?2 2 ? ? = ? ?2 ? ? 2+ ? ? 2 2 So Var ? = ? ?2 ? ? 2.

  30. Useful Facts

  31. Make a prediction How should Var ? + ? relate to ???(?) if ? is a constant? How should Var(aX) relate to Var(?) is ? is a constant?

  32. Make a prediction How should Var ? + ? relate to ???(?) if ? is a constant? How should Var(aX) relate to Var(?) is ? is a constant? Var ? + ? = Var ? Var ?? = ?2Var(?)

  33. Facts About Variance Var ? + ? = Var(?) Proof: Var ? + ? = ? ? + ?2 ? ? + ?2 = ? ?2+ ? 2?? + ? ?2 ? ? + ?2 = ? ?2+ 2?? ? + ?2 ? ?2 2?? ? ?2 = ? ?2 ? ?2 = Var(?)

  34. Facts about Variance Var ?? = ?2Var(?) = ? ??2 (? ?? )2 = ?2? ?2 ?? ? = ?2? ?2 ?2? ?2 = ?2? ?2 ? ?2 2

  35. Extra Practice

  36. More Practice Suppose you flip a coin until you see a heads for the first time. Let ? be the number of trials (including the heads) What is the pmf of ?? The cdf of ?? ?[?]?

  37. More Practice Suppose you flip a coin until you see a heads for the first time. Let ? be the number of trials (including the heads) What is the pmf of ?? ??? = 1/2? for ? +, 0 otherwise The cdf of ?? ??? = 1 1/2 ? for ? 0, 0 for ? < 0. ?[?]? ?=1 ? 2?= 2

  38. More Random Variable Practice Roll a fair die ? times. Let ? be the number of rolls that are 5? or 6?. What is the pmf? Don t try to write the CDF it s a mess Or try for a few minutes to realize it isn t nice. What is the expectation?

  39. More Random Variable Practice Roll a fair die ? times. Let ? be the number of rolls that are 5? or 6?. What s the probability of getting exactly ?5 s/6 s? Well we need to know which ? of the ?rolls are 5 s/6 s. And then multiply by the probability of getting exactly that outcome ? ? ? 1 3 2 3 ? ? 0 if ? ?,0 ? ? ??? = otherwise Expectation formula is a mess. If you plug it into a calculator you ll get a nice, clean simplification: ?/3.

Related


More Related Content