Topics on Random Variables: Covariance, Correlation, and Variance Calculation

tutorial 9 further topics on random variables 2 n.w
1 / 26
Embed
Share

Explore the concepts of covariance, correlation, and variance in random variables through examples and insights. Learn how to calculate covariance, determine independence, and compute total variance in summations of random variables. Dive into conditional expectation and its significance in probability theory.

  • Random Variables
  • Covariance
  • Correlation
  • Variance
  • Conditional Expectation

Uploaded on | 1 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Tutorial 9: Further Topics on Random Variables 2 Weiwen LIU wwliu@cuhk.edu.hk March 27, 2017 1

  2. Covariance and Correlation Covariance and correlation describe the degree to which two random variables or sets of random variables tend to deviate from their expected values in similar ways. Independent random variables are uncorrelated, but NOT vice versa. 2

  3. Example 1 Let X and Y be continuous random variables with joint pdf ??,??,? = 3?,0 ? ? 1, 0,?? ??????. Compute cov ?,? ; Are ? and ? independent? 3

  4. Example 1 The marginal pdfs and expectations of X and Y are ? 3??? = 3?2,0 ? 1, ??? = 0 1 1 3 4?4 =3 ? 3?2?? = E ? = 4, 0 0 1 1 3 2?2 =3 21 ?2,0 ? 1, 3 2 4 ?? 3????? =3 2 ??? = 3??? = ? ? 1 1 ?2 2 ?4 ?5 5 ? 3 =3 21 ?2?? = 1 0 E ? = 8, 0 0 1 ? 3 10 E ?? = = 0 0 4

  5. Example 1 Then the covariance is 3 10 3 4 3 3 cov X,Y = E XY E X E Y = 8= 160. ? and ? are not independent as it is not true that ??,??,? = ??? ??? . 5

  6. Variance of Summations Let ?1,?2, ,?? be random variables with finite variance, then we have var ??? = ?var(??) + ? ?cov(??,??) 6

  7. Example 2 If the variance of verbal GRE were 64, the variance of quantitative SAT were 81 and the correlation between these two tests were 0.50, then what is the variance of total SAT (verbal + quantitative)? 7

  8. Example 2 Denote ?1,?2 as the score for verbal and quantitative respectively, then var ?1+ ?2 = var ?1 + var ?2 + 2cov ?1,?2 = 64 + 81 + 2 0.5 64 81 = 217 8

  9. Conditional Expectation Revisit The conditional expectation E[?|?] of a random variable ? given another random variable ?, is a new random variable determined by ?. It s distribution is determined by the distribution of ?.

  10. Expectation of CE The conditional expectation E ? ? is a random variable. It has expectation (over ?) E[E ? ? ] = ?E ? ? = ? ??? , or for continuous ? E[E ? ? ] = E ? ? = ? ??? ??. ?

  11. Expectation of CE - Properties If X has a finite expectation, we have, concluded straightforwardly from the total expectation theorem of the Law of iterated expectations E[E ? ? ] = E ? . For any function g, we have E[??(?)] = ?(?)E ?|? .

  12. Example 3 1 ? ???. A class has ? students with score ?1, ,??. Let ? = Divide them into ? disjoint subsets ?1, ,??. Let ??= ??, and ??= section. We have ?? ???= ?. 1 ?? ? ???? be the averaged score of the ?-th ?

  13. Example 3 ?? ???= ? can be viewed from a CE The result ? perspective. Let ? be the score of a random student, and ? be the section of a random student, we have ?? ???= = E[E ? ? ] = E ? = ? ?E ? ? = ? P(? = ?) ?

  14. Example 4 Let ? be the sales of the next year, and ? be the sales of the first quarter of next year. Suppose we have a forecast system giving the joint distribution of ? and ?. We view Z = E[?|?] E[?] as the forecast revision E ? = E E ? ? Intuitively, if ? is positive, the forecast system underestimates E[?|?]. E E ? = 0

  15. CE as Estimator If we could observe ? who provides information about ?, it s natural we estimate ? using ?, as ? = E ?|? . The estimation error ? = ? ? is a random variable satisfying E ?|? = E ?|? E ?|? = 0 And hence E ? = 0

  16. CE as Estimator An important property is that the estimation ? is uncorrelated with the estimator error ?. In fact, E ? ? = E[E ? ? |?] = E[ ?E ?|? ] = 0. Hence cov ?, ? = E ? ? E ? E ? = 0. As an result, var ? =var ? +var ?

  17. Conditional Variance The conditional variance is defined as var ? ? = E ?2|? = E (? E[?|?])2|? . As usual, var ?|? = ? = E ?2|? = ? . Also, var ? = E ?2= E E ?2|? = E[var(?|?)].

  18. Law of Total Variance Law of Total Variance var ? = E var ? ? + var(? ? ? ). It s especially useful to calculate variances of random variables. See the following examples

  19. Example 5 Consider we tossing a coin, of whom the probability of heads is a uniform random variable ?, the number of heads ? satisfies E ?|? = ?? and = var ?? =?2 var E ? ? 12

  20. Example 5 Meanwhile we have var ? ? = ??(1 ?), and E[var ? ? ] =? 6 Then using law of total variance, var ? = E var ? ? =? 12 + var(? ? ? ). 6+?2

  21. Transforms The transform provides us with an alternative representation of its probability law (PMF or PDF). It is not particularly intuitive, but it is often convenient for certain types of mathematical manipulations. The transform (also referred to as the associated moment generating function) is defined as: ??? = E ???, which is a function of a scalar parameter ?. 21

  22. Inversion Property Suppose that ??(?) is finite for all ? in an interval of the form [ ?,?], where a is a positive number. Then, ?? determines uniquely the CDF of the random variable ?. If ??(?) = ?? (?) < , for all ? [ ?,?], where ? is a positive number, then the random variables ? and ? have the same CDF. 22

  23. Sums of Independent Variables Generally, ?1, ,?? is a collection of independent random variables, and ? = ?1+ + ?? Then, ??? = ??1? ???? 23

  24. Sums of A Random Number of Independent Random Variables Let ? = ?1+ + ??, where ?is a random variable that takes nonnegative integer values. Then, E E ? = E E E E ? ? varE E ? = ? ? var ? + E E ?2var ? , = E E ? E E ? , ??????? . ??? = ?=0 24

  25. Example 6 Let ? be geometrically distributed with parameter ?, and let each random variable ??be geometrically distributed with parameter ?. We assume that all of these random variables are independent. Let ? = ?1 + + ??. We have ??? ??(?) = 1 1 ? ??, ??? 1 1 ? ?? . ??(?) = What is the distribution of ?? 25

  26. Example 6 To determine ??(?), we start with the formula for ??(?) and replace each occurrence of ??with ??(?). This yields ???? ??? = 1 1 ? ??? and, after some algebra ???? ??? = 1 1 ?? ??. We conclude that ?is geometrically distributed, with parameter ??. 26

Related


More Related Content