Multiple Random Variables CDFs, PDFs, Marginals, Independence & More

ee 5345 n.w
1 / 41
Embed
Share

Learn about Cumulative Distribution Functions, Probability Density Functions, Marginals, Independence, Conditional PDFs, Expectation, and Characteristics of Multiple Random Variables. Explore how to analyze, integrate, and apply these concepts in multidimensional scenarios.

  • Random Variables
  • PDFs
  • CDFs
  • Marginals
  • Independence

Uploaded on | 1 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. EE 5345 Multiple Random Variables Cdf s and pdf s, Marginals, Independence Functions of Several RV s Multidimensional Expectation: Correlation & Covariance Multivariate Gaussian RV s.

  2. Multiple Random Variables Cumulative Distribution Function = = F = ( x ) F ( x x , 1 x , ,..., x ) Pr[ X x ] 2 3 n X X Pr[ X x , X x , X x ,..., X x ] 1 1 2 2 3 3 n n T = ... x [ x x x n] x 1 2 2

  3. Multiple Random Variables (cont) Probability Density Function n = f ( x ) F ( x ) X X x x x ... x 1 2 3 n CDF Marginals = F ( x x , 1 ) F ( x x , 1 , , ,..., ) 2 2 X X X 1 2

  4. Multiple Random Variables (cont) Pdf Marginals: Integrate out what you don t want. = f ( x x , 1 ) ... 2 X X 1 2 = = = = x x x x dx 3 x ... 3 4 5 n ( f x x , 1 x , 2 x , 3 x , 4 ,..., dx ) dx dx 5 4 5 n n

  5. Multiple Random Variables (cont) Conditional Pdf s f ( x x , 3 x , 4 ,..., x | x x , 1 ) 5 2 n X f ( x x , 1 x , 2 x , 3 x , 4 ,..., x ) 5 n X = f ( x x , 1 ) 2 X X 1 2

  6. Multiple Random Variables (cont) Independence n = =1 f ( x ) f ( x ) X k X k k The joint is the product of the marginals.

  7. Multiple Random Variables (cont) Expectation = E ( g [ X )] ( g x f ) ( x d ) x X x Note: If Xk s are independent n = 1 n = = E [ g ( X )] E [ g ( X )] k k k k 1 k k (proof)

  8. Multiple Random Variables (cont) If Xk s are independent, the joint characteristic function is the product of the marginal characteristic functions X j X = T = ( ) E [ e ] + + + + ( j X X X ... X ) E [ e ] 1 1 2 2 3 3 n n n ( j X ) = E e k k = 1 k n ( j X ) = = E e k k 1 k n = = ( ) X k k 1 k

  9. Random Variable Sum (cont) If X and Y are independent, and Z=X+Y j Z = ( ) E [ e ] Z + ( j X Y ) = E [ e ] j X j Y = E [ e ] E [ e ] = ) ( ( ) X Y Thus, from the convolution theorem of Fourier analysis = f ( z ) f ( z )* f ( z ) Z X Y = ( f f ) ( z d ) = X Y

  10. Random Variable Sum (cont) If {Xk| 1 k n } are i.i.d. (independent and identically distributed), and n X S 1 Then = = k k n X = ( ) ( ) S

  11. Random Variable Sum (cont) n = = k S X 1 If {Xk | 1 k n } are i.i.d. and Then k S is Gaussian if the Xk s are Gaussian. S is Poisson if the Xk s are Poisson. S is Binomial if the Xk s are Binomial. S is Gamma if the Xk s are Gamma. S is Cauchy if the Xk s are Cauchy. S is Negative Binomial if the Xk s are Negative Binomial .

  12. Functions of Several Random Variables Types of Transformations A Single Function of nRV s = Z ( g X ) Functions of nRV s Z k = 1 g ( X ); k n k

  13. Leibnitzs Rule ( u z ) d ( h ) z , x dx ) dz ( z ( u z ) du ( z ) d ( z ) d ( ( u ) ( ) = + h z ), z h ( z ), z ( h z , x dx ) dz dz dz ( z ) "It is unworthy of excellent men to lose hours like slaves in the labor of calculation, which could be safely relegated to anyone else if machines were used." Gottfried von Leibnitz. Gottfried von Leibnitz (1648-1716)

  14. One Function of Several Random Variables = Z ( g X ) CDF of Z: = FZ ( z ) Pr[ Z z ] = Pr[ ( g X ) z ] x2 Challenge: Find the region Rz = Rz { x ( g | x ) z } x1

  15. One Function of Several Random Variables (cont) = FZ ( z ) Pr[ Z z ] x2 Rz = Pr[ ( g X ) z ] x1 = Rz F { x ( g | x ) z } = ( z ) f ( x d ) x Z x R X z d = f ( z ) F ( z ) Z Z dz

  16. Sum of Two Random Variables (cont) = = + Z ( g X Y , ) X Y = = + Rz {( y , x | ) ( g y , x ) x y z } = {( y , x | ) y z x } y Rz z x

  17. Sum of Two Random Variables (cont) = Rz {( y , x | ) y z x } = + Z X Y = F ( z ) Pr[ Z z ] Z = Pr[ ( g X z ) \ ] z y x = f ( y , x dxdy ) = = XY x y Rz z x

  18. Sum of Two Random Variables (cont) = + Z X z Y x = F ( z ) f ( , x y dxdy ) = d = Z XY x y = f ( z ) F ( z ) Z Z dz z x d = f ( y , x dxdy ) = = XY dz x y = f ( z , x x dx ) = XY x

  19. Sum of Two Random Variables (cont) = + Z X Y If X and Y are independent: = f ( z ) f ( f ) x ( z x dx ) = Z X Y x = f ( z )* f ( z ) X Y This is the same result we saw with the Characteristic function.

  20. Product of Two Random Variables Z = XY Assume X > 0 and Y >0. Then = = FZ ( z ) Pr[ Z z ] Pr[ XY z ] y z = Pr Y X y = z / x Rz x

  21. Product of Two Random Variables Z = XY X ,Y >0 z = F ( z ) Pr Y Z X y z / = x = f ( y , x dydx ) = XY 0 0 x y y = z / x Rz x

  22. Product of Two Random Variables Z = X ,Y >0 XY d = f ( z ) F ( z ) Z Z dz Use Liebnitz Rule z / = x d = f ( y , x dy ) dx = XY dz 0 0 x y What happens when we do not restrict X and Y to be positive? d z z = , x f dx = XY dz x x 0 x 1 z = , x f dx = XY x x 0 x

  23. Product of Two Random Variables: Example Z = X and Y i.i.d. and uniform on (0,1) XY 1 z = , x f ( z ) f dx = Z XY x x 0 x ( y , x ) = 1 0 1 ; 0 1 fXY ; x y z z = 1 0 1 ; 0 1 fXY , x ; x x x z = 1 0 1 fXY , x ; z x x

  24. Product of Two Random Variables: Example Z = X and Y i.i.d. and uniform on (0,1) XY 1 z = , x f ( z ) f dx = Z XY x x 0 x z = 1 0 1 fXY , x ; z x x fZ ( z ) 1 = ln( 1 x = f ( z ) dx Z x z = 0 1 z ); z 1 z

  25. Quotient of Two Random Variables X Z = Y Scaling Background: If Y=aX 1 y = f ( y ) f Y X | a | a fX(x) a = 2 fY(y) x y

  26. Quotient of Two Random Variables (cont) X Z = Y Given Y, this is a simple scaling problem with 1 = a Y Thus = f ( | z y ) | f | y ( yz | y ) Z X

  27. Quotient of Two Random Variables (cont) X = f ( | z y ) | f | y ( yz | y ) Z = Z X Y Joint pdf = f ( y , z ) f ( | z f ) y ( y ) ZY Z Y = f ( z ) | f | y ( yz | f ) y ( y dy ) = Z X Y y = | f | y ( yz y , dy ) = XY y

  28. Quotient of Two Random Variables (example) X Z = = f ( z ) | f | y ( yz y , dy ) = Z XY Y y X & Yi.i.d. exponential RV s + ( x y ) = f ( y , x ) e U ( x U ) ( y ) XY + ( yz y ) = f ( z ) ye dy Z = 0 y

  29. Quotient of Two Random Variables (example) Integration + ( yz y ) = f ( z ) ye dy Z = 0 y + 1 ( y z ) = + F ( z ) e dy U ( z ) const . = Z 0 y 1 = + U(z ) const . + 1 ( z ) 1 + = U(z f ( z ) ) Z 2 1 ( z )

  30. Expectation There are two ways to find Z [ E = ] Z ( g X ) 1. Smart way Z [ E = = ] E [ ( g X )] ( g x f ) ( x d ) x x X 2. Dumb way. (unless you know the distribution of Z): Set ) X ( g Z = ) z ( fZ ] Z [ E z Find = zf ( z dz ) = Compute Z

  31. Expectation (example) X and Y uniform on (0,1) and i.i.d. Find E[Z] when Z= cos(2 (X+Y)) 1 1 ( ) = + = 2 0 E [ Z ] cos ( x y ) dxdy = = 0 0 x y

  32. Expectation (discrete) If X and Yare discrete RV s, we can use the probability mass function = E [ ( g X )] ( g x ) p ( x ) k k X k

  33. Expectation (joint moments) The joint moments of X and Y are k ] Y X [ E k = x y f ( y , x dxdy ) = = XY x y If discrete k k n = E [ X Y ] x y p ( x y , i ) i XY n i n

  34. Expectation (correlation) Correlation of X and Y are E [ XY ] If correlation=0, X and Y are orthogonal. Covariance of X and Y is = cov( X Y , ) E [( X X )( Y Y )] cov( X Y , ) = Correlation coefficient ( X Y , ) X Y E [ XY ] X Y = X Y

  35. Joint Gaussian Random Variables 1 = ( , ) f x y XY 2 XY 2 1 1 2 2 2 x x 1 m m y m y m 1 1 2 2 + exp 2 XY 2 XY 1 ( 2 ) 1 1 2 2 What does this look like?

  36. Contours If g(x,y) has contours, then f(g(x,y)) has the same contours. y g(x,y)=a f(g(x,y)) = f(a) x

  37. Joint Gaussian Random Variables Thus 1 = ( , ) f x y XY 2 XY 2 1 1 2 2 2 x x 1 m m y m y m 1 1 2 2 + exp 2 XY 2 XY 1 ( 2 ) 1 1 2 2 has the same contours as 2 2 x x m m y m y m 1 1 2 2 = + ( , ) 2 g x y XY 1 1 2 2 This is the equation for an ellipse.

  38. Joint Gaussian Random Variables Means (m1 and m2), variances and 1and 2 aand correlation coefficient , uniquely define 2-D Gaussian. y m2 1 2 X Y = arctan 2 X 2 y 2 m1 x The marginals are 1-D Gaussian RV s. Do Gaussian marginals imply a joint Gaussian RV? When is a joint Gaussian RV a line mass?

  39. Joint Gaussian Random Variables

  40. nJointly Gaussian RVs ( 2 = 1 ) ( ) T 1 exp x m K x m = ( ) f x X ( ) / 2 n 2 | | K where (X ) m E and the covariance matrix is ( E K ( = ) T )( ) X m X m

  41. nJointly Gaussian RVs The Characteristic Function 1 = T T ( ) exp j m K X 2 n n n 1 = k = k = = exp COV ( , ) j m X X k k k k 2 1 1 1

More Related Content