Joint Distribution Functions, Independence of Random Variables, Covariance, and Correlation
This lecture covers the concepts of joint distribution functions, independence of random variables, covariance, correlation, and standby redundancy analysis in the context of engineering applications. It explores the probability density functions and failure scenarios of components in a standby system, discussing system reliability and failure analysis.
Uploaded on Feb 20, 2025 | 0 Views
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Joint Distribution Functions, Independence of Random Variables, Covariance, and Correlation ECE 313 Probability with Engineering Applications Lecture 19 Ravi K. Iyer Dept. of Electrical and Computer Engineering University of Illinois at Urbana Champaign Iyer - Lecture 19 ECE 313 Spring 2017
Todays Topics and Announcements Joint Distribution Functions Announcements: Group activity in the class, next Week,. Final project will be released Today Concepts: Hypothesis Covariance and correlation Project schedules on Compass and Piazza testing, Joint distributions, Independence, Iyer - Lecture 19 ECE 313 Spring 2017
Standby Redundancy: joint/conditional joint probability distributions A standby system is one in which two components are connected in parallel, but only one component is required to be operative for the system to function properly. Initially the power is applied to only one component and the other component is kept in a powered-off state (de-energized). When the energized component fails, it is de-energized and removed from operation, and the second component is energized and connected in the former s place. If we assume that the first component fails at some time , then the second component s lifetime starts at time and assuming that it fails at time t, its lifetime will be t : t - t = 0 t > Iyer - Lecture 19 ECE 313 Spring 2017
Standby Redundancy (Contd) If we assume that the time to failure of the components is exponentially distributed with parameters 1 and 2, then the probability density function for the failure of the first component is: f1(t)=l1e-l1t, 0<t <t Given that the first component must fail for the lifetime of the second component to start, the density function of the lifetime of the second component is conditional, given by: l2e-l2(t-t ), 0<t <t 0 f2(t |t)= , t >t t Then we define the system failure as a function of t and , using the definition of conditional probability: f(t,t)= f1(t)f2(t|t) Iyer - Lecture 19 ECE 313 Spring 2017
Standby Redundancy (Contd) f(t) The associated marginal density function of is: t t (l1e-l1t)(l2e-l2(t-t )) f(t)= f(t,t)dt = dt 0 0 So the system failure will be: l1l2 l1-l2 (e-l2t-e-l1t) f(t)= And the reliability function will be: l1l2 l1-l2 t t (e-l2t-e-l1t)dt R(t)=1- =1- f(t)dt 0 0 =l1e-l2t-l2e-l1t l1-l2 Iyer - Lecture 19 ECE 313 Spring 2017
Standby Redundancy (Contd) Iyer - Lecture 19 ECE 313 Spring 2017
Joint Distribution Functions We have concerned ourselves with the probability distribution of a single random variable Often interested in probability statements concerning two or more random variables Define, for any two random variables X and Y, the joint cumulative probability distribution function of X and Y by = b Y a X P b a F }, , { ) , ( , a b The distribution of X (Marginal Distribution)can be obtained from the joint distribution of X and Y as follows: } { ) ( FX = a F = a P X a { , } P X a Y = ( , ) Iyer - Lecture 19 ECE 313 Spring 2017
Joint Distribution Functions Contd = = ( ) { } ( , ) FY b P Y b F b Similarly, Where X and Y are both discrete random variables it is convenient to define the joint probability mass functionof X and Y by { ) , ( X P y x p = = = , } x Y y ( x = ( ) ( , ) p x p x y Probability mass function of X X : , ) 0 y p y , ( x = ( ) ( , ) p y p x y Y : ) 0 x p y Iyer - Lecture 19 ECE 313 Spring 2017
Joint Distribution Functions Contd We say that X and Y are jointly continuous defined for all real x and y B A = { , } ( , ) P X A Y B f x y dxdy f(x,y) Called the joint probability density function of X and Y . probability density of X P { = } { , ( , )} X A P X A Y = ( , ) f x y dxdy A ( = ) f x dx X A = ( ) ( , ) fX x f x y dy is the probability density function of X. The probability density function of Y is because: = Y a X P b a F , ( ) , ( = ( ) ( , ) fY y f x y dx a b = ) ( , ) b f x y dydx Iyer - Lecture 19 ECE 313 Spring 2017
Joint Distribution Functions Contd Proposition: if X and Y are random variables and g is a function of two variables, then = [ ( , )] ( , ) ( , ) E g X Y g x y p x y y x = ( , ) ( , ) g x y f x y dx dy For example, if g(X,Y)=X+Y, then, in the continuous case + = + [ ] ( ) ( , ) E X Y x y f x y dx dy = + ( , ) ( , ) xf x y dx dy yf x y dx dy = + [ ] [ ] E X E Y Iyer - Lecture 19 ECE 313 Spring 2017
Joint Distribution Functions Contd Where the first integral is evaluated by using the foregoing Proposition with g(x,y)=x and the second with g(x,y)=y In the discrete case Joint probability distributions may also be defined for n random variables. If are n random variables, then for any n constants a a ,..., , 2 1 ] ... [ 1 2 2 1 1 n n E a X a X a X a E = + + + = + [ ] [ ] [ ] E aX bY aE X bE Y , ,..., X X X 1 2 n n a + + + [ ] [ ] ... [ ] X a E X a E X 1 2 2 n n Iyer - Lecture 19 ECE 313 Spring 2017
Example 1 A batch of 1M RAM chips are purchases from two different semiconductor houses. Let X and Y denote the times to failure of the chips purchased from the two suppliers. The joint probability density of X and Y is estimated by: + ( ) x y , , 0 0 e x y = ( , ) f x y , 0 otherwise 10 = 10 6 = 5 Assume per hour and per hour. Determine the probability that time to failure is greater for chips characterized by X than it is for chips characterized by Y. Iyer - Lecture 19 ECE 313 Spring 2017
Example 1 (Contd) Iyer - Lecture 19 ECE 313 Spring 2017