
Theory of Probability and Statistics II Explained
Explore the Theory of Probability with detailed explanations of random experiments, sample spaces, Bayes' formula, addition law, and Boole's inequality. Understand concepts through clear examples and proofs in this comprehensive overview. Perfect for students and enthusiasts alike.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
STATISTICS-II by Dr. S. SEYADALI FATHIMA Assistant Professor Department of Mathematics Statistics II III B.Sc Mathematics III B.Sc Mathematics
THEORY OF PROBABILITY Random Experiment: The random experiment is an experiment inwhich The set of all possible outcomes are known. But exact outcome is not known. Ex: Ex:- -Tossing a coin, Rolling a die. Tossing a coin, Rolling a die.
Sample Space: The set of all possible outcomes in a random experiment is called sample space. It is denoted by (S). Ex:- When tossing a coin S={H,T} When rolling a die S={1,2,3,4,5,6} Sample point: Each element of a sample space is called Sample Point.
Let {Ai} be a sequence of mutually exclusive and exhaustive events in a sample space S such that P(Ai)>0 for all i. Let B be any event with P(B)>0. Then P(Ai/B)= [P(Ai) P(B/Ai)]/[ P(Ai) P(B/Ai)]. Proof:- By Baye s formula we have, P(Ai/B)=[P(Ai) P(B/Ai)]/[P(B)] -------(1) We claim that P(B)= P(Ai) P(B/Ai) Since the events Ai are mutually exclusive and exhaustive we have, UAi=S and Ai s are disjoint.
Therefore, B=B S=B (UAi) =U(B Ai) P(B)=P[U(B Ai)] = P(B Ai) P(B)= P(Ai)P(B/ Ai) [since P(B/A)=[P(B A)]/[P(A)]] Sub P(B) in equation (1) we get, P(Ai/B)=[P(Ai)P(B/Ai)]/[ P(Ai)P(B/Ai)]
ADDITION LAW IF A AND B ANY TWO EVENTS OF A SAMPLE SPACES. THEN P(AUB)=P(A)+P(B)-P(A B) AUB=AU(A B) P(AUB)=P(A)+P(A B)-P(A A B)= P(AUB)=P(A)+P(A B) ---------(1) Now, B=(A B)U(A B) A B and A B are disjoint events. P(B)=P(A B)+P(A B) P(A B)=P(B)-P(A B) ------------(2) (2) in (1) we get, P(AUB)=P(A)+P(B)-P(A B)
Let A1,A2,An be n events in a Sample Space S. Then P(A1 A2 A3 .. An) =P(A1)P(A2/A1)P(A3/(A1 A2) .P(An/A1 A2 A3 An-1) P(A1)P(A2/A1) P(An/A1 A2 An-1) =P(A1).P(A1 A2)/P(A1) ...P(A1 A2 . An)/ P(A1 A2 . An-1) =P(A1 A2 .. An)
If A and B are any two events in a Sample Space. Then prove that P(A B) 1-P(A)-P(B) ii. Generalized Boole s Inequality. For the events A1,A2, ..,An, . in a Sample Space. Then P( Ai) 1- P(Ai) i. 1) For any two events A and B we have P(AUB)=P(A)+P(B)-P(A B) P(A B)=P(A)+P(B)-P(AUB) =(1-P(A))+(1-P(B))-P(AUB)
=1-P(A)-P(B)+[1-P(AB)] [since 0 P(AUB) 1] 1-P(A)-P(B) 2) Let B1= Ai so that Ai=A1 B1 Now, P( Ai)=P(A1 B1) 1-P(A1)-P(B1) [By (1)] =1-P(A1)-P( Ai) =1-P(A1)-P(UAi) 1-P(A1)-P(Ai) 1- P(Ai)
F(x)=P(Xx)=f(xi)=P(X=xi) If X takes only a finite number of values {x1,x2, ,xn} where (x1<x2< ..<xn) Then, 0 - x x1 f(x1) x1 x<x2 F(x)= f(x1)+f(x2) x2 x<x3 .. f(x1)+f(x2)+ +f(xn) xn x<
A random variable X is said to be continuous random variable if it can take any value in an interval which may be finite or infinite. Let X be a continuous random variable taking the values in the interval (- , ). Let f(x) be a function such that f(x) is integral (- , ) f(x) 0 for all x (- , ) f(x) dx=1 Then f(x) is called the p.d.f of the Continuous random variable X.
PROBLEM A random variable X has the following probability density function Xi -2 -1 0 1 2 3 p(xi) 0.1 k 0.2 2k 0.3 k Find the value of k. I. Mean II. Variance III. P(x>=2) IV. P(x<2) V. P(-1<x<3)
SOLUTION: pi =1 0.1+k+0.2+2k+0.3+k=1 0.6+4k=1 4k=1-0.6 K=0.1 2). Mean: E(X)= xi pi=(-2)(0.1)+(- 1)(0.1) +(0)(0.2)+(1)(0.2)+(2)(0.3)+(3)(0.1)
=-0.2-0.1+0+0.2+0.6+0.3 =-0.3+0.11 E(x)=0.8 3). Variance E(x2)=[(4)(0.1)+(1)(0.1)+(1)(0.2)+(4)(0.3)+(9)0.1)] =0.4+0.1+0+0.2+1.2+0.9 =2.8
2=E(x2)-(E(x))2 =2.8-((0.8))2 =2.8-0.64 =2.16 4). P(X>=2)= P(X=2)+P(X=3) =0.3+0.1 =0.4 5). ( P(X<2)= P(X=1)+P(X=0)+P(X=-1)+P(X=-2)
=0.2+0.2+0.1+0.1 =0.6 6). P(-1<X<3)=P(X=0)+P(X=1)+P(X=2) =0.2+0.2+0.3 =0.7
Let X be Discrete random variable which can assume any of the values x1,x2, .,xn, with corresponding probabilities. Pi=P(X=xi)= Pi=P(X=xi)=i i=1,2,3, The, the mathematical expectations of X. Denoted by E(x)= Pi xi E(x)= Pi xi provided the series is absolutely convergent. =1,2,3,
If X and Y are random variables. Then, 1) E(C)=C, where C is constant. 2) E(CX)=CE(X), where C is constant. 3) E(Ax +b)=a E(X)+b 4) E(X+Y)=E(X)+E(Y) 5) E(XY)=E(X)E(Y) if X and Y are independent random variable. 6) Let X be a discrete random variable having a p.d.f Pi. Let (x) be a function of X. Then E( (x))= Pi (xi)
CUMULANT GENERATING FUNCTION (C.G.F) DEFINITION The cumulant generating function Kx(t) of a random variable X is defined by Kx(t)=loge (Mx(t) provided the right hand side can be expanded as a convergent series in powers of t. Hence Kx(t)=loge Mx(t)=k1t+k2t2/2!+ ..+kr tr/r!+ And kr=coefficient of tr/r! in Kx(t) is called the rth cumulant of X.
BINOMIAL DISTRIBUTION Let n be any positive integer and let 0<p<1. Let q=1-p. Define p(x)= ncx px qn-x if x=0,1,2, ..n 0 Otherwise A discrete random variable with the above p.d.f is said to have binomial distribution and the p.d.f itself is called a binomial distribution.
NOTE NOTE 1 1 The two independent constants n and p in the distribution are known as the parameters of the distribution. If X X is a binomial variate with parameters n and p we write as X~B(n , p). NOTE 2 NOTE 2 The probability distribution function of a binomial distribution is obtained by considering n independent trials of a random experiment whose outcome is success or failure.
ADDITION PROPERTY OF BINOMIAL DISTRIBUTION If x1~B(n1,p) , x2~B(n2,p) are independent random variables If x1~B(n1,p) , x2~B(n2,p) are independent random variables then x1+x2 is B(n1+n2,p) then x1+x2 is B(n1+n2,p) Proof: Proof: Given x1 and x2 are independent random variables with parameters. N1,p and n2,p respectively. Let consider m.g.f of x1 and x2 about origin. Mx1(t)=(q+pet)n1 and Mx2(t)=(q+pet)n2.
Now, Mx1+Mx2(t)=Mx1(t)-Mx2(t) = [since x1 and x2 are independent] =(q+pet)n1(q+pet)n2 =(q+pet)n1+n2 =m.g.f of the binomial x1+x2 with parameters n1+n2 and p. Hence by uniqueness theorem x1+x2 is a binomial variable with parameters n1+n2 and p.
MODE OF BINOMIAL DISTRIBUTION CASE 1: CASE 1: If (n+1)p not an integer clearly mode is the integer part of (n+1)p i.e.). [(n+1)p] is the mode and distribution is unimodel. CASE 2: CASE 2: If (n+1)p is an integer both (n+1)p and (n+1)p-1 will represent mode and the distribution is bimodel.
REFERENCE BOOK: STATISTICS II By Dr.S.Arumugamand Mr. A. Thangapandi issac REFERENCE BOOK: STATISTICS