Multivariate Linear Regression and Least Squares Estimation

multivariate linear regression n.w
1 / 28
Embed
Share

Explore the concepts of multivariate linear regression and least squares estimation in this detailed analysis. Learn about model construction, data interpretation, and sums of squares decomposition. Gain insights into correcting errors and optimizing predictions for your regression models.

  • Regression Analysis
  • Multivariate Regression
  • Least Squares
  • Model Interpretation
  • Statistical Analysis

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Multivariate Linear Regression

  2. Multiple Regression with a Single Response and r Predictors = + + + + Model: ... Y Z Z 0 1 1 r r y = = + + + + = = = = 2 Data: ... 1,..., 0 COV , 0 ' Y Z Z j n E V j j 0 1 1 ' j j r jr j j j j j 0 1 M M L L Y M Z Z 1 11 1 1 r ' 1 = = + = + = = = 2 Y Z Matrix Form: M M E V E 0 I M 1 L Y Z Z 1 n n nr n r 1 M M L L z z 1 11 1 M r = y Z assumed to be full column rank Z Observed Data: ,..., ; ,..., ,..., ,..., M y y z z z z 1 11 1 1 n n r nr 1 L y z z 1 n n nr b b M 0 1 = b Arbitrary Values for , ,..., , ,..., b b b 0 1 0 1 r r b r ( ) + + + = b Differences between observed data and predictions based on : ... ... y b b z b z y b b z b z 0 1 1 0 1 1 j j r jr j j r jr n ( ) 2 ( ) b = Sum of Squared Differences: ... S y b b z b z 0 1 1 j j r jr = 1 j

  3. Least Squares Estimation (No Linear Dependencies among Predictors) ( 0 1 1 1 j = = = + b ^ Setting derivative to and solving for minimizing value of n ) 2 ( ) b ( ) ( ) = = = + y Zb ' y Zb y'y y'Zb b'Z'Zb Sum of Squared Differences: ... 2 S y b b z b z j j r jr a'x x x'Ax x x'Ax x ^ ( ) b ( ) = + = Goal: Choose that minimizes Note: symmetric A 2 S a A A' x Ax ( ) b S b ( ) 0 2 = + = + y'y y'Zb b' Z'Zb y'Z ' Z'Zb Z'y Z'Zb 2 2 2 2 ^ ^ ( ) 1 = = 0 b Z'Z Z'y Z'Z Z'y ^ y ^ ( ) ( ) 1 1 = = = = = = Z Fitted (Predicted)Values: (symmetric a nd idempotent) Z Z'Z Z'y Hy H Z Z'Z Z' H' H HH H ( ) ^ ^ y ( ) ( ) ( ) 1 1 = = = = Residuals (Observed minus fitted values): y y Z Z'Z Z'y I Z Z'Z Z' y I H y ( ) ( ( ) ( )( ) ( = ) = I H I H I H I H I H ' ) ^ ( ) ( ) 1 1 = = = = Z' Z' I Z Z'Z Z' y Z'y Z'Z Z'Z Z'y Z'y Z'y 0 ( ) ^ y' ^ ( ) ( ) ( ) ( ) y'( ) 1 1 = = = = = H H y 0 y'Z Z'Z Z' I Z Z'Z Z' y y'H I H y y' H HH y ^ ' ^ ( ) = = Error Sum of Squares: SSE y' I H y

  4. Sums of Squares Decomposition ^ y ^ y ^ y ^ ^ y ' y ^ ^ ^ ^ y'y y' 'y ' ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ' ^ ^ ^ ^ ^ = + = + = + + = + + + = 0 0 + + + = + y y y'y y'y y'y ' 2 2 n n n ^ y'y ' ^ ^ ^ ^ y ^ = = + = + = + 2 j y'y SS SS SS y j TOTAL UNCORRECTED MODEL ERROR j = = = 1 1 1 j j j 1 n 2 = = y' J y Correction for the Mean: CM ny n ( ) 1 n 1 n n 2 = = = = y'y y' J y y' I J y SS SS CM y y TOTAL CORRECTED TOTAL UNCORRECTED n n j = 1 j 2 1 n 1 n n ^ y'y ^ ^ y 2 = = = = = y'H'Hy y' J y y' H J y SS SS CM ny y REGRESSION MODEL n n j = 1 j 2 n = ^ ' ^ ^ y ( ) ( ) ( ) = = = = S S y y' I H ' I H y y' I H y ERROR j j = 1 j ( ) ( ) = + = + = + + 1 1 SS SS SS df n df df r n r TOTAL CORRECTED REGRESSION ERROR TOTAL CORRECTED REGRESSION ERROR

  5. Geometry of Least Squares y Z response vector (point in dimensional space) vector in model plane (straight line when error vector n + = 1 2) r ^ y ^ = Z projection of onto plane made up of all linear combinations of columns of y Z ^ ^ = perpendicular to the plane made up of all linear combinations of columns of y Z Z Z'Z Spectral Decomposition of (full rank) : 1 1 ( ) 1 = 1 1 1 e e ' + + = + + Z'Z e e ' Z'Z 1 1 e e ' e e ' ... 0 ... i + + + + + 1 1 1 1 1 r r r i r r + 1 1 r + + 1 1 r r ( ) 1 1/2 = = = 1 Z Z'Z Z' Ze e'Z' q q ' q Ze i i i i i i i i = = 1 1 i i y q q Projection of on a linear combination of ,..., = I Z Z'Z Z' : + 1 1 r + + 1 1 r r ^ ( ) ( ) 1 = = q y q q q ' y Z Z'Z Z'y Z ' i i i i = = 1 1 i i ( ) 1 projects onto plane perpendicular t y o plane spanned by columns of Z

  6. Least Squares Estimators Sampling Properties - I Y Y = + = = = = 2 2 Y Z E V E V 0 I Z I = A AY A Y matrix of constants E E ( ) ( ) ( ) ( ) Y Y 'A' Y A' = = = AY AY AY AY AY ' A Y Y A V E E E E E E V ^ ( ) 1 = Z'Z Z'Y ( ( ^ ( ) ) ( Y ) 1 1 1 = = = = E E E Z'Z Z'Y Z'Z Z' Z'Z Z'Z ^ ( ) ) ( Y Z Z'Z ) ( ) ( ) ( ) Z' ( Z Z'Z ) ( ) 1 1 1 1 1 1 1 1 = = = = = 2 2 2 Z'Z V V V Z'Z Z'Y Z'Z Z' Z'Z Z' IZ Z'Z Z'Z ^ ( ) 1 = I Z Z'Z Z' Y ^ ( ) ( ) 1 1 = = = = E I Z Z'Z Z' Z Z Z Z'Z Z'Z Z Z 0 ^ ( ) ( ) ( ) 1 1 1 = = 2 2 (since is symmetric and idempotent) V I Z Z'Z Z' I I Z Z'Z Z' ' I Z Z'Z Z' I H

  7. Least Squares Estimators Sampling Properties - II Y Y 'A Y = + Y'AY A trace E V E E ^ ' ^ ( ) ( ) ( ) 1 1 1 = = + 2 trace E E Y' I Z Z'Z Z' Y I Z Z'Z Z' I 'Z' I Z Z'Z Z' Z I = ( ) ( ) ( ) ) 1 1 1 = = = 2 2 2 2 2 I Z Z'Z Z' I I Z Z'Z Z' I Z Z'Z Z' trace trace trace trace trace n ( ) ( ) ( = 1 + 2 2 2 Z'Z Z'Z tra ce 1 n n r ( ) ( ) 1 1 = = = 'Z'Z 'Z'Z 'Z' I Z Z'Z Z' Z 'Z'Z 'Z'Z Z'Z Z'Z 0 ^ ' ^ ( ) 1 Y' I H Y ^ ' ^ ( ) ( ) = + = = 2 2 1 E n r S ( ) ( ) + + 1 n r n r ( ) ( ) ( ) ( ) Y Y 'B' Y B' = = = AY BY AY AY BY BY ' A Y Y A COV , E E E E E E V ^ ^ ( ) ( ) ( ) Y ( ) ( ) ( ) ( ) 1 1 1 1 1 1 1 = = = = 2 Z'Z Z' I Z Z'Z Z' ' Z'Z Z' Z'Z Z'Z Z'Z Z' 0 COV , COV , V Z'Z Z'Y I Z Z'Z Z' Y

  8. Gauss Theorem = + + + s c' Parameter (Linear Function of ): ... c c c 0 0 1 1 r r ^ ^ ^ ^ ( ) ( ) 1 1 = + + + = = = c' Estimator: ... c c c c' Z'Z Z'Y a'Y a' c' Z'Z Z' 0 1 r 0 1 r Alternative (Linear, Unbiase V = d'Y d' ^ ^ ( ) ( ) ( ) 1 1 1 = = = = 2 2 2 c' E V c' c' c' Z'Z Z' IZ Z'Z c c' Z'Z c a'a = = = = d'Y d'Y d'Z d) Estimator: with E c' d'Z c' ( ) ( ) ( ) ( ) = = + + 2 2 2 Id d'd a d a ' a d a ( ^ ( ) ( ) ( ) ( ) ( ) = + + = + 2 2 a'a d a ' d a a' d a a'a d a ' d a c' 2 V ( ) ( ) ( ) ( ) ) ( ) 1 1 1 1 = = = a' d a c' Z'Z Z' d a c' Z'Z c c' Z'Z Z'Z Z'Z c The last equality holds as: 0 ^ c' is the Best Linear Unbiased Estimator (B.L.U.E.) of c'

  9. Inferences Based on Normality of Errors - I Adding Normality Assumption for Distribution of : ( ) ( ) ( ) ^ ( ) ( ) 1 1 = 2 2 2 ~ , ~ , ~ , N N N 0 I Y Z I Z'Z Z'Y Z Z'Z 2 ^ ' ^ ^ ( ) + 2 ( 1) n r S n = = 2 n ~ ( ) + 1 r 2 2 2 ^ ^ ^ ( ) 1 = 2 Estimated Variance-Covariance Matrix of : V S Z'Z ^ ' ^ ^ ^ y ^ ^ ^ y ( ) 1 = = = = 2 Based on Sample Da ta: s Z'Z Z'y Z y + ( 1) n r ( ) 1 100% Confidence Ellipsoid for : ^ ^ 1 ( ) 1 2 s ' Z'Z ^ ^ )( ) 1 + ( ) ( ) )( ) 1 + + 2 s.t. 1 F s r F ' Z'Z ( ( + + 1, 1, + r n r r n r 1 r ( ^ ^ ^ ^ ^ ( ) ) ( ) 1 th + 2 Z'Z Individual 1 100% Conf idence Intervals for : where 1 diagonal element of t V V i s ( ) + i i i i 1 n r 2 ^ ^ ^ ( ) ( ) )( ) 1 + + Simultaneous 1 100% Confidence Intervals for All : 1 r F V ( + i i i 1, r n r

  10. Inferences Based on Normality of Errors - II Likelihood Ratio Tests for Regression Coefficients = = 0 M ( ) 1 q = = = = = : ... 0 : H H Z Z Z 0 0 ( ) 2 ( ) 2 + 1 2 0 1 q r A ( ) 2 + 1 q M r ^ ^ ^ ( ) ( ) 1 = + = = Y Z Under : H SS Z 'Z Z 'y Z y Z ' y Z ( ) 1 ( ) 1 ( ) 1 ( ) 1 0 1 1 1 1 ERROR 1 1 1 ^ ^ ^ ( ) ( ) Z 1 = + = = Y Z Under : H SS Z'Z Z'y y Z ' y Z ERROR A ( ) Z ( ) Z ( ) ( ) ( ) Z ( ) ( ) ( ) Z = + = + = Z Z Z Z Extra Sum of Squares and due to variables in : Degrees of Freedom: ( ) 1 ERROR SS r q s 1 1 SS ) r SS df n q df n r df df r q 2 ( ERROR SS q Z 1 ( ) ERROR = 1 1 ( ) Z Z Z SS SS ERROR 1 ERROR ERROR = Likelihood Ratio Test St atistic: F ( ) ) 1 + obs 2 SS n ERROR ( ) r ( )( ) 1 + = : RR F r q n F P P F F ( ( ) r q n + obs obs , , 1 r r ( ) ( ) = = + + C' General Linear Test: : ' 1 with 1 linearly indep endent rows H C k r k r m C' m 0 0 ^ ^ 1 ( ) 1 C' m ' C' Z'Z C C' m ( ) )( ) 1 + = = : : TS F RR F F P P F F ( ( ) + obs obs obs , , 1 k n r k n r 2 ks

  11. Estimating the Mean and Predicting New Observations 1 Z ^ 01 M = = Z Z Z ' Levels of Predictors of Interest: Parameter: Estimator: Z ' 0 0 0 Z 0 r ^ ^ ^ ( ) ( ) ( ) 1 1 1 = = = 2 2 2 Z ' V V s Z ' Z'Z Z Z ' Z'Z Z Z ' Z ' Z'Z Z 0 0 0 0 0 0 0 0 ^ ( ) ( ) 1 2 Z ' Z ' Z'Z Z 1 100% Confidence Interval for : t s Z ' ( ) + 0 0 0 0 1 n r 2 = = + Z Z Z ' Forecasting New Response when : Y 0 0 0 0 ^ ^ Z ' Forecast: Forecast Error: Y Z ' 0 0 0 ^ ^ ^ V Y ( ) ( ) 2 0 ( ) 1 1 = + = + = + 2 2 2 Z ' 2COV , 1 V Y V Y Z ' Z ' Z ' Z'Z Z Z ' Z'Z Z 0 0 0 0 0 0 0 0 0 0 ^ ( ) ( ) 1 + 2 Z ' 1 100% Prediction Interval for : 1 Y t s Z ' Z'Z Z ( ) + 0 0 0 0 1 n r 2

  12. Multivariate Multiple Regression m Response Variables Response Variables, Common set of Predictor Variables within Units m r M 1 = + + + + = = = = ... 1,..., Y Z Z i m E V 0 0 1 1 i i i ri r i m = = = th Y ' unit: 1, ,..., L 1,..., j Z Z Z Y Y j n 0 1 1 j j jr j j jm 1 M M O + Z L L Z Z Y Y 11 1 11 M O 1 M r m = = = Z Y Y Y M L ( ) 1 ( ) m 1 M O L L Z Z Y Y 1 1 n L L nr n nm 01 0 m L 11 M O 1 M m 11 1 M m = = = = L L ( ) 1 ( ) m ( ) 1 ( ) m L 1 n nm L 1 r rm = = = = = + = = Y 0 COV , , 1,..., m 1,...,m E i k Y Z V i I ( ) i ( ) i ( ) k ( ) i ( ) i ( ) i ( ) i ik ii

  13. Least Squares Estimation and Sums of Squares and Cross-Products ii i i i i i E V 0 I = + = = = Y Z 1,..., i m ( ) ( ) ( ) ( ) ( ) L 11 M O 1 M m ' = = Y Different responses on the same unit may be correlated: 1,..., V j n j L 1 m mm = B b b Y-ZB Arbitrary choice of parameters: L and matrix of errors: ( ) 1 ( ) m Err or Sums of Squares and Cross-Products Matrix: = ( ) ( ) ( ) ( M ) Y Zb ' Y Zb Y Zb ' Y Zb L ( ) 1 ( ) 1 ( ) 1 ( ) 1 ( ) 1 ( ) 1 ( ) m ( ) m ( ) ( ) Y ZB ' Y ZB M O ( ) ( ) ( ) ( ) Y Zb ' Y Zb Y Zb ' Y Zb L ( ) m ( ) m ( ) 1 ( ) 1 ( ) m ( ) m ( ) m ( ) m ^ ( ) ( ) ( ) 1 = th Y ZB ' Y ZB is the choice of that minimizes the diagonal eleme nt of i Z'Z Z'Y b ( ) i ( ) i ( ) i ^ ( ) ( ) ( ) 1 = = Y ZB ' Y ZB B trace is minimized at Z'Z Z'Y ^ ( ) ( ) ( ) 1 = = Y ZB ' Y ZB B Further: Generalized Variance: is minimized at Z'Z Z'Y

  14. Predicted Values, Residuals, and SSCP Matrices ^ ( ) ( ) 1 1 = = Least Squares Estimator of : Z'Z Z'Y Z'Z Z' Y Y ( ) 1 ( ) m ^ ^ ^ ^ ( ) ( ) 1 1 = = = = Y Z Predicted Values: Z Z'Z Z'Y Z Z'Z Z' Y Y Y Y ( ) 1 ( ) m ( ) 1 ( ) m ( ) ( ) ^ ^ ^ ^ ( ) ( ) 1 1 = = = = Residuals: Y Y I Z Z'Z Z' Y I Z Z'Z Z' Y Y ( ) 1 ( ) m ( ) 1 ( ) m ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ = = = + + = + = + Z' 0 TOTAL UNCORRECTED SSCP SSCP SSCP Y' Y'Y Y ' Y Y'Y ' MODEL ERROR ( ) ( ) ( ) ^ ' ^ ^ ^ ( ) ( ) ( ) 1 1 1 = = = Z' Y Y'Y Y'Y Y'IY Y' I Z Z'Z Z' ' I Z Z'Z Z' Y Y' I Z Z'Z

  15. Mean and Variance/Covariance Structure of Least Squares Estimator Y = + = = = = Y Z E E V V 0 Y Z Y I ( ) i ( ) i ( ) ( ) i ( ) i ( ) i ( ) i ( ) i ii i = = = COV , , COV , , 1,..., i k m Y I ( ) i ( ) k ( ) i ( ) k ik ^ ^ ^ ( ) 1 = = Z'Z Z'Y ( ) 1 ( ) m ( ( ( ( Z Z'Z ^ ^ ( ) ) ) 1 1 1 = = = = = E E E E Z'Z Z'Y Z'Z Z' Y Z'Z Z'Z ( ) i ( ) i ( ) i ( ) i ( ) i ( Z Z'Z ^ ( ) ) ) ( ) ) 1 1 1 1 1 = = = = V V V V Z'Z Z'Y Z'Z Z' Y Z'Z Z' Y ( ) i ( ) i ( ) i ( ) i ( ) Z'I ( ) ( ) 1 1 1 = = = Z'Z Z Z'Z Z'Z 1,..., i m ii ii ( ( Z Z'Z ^ ^ ( ) ( )( i ) ) ) 1 1 1 1 = = = COV , COV , COV , Z'Z Z'Y Z'Z Z'Y Z'Z Z' Y Y ( ) i ( ) k ( ) k ( ) i ( ) k ( ) ( ) ( ) 1 1 1 = = = Z'Z Z'IZ Z'Z Z'Z , 1,..., i k m ik ik

  16. Properties of Residuals and SSCPERRORMatrix ( ) ^ ^ ^ ^ ( ) 1 = = = Y Z I Z Z'Z Z' Y ( ) 1 ( ) m ( ) ( ) ( ) ^ ^ ( ) ( ) ( ) 1 1 1 = = = = = E E E E I Z Z'Z Z' Y I Z Z'Z Z' Y I Z Z'Z Z' Z 0 0 ( ) i ( ) i ( ) i ( ) i ( ) ( ) ( ) ^ ' ^ ( ) ( ) ( ) 1 1 1 = = E E E Y ' I Z Z'Z Z' ' I Z Z'Z Z' Y Y ' I Z Z'Z Z' Y ( ) i ( ) k ( ) i ( ) k ( ) i ( ) k = + = (1) (2) + ( W'AW COV A ) ) W W W A W ) Note: trace ) ( Z Z'Z , ' E ( E E 1 2 1 2 1 2 ( ) ( ( ) 1 1 = I Z Z'Z Z' Y Y I Z Z'Z Z' (1): COV , ( ) i ( ) k ik ) ( ) ( ) 1 = + I Z' trace 1 (see slide 7) n r ik ik ( ( ) 1 = = (2): ' 'Z' I Z Z'Z Z' Z 0 0 ( ) i ( ) k ( ) i ( ) k 11 1 m 1 r ^ ' ^ ^ ' ^ ( ) ( ) = + = = 1 E n r E ( ) i ( ) k ( ) ik + 1 n 1 m mm

  17. Covariance of Regression Coefficients and Residuals ^ ^ ^ ^ ^ ^ ^ ^ = = COV , E E E E ' 0 ' ( ) k ( ) k ( ) k ( ) k ( ) i ( ) i ( ) i ( ) i ( ) i ( ) ( ^ ( ) ( ) ) ( ) ( i ) ( ) ( i ) 1 1 1 1 1 = = + = + = + Z'Z Z'Y Z'Z Z' Z Z'Z Z'Z Z'Z Z' Z'Z Z' ( ) i ( ) i ( ) i ( ) i ( ) i ( ) i ^ ( ) 1 = Z'Z Z' ( ) i ( ) i ( ) i ( ) ( ) ( ) ( ) ^ ^ ( ) ( k ) ( ) 1 1 = ( = + + = + = Y Z Z ) ( ) Z Z'Z Z' Z Z ( ) Z Z'Z Z' ( ) k ( ) k ( ) k ( ) k ( ) k ( ) k ( ) k ( ) k ( ) k k ( ) 1 = I Z Z'Z Z' k ( ( ( ) ( ) ^ ^ ( ) ( ) ) ( ( ) 1 1 1 1 = = = COV , E Z'Z Z' ' I Z Z'Z Z' ' Z'Z Z' I I Z Z'Z Z' ( ) k ( ) i ( ) ( ) i ik k ) ( ) ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) 1 1 1 1 1 1 1 = = = = Z'Z Z' I Z Z'Z Z' Z'Z Z' Z'Z Z'Z Z'Z Z' Z'Z Z' Z'Z Z' 0 ik ik ik

  18. Estimating Mean Responses and Forecasting New Responses at z z0 ^ ^ = = z ' Parameter: 1,..., Estimator: i m E z ' z ' z ' ( ) i ( ) i ( ) i ( ) i 0 0 0 0 ^ ^ ( ) ( ) 1 1 = = z ' COV , COV , z ' z ' Z'Z Z' z ' Z'Z Z' ( ) i ( ) k ( ) i ( ) k ( ) i ( ) k 0 0 0 0 ( Z Z'Z ( ) ) ( ) 1 1 1 = = z ' Z'Z Z' COV , z z ' Z'Z z ( ) i ( ) k 0 0 0 0 ik = Forecast Error for New observa ( tion 1,..., = : Y i m 0 i ) ^ ^ ^ ^ = + = z ' 0 Y Y E Y z ' z ' z ' z ' z ' ( ) i ( ) i ( ) i ( ) i ( ) i ( ) i ( ) i 0 0 0 0 0 0 0 0 0 0 i i i i ) ^ ^ ^ ^ ^ ^ = = = z ' COV , COV , COV , Y Y z ' z ' z ' z ' z ' ( ) i ( ) k ( ) i ( ) k ( ) i ( ) k ( ) i ( ) k 0 0 0 0 0 0 0 0 0 0 0 0 i k i k i k ^ ^ ^ ( 1 = + = + z ' COV , COV , 2COV , 1 z z z ' Z'Z z ( ) i ( ) k ( ) k 0 0 0 0 0 0 0 0 i k i ik

  19. Model with Normal Distribution for Errors ( ) ( ) + + Full-rank (No linear dependencies among predictors), Z 1 ~ , n r m NID 0 ^ ( ) 1 = MLE for Z'Z Z'Y ^ ^ ^ ( ) 1 = ~ ,COV , N Z'Z ( ) i ( ) k ik 1 n 1 n ^ ^ ' ^ ^ ^ = = MLE for Y Z ' Y Z ^ )( ) 1 + ~ Wishart n ( , m n r /2 n ^ ^ ^ ^ ^ ^ ( ) /2 mn = = /2 mn Mazimize d Likelihood evaluated at , : , 2 L e Z

  20. Likelihood Ratio Tests for Regression Coefficients L O L L O L 01 M 0 M m ) 1 + 1 ( q m 1 q qm = = = = : H Z Z Z 0 1 q + 2 r q 2 0 ) ( ) ( ) 1,1 + M + M 1, q q m 2 1 n n ( r q m 1,1 + + 1, r r m ) ) 1 + 1 ( Y q m = = = + Z E Z Z 1 Z 2 Z 1 q + 2 r q 1 2 ) ( ) ( ) 2 1 n n ( r q m Du ( ( ) 1 n ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ 1 = = = Z Z Y Z Extra e to after : where: SSCP n ' Y Z Y Z ' Y Z Z 'Z Z 'Y Y Z ' Y Z 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 /2 n ( ) ^ ^ ^ ^ , max , L L 1 1 1 , 1 max = = = = 2/ n Likelihood Ratio: Wilks' Lambda Statistic: ( ) ^ ^ ^ ^ , L , L 1 1 , ^ ^ 1 2 g ( ) ( ) ( ) = = n r n m + + 2 : Test Statistic: 2ln ln For large , : 1 1 ln ~ H n n r m r q 0 2 m r ( ) 0 q ^ ^ 1 1 1 n ^ ^ = = Information criteria: Model with predictors: ln 2 d SSCP AIC n md d d ERROR

  21. Alternative Test Statistics # of response variables m = = : H 0 2 0 ^ ^ ^ ^ ^ = = = E Y Z ' Y Z Full Model: n n H 1 ( ) = 1 HE Eigenvalues of : ... min , s m r q 1 2 s E s 1 = = Wilks' Lambda + + E H 1 = 1 i i + s ( ) 1 = = + H H E Pillai's Trace tr ace i 1 = 1 i i s ( ) 1 = = HE Hotelling-Lawley Trace trace i = 1 i + = Roy's Largest Root 1 1 1 Statistical Software Packages will print these statistics along with P-values

  22. Multivariate Multiple Regression Tests - I = = = = L M L : ( ) ( if ) H d L cj M 0 c 0 d 0 0 M where is a matrix for predictors and is column vector of constants, and is a row vector of 1 . Step 1: Set up the hypothesis ( ) matrix: is for responses, s c j H ' ( ) ^ ^ 1 = 1 H M L ' ( ) ' cj L Z'Z L L cj M E Step 2: Set up the error SSCP ( ) matrix: = ^ ^ ' E M Y'Y Z'Z M ' Step 3: Set up (at least one of) 4 test statistics Step 4: Convert test statistic(s) to approximate Step 5: Compare statistics with appropriate critical values -statistics F F

  23. Multivariate Multiple Regression Tests - II Common Elements among Statistics: = + H E ( ) = 1 L Z'Z L rank( ) rank ( ) ' q q 1 2 = + = ( 1) min( , ) n r s q q 1 2 0.5( ( ) = = 0.5 1 1) m q q m q 1 1 2 2 1 E 1 t 1 2 t u = = Wilks' Lambda: ~ F F , 2 W 1 2 q q t u + 1 t H E 1 2 q q 2 1 2 2 q 4 q q q + 2 1 2 2 if 5 0 q q = + = = where: ( 1) 2 ( 2) 4 + q q u 1 2 q q t 2 1 2 2 5 1 2 1 otherwise + + + + 2 2 1 1 m m s s V ( ) = + = 1 H(H E) Pillai's Trace: trace ~ V F F 2 + + + + (2 1), (2 s 1) P s m s m s s V 1 2 1

  24. Inference Regarding Predictions ( ) ^ ( ) 1 1 vector ~ , ' m N 'z 'z z Z'Z z 0 0 0 0 ^ ^ 1 n r 'z 'z 'z 'z ^ ^ ^ ^ )( ) 1 + = 2 ~ Wishart , independent 0 Z'Z 0 0 Z'Z 0 n T ' ( ) ( , + m n r 1 n ( ) ( ) 1 1 z z z z ' ( n r ' 0 0 ) 0 0 ) ( + 1 m n r ( ( ) ) ( ) 2 1 100% Confidence Ellipsoid for : s.t. T m n r m F 'z 'z 0 0 , m z 1 100% Simultaneous Confidence Intervals for ' : ( ) i 0 ^ ^ L ( n r ) ( ) 11 M O 1 M m + 1 m n r n r ^ ^ ^ ^ ^ ^ ( ) ( ) 1 = = = z ' ' 1,..., where m L m n r m F i z Z'Z z ( ) ii ( ) i ( ) 1 0 , 0 0 m + 1 m n ^ ^ L 1 m mm ( ) Y 1 100% Prediction Ellipsoid for : 0 ( n r ) ( ) 1 + ( ) 1 m n r n r ^ ( ) ( ) ( ) ( ) 1 + Y Y s.t. 1 ' m n r m F 'z ' Y 'z z Z'Z z ( ) 0 0 0 0 0 0 0 , + 1 n m ( ) 1 100% Simultaneous Confidence Intervals for : Y 0 i ( n r ) ( ) + ( ) 1 m n r n r ^ ^ ( ) ( ) 1 + = z z Z'Z z ' 1 ' 1,..., m n r m F i m ( ) ii ( ) i 0 , 0 0 + 1 m n

  25. Classical Regression Model Univariate Response Response Variable ,..., Random Predictor Variables b Z b b Z Y b + = Y Z Z 1 r Y Z M ' Y Z Y Z Y Z Y YY ZY r 1 1 1 1 1 1 = = = = = Data vector (within units): E V Z ZY ZZ 1 1 r r r r Z + r + + + = + b'Z Linear Predictor of : Prediction Error: ... ... Y b bZ bZ + 0 1 1 0 r r ( ) b'Z Y b 0 1 1 0 r ( ) 2 = b'Z Mean Square Error: MSE E Y b 0 + = = 1 Linear Predictor of which minimizes Y ( ) 0 E Y = 'Z : where MSE 'Z ' 0 0 ZZ ZY Y Z 2 1 ' YY ZY ZZ ZY 1 ' ' ( ) ( ) + = + = = = b'Z maxCORR b b , CORR , 0 1 Y b Y ZZ ZY ZZ ZY 'Z ( ) Z ( ) Z 0 0 Y Y , 0 YY YY

  26. Classical Regression Model Univariate Response - Estimators Y Z M ' Y Z Y YY ZY r 1 1 1 1 1 1 = = = ~ , N Z ZY ZZ 1 1 r r r r Z r ( )( ) n Z Z Y Y 1 1 j j S ' S Y = 1 j YY ZY r 1 ^ ^ ^ ^ 1 1 Z 1 1 S 1 = = = = = = 1 1 M Y Y S S S S 'Z S 'S Z 0 ZY ZZ ZY ZY ZZ S 1 n ZY ZZ r r ( )( ) n 1 r 1 r Z Z Y Y r jr j = 1 j ( ) ^ ^ ^ ^ ^ + = + = + MLE (and OLSE) of Regression Function: Y Y 'Z 'Z 'Z ' Z Z 0 1 n ( ) ^ ( ) 2 = = = 1 MLE of MSE E Y S 'Z S 'S S g YY Z 0 YY ZY ZZ ZY n 2 1 + 1 r n n )( ) ^ ^ = 1 S 'S S Unbiased Estimator: S Y 'Z ( ( ) 0 YY ZY ZZ ZY j j + 1 1 n r n = 1 j

  27. Classical Regression Model Multivariate Response ,..., Response Variable ,..., Random Predictor Variables Y Y Z Z 1 1 m r Y M 1 Y Y Y Z Z Z Y Z YY YZ r Y 1 m m m m m = = = = = Data vector (within units): E V 1 ZY ZZ Z 1 r m r r r M Z r ( ) ( ' ) = = + = + = + 1 1 1 Y Z z | E z z z 0 Y YZ ZZ Z YZ ZZ Y YZ ZZ Z ( ) ( ) ( ) ( ) = = 1 1 1 E Y ' Z Y ' Z ' g YY Z Y ZY ZZ Z Y ZY ZZ Z YY ZY ZZ ZY Based on a random sample of size : n ( ) ( ) ^ ^ + = + = + 1 1 1 Z S S z Y S S z Z ML Estimators: z Y S S 0 YZ ZZ YZ ZZ YZ ZZ 1 1 r n n ( ) ^ ^ ^ ^ ^ = 1 Unbiased Estimator: S S S S Y Z Y Z ' g YY Z ( ) 0 0 YY YZ ZZ ZY j j j j + 1 n n = 1 j

  28. Predictions and Partial Correlation Coefficients ( ) ^ y ^ ^ ^ , Y = + + + = s Joint Normality of ... 1,..., Same m Z z z i z 0 1 i i ri 1 r i ^ ^ ^ L 11 M O 1 01 M r ^ ^ ^ = = = = 1 M S S y z 0 YZ ZZ ^ ^ ^ L 1 0 m rm m ( ) = = 1 Prediction Errors: 1,..., Y i m Z i i Y Y Z ZZ Z i i L g g 1 1 YY Z M YY Z 1 m = = 1 Erro r Covariance Matrix: O L M g YY Z YY YZ ZZ ZY g g YY Z m m Y Y Z 1 m g i k YY Z = Partial Correlation Coefficient between and given ,..., : Y Y Z Z 1 g i k r i k YY Z g g YY Z k k Y Y Z i i i k YY s g Z = Sample Partial Co rrelation Coefficient: i k YY r g Z YY Z Y Y s g s g Z i i k k

Related


More Related Content