Econometric Analysis of Panel Data: Regression Extensions Overview

part 7 regression extensions 1 41 n.w
1 / 80
Embed
Share

Dive into Part 7 of William Greene's Econometric Analysis of Panel Data focusing on regression extensions. Explore topics like time-varying fixed effects, measurement error, spatial autoregression, and more in this comprehensive guide.

  • Econometrics
  • Panel Data
  • Regression Analysis
  • William Greene
  • University

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Part 7: Regression Extensions [ 1/41] Econometric Analysis of Panel Data William Greene Department of Economics University of South Florida

  2. Part 7: Regression Extensions [ 2/41]

  3. Part 7: Regression Extensions [ 3/41] Regression Extensions Time Varying Fixed Effects Measurement Error Spatial Autoregression and Autocorrelation (Baltagi 10.5)

  4. Part 7: Regression Extensions [ 4/41] Time Varying Effects Models Time Varying Fixed Effects: Additive yit= xit+ ai(t) + it yit= xit+ ai+ ct+ it ai(t) = ai+ ct, t=1, ,T Two way fixed effects model Now standard in fixed effects modeling.

  5. Part 7: Regression Extensions [ 5/41] ----------------------------------------------------------------------------- LSDV least squares with fixed effects .... LHS=YIT Mean = 11.57749 Standard deviation = .64344 ---------- No. of observations = 1482 DegFreedom Mean square Regression Sum of Squares = 605.772 255 2.37558 Residual Sum of Squares = 7.37954 1226 .00602 Total Sum of Squares = 613.152 1481 .41401 ---------- Standard error of e = .07758 Root MSE .07057 Fit R-squared = .98796 R-bar squared .98546 Estd. Autocorrelation of e(i,t) = .007815 -------------------------------------------------- Panel:Groups Empty 0, Valid data 247 Smallest 6, Largest 6 Average group size in panel 6.00 Variances Effects a(i) Residuals e(i,t) .021204 .006019 Std.Devs. .145615 .077583 Rho squared: Residual variation due to ai .778892 Within groups variation in YIT .49745D+02 R squared based on within group variation .851653 Between group variation in YIT .56341D+03 --------+-------------------------------------------------------------------- | Standard Prob. 95% Confidence YIT| Coefficient Error z |z|>Z* Interval --------+-------------------------------------------------------------------- X1| .63797*** .02380 26.81 .0000 .59132 .68461 X2| .04128*** .01544 2.67 .0075 .01100 .07155 X3| .02819 .02217 1.27 .2036 -.01527 .07165 X4| .30816*** .01323 23.30 .0000 .28224 .33408 --------+-------------------------------------------------------------------- T| Base = 1993 1994 | .03292*** .00713 4.62 .0000 .01894 .04690 1995 | .06137*** .00749 8.20 .0000 .04669 .07604 1996 | .07195*** .00801 8.98 .0000 .05625 .08765 1997 | .07530*** .00843 8.93 .0000 .05878 .09183 1998 | .09401*** .00892 10.53 .0000 .07651 .11150 --------+-------------------------------------------------------------------- Evidence of technical change

  6. Part 7: Regression Extensions [ 6/41] Time Varying Fixed Effects 911 Rescue

  7. Part 7: Regression Extensions [ 7/41] Need for Clarification

  8. Part 7: Regression Extensions [ 8/41] Time Varying Fixed Effects

  9. Part 7: Regression Extensions [ 9/41] A Partial Fixed Effects Model Some individual specific parameters E.g., Individual specific time trends, y t E.g., Individual specific constant terms, y x Strategy :Use OLS and Frisch-Waugh + [ ] [ i D i XM X DD D y X it = + i0 + + x ; Detrend individual data, then OLS it i1 it it = + + ; Ind ividual group mean deviations, then OLS it i0 it = + y D X , T observations i i i i i i i i N i 1 N i D i D 1 = = XM y ], M I D DD ( ) D = i 1 = i 1 i i i i i 1 = [ ] ( - ) i i i i

  10. Part 7: Regression Extensions [ 10/41] Time Varying Effects Models Time Varying Fixed Effects: Additive Polynomial yit = xit + ai(t) + it yit = xit + ai0 + ai1t + ai2t2+ it Let Wi = [1,t,t2]Tx3 Ai = stack of Wi with 0s inserted Use OLS, Frisch and Waugh. Extend within estimator. Note Ai Aj = 0 for all i j. i i i i = W 1 X N N i = = X M X X M y W , , i i = 1 1 i ) ( ) ( 1 W W y i i i i i See Cornwell, Schmidt, Sickles (1990) (Frontiers literature.)

  11. Part 7: Regression Extensions [ 11/41]

  12. Part 7: Regression Extensions [ 12/41]

  13. Part 7: Regression Extensions [ 13/41] calc;list;r2=1-(1482-col(x)-3*247)*sst/((n-1)*var(yit))$ [CALC] R2 = .9975014 F[2*247, 1482-4-3*247] = (.99750 - .98669)/(2*247) / ((1 - .99750)/(1482 4 3*247)) = 6.45 Wald = 6.45*494 = 3186. Critical chi squared for 494 DF = 546.81

  14. Part 7: Regression Extensions [ 14/41] Time Varying Effects Models Random Effects yit = xit + it + ai(t) or yit = xit + it + uig(t, ) A heteroscedastic random effects model Stochastic frontiers literature Battese-Coelli (1992) 2 2 ... 2 (1, ) (1, ) (2, ) ... ... g g g 2 (2, ) (1, ) (2, ) ... ... ... ... ... g g g = + I i u 2 ( , ) g T (1, ) ( , ) g T (2, ) ... ( , ) g T g g

  15. Part 7: Regression Extensions [ 15/41] A Stochastic Frontier Model y = + x + v - |u |*exp(- (t-T) it it it i 2 v 2 u v ~ Normal (0, ), u normal(0, ) it i 2 = exp(-2 (t-T)) 2 u Var[|u |*exp(- (t-T)] i = u v

  16. Part 7: Regression Extensions [ 16/41] Munnell State Production Model

  17. Part 7: Regression Extensions [ 17/41] No Effects

  18. Part 7: Regression Extensions [ 18/41] Quadratic Fixed Effects Correct DF: 816-6-3(48)=666 Multiply standard errors by sqr(810/666) = 1.103

  19. Part 7: Regression Extensions [ 19/41] Time Varying Effects Models Time Varying Fixed Effects: Multiplicative yit = xit + ai(t) + it yit = xit + i t + it Not estimable. Needs a normalization. 1 = 1. An EM iteration: (Chen (2015).)

  20. Part 7: Regression Extensions [ 20/41] EM Algorithm (Chen (2015)) (1) (1) 1. Choose starting values for (Linear FEM for and 1,0,... for , for example.) and . ( ( ( ) ( ) 1 it (k+1) ( ) k ( ) k ( x x x 2.1 = ( ) y it it it i t , , i t i t ) ) 2 T T it + = = ( 1) (k+1) ( ) k ( ) k k x 2.2 ( ) / ) ( , 1,..., y i N i it t t = = 1 1 t t ) 2 N N it + + x = (k+1) (k+1) ( 1) ( 1) k k 2.3 = ( ) / , 2,..., y t T t it i i = = 1 1 i i 3. Iterate to convergence. (*) a. What does this converge to? MLE under normality. (*) b. How to compute standard errors? Hessian. No IP problem for linear model.

  21. Part 7: Regression Extensions [ 21/41] Measurement Error Standard regression results: General effects model y x c x x h x measured variable, including measurement error. = x x x y x x * it * it = = + + + it i it it it it ) [ -1 N i=1 -1 * * N i=1 + + + ]/ c b=( ) =( / T) ( x h x T i + * it Var[h ] * it Var[x * it ] Cov[x ,c ] Var[x ] it plim b = i * + + Var[x ] Var[h ] it it biased twice, possibly in opposite directions. (Griliches and Hausman (1986).)

  22. Part 7: Regression Extensions [ 22/41] General Conclusions About Measurement Error In the presence of individual effects, inconsistency is in unknown directions With panel data, different transformations of the data (first differences, group mean deviations) estimate different functions of the parameters possible method of moments estimators Model may be estimable by minimum distance or GMM With panel data, lagged values may provide suitable instruments for IV estimation. Various applications listed in Baltagi (pp. 205-208).

  23. Part 7: Regression Extensions [ 23/41] Application: A Twins Study "Estimates of the Economic Returns to Schooling from a New Sample of Twins," Ashenfelter, O. and A. Kreuger, Amer. Econ. Review, 12/1994. (1) Annual twins study in Twinsburg, Ohio. (2) (log) wage equatio i,j ns, y logwage twin j=1,2 in family i. (3) Measured data: (a) Self reported education, Sibling reported education, Twins report same education, other twin related variables (b) Age, Race, Sex, Self employed, Union member, Married, of mother at birth (4) S reported schooling by of twin j by twin k. Measurement error. = = k j k j k j k j = + S S v . Reliability ratio = Var[S ]/(Var[S ]+Var[v ])] j j j

  24. Part 7: Regression Extensions [ 24/41] Wage Equation Structure y y z z x i i2 = = + + + + i x x z z + + + i1 i i1 i i1 i2 i i2 i1 i2 = i Reduced Form=Two equation SUR model. y ( + ) ( + )+ y ( + )+ i i1 x z First differences gives the "fixed effects" approach y y ( ) +( - y y (S -S ) +( - First difference gets rid of the family effect, but measurement problem But, S S may be used as an instrumental variable + + i i2 = = + + + i2 x z z z + ( ) i1 i i1 i2 i1 i ) + ( + )+ ( i2 i i1 i2 = = i1 z -z ) i1 i2 i2 1 1 2 2 i1 ) The regressor is measured with error. i1 i2 i2 worsens the 2 1 1 2

  25. Part 7: Regression Extensions [ 25/41]

  26. Part 7: Regression Extensions [ 26/41] Spatial Autocorrelation Thanks to Luc Anselin, Ag. U. of Ill.

  27. Part 7: Regression Extensions [ 27/41] Spatially Autocorrelated Data Per Capita Income in Monroe County, NY Thanks Arthur J. Lembo Jr., Geography, Cornell.

  28. Part 7: Regression Extensions [ 28/41] Hypothesis of Spatial Autocorrelation Thanks to Luc Anselin, Ag. U. of Ill.

  29. Part 7: Regression Extensions [ 29/41] Testing for Spatial Autocorrelation W = Spatial Weight Matrix. Think Spatial Distance Matrix. Wii = 0.

  30. Part 7: Regression Extensions [ 30/41] Modeling Spatial Autocorrelation = + i ( arranged variable 'contiguity matrix;' = W W must be specified in advance. It is not estimated. tocorrelation parameter, -1 < < 1. y i ) W y ( ) , N observations on a spatially = W 0 ii = spatial au 1 k Identification problem: W = k W for any k 0 Normalization: Rows of E[ ]= Var[ ]= ( ) [ E[ ]= Var[ ]= y i, y W sum to 1. 2 0, i I 1 = y I W ] 2 -1 [( I W ) ( I W )]

  31. Part 7: Regression Extensions [ 31/41] Spatial Autoregression = |X + y Wy X + 0, W W I . |X 2 E[ y ]= Var[ ] ( ] ]= I 1 = = y|X y|X = + [ [ I I X X W ) 1 1 + E[ Var[ [ X I W ] 1 ]=[ ] I ) ( 2 -1 ] [( W I W )]

  32. Part 7: Regression Extensions [ 32/41] Generalized Regression Potentially very large N GPS data on agriculture plots Estimation of . There is no natural residual based estimator Complicated covariance structure no simple transformations

  33. Part 7: Regression Extensions [ 33/41] Spatial Autocorrelation in Regression = X y X y X = A Generalized Regression Model + = y X ( 0, X I- W ) . w X 0. ii 2 E[ | E[ | Var[ | ]= ]= Var[ | ]= I 2 ] ( I- W I- W )( ) 1 ( ) ( ) 1 1 = X ( I- W I- W )( ) X X ( I- W I- W )( ) y ( )( ) ( ) 1 N 1 2 = y - X ( I- W I- W )( ) y - X = The subject of much research

  34. Part 7: Regression Extensions [ 34/41] Panel Data Application: Spatial Autocorrelation E.g., N countries, T periods (e.g., gasoline data) y c = N observations at time t. Similar assumptions Candidate for SUR or SA model. it W = = + + v x it i it + t t t

  35. Part 7: Regression Extensions [ 35/41]

  36. Part 7: Regression Extensions [ 36/41] Spatial Autocorrelation in a Panel

  37. Part 7: Regression Extensions [ 37/41] Spatial Autocorrelation in a Sample Selection Model Flores-Lagunes, A. and Schnier, K., Sample Selection and Spatial Dependence, Journal of Applied Econometrics, 27, 2, 2012, pp. 173-204. Alaska Department of Fish and Game. Pacific cod fishing eastern Bering Sea grid of locations Observation = catch per unit effort in grid square Data reported only if 4+ similar vessels fish in the region 1997 sample = 320 observations with 207 reported full data

  38. Part 7: Regression Extensions [ 38/41] Spatial Autocorrelation in a Sample Selection Model LHS is catch per unit effort = CPUE Site characteristics: MaxDepth, MinDepth, Biomass Fleet characteristics: Catcher vessel (CV = 0/1) Hook and line (HAL = 0/1) Nonpelagic trawl gear (NPT = 0/1) Large (at least 125 feet) (Large = 0/1)

  39. Part 7: Regression Extensions [ 39/41] Spatial Autocorrelation in a Sample Selection Model = + + x = + * 1 i y u u c u 0 1 1 1 1 1 i i i ij j i j i i = + 2 + = + * i x y u u c u 2 0 2 2 2 2 i i ij j i j i 2 1 0 0 1 i = 12 2 2 ~ , , (?? 1??) N 1 2 i 12 Observation Mechanism = = 2 if i y y > 0 Probit Model = 1, unobserved otherwise. i * 1 i 1 y y 1 i * y 2 1 i

  40. Part 7: Regression Extensions [ 40/41] Spatial Autocorrelation in a Sample Selection Model = + u C u Cu 1 1 1 = C = Spatial weight matrix, = I C 0. ii 1 (1) u [ ] = , likewise for 1 1 1 2 ( ) 2 N N i = + 1 + = ( * 1 i (1) ij 2 1 (1) ij x ( ) , Var[ ] ( ) y u 0 1 1 i i = = 1 1 j j ) 2 N N i = + 2 + = * i (2) ij 2 2 ( ij 2) x ( ) , Var[ ] ( ) y u 2 0 2 2 i i = = 1 1 j j N = (1) ij (2) ij Cov[ , ] ( ) ( ) u u 1 2 12 i i = 1 j

  41. Part 7: Regression Extensions [ 41/41] Spatial Weights 1, d = c ij 2 ij = Euclidean distance d ij Band of 7 neighbors is used Row standardized.

  42. Part 7: Regression Extensions [ 42/41] Appendix: Miscellaneous

  43. Part 7: Regression Extensions [ 43/41] Ordinary Least Squares Standard results for OLS in a GR model Consistent Unbiased Inefficient Variance does (we expect) converge to zero; 1 1 i i i N i 1 = N i 1 = N i 1 = X X X X X X 1 = Var[ | b X ] i i i i N i 1 = N i 1 = N i 1 = N i 1 = T T T T i i i i 1 1 i i i i i X X X X X X 1 N i 1 i = N i 1 i = N i 1 i = f f f , 0 < f < 1. i i i i i N i 1 = T T T T i i i i

  44. Part 7: Regression Extensions [ 44/41] Estimating the Variance for OLS White correction? ( ) 1 1 i it i T t=1 N i 1 = N i 1 = 2 it N i 1 = Est.Var[ | ]= b X X X e x x X X i i it i Does this work? No. Observations are correlated. Cluster Estimator Est.Var[ | ] b X it T t=1 iT ) 1 N i=1 1 = ( X'X ) ( x e )( x e ) ( X'X i it it t=1 it

  45. Part 7: Regression Extensions [ 45/41] White Estimator for OLS 1 1 1 X X X X X X = Var[ | b X ] N i 1 = X X N i 1 = N i 1 = N i 1 = T T T T i i i i i X X i N i 1 i = = f , where = =E[ w w | X ] i i i i i N i 1 = T T i In the spirit of the White estimator, use = Hypothesis tests are then based on Wald statistics. i T X w w X X X w N i 1 i = f , = y - X ib i i i i i N i 1 = T i

  46. Part 7: Regression Extensions [ 46/41] Generalized Least Squares =[ =[ -1 1 -1 X X X X ] [ X y ] i i N -1 i 1 N -1 i ] [ X y ] = i 1 = i 1 i i 2 + 1 ,i T -1 i = I ii T 2 2 2 u,i i ,i ,i i 2 2 u,i (Depends on i through , and T) ,i i

  47. Part 7: Regression Extensions [ 47/41] Maximum Likelihood i = = Let exp( exp( z h ), ) ,i u,i u i 2 i =1/ = R Q logL Can be ma , i i 2 ui 2 / , 1, i = = (1/2)[ ( ximized using ordinary optimization methods. Treat as a standard nonlinear optimization problem. Solve with iterative, gradient methods. Is there much benefit in doing this? Why would one do this? + i i T /R , i i i i i 2 = i i + + + i Q (T ) ) logR T log T log2 ] i i i i i i i

  48. Part 7: Regression Extensions [ 48/41] Conclusion Het. in Effects Choose robust OLS or simple FGLS with moments based variances. Note the advantage of panel data individual specific variances As usual, the payoff is a function of Variance of the variances The extent to which variances are correlated with regressors. MLE and specific models for variances probably don t pay off much unless the model(s) for the variances is (are) of specific interest.

  49. Part 7: Regression Extensions [ 49/41] Generalized Regression Accommodating Autocorrelation (and Heteroscedasticity) Fixed Effects : x it = + + X D y it i it = + y [ X D ] i i i i = Var[ Random Effects : x y X ] Var[u i+ | , ] = (Dimension TxT), positive definite i i i i i i i i it = = + + y u + it i u + i X it [ i i i i 2 u = | ] ii + = (Dimension TxT.) i i i i i i

  50. Part 7: Regression Extensions [ 50/41] OLS Estimation + u + = i F i F i = + + Fixed Effects : y [ X D ] = Z w i i i i F R i R i = + Random Effects : y Least Squares [ X ] Z w i i i i R Coefficient Estimator, M = FE or RE 1 N M i M i N M i M i = Z Z Z y = i 1 = i 1 M Cluster Robust Covariance Matrix based on the White Estimator 1 1 N M i M i N M i M i M i M i N M i M i = Est.Asy.Var[ ] Z Z Z w w Z Z Z = i 1 = i 1 = i 1 M M i = w vector of T least squares residuals i

Related


More Related Content