NASCAR Lead Changes 1975-1979: Negative Binomial Regression Insights

negative binomial regression n.w
1 / 12
Embed
Share

Explore the application of negative binomial regression to analyze lead changes in NASCAR races from 1975 to 1979. The study examines predictors like laps, drivers, and track length, showcasing the significance of these factors in determining lead changes. The goodness-of-fit testing highlights insights into model consistency and variance within race groups.

  • NASCAR
  • Regression Analysis
  • Lead Changes
  • Statistical Modeling
  • Goodness-of-Fit

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Negative Binomial Regression NASCAR Lead Changes 1975-1979

  2. Data Description Units 151 NASCAR races during the 1975- 1979 Seasons Response - # of Lead Changes in a Race Predictors: # Laps in the Race # Drivers in the Race Track Length (Circumference, in miles) Models: Poisson (assumes E(Y) = V(Y)) Negative Binomial (Allows for V(Y) > E(Y))

  3. Poisson Regression Random Component: Poisson Distribution for # of Lead Changes Systematic Component: Linear function with Predictors: Laps, Drivers, Trklength Link Function: log: g( ) = ln( ) ( ) X ( ) X y e ( ) = = Mass Function: | , , P Y y X X X 1 2 3 ! y ( ) ( ) ( ) = + + + = = ' ' 1 g X X X X x x X X X 1 1 2 2 3 3 1 2 3 + + + = = X X X ' x X e e 1 1 2 2 3 3

  4. Regression Coefficients Z-tests Parameter Intercept Laps Drivers Trklength Estimate -0.4903 0.0021 0.0516 0.6104 Std Error 0.2178 0.0004 0.0057 0.0829 Z P-value .0244 <.0001 <.0001 <.0001 -2.25 5.15 9.09 7.36 Note: All predictors are highly significant. Holding all other factors constant: As # of laps increases, lead changes increase As # of drivers increases, lead changes increase As Track Length increases, lead changes increase 0.4903 0.0021 + + + = 0.0516 0.6104 L D T e

  5. Testing Goodness-of-Fit Break races down into 10 groups of approximately equal size based on their fitted values The Pearson residuals are obtained by computing: ^ ^ observed - fitted Y Y = = = = 2 2 i i V i i i e X e i ( ) fitted Y ^ i i Under the hypothesis that the model is adequate, X2 is approximately chi-square with 10-4=6 degrees of freedom (10 cells, 4 estimated parameters). The critical value for an =0.05 level test is 12.59. The data (next slide) clearly are not consistent with the model. Note that the variances within each group are orders of magnitude larger than the mean.

  6. Testing Goodness-of-Fit Range 0-9.4 9.4-10.5 10.5-11.6 11.6-20 20-21 21-23 23-26 26-32 32-36 36+ Total #Races 15 14 14 17 19 15 16 16 11 13 151 Pearson -1.60 -1.03 1.67 2.81 4.79 -7.60 -2.21 1.79 -1.30 1.62 X2=107.4 Mean 7.53 9.20 12.71 18.88 25.53 12.73 22.06 30.69 31.73 44.15 Variance 23.41 34.46 41.30 56.36 89.93 48.21 74.33 183.70 201.82 229.47 obs 113 138 178 321 485 191 353 491 349 574 fit 131.3 150.6 157.1 274.4 390.3 328.7 397.1 452.9 374.2 536.4 107.4 >> 12.59 Data are not consistent with Poisson model

  7. Negative Binomial Regression Random Component: Negative Binomial Distribution for # of Lead Changes Systematic Component: Linear function with Predictors: Laps, Drivers, Trklength Link Function: log: g( ) = ln( ) ( ) + k y + + y k y k + ( ) = = = Mass Function: | , , , 0,1,2,... P Y y X X X k y ( ) ( k ) 1 2 3 1 k k 2 ( ) ( ( ) = = + E Y V Y k ) ( ) ( ) = + + + = = ' ' 1 g X X X X x x X X X 1 1 2 2 3 3 1 2 3 + + + = = X X X ' x X e e 1 1 2 2 3 3

  8. Regression Coefficients Z-tests Note that SAS and STATA estimate 1/k in this model. Parameter Intercept Laps Drivers Trklength 1/k Estimate -0.5038 0.0017 0.0597 0.5153 0.1905 Std Error 0.4616 0.0009 0.0143 0.1636 0.0294 Z P-value .2752 .0447 <.0001 .0041 -1.09 2.01 4.17 2.87 0.5038 0.0017 + + + = 0.0597 0.5153 L D T e ( ) 2 ( ) = + 0.1905 V Y

  9. Goodness-of-Fit Test Y Y = = = 2 2 i i i Pearson Residuals: i V Y i e X e i ( ) 2 + k Mean 7.38 9.12 12.40 22.82 24.90 11.75 24.56 29.38 31.79 46.89 i i i Range 0-9.4 9.4-10.5 10.5-11.6 11.6-20 20-21 21-23 23-26 26-32 32-36 36+ Total #Races 13 17 20 11 21 12 18 16 14 9 151 Pearson -0.30952 -0.20153 0.250504 0.543148 0.482911 -1.04674 -0.0504 0.02797 -0.18924 0.140292 X2=1.88 S.D. 4.22 6.10 5.87 5.53 9.55 6.44 10.98 14.83 14.12 13.82 obs 96 155 248 251 523 141 442 470 445 422 fit 111.4 170.2 223.3 202.4 431.5 261.8 452.0 464.3 485.3 397.5 Clearly this model fits better than Poisson Regression Model. For the negative binomial model, SD/mean is estimated to be 0.43 = sqrt(1/k). For these 10 cells, ratios range from 0.24 to 0.67, consistent with that value.

  10. Computational Aspects - I k is restricted to be positive, so we estimate k* = log(k) which can take on any value. Note that software packages estimating 1/k are estimating k* Likelihood Function: k y k y + + + + i i ( ) ( 1) ( ) ( ) 1) + y k y k k k k + k + = = i i i i L i + ( ) ( k 1) ( ) ( k y k k y k k i i i i i i * k k y e y + + + + * * k k * i i k ( 1) ! i y ( ) k ( 1) ! y k y e e k + e = = i i i i + * * k k k k y e e i i i i i Log-Likelihood Function: 1 iy ( ) L = = + ) ln + + ) ( + + * * * * * k k k k k ln ln( ! ln( ) ln( )ln( ) l e j y e e y e y e i i i i i i i = 0 j

  11. Computational Aspects - II Derivatives wrt k* and + + 1 * y k 1 l e e y i = 1 ln( + + + * * * k k k ) ln( ) i i e e e i + * * k k * k e j = 0 j i + + 1 1 2 * y y k * k 1 1 + l e e y y e + i i = 1 ln( + + + + * * * * * k k k k k ) ln( ) 1 i i i + i e e e e e ( ) i + 2 2 * * * 2 * k k k k ( *) k ( ) e j e j e * k e = = 0 0 j j i i i 2 l y = * k i i i xe ( ) i i 2 * k + * k e i + l y = * k i i i xe i * k e i + 2 * k l e y = * k ' i i x x e ( ) i i i 2 ' + * k e i

  12. Computational Aspects - III Newton-Raphson Algorithm Steps: 2 l l = = i i g G k k 2 * * k k 2 l i G * k g g 2 l l = = = = i i g G g G ' k k ' 2 l k i G * k * k ) 1 ( ) ( i i ~ ~ Step 1: Set k*=0 (k=1) and iterate to obtain estimate of 1 = G g ) 1 ( ) ( i i Step 2: Use of Step 1 and iterate to obtain estimate of k*: ~ ~ 1 = * * k k G g k k Step 3: Use results from steps 1 and 2 as starting values to obtain estimates of k* and k ( ) ( ) 1 i i ~ ~ 1 = = G g k k * Step 4: Back-transform k* to get estimate of k: k=exp(k*)

Related


More Related Content