Impact Evaluation Methods in Education Interventions

measuring the impact of education interventions n.w
1 / 21
Embed
Share

Explore diverse methods for evaluating the impact of education interventions through case studies, challenges with Randomized Controlled Trials (RCTs), and opportunities for research in government settings. Understand the importance of identifying a valid comparison group as a pseudo-counterfactual in evaluation studies. Delve into quantitative and qualitative analysis techniques to gauge the effectiveness of educational programs.

  • Education
  • Impact Evaluation
  • Research Methods
  • Case Study
  • Government

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Measuring the impact of education interventions Stephen Taylor STELLENBOSCH, AUGUST 2015

  2. PLAN Locating impact evaluation Menu of methods Case Study: A randomised experiment Challenges with RCTs in education Opportunities for research in government

  3. Locating impact evaluation Qualitative work Hoadley (2003,2007); Ensor et al (2009) Systemic analysis with mixed methods Taylor, Vinjevold, Muller (2003); Fleisch (2008) Descriptive quantitative work Reddy (2006); Taylor & Yu (2008); Spaull (2011) Correlational analysis Crouch & Mabogoane (1998); Van der Berg (2008); Gustafsson (2007); Spaull (2012); Shepherd (2011) Moving toward causal quantitative analysis

  4. The evaluation problem: knowing a counterfactual The evaluation problem: We cannot observe the counterfactual: 2 alternative scenarios for the same person or group. So we have to identify or construct comparison groups as a pseudo-counterfactual . Or an estimate of the counterfactual The big question is: when is a comparison group a valid estimate of the counterfactual? Internal validity Selection bias (endogeneity): Years of Schooling and IQ Libraries and learning outcomes

  5. A menu of methods } Pre & Post Simple Difference Difference-in-differences Regression & matching Fixed effects RCT RDD IV Non- experimental (observed data) } } Experimental Quasi- Experimental

  6. Case Study: The impact of study guides on matric performance: Evidence from a randomised experiment

  7. Background to the Mind The Gap study Mind the Gap study guides developed during 2012 Aimed at acquiring the basic knowledge and skills necessary to pass the matric exam Distributed to schools in some parts of the country Mainly underperforming districts in EC, NC, a bit in Gauteng and elsewhere, but not in Mpumalanga Impact evaluation using 4 subjects in MP ACCN, ECON, GEOG, LFSC

  8. The Sampling Frame National list of schools that were enrolled for the matric 2012 examination. The list was then restricted to only schools in Mpumalanga. Further restricted to schools registered to write the matric 2012 exam in English. The final sampling frame consists of 318 schools. Randomly allocated guides to 79 schools (books were couriered delivery reliable) Leaves 239 control schools Books delivered late in Year: September

  9. Main Results: OLS regressions with baseline To summarise: No significant impact in Accounting & Economics; Impacts of roughly 2 percentage points in Geography & Life Sciences

  10. Heterogeneous effects

  11. Did impact vary by school functionality? Geography Life Sciences 60 60 50 50 Predicted score in 2012 Predicted score in 2012 40 40 30 30 20 20 0 20 40 60 20 30 40 50 60 70 School mean score in 2011 School mean score in 2011 Control Treatment Control Treatment

  12. Matric 2010 simulation 5609 The number of children who did not pass matric in 2010 but would have passed had Mind The Gap been nationally available Geography and Life Sciences. Roughly a 1 percentage point increase in matric pass rate

  13. Interpreting the size of the impact Very rough rule of thumb: 1 year of learning = 0.4 to 0.5 standard deviations of test scores Geography: 13.5% SD Life Sciences: 14.4% SD Roughly a third of a year of learning The unit cost per study guide (reflecting material development, printing and distribution) is estimated to be R41,82

  14. MTG: 3.04 SD per $100 Kremer, Brannen & Glennerster, 2013

  15. Interpretation of results 2 guides had no impact: Interventions do not always impact on desired outcomes Interventions are not uniform in effectiveness The quality of the ACCN & ECON material? Or of the GEOG & LFSC materials? Contextual factors pre-disposing LFSC & GEOG to have an impact but not ACCN & ECON? A certain level of school functionality / managerial capacity needed in order for resources to be effective Timing of the delivery of guides External validity We are more certain about delivery in MP than if this were taken to scale Awareness campaigns could increase the impact at scale

  16. Critiques of RCTs External validity Necessary and sufficient conditions for impact evaluations (internal and external validity) Internal validity = causal inference External validity = transferability to population Context: geography, time, etc...? E.g. Private schools, class size Special experimental conditions Hawthorne effects Implementation agent System support

  17. External validity: Recommendations Choose a representative & relevant study population Investigate heterogeneous impacts Investigate intermediate outcomes Use a realistic (scaleable) model of implementation and cost structure Work with government... But be careful No pre-test...? Or use administrative data (ANA & NSC provide opportunity here for DBE collaboration)

  18. RCTs in Education: Practical challenges Fund raising Stakeholder engagement Test development Fieldwork quality Project management

  19. Evaluations with Government: Advantages Accountability Shifts the focus from inputs (e.g. number of teachers trained) to outcomes; From form to function (mimicry). Cooperation between government and other actors (researchers, NGOs, etc) Encourages policy-makers to interact with research and evidence Thinking about theories of change Shifts the focus from did government programme X succeed or fail, to why? The agency of programme recipients to change behaviour. Benefits for research: reduces publication bias

  20. Evaluations in Government: Opportunities Low hanging RCTs 1000 libraries; EGRA RDD Encouragement designs Online tools; winter schools Good analysis of existing data Grade R evaluation LOLT paper

  21. Concluding thought Broader benefits of an evaluation culture Not all programmes/policies can be subjected to a quantitative impact evaluation Theories of change Accountability Binding constraints Interaction btw government and researchers

More Related Content