Impact Evaluations at IFAD-IOE: Fabrizio Felloni, Deputy Director

Impact Evaluations at IFAD-IOE: Fabrizio Felloni, Deputy Director
Slide Note
Embed
Share

Presentation by Fabrizio Felloni, Deputy Director of Independent Office of Evaluation of IFAD (IOE), on impact evaluations at IFAD-IOE. The content covers insights shared with the Independent Evaluation Division, UNIDO, on 19th May 2017. It delves into the evaluation processes and methodologies used to assess the impact of projects facilitated by IFAD.

  • IFAD
  • IOE
  • Impact Evaluations
  • Fabrizio Felloni
  • UNIDO

Uploaded on Mar 08, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. 1 IMPACT EVALUATIONS AT IFAD-IOE Fabrizio Felloni, Deputy Director Independent Office of Evaluation of IFAD (IOE) 19 May 2017 Presentation to the Independent Evaluation Division, UNIDO

  2. 2 Background IFAD Management launched in 2013 an impact assessment initiative (over 30 IAs 2013-15), with quantitative techniques to attribute impact to project activities IOE-IFAD requested by Executive Board to review the above initiative. Since 2013 IOE conducted 1 impact evaluation per year to: (i) upgrade its technical skills; (ii) better engage in IFAD and external fora; and (iii) provide hands-on assessment on impact assessment initiative

  3. 3 Impact Evaluations conducted so far Sri Lanka (2013). Quantitative survey (~2,500hh) + focus group discussion and follow-up technical mission India (2014). Quantitative survey (~8,800hh) + focus group discussion Mozambique. Quantitative survey (~1,500hh) + focus group discussion Ongoing: Georgia: Quantitative survey (~4,000hh) + focus group discussion

  4. 4 Characteristics Cover all evaluation criteria Relevance Effectiveness Efficiency Impact Sustainability Gender equality Innovation Scaling Up Natural Resource Management Climate Change Adaptation Performance of partners (IFAD, Government) Include both: Quantitative and Qualitative data collection and analysis Primary data collection

  5. 5 General approach Quantitative part compares between household with and without project support (treatment vs non- treatment). Focuses on whether and what changed Qualitative part focus on understanding why (mechanisms) Technical validation mission: covers other evaluation criteria and further validates findings

  6. 6 Main constraints Non-existent or poor baseline surveys or missing databases Can not use difference in difference methods Non-random selection of project beneficiary Used propensity score matching (non-parametric) and Heckman selection procedure (parametric) to correct for sample selection bias They do not strictly require baseline data Included some recall questions in the questionnaire

  7. 7 Ideal situation (so far not found) Compare a sample of units of observations (persons, households) with and without project. And observe the differences before and after I n c o m e with project without project Project s contribution p c Total change 2000 Before 2006 After Time

  8. 8 Other methodological issues Reconstruct a theory of change highlight causal chain and key assumptions Level of analysis. Useful to foresee two questionnaires for quantitative part: (i) household level; (ii) community characteristics level Helps understand village fixed effects on final outcomes / impacts Beware of possible: (i) spill-over effects (spreading to non- treated groups); (ii) contamination effects (an external programme affecting project results) Model specification is time consuming and in your final statistical output, you may have signs and levels of significance that you can not explain

  9. 9 Practical organization Data collection and analysis conducted by consulting companies (national and international) But IOE retained full leadership of evaluation (design of methodology, questionnaires, oversight of testing of field instruments, oversight of analysis, report drafting) Better not to have an external company dictate main methodological choices IOE conducted two-three missions: (i) reconnaissance; (ii) oversight of instrument testing in the field; (iii) follow-up technical mission Budget US$ 200,000 all inclusive

  10. 10 Practical organization Important to retain former project manager / senior staff as key informants (check sampling strategy) Important for consulting companies to invest on training of enumerators and quality of collection (vs. incentives to fill in many questionnaires) ICT technology to reduce coding time and reduce risk of fakes (timing, GPS)

  11. 11 Documents IFAD Manual: https://www.ifad.org/documents/10180/bfec198c-62fd-46ff- abae-285d0e0709d6 Sri Lanka: https://www.ifad.org/documents/10180/2b8f1c99- 16be-4b30-969b-d93f03ccca41 India: https://www.ifad.org/evaluation/reports/impact_evaluation/tag s/india/1063/7854528 Mozambique: https://www.ifad.org/evaluation/reports/impact_evaluation/tag s/mozambique/1517/36805916

Related


More Related Content