Effective Planning for Evaluation Workshops

evaluation workshop step 2 plan n.w
1 / 21
Embed
Share

Learn how to plan and conduct your own evaluation workshop successfully, including developing outcome measures, choosing research methods, and identifying research questions. Explore the importance of preparation in research and the impact of COVID-19 on educational inequalities.

  • Evaluation
  • Workshop
  • Outcome Measures
  • Research Methods
  • COVID-19

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Evaluation Workshop Step 2: Plan A template for running your own evaluation workshop.

  2. Outline of session The Evaluation Cycle: Measure Stage 1 Developing your outcome measures + collation 2 Choosing the right research method(s) 3 Creating a research protocol Questions 4 5 2

  3. The evaluation cycle

  4. How this can help you Aimhigher 35% of HEPs attributed increased applications to their institutions to Aimhigher Associated with rising GCSE results BUT no evidence that it actually impacted progression rates, particularly to selective universities Discontinued in 2011 When running research, preparation is key. A clear analytical strategy stating how the outcome variables will be defined and exactly how the analyses will be run, helps to focus on a small number of well-chosen and justified outcome measures. Developments like COVID-19 pose real threats to progress addressing education inequalities. Now is the time to put methodological differences aside and to start sharing insights to get our evaluative findings out there.

  5. Identifying Research Questions These are overarching question that your evaluation will seek to answer. They will determine the scope and approach of your evaluation. Primary Research Question Secondary Research Question Process Evaluation Question Causal impact of your evaluation Focus on specific groups or intermediate outcomes Focus on implementation and efficiency of your set-up Was the initiative delivered the way we expected? Are we targeting the right students? What was the cost- effectiveness of the initiative? Did [scheme] increase [main outcome/ secondary outcome] among [group/subgroup]? Did [scheme] increase [main outcome] among [group]? Did Summer School attendance improve enrolment rates among estranged students? Did Summer School attendance improve enrolment rates among participants?

  6. Outcome measures I ll know [outcome reached] when I see [indicator] Observable indicators are those we can build into the evaluation and control out; e.g. demography, observed behaviour, measured attitudes. Unobservable indicators are those we can t observe and therefore can t build into the evaluative model; e.g. motivation, unobserved behaviour, unmeasured attitudes; anything that influences the outcome that we don t know about or can t measure.

  7. Common outcome measures Year group Objectives Indicators for Process Evaluation Indicators for Impact Evaluation (short-medium term) Improved GCSE attainment GCSE attainment (through NPD) Indicators for Impact Evaluation (long -term) KS4 Raising attainment Engaging parents Increasing awareness of subject options at HE Was the programme delivered as intended? Were students targeted correctly? Did students attend? Which students attended? Was the content delivered as intended by deliverers? (academics, ambassadors etc.) Participant experience as measured by evaluation survey Experience & perceptions of stakeholders e.g. key influencers such as teachers, ambassadors, project leads, academics A level attainment (through NPD/HEAT) Progression to HE (HEAT) Progression to top-third HEPs (HEAT) (Years 10-11) KS5 Improving knowledge and confidence academic skills, culture Increasing knowledge of how to apply to HE Improving attainment Increasing preparedness for study in HE Was the programme delivered as intended? Were students targeted correctly? Did students attend? Which students attended? Was the content delivered as intended by deliverers? (academics, ambassadors etc.) Participant experience as measured by evaluation survey Experience & perceptions of stakeholders e.g. key influencers such as teachers, ambassadors, project leads, academics Pre-&-post common survey questions Pre-&-post attainment tests Offers received (from home HEP) Successful offers (from home HEP) A level attainment (through NPD) Progression to HE (HEAT) Progression to top- third HEPs (HEAT) Progression to own HEP (HEAT & internal) Graduate outcomes (GOS) (Years 12-13)

  8. Selecting the appropriate research method What people do or how they act unconsciously Behaviour Eye Tracking Observations Simulations Correlation Analysis Documentary Analysis Causal Data Analysis Subjective Difference Meanings Context Experiences Perceptions Objective Comparable Patterns Structures Tendencies Measurement Quantitative How much? How many? Journey Mapping Lab Experiment Qualitative How?Why? Participatory Observation Closed Survey Structured Interview Semi-structured Interview Unstructured Interview Prompts and visuals Open Survey Focus Groups Attitudes Self-reported preferences, recognition and memories

  9. Selecting the appropriate research method Level 3 - Identify Level 2 Compare Level 1 Monitor

  10. Level 1: Theory of Change Level 1: Theory of Change 10

  11. Level 2: Matched Comparison Level 1: Theory of Change 11

  12. Level 3: Identifying causal evidence Hold everything constant is a nice idea, but 18 18 Female Female Same school Same school BAME BAME Moderate GCSEs Moderate GCSEs Doesn t like her teachers Likes her teachers Wants to be a midwife Isn t sure what she d like to do Doesn t have family support Has family support Study habits improved Study habits stayed the same Applied for scheme Applied for scheme 12

  13. RCTs how they help Randomisation does not aim to hold everything constant Instead, we aim to balance the treated and control groups in expectation (i.e. on average). In other words, although there may be no perfectly matched treated/control cases, on average, the groups are the same, and therefore we expect the same outcomes from the groups in the absence of treatment. 13

  14. RCTs how they can help (cont.) Treatment Balanced Group 1 Balanced Group 2 Control 14 14

  15. What this looks like in practice Effect of Study Supporter on attendance rate 8.4 7.0 4.7 Na ve Na ve with covariates Post-consent randomisation Other studies: Wellness programmes: Jones, D., Molitor, D., & Reif, J. (2018). What Do Workplace Wellness Programs Do? Evidence from the Illinois Workplace Wellness Study (No. w24229). National Bureau of Economic Research. Online advertising: Gordon, B. R., Zettelmeyer, F., Bhargava, N., & Chapsky, D. (2018). A comparison of approaches to advertising measurement: Evidence from big field experiments at Facebook. Retrieved from https://ssrn.com/abstract=3033144. Education Endowment Foundation (2020). Texting Students and Study Supporter. Available at: https://educationendowmentfoundation.org.uk/projects-and-evaluation/projects/texting- students-and-study-supporters/ 15

  16. RCTs arent always right Research question A RCT can only tell you whether something works, not how or why (mixed methods can help here). Fidelity and validity Try to maintain protocol consistency within your intervention; this can sometimes be difficult and can sometimes reduce external validity. Sample size You need a reasonable number of participants to ensure that your groups are sufficiently balanced. Ethics If there is substantial, consistent, high-quality evidence that something is effective, it shouldn t be withheld from anyone who could benefit. Data Collected consistently and universally (administrative datasets are best). 16

  17. Why this matters to student success & outreach Very little existing research offering causal evidence. Evaluation support provided for free by TASO. Where programmes are oversubscribed, all that is required is a tweak to the recruitment process and use of online tracking for HE progression data. Once this system is in place it can be implemented over a number of years and the results aggregated (see next session). 17

  18. Evaluation Methods Framework 18

  19. Research Protocol A Research Protocol is a written document that describes the overall approach that will be used throughout your initiative, including its evaluation. The protocol should be written as if it s going to end up in the hands of someone who knows very little about your organisation, the reason for the research, or the intervention. 19

  20. Lessons from the field Know the data journey 1 Get to know the system and the people 2 Look for SMART interventions 3 Communicate and monitor regularly 4 Use behavioural insights 5 There is only so much you can do 6 TASO is here to help 7 20

  21. Questions? 21

More Related Content