Mixed Methods in Program Evaluation - Overview and Guidelines

Mixed Methods in Program Evaluation - Overview and Guidelines
Slide Note
Embed
Share

Delve into the world of mixed methods in program evaluation through the lens of Tom Chapel, Chief Evaluation Officer at the Centers for Disease Control and Prevention. Explore the rationale, options, challenges, and criteria involved in utilizing mixed methods. Gain insights on applying mixed methods through simple examples and learn about the CDC's Evaluation Framework and Standards. Discover how to engage stakeholders, describe programs, and gather credible evidence effectively. Uncover the importance of focusing on design and data collection choices to ensure successful evaluations.

  • Program evaluation
  • Mixed methods
  • CDC
  • Evaluation framework
  • Stakeholders

Uploaded on Mar 05, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Chief Evaluation Officer Centers for Disease Control and Prevention Tchapel@cdc.gov 404-639-2116

  2. Agenda 1. The why and how of mixed methods: Rationale Options Challenges Criteria for making choices 2. Apply points to some simple examples

  3. CDCs Evaluation Framework STEPS Engage stakeholders Ensure use and share lessons learned Describe the program The Standards apply especiallywhen we re trying to make data collection choices. Standards Utility Feasibility Propriety Accuracy Focus the evaluation design Justify conclusions Gather credible evidence

  4. CDCs Evaluation Standards Standards Utility The Standards provide a quick and easy way to identify the 2 or 3 best data collection choices for this evaluation. Feasibility Propriety Accuracy

  5. CDCs Evaluation Framework STEPS Engage stakeholders Not Collect data Ensure use and share lessons learned Describe the program Standards Utility Feasibility Propriety Accuracy Not Analyze data Focus the evaluation design Rather Justify conclusions Gather credible evidence Gather credible evidence

  6. Steps 1-3 Help You Focus Design And Data Collection Choices After the first 3 steps of the Evaluation Framework, we know which evidence will work for these stakeholders in this situation. Randomized control trials? Qualitative data? Quantitative data? Performance measures?

  7. CDCs Evaluation Standards The Evaluation Standards help us narrow down our data collection choices to the handful of methods that will work for this evaluation at this time. Standards Utility Feasibility Propriety Accuracy

  8. Mixed Methods Data collection methods that will work for this evaluation at this time sometimes means surveys or focus groups. But sometimes there is no one best way. The best choice would be a combination of methods or mixed methods .

  9. Six (Most) Common Ways to Collect Data Surveys Interviews Focus Groups Document Review Observation Secondary Data

  10. How Standards Inform the Choice of Methods Consider the context : How soon do I need the results? What resources can I use? Are there any ethical issues to consider? Standards Utility Utility Standards Feasibility Feasibility Propriety Propriety Accuracy Accuracy

  11. How Standards Inform the Choice of Methods Also consider the content : Sensitivity of the issue Standards Utility Utility Standards Feasibility Feasibility Propriety Propriety Accuracy Accuracy

  12. How Standards Inform the Choice of Methods Also consider the content : The Hawthorne Effect Standards Utility Utility Standards Will the act of being observed cause someone to distort their response? Feasibility Feasibility Propriety Propriety Accuracy Accuracy

  13. How Standards Inform the Choice of Methods Also consider the content : Validity Reliability Standards Utility Utility Standards Feasibility Feasibility Propriety Propriety Accuracy Accuracy

  14. Mixed Methods Address Concerns Key Concept: Regardless of the method, when there are validity and reliability concerns, often using more than one method-- i.e., mixed methods-- will help.

  15. Mixed Methods: Definition The combination of at least one qualitative and at least one quantitative component in a single research project or program. (Bergman 2008)

  16. Use Complementary Methods Mixed methods is: A combination of methods that has complementary strengths and non- overlapping weaknesses. The purpose is to supplement or complement the validity and reliability of the information.

  17. Strengths of Quantitative Methods Strengths of quantitative methods: Require less time than qualitative methods Cost less Permit researcher control Quantitative data is considered to be scientific Easier to explain validity and reliability Easily amenable to statistical analyses

  18. Strengths of Qualitative Methods Choose qualitative methods when you are trying to: Explore or describe a phenomenon

  19. Strengths of Qualitative Methods Choose qualitative methods when you are trying to: Look for induction (i.e., surprise )

  20. Strengths of Qualitative Methods Choose qualitative methods when you are trying to: Identify patterns

  21. Strengths of Qualitative Methods Qualitative data can help you understand not just what but WHY .

  22. When to Use Mixed Methods 1. Corroboration better understanding; more credibility triangulation measuring the same thing from several different viewpoints 2. Clarification trying to understand why we got this result

  23. When to Use Mixed Methods Mixed Methods are most commonly used for: 3. Explanation similar to clarification want to know the why or what behind the situation 4. Exploration similar to explanation charting new territory trying to observe patterns examine different situations and varying results to induce patterns

  24. Number of Project Facets Reported via Each Data Collection Method Source: Gregory Guest, PhD

  25. Number of Project Facets Reported via Each Data Collection Method This is an example of using a qualitative method (site visits) to corroborate a quantitative method (surveys). The result was increased validity of the data. Source: Gregory Guest, PhD

  26. Which to Choose? How do you choose which methods to use? Which method comes first, the quantitative or the qualitative? You have a lot of flexibility in these decisions.

  27. Parallel or Concurrent Mixed Methods QUANTITATIVE QUALITATIVE For parallel or concurrent mixed methods, quantitative and qualitative data collection happen at the same time.

  28. Sequential Mixed Methods QUANTITATIVE QUALITATIVE OR QUALITATIVE QUANTITATIVE For sequential mixed methods, either quantitative or qualitative data collection can happen first.

  29. Example of Sequential Mixed Methods to Corroborate Data QUALITATIVE In this case, the qualitative method (site visits) was used to corroborate the quantitative (survey) method and the results were different. QUANTITATIVE

  30. Mixed Methods Is Your Choice You are never required to use mixed methods. However, you may choose to use mixed methods when: you have some indication that a single method may give you incorrect data. a single method may give you an incorrect perception of reality.

  31. Mixing Methods During Data Analysis Qualitative data (focus groups, observations, secondary data, etc.) can be converted to numbers via quantitative techniques like content analysis. This is also a mixed method design approach.

  32. Mixing Methods During Data Analysis Qualitative data can be very complex. Examining qualitative data with quantitative techniques helps to identify or validate patterns or themes.

  33. Deciding When To Use Mixed Methods and How Key Concept: Using mixed methods is a deliberate design decision. You use it when you don t trust the data from any single method. The reason for your uncertainty determines the methods you choose to mix and the order in which you use them.

  34. Example 1 Concurrent Design Problem or Purpose: Validity Example Do people give similar responses on surveys as well as in focus groups? Survey (quantitative) and focus groups (qualitative) are conducted concurrently with similar participants.

  35. Example 2 Explanatory Sequential Design Problem or Purpose: Explain unexpected results Example Use a qualitative method to explain blindside results from a quantitative method. Survey (quantitative) followed by focus groups (qualitative) to explain or to better understand what s going on.

  36. Example 3 Exploratory Sequential Design Problem or Purpose: Verify suspected patterns Example Explore potential patterns with a qualitative method and then verify the patterns with a quantitative follow-up. Focus groups (qualitative) first, identify potential patterns, then do a survey (quantitative) to validate any patterns.

  37. Design Options Summary You mix quantitative and qualitative methods in a different order depending on the presenting problem: Validate results

  38. Design Options Summary You mix quantitative and qualitative methods in a different order depending on the presenting problem: Validate results Explain the unexpected

  39. Design Options Summary You mix quantitative and qualitative methods in a different order depending on the presenting problem: Validate results Explain the unexpected Explore new themes

  40. Selected Resources (Page 1 of 2) Caracelli, V. and J. Greene (eds.). 1997. Advances in Mixed- Method Evaluation: The Challenges and Benefits of Integrating Diverse Paradigms. San Francisco, CA: Jossey-Bass. Creswell, J. and V. Plano Clark. 2010. Designing and Conducting Mixed Methods Research, 2nd edition. Thousand Oaks, CA. Sage Publications. Morse, J. and L. Niehaus. 2009. Mixed Method Design: Principles and Procedures. Walnut Creek, CA. Left Coast Press.

  41. Selected Resources (Page 2 of 2) Johnson, R. Burke, and L. Christensen. Evaluation Methods. 2008. www.southalabama.edu/coe/bset/johnson/ Plano Clark, V. and J. Creswell. 2008. The Mixed Methods Reader. Thousand Oaks, CA: Sage Publications. Teddlie, C. and Tashakkori, A. 2009. Foundations of Mixed Methods Research: Integrating Quantitative and Qualitative Approaches in the Social and Behavioral Sciences. Thousand Oaks, CA. Sage Publications.

  42. Recommended Resource Creswell, J. and V. Plano Clark. 2010. Designing and Conducting Mixed Methods Research, 2nd edition. Thousand Oaks, CA. Sage Publications.

  43. The Community Tool Box Community Tool Box http://ctb.ku.edu Chapter 37, Section 5. Collecting and Analyzing Data Community Toolbox Logo

  44. End Mixed Methods Return to Webinar 4: Gathering Data, Developing Conclusions, and Putting Your Findings to Use Return to Evaluation Webinars home page

Related


More Related Content