
Promoting MSP Effectiveness Through Evaluation - Strengthening STEM Partnerships
Enhance the quality of Mathematics and Science Partnership (MSP) projects by promoting effective evaluation practices. Learn how MSP projects play a crucial role in advancing STEM disciplines and careers through rigorous evaluation processes. The TEAMS project focuses on building evaluation capacity and improving project effectiveness.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
John Sutton Principal Investigators Meeting MSP FY 12 Washington, DC December 16, 2013
Any opinions, suggestions, and conclusions or recommendations expressed in this presentation are those of the presenter and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content. This project is funded through the NSF Research and Technical Assistance (RETA) program (DRL 1238120).
Professional Learning Network for Mathematics and Science Partnership Projects Learn and Share: Challenges and Successes Improve Skills Engage in Reflective Evaluation
The Goal of the TEAMS project is to: Strengthen the quality of MSP project evaluations and build the capacity of the evaluators by strengthening their skills related to evaluation design, methodology, analysis, and reporting.
Promoting MSP Effectiveness Through Evaluation MSP projects represent a major federal effort to support advancements in science, technology, engineering, and mathematics (STEM) disciplines and careers. Recognizing the vital role of evaluation in this national effort to promote STEM disciplines and careers, NSF MSP projects have an obligation to ensure their project evaluation is designed and conducted in a rigorous manner.
Promoting MSP Effectiveness Through Evaluation Regardless of funding sources, project evaluation plays a vital role in every Mathematics and Science Partnership (MSP) project by: Assessing the degree to which projects attain their goals and objectives; Advancing the field by sharing lessons learned and evaluation findings; and Improving the overall effectiveness of the project through formative evaluation.
Promoting MSP Effectiveness Through Evaluation Technical Evaluation Assistance in Mathematics and Science (TEAMS): Fosters increased understanding of evaluation design and implementation, in particular new and innovative methodologies. Promotes the use of longitudinal data systems in MSP evaluations. Strengthens the role of evaluation as a means of improving project effectiveness and contributing to the knowledge of the field.
Meeting the Needs of MSP Evaluation Technical Evaluation Assistance in Mathematics and Science (TEAMS): Works closely with the NSF staff to develop and implement strategies to encourage innovation and increased rigor in MSP evaluations. Conducts ongoing needs assessment to identify issues that pose challenges for the work of evaluators of MSP projects. Offers no-cost technical assistance to address these issues and challenges. Provides venues for MSP evaluators and project staff to share strategies and findings from MSP evaluations.
Meeting the Needs of MSP Evaluation Evaluation Approaches Often, external evaluations provide: Formative feedback to improve projects and suggest mid-course corrections Summative reporting of project outcomes and impacts Project monitoring for accountability
Resources to Inform Evaluation Institute of Education Sciences, U.S. Department of Education, and National Science Foundation. (2013). Common Guidelines for Education Research and Development. Washington, DC: IES and NSF. Frechtling, J. (2010). 2010 User-Friendly Handbook for Project Evaluation. REC 99- 12175. Arlington, VA: National Science Foundation
Resources to Inform Evaluation Heck, D.J. & Minner, D.D. (2010). Technical report: Standards of evidence for empirical research, math and science partnership knowledge management and dissemination. Chapel Hill, NC: Horizon Research, Inc. Guthrie, Wamae, Diepeveen, Wooding, & Grant. (2013). Measuring research: A guide to research evaluation frameworks and tools. RAND Europe.
Meeting the Needs of MSP Evaluation Research Types Types Description Foundational Research Early-state or Exploratory Research Design and Development Research Efficacy Research Effectiveness Research Scale-Up Research Each of these types of research have different evaluation purposes and require different types of evaluation approaches.
Meeting the Needs of MSP Evaluation Measuring Research: Key Rationales Demonstrate the benefits of supporting research, enhance understanding of research and its processes among policymakers and the public, and make the case for policy and practice change. Advocacy Show that money and other resources have been used efficiently and effectively, and to hold researchers accountable. Accountability Understand how and why research is effective and how it can be better supported, feeding into research strategy and decision-making by providing a stronger evidence base. Analysis Determine where best to allocate funds in the future, making the best possible use of limited funding. Allocation
Meeting the Needs of MSP Evaluation Standards of Evidence Specify indicators for empirical evidence in six domains: 1. Adequate documentation 2. Internal validity 3. Analytic precision 4. Generalizability/external validity 5. Overall fit 6. Warrants for claims
Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Experienced with MSP Evaluations 15% 22% No prior experience Somewhat experienced A little experience 17% Very experienced 46%
Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Challenge Posed for Each Aspect of Evaluation Instrumentation (38%) Theory of Action and Logic Model (27%) Establishing Comparison Groups (24%) Evaluation Design (24%) Sampling (19%) Measurable Outcomes and Evaluation Questions (19%) Data Analysis Methodology (16%) Data Collection (16%) Reporting (14%)
Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Other Evaluation Challenges Instruments o Instruments for Science and Engineering o Instruments Aligned to State Standards o Instruments Aligned to Content of MSP Valid and Reliable Performance Tasks Classroom Observation Protocols
Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Where Additional Assistance Needed Comparison Groups in Rural Settings Random Groups/Comparison Groups Large Enough Sample Size/Strategies for Random Selection Evaluation Design and Measurable Outcomes for New Projects Data Collection/Statewide Task Excessive Evaluation of Students and Teachers
Meeting the Needs of MSP Evaluation Strategic Plan Tasks Task 1: Intranet Project Internal Storage and Retrieval Structure Task 2: Website teams.mspnet.org Task 3: Outreach Ongoing Communications Task 4: National Advisory Board Guidance and Review Task 5: Help Desk Quick response to Queries Task 6: Document Review Identify commonalities develop resources Task 7: Webinars Topics to Inform
Meeting the Needs of MSP Evaluation Strategic Plan Tasks Task 8: Communities of Practice Guided discussions around evaluation topics Task 9: Direct Technical Assistance - Strategies and activities at the project level Task 10: National Conferences Presentations to inform others work Task 11: Annual Meeting Focus on Evaluation Task 12: Data Sources Information about data sets and utility Task 13: Instrument Review share information about what is being used, by whom, and for what
Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance Principal Investigators receive TEAMS communications to know what is available regarding resources and technical assistance. Identify additional resources, templates, processes, and measures being used by project for sharing with other MSP project PIs and evaluators. Communicate with TEAMS regarding specific project needs for information and technical assistance. Task 3: Outreach Encourage project staff and evaluators to pose queries for TEAMS to respond. Task 5: Help Desk Based on PI review of reports, especially challenges identified by evaluator, contact TEAMS staff for follow- up resources or technical assistance. Task 6: Document Review
Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance Invitations sent to PIs and evaluators to participate in webinars. Identify topics for which webinars can be prepared and provided and communicate that to TEAMS. Encourage your evaluator and project staff to present/participate in offered webinars. Task 7: Webinars Based on PI review of reports, especially challenges and needs identified by individual project, recommend possible topics to TEAMS staff. Consider participation and encourage project staff and evaluator to participate in discussions. Task 8: Communities of Practice
Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance Based on insights and familiarity with individual project, including review of reports, contact TEAMS staff for follow-up with specific technical assistance and resources. Identify Evaluation topics for which technical assistance could be provided to project staff and evaluators. Task 9: Direct Technical Assistance Share information with TEAMS about upcoming presentations from your project, especially if related to evaluation. TEAMS staff could help post presentations to share interesting findings from project. Task 10: National Conferences
Tier Definitions Tier Group Description Services 1 Evaluators and researchers of projects other than NSF- and ED-funded MSP projects Access to website that provides links to available evaluation research and resources, research briefs, and other TEAMS publications 2 Evaluators of NSF- and ED- funded MSP projects and external evaluators of other projects Help Desk services (Task 5) Webinars (Task 7) Communities of practice (Task 8) 3 Evaluators of NSF-funded MSP projects Annual Conference (Task 11) 4 Evaluators of NSF-funded MSP projects that are confronting specific challenges Communities of practice specifically for Tier 4 projects with common needs (Tasks 8 & 9) Direct technical assistance (Task 9)
Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance Help identify changes in project staff Help identify specific projects to highlight and participate Help promote participation in meetings (allow resources to be used for this purpose) Task 11: TEAMS Annual Meeting Identify projects that are using public databases in their reporting Share information about projects asking about use of public databases Task 12: Data Sources
Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance Contact TEAMS with queries regarding specific instruments for specific use. Share information with TEAMS regarding challenges encountered regarding instruments. Identify and share unique instruments being used in project. Consider using instruments from other projects as appropriate. Task 13: Instrument Review
Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance In Summary, Principal Investigators can: Identify needs; Share information between projects and TEAMS; Encourage involvement; Facilitate communication; and Promote high quality evaluation approaches.
Meeting the Needs of MSP Evaluation Website (http:teams.mspnet.org) and Help Desk
Meeting the Needs of MSP Evaluation Website (http:teams.mspnet.org) and Help Desk
Meeting the Needs of MSP Evaluation Instruments Considerations Using measures of established quality vs. alignment to the specific goals/approaches of the project o Internally developed & piloted instruments o Externally developed & validated instruments o Collection & analysis of teacher work from the PD
Meeting the Needs of MSP Evaluation Instruments Benefits Internally developed instruments can help demonstrate results were what was intended and promised Externally validated instruments can help demonstrate findings are credible and more broadly important Use of multiple instruments provides triangulation of data for findings Use of internally developed instruments and teacher work samples can help in refining the program and informing providers about participants learning
Meeting the Needs of MSP Evaluation Instruments Lessons Learned As evaluation informs the project and the project evolves, this sometimes requires instrument changes Modifying instruments (adding and/removing items over time) and aligning data sets after modifications to keep up with evolving project needs Adding new instruments or removing instruments (when initial instrumentation isn t providing appropriate data i.e., teacher knowledge, etc.) Verify instrument validity and reliability after modifications and include information in reports.
Meeting the Needs of MSP Evaluation Develop a Conceptual Model of the Project and Identify Key Evaluation Points Theory of Action Why This/Hypothesis o Based on interpretation of current research Describes the experience of the intended audience o Cognitively or behaviorally Expected Outcome o If This/Then This
Meeting the Needs of MSP Evaluation Develop a Conceptual Model of the Project and Identify Key Evaluation Points Model Components Inputs Activities Outputs Short-term Outcomes Long-term Outcomes Contextual Factors
Meeting the Needs of MSP Evaluation Example of Logic Model
Meeting the Needs of MSP Evaluation Develop an Evaluation Plan Steps Determining what type of design is required to answer the questions posed Selecting a methodological approach and data collection instruments Selecting a comparison group Timing, Sequencing, and Frequency of Data Collection
Meeting the Needs of MSP Evaluation Develop Evaluation Questions and Define Measurable Outcomes Steps Identify Key Stakeholders and Audiences Formulating potential evaluation questions of interest to the stakeholders and audiences Defining outcomes in measureable terms Prioritizing and eliminating questions
Meeting the Needs of MSP Evaluation Conducting the Data Collection Considerations Obtain necessary clearances and permission. Consider the needs and sensitivities of the respondents. Make sure your data collectors are adequately trained and will operate in an objective, unbiased manner. Obtain data from as many members of your sample as possible. Cause as little disruption as possible to the ongoing effort.
Meeting the Needs of MSP Evaluation Analyzing the Data Considerations Check the raw data and prepare them for analysis. Conduct initial analysis based on the evaluation plan. Conduct additional analyses based on the initial results. Integrate and synthesize findings.
Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Analytic Precision Indicators Description Measurement Validity/Logic of Research Process Reliable Measures/Trustworthy Techniques Appropriate and Systematics Analysis The extent to which the findings of a study were generated from systematic, transparent, accurate and thorough analyses.
Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Analytic Precision Indicators Description Unit of Analysis Issues Power Effect Size Multiple Instruments Multiple Respondents All Results The extent to which the findings of a study were generated from systematic, transparent, accurate and thorough analyses.
Meeting the Needs of MSP Evaluation Reporting the Findings Considerations Background (Context, sites, intervention, etc.) Evaluation study questions Evaluation procedures (description of measures used and purposes) Study Sites and Sample Demographics Data Collection (administration, participants counts, timelines for acquiring data, etc.) Data analyses (what methods for what measures, limitations, missing data, etc.) Findings Conclusions (and recommendations)
Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Generalizability/External Validity Indicators Description Findings for Whom Generalizable to population or theory Generalizable to different contexts The extent to which you can come to conclusions about one thing (e.g., population) based on information about another (e.g., sample).
Meeting the Needs of MSP Evaluation Disseminate the Information Considerations The funding source(s) Potential funding sources Others involved with similar projects or areas of research Community members, especially those who are directly involved with the project or might be involved Members of the business or political community, etc.
Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Warrants for Claims Indicators Description Limitations Decay and Delay of the Effect Efficacy Conclusions/Implications Logically Drawn from Findings The extent to which the data interpretation, conclusions, and recommendations are justifiable based on the evidence presented.
Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider Evaluation Design Component Development of a conceptual model (logic model) of the program Evaluation Topics Develop logic model Identify contextual conditions Articulate goals clearly Define multiple achievement outcomes Development of evaluation questions and measureable outcomes
Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider Evaluation Design Component Development of the evaluation design Evaluation Topics Address shifting project and evaluation priorities Format measures (hard- copy, electronic, etc.) and schedule administration Display data effectively Data management Collection of data
Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider Evaluation Design Component Analysis of data Evaluation Topics Conduct appropriate data analyses to respond to evaluation questions Report intended impact on various populations Report findings to different audiences Provision of information to interested audiences
Meeting the Needs of MSP Evaluation Ongoing Needs Assessment At your tables, please write down one or two anticipated evaluation challenges and/or needs that your project perceives it may need assistance related to project/program evaluation.
Meeting the Needs of MSP Evaluation What Questions Do You Have Regarding TEAMS? TEAMS contact information: teams.mspnet.org