
Assessing Program Effectiveness through Focus Group Data
Explore how to assess program effectiveness by tracking completers into their early careers, using data sources for improvement, and conducting focus group discussions. Learn about the importance of program improvement data, a multiple measures approach, and the impact on student learning. Discover how pre/post assessments on units of instruction play a vital role in evaluating program impact.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Using Focus Group Data to Assess Program Effectiveness Hillary Merk Randy Hetherington Bruce Weitzel James Carroll Jacqueline Waggoner 1
CAEP Definitions of Completer Two definitions of completer- Standard 1 and 4 A term to embrace candidates exiting from preparation programs. (Standard 1 glossary) Program completers who have been out of the program at least 6 months. 2
Objectives To share how we tell if our initial licensure programs have adequately prepared our completers for today s classrooms through a process of tracking them into their early careers. Explore the sources of data that can be used for continuous program improvement. Discuss how data from focus group discussions with completers can be used as part of the continuous program improvement cycle. 3
Program Improvement Data EPPs must demonstrate measures are reliable and valid. There is a need to gather a minimum of three cycles of informative data prior to an accreditation review. It takes time to implement curricular changes in response to identified areas for improvement. Curricular changes require continuous assessment of completer impact in P-12 schools. 4
Multiple Measures Approach Develop an array of measures: Single measures may not portray a complete picture of the impact of our EPP. Cumulatively they provide convincing evidence. Collectively they are a reflection of the impact of our initial licensure programs. They demonstrate program efficacy as it links to program knowledge and skills (curriculum). 5
Standard 4 Impact on Student Learning Indicators of Teaching Effectiveness Satisfaction of Completers Satisfaction of Employers 6
Standard 4: Impact on Student Learning Pre/Post Assessment on a Unit of Instruction Pre/Post assessments use to measure specific, teacher-selected, instructional units. Unit assessments were designed around matched pre/post assessments of students. Activity mimicked work that the teachers had done twice while they were candidates in their preparation program. 7
Standard 4: Impact on Student Learning Pre/Post Assessment on a Unit of Instruction Completers report on an Excel spreadsheet pre/post scores of each P- 12 student along with unit goals, and demographic categories such as gender, ethnicity, and identified learning needs. Learning gains computed for each P-12 student as the percent correct difference between pre-and post assessments. Completers disaggregate and analyze data. 8
Standard 4: Indicators of Teaching Effectiveness Classroom Observations A sample of volunteer teachers who had completed the analysis of a unit of study were observed teaching in their classrooms by a clinical practice professor. Rubric developed to align with the mid-term and final evaluation used in their student teaching. The observers used open-ended notes to each of the rubric elements to avoid teachers feeling that the observation was evaluative. Principals of those who volunteered to be observed were also interviewed. 9
Standard 4: Satisfaction of Employers Employer Interviews Interviews of principals of newly hired EPP graduates (of both the four-year undergraduate and the 10-month MAT programs) Associate Dean has face-to-face interviews with Principals and Vice Principals Principals or Vice Principals fill out short survey The Principals interviewed had completers in their building who submitted the Pre/post test and agreed to a classroom observation 10
Standard 4: Satisfaction of Completers Data analyzed from: State survey of new teachers New Teacher Support Group for recently hired candidates Educational Leadership Network Teacher Leadership Network Focus groups of completers 11
Standard 4: Satisfaction of Completers Focus Group Interviews Focus group assessment (FGA) is: One of multiple measures that inform program effectiveness A form of providing on-going mentorship, coaching and support to graduates in the field. A means of triangulation of observed practice, and employer viewpoint to inform program decisions.
Standard 4: Satisfaction of Completers Focus Group Interviews FGA protocol development and process: Follows Patton s (2015) question typology. Linked to edTPA and InTASC standards. Ensured one-to-one alignment of questions with state and national standards for teacher evaluation. Utilized purposive sampling process of convenience (Creswell & Poth, 2018)
Standard 4: Satisfaction of Completers Focus Group Interviews FGA scoring and analysis: Discussions are recorded and transcribed. Observations noted by moderators. Two-cycle coding method (Salda a, 2016) (descriptive ; InVivo) Triangulated with other measures for consistency across data sources. Determine and assess outlier responses. Retain for future iterations.
Standard 4: Satisfaction of Completers Focus Group Interviews The questions: Derived from the student clinical assessment protocol. Linked to edTPA and InTASC standards. Chosen to supplement areas where other data sources suggested areas of either strength or growth among graduates. Principal interviews Pre/post assessment Observations Limited to honor time.
Standard 4: Satisfaction of Completers Focus Group Questions How do you plan for culturally relevant lessons that align with the appropriate standards and draw from your student s experiences? CPAST C Assessment; edTPA Task 1 Rubric 5 In your planning of lessons and units, how do you embed differentiation of instruction to meet the needs of all your students? CPAST D Differentiation; edTPA Task 1 Rubrics 3, 4; Task 2 Rubrics 7, 11 When you are teaching, how do you use formative assessments to gauge the learning of your students and make changes to your pedagogy or assessment as a result? CPAST G Formative Assessment; edTPA Task 2 Rubrics 8, 10
Standard 4: Satisfaction of Completers Focus Group Questions Can you give examples of how you have you have used technology to engage students in learning and demonstrate their understanding of content or skills? CPAST H Digital Tools; edTPA Task 1 Rubric 5; Task 2 Rubric 9 In what ways do you use data collected from your students to set short or long term goals for future instruction and/or assessment? CPAST J Data Guided Instruction; edTPA Task 2 Rubric 10; Task 3 Rubrics 11, 14, 15 Describe the varied types of assessment you utilize in arriving at your determination of student progress towards state or national standards CPAST L Assessment Techniques; edTPA Task 1 Rubric 5 Can you describe the way you determine the academic, physical, social, emotional, and cultural needs of students? CPAST T Advocacy to Meet Needs of Learners or Teaching Profession; edTPA Task 1 Rubric 3; Task 3 Rubric 15
Standard 4: Satisfaction of Completers Focus Group Interviews Preliminary Findings: Cultural Relevance Importance of relationships emphasized (both with P-12 students and colleagues) Cohort Model beneficial Experience varied by program level P5: the majority of classes at [the University] prepared me for that [cultural relevance] we always connected back to the importance of knowing your students and making it a safe and comfortable environment for the kids Limited to honor time. P1: I ve done most of my learning, in this regard, on the job.
Standard 4: Satisfaction of Completers Focus Group Interviews Preliminary Findings: Differentiation Content covered; Specifics on implementation needs emphasis De-mystify the difficulty level P6: I m teaching multiple subjects so if they can t read, it s going to be hard for them to take math tests and making sure that there are accommodations so that they can still show what they are learning I really learned that from lots of different classes at [the University]. P2: I do think that s an area [the University] could improve on, since I walked out of it thinking that differentiation was super hard and overwhelming
Standard 4: Satisfaction of Completers Focus Group Interviews Preliminary Findings: Formative Assessment Value of class time emphasized; Multiple methods presented Benefit from learning more ways to adapt on-the-fly. P4: where it [the University] did a great job of emphasizing that in class work time is so valuable, because a kid will go home and do a math problem a million times wrong, and so how I set my class apart from the other teachers at my school is that I try to teach for the first 20 minutes, and then I say alright, try it out for 5 minutes. And if they do it right we move on to the next thing, and they can collaborate and work with their neighbors. P5: so I think it was great that [the University] prepared us with all the different methods that we can do, rather than just one or two.
Standard 4: Satisfaction of Completers Focus Group Interviews Preliminary Findings: Technology Addressed in many courses; Emphasis on use as a teaching method Instruction and reality of technology for in-service classrooms mismatch P2: I m of the opinion that in the education field, sometimes the use of technology is over-emphasized, I think it s important, but you don t always need it. For example, I teach in a low-income school, so some of my students don t even have Wi-Fi at home . P4: I realize that the use of technology is going to be unique depending on which school you get, so if the program had been totally focused on a specific type of technology, you could end up at a school where they use something else and be totally lost. So, I think [the University] prepared me to the point that I did fine.
Standard 4: Satisfaction of Completers Focus Group Interviews Preliminary Findings: Data-Based Instruction Why data matters clear; Focus on reflection to improve pedagogy More needed on using standards-based assessment P3: I m constantly thinking okay that s not how I m going to teach that next time, or I think I m going to redo that this time, I think it went so poorly that I think I want to reteach that lesson. And there are times where I will literally rerun the same test like alright we are going to come at this another way and try that test again to see if that [understanding] improves.
Standard 4: Satisfaction of Completers Focus Group Interviews Preliminary Findings: Data-Based Instruction P3: The pre/post lesson plan thing that we did in the fall was really useful in developing ways to look at data and figure out what I need to teach and how I need to teach it. P1: I feel that something the program could have done better was when we have gathered the data, how exactly we should be implementing these changes because I feel that we were given a bunch of the pieces, like I feel like I am really good at collecting the data and reflecting on where changes need to be made, but about strategies and implementation for the methods toolbox, I think that needed to be emphasized more.
Standard 4: Satisfaction of Completers Focus Group Interviews Preliminary Findings: Learner Needs Cultural needs well addressed; More needed on specific special needs Situation in some schools different from what was presented P1: That was something that I felt was lacking in the MAT program. I felt like it was mostly catered toward schools that were at least middle-class, generally had supports going on-parent-wise or in the classroom technology, etc I felt like there was a huge gap between the University and what was happening in the low-income schools. P6: I don t really know how [the University] could help with that [severe depression]but there are so many problems that kids can have and come to you with and you still have to help and love them so maybe if you could have a class about that it would be great.
Next Steps Focus Group data collection on-going Triangulation with other Standard 4 Data sources Preliminary report to Program Assessment Committee Implications for curriculum/program review Recommendations to Faculty and/or committees
Conclusions (Focus Group Data) Central topics addressed; Strengths in relationships and reflection on practice Focus on implementation of technology and assessment methods desired Inconsistencies across instructors/program levels/years Full year placements valuable Special needs major area of concern for graduates
Questions? Hillary Merk merk@up.edu Randy Hetherington Bruce Weitzel James Carroll carroll@up.edu Jacqueline Waggoner waggoner@up.edu hetherin@up.edu weitzel@up.edu 27
References Cresswell, J. W. & Poth, C. N., (2018). Qualitative inquiry & research design: Choosing among five approaches (4th ed.). Los Angeles, CA: Sage. Patton, Michael Quinn (2015). Qualitative research & evaluation methods: Integrating theory and practice (4th ed.). Thousand Oaks, CA: Sage. Salda a, J. (2016). The coding manual for qualitative researchers. Thousand Oaks, CA: Sage.