Big Data, Education, and Society
Explore the concept of implementation fidelity in education and society, focusing on delivering interventions as intended. Discover the potential pitfalls and challenges faced during the implementation process.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Big Data, Education, and Society November 12, 2021
Assignment 3 Any questions about assignment 3? Due Monday
Implementation Fidelity NIH definition Implementation fidelity is the degree to which an intervention is delivered as intended
Questions to ask during implementation (Feng et al., 2014, p. 564) Is this the quality of implementation we expected as creators of the intervention? What actions can we take that might bring implementation up to our desired levels?
The peril of implementation fidelity You design a brilliant innovation
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings Well-designed teacher professional development
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings Well-designed teacher professional development Buy-in from school administrators
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings Well-designed teacher professional development Buy-in from school administrators Monitoring of ongoing implementation
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings Well-designed teacher professional development Buy-in from school administrators Monitoring of ongoing implementation You then throw it out into the cold, cruel world
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings Well-designed teacher professional development Buy-in from school administrators Monitoring of ongoing implementation You then throw it out into the cold, cruel world Where it fails miserably (Khachatryan et al., 2014)
Failure to take medicine 4 days of antibiotics instead of 7
Evidence of Poor Implementation Fidelity in Large-Scale Trials Carnegie Learning RAND study (Karam et al., 2017) ALEKS RAND Study (Phillips et al., 2020) Details on these studies a bit later
Does anyone have any personal examples of implementation fidelity failures to share?
Categories of implementation fidelity (Dane & Schneider, 1998) Adherence are program components delivered as prescribed? Exposure/dosage are program components delivered as much as intended? (and in the right proportions?) Quality are program components delivered in the theoretical ideal intended fashion?
What attributes influence implementation fidelity (Carroll et al., 2007) Intervention complexity and prescriptiveness Support by developer, school, district Participant responsiveness how interested/willing are teachers, and how difficult is intervention for them to adopt
How can we improve odds of good implementation fidelity? Your thoughts?
Reasoning Mind approach (Khachatryan et al., 2014) Regional implementation coordinators (1 per 37.7 teachers) Look at data on student engagement and performance to identify problem spots Visit classrooms periodically and conduct observations according to a detailed rubric (average = 6 visits/year) Meet with teachers after observations to provide professional development on classroom issues This was rated as teachers favorite part of adopting the curriculum
Reasoning Mind approach (Khachatryan et al., 2014) Rubric Does teacher use analytics reports to make instructional decisions Does teacher plan lesson activities and student interventions before class Does teacher conduct varied interventions with students in need Proportion of class time students spend using system Do students use all system features Does teacher engage with students during class
Reasoning Mind approach (Khachatryan et al., 2014) Rubric (continued) Does teacher use effective classroom management procedures Does teacher establish clear goals and rewards for individual students and entire class Do students have well-organized notebooks that show student work Do students use recommended learning strategies Are students on-task
Reasoning Mind approach (Khachatryan et al., 2014) Curriculum modification Use data from regional coordinator visits to identify areas where implementation fidelity is generally low Encouraging students to take notes and show written work Checking student notes and written work Modify automated curriculum to better scaffold these areas for both teachers and students Modifying teacher professional development to emphasize these areas (2 days before school year, six half-day workshops during school year)
Thoughts What do you like about this approach? What do you dislike about this approach?
Feng et al. (2014) ASSISTments approach 12-hour (2-3 day) Best practices workshop with teachers at beginning of year Count as state professional development credit Beginning-of-year interviews with principals Beginning-of-year teacher survey Analyze system log data during year Classroom observations of teacher practices End-of-year teacher interviews
Feng et al. (2014) ASSISTments log data analysis How often do teachers assign homework in ASSISTments? What are homework completion rates? How long do students spend on homework? Which teachers are not opening homework reports?
Feng et al. (2014) actions taken Visit teachers who did not use system as intended, with targeted plans for which behaviors to coach Change agenda of best practices workshop to match general issues Modify design of reports given to teachers
Thoughts What do you like about this approach? What do you dislike about this approach?
Carnegie Learning Implementation Fidelity Approach (Pane et al., 2014, p. 129) 3 days of professional development/training during summer 1 visit from PD staff to a school during year PD staff observe classrooms, offer recommendations, and help teachers address any problems they are having with implementations Teachers also receive a set of training materials, an implementation guide, and a book of resources and assessments.
Thoughts What do you like about this approach? What do you dislike about this approach?
Karam et al., 2017 Studied implementation fidelity of Cognitive Tutor in real-world use (self-report surveys) Only 45% of HS teachers reported using software for prescribed amount of time Only 14% of HS teachers reported working with non-software recommended practices for prescribed amount of time Only 30% of HS teachers reported spending as much time during software use working with students as prescribed
Bingham et al., 2018 Studied implementation fidelity of Cognitive Tutor in real-world use (teacher interviews) Technical problems play a large role 25% of teachers had hardware problems 35% of teachers had internet connectivity problems
ALEKS Implementation Fidelity Approach (Phillips et al., 2020) Initial training on software features, including reports, and how long and how often students should use system Monthly visits to schools from staff Study right here in Philadelphia!
ALEKS Implementation Fidelity Success (Phillips et al., 2020) Only 60% of classes used both system and traditional instruction in class as intended Median class used software for 12 hours entire year (expected usage = 60 hours) Study involved Algebra I but majority of teachers reported students were mostly not ready for Algebra I Extensive student absenteeism Many students observed typing problems into web algebra solvers to get answers Students off-task more than 50% in 93.5% of classes Only 13% of teachers used data in recommended fashion (to adapt their own instruction)
Effectiveness of PD Correlation between more PD and prescribed pedagogical practices in Reasoning Mind (Miller et al., 2015) No correlation between more PD and prescribed pedagogical practices in Cognitive Tutor (Karam et al., 2017) May be due to different quality of PD
Effectiveness of PD I constantly change the structure of my class. I ve probably changed things five times this year because what I was doing wasn t working. So, I changed it. And then, that wasn t working, so I changed it again. So, I just feel like I m constantly adjusting because [Cognitive Tutor] has never been clearly explained, in terms of what is expected or how a classroom should run in regards to that. So, I ve kind of just made it up as I ve gone along. (Bingham et al., 2018 same system as Karam) The ideas and the implementation is what s lacking I think. I don t feel like I know what I m doing. I need to see things modeled and I need to know what it is. I need to be able to touch it. Show me a model, model for me. (Bingham et al)
Effectiveness of PD [The professional development] is slow, it s boring. It doesn t give me any new ideas. It doesn t challenge me to think in a different way. And then they want us to use [a professional learning resource] with videos from . . . the early 80s. It s so outdated and really poorly put together. (Bingham et al) The PD is not really geared towards what people want. [PD] is more what [administrators] want to give out to everybody else [teachers in more traditional school models]. It s not about learning specific issues with technology. I would prefer if it were in-house [at the school and school-specific]. (Bingham et al)
How Does Adoption Occur? Makes a lot of difference to how good implementation fidelity will be State-level curriculum approval processes Top-down decisions based on sales Curriculum reviews by groups of teachers Individual teachers decision-making Individual learners decision-making
Activity You are the implementation director for Bob s Discount Math Curriculum (BDMC) 1000 teachers use your system Each teacher s classes produce $3000 of profit after all expenses except implementation/PD/support
Costs "trouble-spotting data analysis" 1 day 350 specific teacher data analysis 0.5 days 150 specific teacher classroom observation 1 day 500 single-teacher PD 0.5 days 250 full-day multi-teacher PD 2 days 1000 half-day multi-teacher PD 0.5 days 250 multi-teacher PD prep 3 days 900 books and materials 100
Create a plan for implementation In groups of 3-4 What forms of support will you use? How much profit is left over after support?
Each group Type your final profit into the chat window
Highest-profit group and lowest-profit group Please read out what implementation support you offered, and why you chose this combination
Should this calculation change Between a non-profit like Reasoning Mind And a for-profit like Carnegie Learning What are the risks of too low a profit for each type of organization?