Integrated Care Evaluation: WPIC Model Components and Context-Informed Approach

partners health summit whole person integrated n.w
1 / 32
Embed
Share

Explore the evaluation of Whole Person Integrated Care (WPIC) through a comprehensive model focusing on key components and context-informed strategies. Learn about the three-tier model, evaluation design decisions, and theoretical content relation in this innovative healthcare summit.

  • Integrated Care
  • WPIC Model
  • Evaluation
  • Context-Informed Approach
  • Healthcare Summit

Uploaded on | 4 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. PARTNERS HEALTH SUMMIT: WHOLE PERSON INTEGRATED CARE THE WPIC EVALUATION Dr. Gary Walby Complex Systems Innovations

  2. WPIC Evaluation Targeted Components 2

  3. 3 From Investigation of Context came the Core Integrating Concept of WPIC Wellness is a unifying concept within and between the HUBs and the 3 Tiers of WPIC. Wellness is a critical Tier 2 concept that bridges Tier 1 (focus on the individual/family via the Adapted Collaborative Care Model (ACCM)), and Tier 3, where wellness is already proven as significantly and positively associated with SDOH.

  4. Context Informed Evaluation 4 C/I = Emerging Factors, organizational changes, unexpected events C/P&D = Laws, policies, relationships, organizational and community assets Context THEORY INFORMED THEORY INFORMED ALL = Improved Evaluation Clarity to Support Evaluand Effectiveness Planning & Design Implement P&D/I = Fidelity, timing, feedback, evidence-based changes

  5. What is Being Evaluated? What is the Context? WPIC Three Tier Model - Overview Tier 3 The larger community + Time Bank to address SDOH Tier 2 A community forum/learning collaborative that links to and supports Tier 1 partners Tier 1 Health Providers moving to best practices with customized support (HUBs, PCPs, etc.) 5

  6. Drilling Down: Evaluation Design Decisions 6 Evaluation Questions Evaluation Models or Schools Evaluation Type Specific Outcomes and Outputs Analytic Plan Data Collection Methods, Measures and Tools

  7. Model Theoretical Content in Relation to Evaluation 7

  8. WPIC Evaluation Model Components 8

  9. 9 MACRO CONTEXT: Community and HUB Level Evaluating system variables and relationships for impact on implementation, fidelity and outcomes Hierarchical and Embedded Evaluation Design MESO CONTEXT: HUB and Organization Level Evaluating embedding of WPIC on HUB and organization processes and linking to outcomes MICRO CONTEXT: Team, Staff/Peer and Person Served Level Evaluating WPIC impact on multiple team, staff, and client outcomes Developmental & Collective Impact Evaluation Multiple Evaluation Models and processes were selected to mirror, assess value of, and assist in implementation of the complex WPIC intervention model. This includes outcomes and measurement selection. MACRO CONTEXT Formative & Implementation Evaluation Outcome Evaluation MESO CONTEXT Impact Evaluation & Case Studies MICRO CONTEXT

  10. Points Relevant to Context, Planning/Design and Implementation 10 Develop an evaluation plan that accomplishes evaluation objectives efficiently Gain agreement on the evaluation plan Identify performance needs for both the object of the evaluation (program, organization) and for the evaluation itself Cannot assume access, flawless data collection, buy in by all parties, or that the context will not change Finalize initial timelines and deliverables but build in and communicate flexibility

  11. Evaluation Plan Generation 11 Defining Evaluation Component Research Questions Defining Outcomes by Component Selecting Activities to Track (Processes) Defining Activities in Relation to WPIC Tier Selecting Data Collection Tools Planning Component Consistent Data Collection Methodologies Finalizing an Analytic Plan Reporting and Disseminating

  12. Setting the Stage for Evaluation and WPIC Implementation Success 12 Forethought and Setting the Stage: Assessing and improving the context for evaluation through two methods that will improve the context for evaluation activities and link evaluation implementation to program outcomes. Evaluation component activities checklists micro and meso context and process Identifyingperformance needs and testable solutions all Tiers & context levels 1. 2.

  13. Examples Evaluation Component Activities Checklists 13Outcome Evaluation Developmental Evaluation Implementation Evaluation Develop a shared measurement system across sites Discuss training, expected outputs, and outcome links to peer navigators Establish a data collection monitoring protocol to assess timeliness and accuracy of data entry across sites Collaboratively define patient level outputs and outcomes (clinical, wellness, quality of life, satisfaction) Assist and assess training of staff in the combined model (Quadruple Aim, AIMS Collaborative Care Model, Collective Impact) Finalize standard processes for data collection and persons responsible Create the framework to include partner specific (non-shared) outcomes to support sites as well as the framework to add additional, emergent outcomes Poll collaborative partners for best ways to embed, implement, and sustain short and frequent communication efforts Implement data collection and check in processes to maintain evaluation focus and detect emerging issues within the Collective Impact Emergence Framework

  14. Identifying Performance Needs and Factors Relevant to Program & Evaluation Implementation and Outcomes 14 Goal is to identify appropriate outputs, outcomes, and performance needs that accurately encompass the evaluation goals, questions, and context of the project Performance Need = a state of existence or level of performance required for satisfactory functioning. In order for success there is a Need to do something Need to be something Need to be able to do something

  15. Performance Needs 15 Another way to think about performance need is as an actual or potential problem Skill deficit in the program or the evaluation Lack of knowledge or understanding Lack of extrinsic (incentives) or intrinsic (interest) motivations Lack of resources Organizational culture counter to goals/outcomes Use of a temporary logic model can be helpful Identifying MET and UNMET needs (barriers) Met needs: Maintain and sustain Unmet needs: Plan activities to increase chance of success Both: Coordinate vigilance to detect unintended consequences

  16. Sample WPIC Evaluation Questions & Outcomes Collective Impact 16 1. Is the process of collective impact resulting in shared decision making? 2. Are the connections between organizations in the Health Forum, the HUBs, and Partners sufficiently dense and adaptable to sustainably address social determinants? 3. Are implementation strategies/activities of WPIC successful in promoting a collective effort in meeting implementation goals for Tiers 2 and 3? CI Condition Outcome Indicator Measurement Backbone Structure 80% of participants endorse that the Backbone Organization (Partners) has sufficiently guided the vision and strategy of WPIC Backbone (BB) staff effectively manage complex relationships Online CI specific survey; Interviews Shared Measurement 80% of participants agree that shared measurement was collaboratively designed Partners feel a collective accountability for results Online CI specific survey; Interviews

  17. Sample of Collective Impact Activities 17 Core Condition Sample Activities Backbone Function The backbone group will meet regularly to discuss and reflect on progress and obstacles, planning interventions and support. Training in Collective Impact will be provided by the evaluation team to the core members above and others. Common Agenda The evaluation team supports dialogue for a common agenda using participatory evaluation activities. The evaluation team attends or reviews recordings of at least 80% of meetings, extracting information that supports and obstructs a common agenda and reports to all participants transparently. Shared Measurement System The evaluation team will complete quarterly reports related to common measurement indicators and outcomes. The evaluation team will assist the Health Forums and Time Banks in selecting data elements that support rigorous analysis. Continuous Communication The evaluation team will extract information on indicators and report regularly to identified recipients. Data analysis will be used to determine the degree that partners coordinate efforts, frequency of communication, clarity of communication, and how feedback is integrated. Mutually Reinforcing Activities Organizational evaluation techniques and tools (e.g. Collaborative Development Surveys) will be used to assess the development process and provide feedback.

  18. Sample WPIC Evaluation Questions Developmental Evaluation 18 What are the initial and evolving conditions in each HUB, Health Forum, Time Bank, the cross HUB implementation team, and others as they emerge, that affect full and accurate implementation of WPIC? What are the benefits, issues, unintended results, and opinions of the Care Manager and PEER Mentor/Navigator in relation to implementation and perceived impact of the WPIC model? In what ways are HUBs altered due to integrating WPIC (HUBs) and how are community partnerships developed that are based on WPIC (Health Forums)? 1. 2. 3.

  19. Developmental Evaluation Interpretive Lens and Activity Generator 19 Emergence Interpretive Frameworks Adaptation Uncertainty

  20. Sample Developmental Evaluation Methods/Activities 20 Rapid assessment via consistently used short online surveys and extraction of key information from all attended or recorded meetings Ongoing environmental scanning and outcomes monitoring Systems change mapping (looking at changes in resource dedication, organizational structure, staffing patterns, and similar) Interviewing teams in each HUB via focus groups to assess their understanding of WPIC, evaluation, capacity to interpret objectively data and findings, and openness to change based on evidence

  21. Outcome Evaluation 21 Four evaluation questions are linked to multi-level outcomes: What are the key facilitators, barriers, and solutions that influenced HUBs adoption of WPIC? What are the WPIC practices, specifically of the WPIC implementation of the AIMS model, that influence/facilitate fidelity to the model, provider satisfaction, and clinical outcomes? What are the key factors for persons served that influence the adoption of self-management solutions into their lifestyles? Are resource gaps being addressed and, if so, is there sufficient increase in resources to implement and maintain WPIC over time? 1. 2. 3. 4.

  22. Sample WPIC Evaluation Outcomes Outcome Evaluation 22 25 outcomes segmented into HUB, Consumer, Provider, and Community Level Outcomes Level Outcome Data Source/Method HUB Rates of ER visits will decline by 15% compared to a similar time frame from a previous year Partners service data Consumer Person served completion of services will improve by 20% compared to a similar time frame from a previous year HUB service data 80% of persons served will improve in clinical outcomes related to behavioral health indicators Duke 8 Survey Provider At least 80% of practitioners will indicate satisfaction with the WPIC service delivery model Evaluation survey; interviews Community Time Bank data will endorse improvement in community participation from baseline for 70% of Time Bank WPIC members Project specific online survey; Time Bank data

  23. Sample of Activities and Methods for Outcome Evaluation 23 A retrospective interview process to gather historical recollections and experiences of staff and practitioners in the HUBs before and after WPIC implementation. The HUBs current data collected to develop a data base of shared items that will help to describe persons served, staff, clinical outcomes, and also unique fields for assessing HUB specific outcomes. The use of an online survey and follow-up interviews to describe care coordination practices and how practices are adapting to and engaging in WPIC. This will include measuring satisfaction with WPIC implementation, practice pathways, and the perceived link to outcomes. Surveys and interviews will be repeated.

  24. Case Studies 24 Case studies start by defining the case. For this project, a case is an individual family that one or more members (preferably more) are being served via the WPIC system. A stratified system of deliberate sampling will be completed (e.g. age, gender, presenting problem), with final variables TBD.

  25. Sample Case Study Questions 25 What are the experiences of each case in the WPIC model? What are the perceived benefits or changes for cases in response to exposure to WPIC? What were the expected, planned outcomes due to being involved with WPIC? How well were they met? What were unexpected outcomes or effects from involvement with WPIC?

  26. Implementation Evaluation 26 The design is WHAT you are going to do in the evaluation Implementation is HOW you are going to do it Note: Implementation evaluation focuses on how a program is or is not achieving its goals through looking at contextual and fidelity factors Useful for multi-site projects to determine differential effects

  27. Implementing the Evaluation 27 Implementation monitoring of an evaluation is similar to strategic planning Defining the evaluation and the evaluation vision GAP analysis to detect evaluation barriers Crafting the evaluation implementation plan Evaluating evaluation performance, reviewing, adjusting, and correcting Implementing the evaluation plan

  28. Tracking Program Implementation AND Evaluation Implementation 28 Selection of methods Test Qualitative measures to track process Review methods if repeating in cycles to ensure fidelity and reduce drift Re-Test Examine MODIFY methods after testing them for fidelity and for changes in needed information Modify

  29. Next Steps 29 Implementation and Evaluation Teams Combined Efforts to map HUB Resources Collaborations and relationships with emphasis on similar initiatives to avoid duplication and to establish mutual support Data collected for HUB and HUB members/partners Current measures being used (frequency, data storage, use) Data systems (EHR, etc.)

  30. Next Steps 30 Detailed work flow of each HUB via document review and shadowing HUB understanding of current draft outcomes and collaborating to finalize Define HUB specific outcomes Discussion of data driven decision-making practices and needs How will failures and successes be determined and measured? What is needed to attribute outcomes to services provided?

  31. Next Steps 31 Finalize the Adapted Collaborative Care Model (ACCM) process points for tracking Complete work flow observations to determine process points that fit ACCM How are problems or issues identified and addressed in work flow? How are problems or issues identified and addressed that related to patient goals or outcomes? Finalize white paper and evaluation model based on recent meetings and observations

  32. Q & A 32 Questions, comments, sarcastic remarks? Question that comes to mind later? Email: gwalby@comsysinn.com Call: 727-858-3335

Related


More Related Content