Monitoring and Evaluation of Facility-Based HIV Self-Testing

module 7 monitoring and evaluation of facility n.w
1 / 26
Embed
Share

Explore the essentials of monitoring and evaluation in facility-based HIV self-testing, including key indicators, data management processes, and the importance of M&E for program improvement and accountability.

  • HIV self-testing
  • Monitoring
  • Evaluation
  • Data management
  • Program quality

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. MODULE 7: MONITORING AND EVALUATION OF FACILITY BASED HIV SELF-TESTING

  2. Learning Objectives 7 By the end of this module, participants will be able to: Understand the Basics of Monitoring & Evaluation (M&E) in Facility-Based HIV Self-Testing (HIVST) Identify and Apply Key Indicators for Facility-Based HIVST Use Data to Enhance Program Quality and Identify Areas for Improvement

  3. Introduction to Monitoring & Evaluation 7 Why does M&E matter? Monitoring indicators provides insights into program reach, impact, and progress toward objectives. Identifying program gaps: detects gaps or challenges in service delivery. Data-driven adjustments: uses data to enhance outcomes, improving overall program quality and effectiveness. Part of quality improvement interventions. Demonstrate accountability and good governance Overall, its goal is to improve current and future management of inputs, outputs, outcomes, and impact of the program.

  4. Understanding The Existing M&E System 7 Facilities use different M&E systems based on facility size, available resources, and operational needs. Paper-based system: Includes registers for HIV testing, care, and prevention (e.g., PrEP, PEP), recording client data, test outcomes, and linkage. Pros: Simple setup, minimal technology reliance. Cons: Labor-intensive data entry, limited adaptability to changing indicators, and challenges in data completeness. Electronic system: Supports efficient data collection, cleaning, analysis, and real-time reporting. Pros: Enables quicker data access, easier adaptability to new indicators. Cons: Requires stable electricity, IT support, and dedicated staff time, upfront investment. Hybrid approach: Combines electronic and paper systems for flexible data management where full digital integration isn t feasible. Pros: Balances digital efficiency with the simplicity of paper. Cons: Complexity in synchronizing data and managing dual systems. Comprehensive monitoring of facility-based HIVST may be limited by data availability. Therefore, be realistic and prioritize tracking key indicators as a minimum requirement.

  5. Data Management Process 7 The systematic approach of collecting, collating, analysing, reporting, and ensuring the quality of data to support program decisions and improvements. Data sources: origin of data, categorized as primary (e.g., HTS / HIVST register), secondary (e.g., site spreadsheets), or tertiary (e.g., national census data) Data collection and collation: Collection: gathering data from primary sources, both paper and electronic Collation: aggregating data into standardized formats, done manually or electronically Data analysis: reviewing data to derive insights; enables identifying trends, predicting relationships, and confirming data accuracy Data reporting and quality: Reporting: presenting data as knowledge for stakeholders to track progress, address challenges, and share successes Data quality: ensuring reliable data to support decision-making at all health program levels

  6. M&E Indicators Related to HIVST 7 Distribution Use and results Linkage Number of individual HIVST kits distributed (programme data) (required) Number of HIVST tests used and the percentage of HIVST-positive results observed and self-reported (programme data) Number and percentage of people diagnosed with HIV following HIVST (programme data) Number of sites distributing HIVST kits (programme data) Percentage of new ART initiations among people diagnosed with HIV who report prior self-testing in the past 12 months (programme data) % of the population who has ever self-tested (survey) Percentage of first-time testers among people who received HIVST (programme data) % of the population who has ever self-tested and reported positive result of self-test (survey) Proportion of people who test positive for HIV using an HIVST, enrolled in ART services (survey) Percentage of the population aware of HIVST (survey) % of those tested in the last 12 months reporting self-test as their last test (survey) Percentage of PrEP initiations among people who report prior self-testing in the past 12 months (survey) Percentage of the population willing to self-test if available (survey)

  7. HIV Self-Testing Cascade M&E Challenges 7 HIV Self-Testing Cascade 120% 100% 80% Privacy of the test & autonomy of users Relying on self-report to measure outcome of self-test and uptake of linkage to treatment/prevention services Cost and feasibility of self-test user follow up Measuring secondary distribution 60% 40% 20% 0% HIVST HIVST Result Confirmative test HIV treatment Distribution. Utilisation (reactive) (positive) uptake CQUIN dHTS Meeting | July 9-12, 2024

  8. HIVST M&E General Principles 7 Using multiple data sources and information (including triangulation) Data collection should not be intrusive or burdensome, protection of confidentiality and privacy Human and financial cost of active monitoring to be considered Human and financial cost of active monitoring to be considered

  9. Key Data Sources for HIVST M&E 7 Routine programme monitoring Self-reported data on HIVST Routine HIVST monitoring Special surveys, population size data, client/patient- based surveys Data on use of HIVST from other service data HIV Testing service register, HIVST order form, sale registers Self-administered forms, client feedback, hotline follow up calls ART/PrEP service register, HTS register, health statistics Target groups for HIVST. Group size. Coverage of HIVST. People reported on using HIVST results. Reported results positivity. HIVST kits distributed People receiving HIVST Coverage of HIVST programme People reported on using HIVST services prior to confirmatory HIV test, ART, PrEP, etc. Group using HIVST. Positivity rate of HIVST. People reported on using services after HIVST. Accessing confirmatory test, ART, PrEP, etc. Percentage accessing confirmatory test, ART, PrEP, etc.

  10. Routine Data Sources for HIVST M&E 7 Community and facility-based models Internet based distribution models: Online, mail order and pick up Retail stores pharmacy- based, vending machine distribution On-site self- testing facility Primary distribution of HIVST kits for later use Secondary distribution, index testing, network based Individual HIV Testing register; self-administered forms; self- assessment HIVST distribution Individual HIV testing register Individual self-administered tools (cards, online forms, etc.) Site Level follow up of self-test use by partner (secondary/network based) Site level tracking previous use of HIVST (linkage) Indirect data on ART/PrEP initiation HIVST use/results HIVST linkage

  11. Routine HIVST Monitoring Tools Measuring Distribution 7 Individual level data Site level data Provider or self- administered assessment form Individual HIV Testing register (provider administered) Existing services register add-on Individual HIVST form (client self- administered) Client referral card HIVST online order and assessment form Event register or site level commodity register Online or retail Sale register Information on people receiving HIVST Information on HIVST approach, use by self or secondary distribution Information on HIVST approach, use by self or secondary distribution

  12. Routine HIVST Monitoring Tools Measuring Linkage 7 Notifications and referrals Self-administered reporting Individual-level follow up Clinic registers Referral cards to link to services. Automated SMS and messages. Interactive voice response systems Mobile apps, messengers, chat bots, web apps and online feedback collection forms Provider administered individual follow up forms, peer referral and navigation HTS registers, ART registers, PrEP registers, etc. Number of people self-reported positive tests results confirmed after HIVST Proportion of people using prevention, testing and care services prompted by HIVST Number of people self-reported link to prevention and treatment services after HIVST

  13. Measuring HIVST Linkage in Care and Prevention 7 Clinic registers Include questions into existing clinic registers on prior use of self-testing All referral health service points (ART services, PrEP and VMMC services) can adopt data collection to capture prior HIVST use (e.g. HIVST referral cards; clinic register etc.) Data can provide useful information on the proportion of all ART/PrEP initiations prompted by HIVST Prior HIVST use Caveat: Data may be subject to recall bias and some people may not disclose prior HIVST use and/or results Data do not provide a denominator to measure linkage following HIVST HIVST result Confirmative test result Post-test services

  14. Some Considerations for Data Triangulation and Impact Assessment 7 Leveraging testing and outcome indicators could be used to derive routinely key programmatic indicators Dashboards or visualizations can be used by country Reporting HIVST volumes will be key to assess changes in testing positivity: Ignoring HIVST volumes may lead to misinterpretation of changes in testing positivity Especially in countries using HIVST as screening tool Need to be included in Epidemiological models such as EPP/Spectrum, Shiny90, NAOMI Estimating the impact of HIVST in terms of new HIV diagnoses and ART initiations requires to incorporate other data sources and modelling tools Population-based HIV surveys and UNAIDS estimates could be used to analyze testing and outcome indicators jointly with demographic and burden estimates Bayesian hierarchical models could be used to incorporate uncertainty at the various levels of the modelling. WHO is working on improving methods, working with countries on routine impact analysis and developing tools to use routine programme data together with efforts to align epidemiological modelling, academic modelling and other project and study specific data analysis. Courtesy Cheryl Johnson, WHO

  15. Indicators for Facility-Based HIVST (1) 7 Distribution and HIV self-tesing at the facility: No. of clients eligible for HIV testing (e.g. in ANC, TB, OPD) (can serve as overall denominator) No. /% of clients issued facility-based HIVST: reach of self-testing in the facility No./% of clients using HIVST kits: actual uptake and coverage among clients No./% of clients with reactive HIVST test result Linkage rates (for HIVST positive): No/% of clients with reactive results who proceed to confirmatory testing Confirmatory test results: outcomes of confirmatory testing (reactive/non-reactive/ invalid) No./% initiated ART Linkage rates (for HIVST negative): percentage of clients with negative results who proceed to key HIV preventive services (PrEP, VMMC) Frequency of repeat testing: patterns of retesting All indicators can be further disaggregated by sociodemographic factors (e.g., age, sex), entry points, and population characteristics (e.g., MSM, sex workers, PWID, etc.) Data may be compiled and reported on monthly and/or quarterly basis Complete data may not always be available, so the cascade might begin at a later denominator rather than the first possible one. Additional indicators further along the care cascade are typically monitored by other program components, such as those for PrEP, PEP, family planning, and ART care.

  16. Indicators for Facility-Based HIVST (2) 7 Partner services, family/household testing Social network testing No. clients offered kits for secondary distribution No. clients offered kits for secondary distribution No./% clients agree to secondary distribution No. /% clients agree to secondary distribution No. of kits distributed No. of kits distributed No. contacts listed and eligible for testing No. /% contacts performing HIVST % of eligible contacts successfully contacted No. /% contacts with test result No. /% of eligible contacts tested No. positive contacts linked for confirmatory testing, and treatment No. negative contacts linked to HIV preventive services (PrEP/PEP, VMMC) % eligible contacts tested HIV positive % eligible contacts linked to HIV care (confirmatory testing, ART initiation) No. /% eligible contacts HIV-negative partners linked to HIV prevention (PrEP, VMMC)

  17. Additional Operational Indicators 7 Operational indicators: HIVST Kit stock levels: monitors kit availability to prevent shortages Number of days with kits stock outs Facility coverage: tracks the number & proportion of service points offering HIVST

  18. Methods of Data Collection 7 Avoid duplication or parallel data collection: streamline processes by using existing systems where possible Integrate into existing tools and reporting: HTS / HIVST testing register: used for comprehensive tracking and monthly summaries Monthly summary forms: captures key indicators and outcomes Testing cascade reporting: helps identify bottlenecks and areas for quality improvement Feedback forms: Simple surveys to gather client feedback on HIVST Periodic check-ins: Schedule follow-up contacts with clients who received / distributed kits to assess their experience and linkage to care. Qualitative interview data: Also collect and analyse simple interview data for in depth understanding of perceived and experienced quality of services, testing process, linkages, and recommendations for improvements.

  19. What is Cascade Analysis in Program Monitoring? 7 What is cascade analysis? Tracks the client journey across key stages, from testing to treatment and retention Identifies drop-off points where clients may disengage from the care process Purpose: Helps to visualize challenges along the cascade and to ask the right questions Highlights areas needing improvement to ensure clients receive timely and continuous care Supports targeted interventions to enhance service retention and outcomes

  20. How to Identify Bottlenecks and Areas for Improvement? Cascade Analysis (1) 7 HIVST testing cascade 1000 1000 800 (80%) HIVST prevention cascade 800 700 (88%) 568 (81%*) 600 HIVST positive care cascade 400 220 (39%) 120 (17%*) 95 (95%) 200 100 (83%) 88 (93%) 12 (2%*) 0 * The denominator is number of clients who used HIVST kits

  21. How to Identify Bottlenecks and Areas for Improvement? Cascade Analysis (2) 7 Why are the reasons that not everybody undergoes HJIVST (on ART, know status, decline HIVST, )? HIVST testing cascade 1000 1000 800 (80%) HIVST prevention cascade 800 700 (88%) 17% among clients have a reactive HIVST result why so high? Why are some clients not starting ART? Why is there a huge drop between testing negative and linkage to prevention? Why is not everybody linked to confirmatory testing? 568 (81%*) 600 HIVST positive care cascade 400 Why are there invalid results? 220 (39%) 120 (17%*) 95 (95%) 200 100 (83%) 88 (93%) 12 (2%*) 0 * The denominator is number of clients who used HIVST kits

  22. Data Feedback Loops for Program Monitoring 7 Use data and data feedback loops for monitoring and quality improvement at facility level and higher levels (e.g. program, national) COLLECT Interpret information Comparisons and trends Decisions based on information PROCESS Monthly stat data form based on: Minimum indicator set Standard definitions Data Sources and Tools Feedback loop process: Data collection: gather data on key indicators (e.g. linkage rates, client satisfaction) Analysis and review: regularly review data to identify patterns and trends (e.g. monthly, quarterly, annually) Feedback to staff: share findings with staff to highlight successes and areas needing improvement Action and adjustments: implement improvements plans based on feedback to address service gaps or challenges REPORT Data quality checks Data analysis: Indicators USE Narrative Tables Graphs

  23. Using Data for Program Improvements 7 Examples of improvements informed by data: Identifying service gaps: highlights issues in linkage rates, client satisfaction, or test availability for targeted improvements Adjusting service models: adapt HIVST approaches based on identified needs, like adding peer support or integrating services Enhancing client experience: leverage client feedback to improve delivery, reduce stigma, and create a more client-centered program Other important use of data Data driven advocacy (e.g. addressing service gaps) Policy development

  24. Addressing Challenges in M&E Using Data for Program Improvements 7 Key challenges: Data issues: gaps, inconsistent data, unlikely and duplication of values Delays in reporting Staff capacity: limited resources for effective supervision and feedback Client privacy concerns: ensuring confidentiality while collecting feedback Solutions: Regular training: build staff skills in data collection and quality assurance Simplified tools: adopt user-friendly tools to streamline data collection Secure reporting: protect client privacy with secure data handling methods

  25. Questions 7 Why is Monitoring and Evaluation (M&E) essential in facility-based HIVST programs? What are two key benefits of M&E for HIVST programs? How does M&E contribute to quality improvement in HIVST programs? What are the main proposed indicators for monitoring facility-based HIVST? Name two indicators for direct distribution monitoring. How is linkage to care measured in facility-based HIVST? How does cascade analysis help in monitoring HIVST programs? What does cascade analysis track in an HIVST program? Why is identifying drop-off points important in cascade analysis? What are the key steps in data management for facility-based HIVST programs? What are the main stages of the data management process? Why is data quality important in the data management process? How can feedback loops improve facility-based HIVST programs? Describe one key component of a data feedback loop. How can feedback loops drive program adjustments?

  26. Exercise 7

Related


More Related Content