CPI Committee Meeting and RADs Team Agenda
The CPI Committee Meeting scheduled on Thursday, October 10 aims to introduce the new committee and RADs team, review day presentations, assess survey results, and plan next steps for continuous program improvement. The RADs team comprises key members who oversee research, assessment, and decision support. The meeting agenda includes discussing CPI reports, gathering feedback, and outlining new initiatives. Objectives focus on aligning with MSCHE expectations for educational effectiveness and enhancing institutional improvement practices.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
CPI Committee Meeting Thursday, October 10. 12:45-2:10p.m. DL-2/706
CPI Committee Meeting Agenda 1. Introduction of the new CPI committee and RADs team. 2. Charge of the CPI Committee 3. Brief review of CPI day presentation (if necessary), and survey of the CPI day results. 4. Review of the CPI reports and previous CPI members feedback. 5. Next step CPI: a). follow up with CPI groups that already submitted the reports. b). Identify the new CPI initiatives by RADs and corresponding programs/unites/deans.
RADs TEAM Office of Research, Assessment and Decision Support Senior Director RADS Mike Lane Director Director Director Institutional Research Institutional Effectiveness Program Intelligence and Improvement Michael Urmeneta Shifang Li Mohammed Moizuddin
CPI Committee Charge: Ensure that the Continuous Program Improvement (CPI) practice is consistent with MSCHE s expectations for continuous improvement of educational effectiveness. Periodically evaluate Continuous Program Improvement (CPI) process and make recommendations for improvements.
3. The Survey Results CPI DAY Objectives: 1) Report to you about Middle States (MSCHE) visiting team's recommendations related to CPI. 2) Clarify CPI purpose: advance NYIT mission and priorities, and meet MSCHE's expectation of continuous improvement of institutional and academic effectiveness. 3) Clarify CPI process and identify examples of how CPI has been used to advance mission-based metrics. 4) Differentiate the CPI process from more traditional assessment activities.
3. The survey result: the responses Where % Count Old Westbury - Rockefeller Auditorium 51.28% 20 Old Westbury _Harry J.Schure 20.51% 8 New York City 20.51% 8 Vancouver 2.56% 1 Jonesboro 5.13% 2 Total 100% 39
3. Survey results: Q1. Was the objective of the morning session met? Comments? Counts 29/33 Response & Comments Yes. It was probably the most comprehensive presentation regarding the CPI objectives that I have seen. . the agenda was tight and focused. Not sure. Tech issue: disjointed , did not see any presentation. 2/33 1/33 No. Not helping the people in the CPI process. 1/33 No. the key note subject matter was trite maybe focus on successful initiatives from other universities.
3. The Survey Results: Q2. Of those 4 topics, which areas would you like to hear more? 25 responses, among those 25: 1. Middle States (MSCHE) visiting team's recommendations related to CPI (2/25) 2. Clarify CPI purpose: (10/25) 3. Clarify CPI process and identify examples (10/25) 4. Differentiate the CPI process from more traditional assessment activities. (9/25)
3. The Survey Results: Q3. In the future, what topics you would like to hear and share on CPI DAYs? More examples Bench-marking of NY tech against peer intuitions Any department improvement (effort) that have an impact. Hearing ideas of how others applied CPI to effective results, especially if the application could be duplicated in other areas. Its easier to adapt the successful processes of others who have tested and refined than to start from scratch. Hear more about activities that department do to meet department goals and objectives .
3. The Survey Results: Q4. Any other general suggestions about the event? 23 responses, of those 23: (9/23) Fix the video conferencing technology! Get it right and tested. (1/23) Global campus: need support from administration to implement substantial changes to improve the quality of the program. (1/23) Successful initiatives from other universities. (1/23) Help with the current CPI More registrar and enrollment service center improvement. (7/23) None. Informative and present very well.
4. Review of CPI Report Phase II: Academic Programs VA-MS, Energy Management MS Energy Management (NY) BS Information Technology MS Envir Tech Sust-NYC, LI, online Architectural Technology Architecture Urban and Regional Design Interior Design Health Sciences& Health Wellness Clinical Nutrition
4.Review of CPI Report Phase II: Reviewer Feedback What s in common: All academic programs concluded with a set of improvement initiatives What s missing: An action plan with targeted outcomes for each of the initiatives, a timeline, and personal responsibilities Connect resources (human, financial) request with the initiatives, and validate it with benchmark data analysis.
An example of questions to clarify in an action plan SOM: Strategic Enrollment Management Joint Committee: What is the two year plan? What is the timetable? Who will be doing what? What media is used to send which market signals and to whom? How does this progress over time (the two years)? What activities are going to be run and what is the purpose and content? Who will participate in what? How do we build traction over the plan with all prospective students? What are the expected outcomes in the coming two years? _____From Dean Jess
An example of benchmark: Student/faculty ratio Student faculty ratio score Peer Institutions 45.2 New York Institute of Technology New Jersey Institute of Technology Rutgers University-New Brunswick Stevens Institute of Technology Adelphi University CUNY Bernard M. Baruch College CUNY Hunter College Fordham University Hofstra University LIU Brooklyn LIU Post Manhattan College New York University Pace University-New York Rensselaer Polytechnic Institute Rochester Institute of Technology St John's University-New York SUNY Farmingdale State College SUNY University at Albany Stony Brook University SUNY College at Old Westbury Drexel University 26.9 34.8 84.1 73.9 29.5 73.9 45.2 49.3 49.3 64.1 58.8 84.1 45.2 37.9 53.8 26.9 19.6 22.9 29.5 21.2 84.1 NYIT
4. Reviewer Feedback Reviewers feedback: "New initiative stated "engage more students in research" this is not specific enough, perhaps it could be helpful if this goal had a specific metric assigned. Meaning, what could be a target number of students/research projects for 2019? (e.g., 5, 10, or an increase by 10% as compared to 2018--this will let you assess whether you succeed/fail) An outline of target dates within the two years to benchmark progress. Ideas could be given to identify space for a lab, required staffing levels, data as to current enrollment so that there is a measurable number to compare to and a method to reevaluate progress each term." The recommendation was to improve strategic recruitment, but no baseline, no personal responsibilities assigned, and no timeline to complete in the report. The request for resources for strategic priorities do not align with the initiative.
4. The Provosts feedback: While retention rate, year to year, and graduation rates are important higher-level outcomes, other parameters may be helpful to you for CPI initiatives. Seat time or semesters enrolled to degree Stop outs Change majors Problem or bottleneck courses (e.g., difficulty) Degrees awarded and degree efficiency Progress to degree (e.g., proportion w 30 credits at semester 3, 60 at semester 5 etc.) Credits attempted/credits completed Total SCH required for degree Transfer student retention and time to degree
4.Review of CPI Report: Administrative Units Academic Healthcare Center Career Services and Student Employment Development and Alumni Relations English Language Institute Registrar
4.Review of CPI Report Phase II: Reviewer Feedback What s common: All administrative programs concluded with a set of improvement initiatives. Action plans were included. What s missing: Some are too general. Need more specific and measurable metrics with timeframes Some need more alignment of operational goals to institutional goals.
4.Review of CPI Report Phase II: Reviewer Feedback Reviewer feedback quotes: "I think that these are good initiatives but they are too general. I think that all of these goals should have more specific metrics that will help you determine at the end of 2019 whether you succeeded or failed. "I do not think the report substantiates that the initiative will result in the program contributing to institutional effectiveness. In my opinion, there is not enough data to support the plan at this time. "In contrast, in order to better evaluate the initiative, there needs to be measurable metrics so that you have a valid comparison of a certain timeframe that matches to when the changes were implemented. "It would be good to have some quantitative metrics associated with tracking efficiency such as response time for customer service in critical areas to show continuous progress objectively. "Their plan for a new improvement initiative seems fine and their key performance indicators and allocation of existing resources seemed well reasoned."
5. Identify the new CPI initiatives by RADs and corresponding programs/division/deans Other data analysis to identify areas to intervene: Planning and Resource Optimization Program (PROP) Student matriculation in house data as suggested by the provost. NSSE (2009-2018), NL_SSI (2009-2018), GSS (2014-2017) EMSI (Supply and demand market analysis, external) Hanover research WSJ/TIME ranking metrics CLA+ 2019 NYIT results.
Agenda 5. Selecting Key Metrics
Agenda 5. Selecting Key Metrics
Differentiate the CPI process from more traditional assessment activities. The Annual (learning)Assessment Program Report was ineffective as it required an assessment of how effective courses were at addressing the current xxxx Program Outcomes. This serves little purpose if the Program Outcomes are vague and no longer relevant... "Tweaking the program through the Annual Assessment Program Report, would not lead to the level of change required to develop a leading program ---------From one of the 2018-2019 CPI Report