Insights into End-of-Term Course Evaluation SETE Trials

end of term course evaluation sete n.w
1 / 16
Embed
Share

Gain valuable insights into the Fall 2013 Campus-Wide Trial of End-of-Term Course Evaluation SETE, featuring response rates, benefits, faculty reports, challenges faced, decisions made on-the-fly, and lessons learned in implementing the evaluation system efficiently and effectively.

  • Evaluation
  • Faculty Reports
  • Challenges
  • Lessons Learned
  • Campus Trial

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. End-of-Term Course Evaluation SETE Fall 2013 Campus Wide Trial

  2. Response Rate 128,747 surveys scheduled 46,093 surveys completed

  3. Response Rates by College 46% 42% 40% 39% 37% 36% 34% 14% CFENS FRANKE CAL EXTENDED CAMPUS CHHS SBS UNIVERSITY COLLEGE COE

  4. Benefits Green system Single system to facilitate Annual Review and P&T Consistent questionnaire Uses up-to-date information from PeopleSoft SoC Integration with FAAR Centralized reporting Aggregate reports for units Individual reports for faculty SETE score

  5. Faculty Reports

  6. Faculty Reports

  7. Faculty Reports

  8. Faculty Reports

  9. Faculty Reports

  10. Campus Wide Trial Challenges Planning and implementing simultaneously Decentralized institution Dynamically dated courses Sequentially taught courses Inconsistency in SoC Primary Instructor

  11. Decisions Made on the Fly Overlay the new system on the old Open and close dates Roster view available after 5 responses Responses available with less than 5 after the close

  12. Lessons Learned Send more than three notifications that are personalized Make clear end date Support faculty in encouraging student response

  13. Lessons Learned Centralize mapping of evaluations Continuous communication with students Encourage Smarter Services to: Develop normed national data Expand ways to analyze the results

  14. Resources Tutorials http://www.smartersurveys.com/default/index.cf http://www.smartersurveys.com/default/index.cf m/client m/client- -resources/support resources/support- -documents documents/ /

  15. Gathering Feedback Comments Please send additional comments to Denise Helm at Denise.helm@nau.edu

  16. Special Thank You The Implementation Team ITS, ELC, Student Support Desk, Office of the Registrar

Related


More Related Content