
Technology Tools for Assessment Challenges in Academic Innovation
Explore the use of technology tools to solve assessment challenges in academic settings, including analyzing tools, developing evaluation criteria, and understanding the importance of technology in assessment processes.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
MJ Bishop, Director, USM Kirwan Center for Academic Innovation Sherri Braxton-Lieber, Director, Instructional Technology, UMBC Jennifer M. Harrison, Associate Director for Assessment, Faculty Development Center, UMBC
When we complete this session, we will be able to Analyze how technology tools can solve assessment challenges Classify technology tools and their assessment uses Develop criteria to evaluate technologies for specific uses 2
What assessment problem are you trying to solve?
External accreditor (national/disciplinary, regional, state) USM internal reporting (system, regents) Institutional senior leadership Other administrators/staff College/departmental leadership Faculty Students
Task Technology Timing Collecting What do you need to know about student learning? What tools could help you find out? When will you measure? Diagnostic? Formative? Summative? Connecting Organizing Archiving Analyzing Communicating
Task Technology Timing Collecting Connecting Organizing Archiving Analyzing Communicating
Diagnostic? Formative? Summative?
Quantitative? Qualitative? A mix of the two?
Learner (macro) Institution College Department Program Learning (micro) Course Assignment Concept/competency Continuous/real-time
Learning Analytics Data Learning Outcomes Data Demonstrate Mission 11
Focus on Common Ground Use Backward Design Take Inventory
The UMBC mission defines student learning goals broadly. UMBC s Mission The institutional-level and general education learning outcomes, or Functional Competencies, express these outcomes in five cognitive skill sets. Institutional-Level & General Education Outcomes These general, transferable skills become more focused and particular when expressed in program-level learning outcomes. General Education Committee Program- Level Outcomes Outcomes are even more specific in course-level and assignment outcomes. Course-Level Outcomes Course-Level Outcomes Course-Level Outcomes Course-Level Outcomes 14
UMBCs Mission FC 1: Oral and Written Communication FC 3: Critical Analysis & Reasoning FC5: Information Literacy OUE/FYS SLO 5-7 Literacy, Resilience, Integrative Learning OUE/FYS SLO 1 Communication OUE/FYS SLO 3 Analysis FYS 102 SLO 3-5: Synthesize Social Justice resources FYS 102 SLO 1: Communicate FYS 102 SLO 2: Analyze systems Service Learning Project Service Learning Project Service Learning Project 15
Aggregated Rubric Data: Five UMBC Writing Intensive Courses, Fall 2015, 103 Students What direct evidence do you need to understand student learning? 60% 50% Percentage of Students 40% 30% 20% 10% 0% Critical Organization (FC3) Concepts/Prin ciples (FC4+5) Thinking/Crea tivity (FC3,4,+5) Writing skills (FC1) Terminology (FC3) APA format (FC4+5) Clarity (FC1) Proficient 4 50% 39% 40% 36% 36% 37% 31% Competent 3 40% 41% 46% 36% 31% 48% 41% Minimal Competence 2 9% 1% 19% 1% 14% 1% 28% 1% 31% 2% 15% 1% 25% 3% Not Competent 1 16
What indirect evidence do you need to understand student learning and success? 17
At UMBC, we already had tools for Analytics REX A4L Outcomes Blackboard and Blackboard Outcomes Qualtrics Clickers Scantron Excel, PowerPoint, Word
At UMBC, we needed tools to Link Learning Analytics Data to Student Learning Outcomes Data, so we Formed a Working Group to Explore
At UMBC, we need tools to Align Direct Measures to Outcomes and Aggregate Rubric and Test Data, so we are Piloting EAC Visual Data
Criteria Questions to Consider Usability Does the software attempt to encode current best practices in assessment? Can it manage multiple levels of assessment? Is the software intuitive and easy to use? (Will users need extensive training?) Does it work well with software you already have (and are using)? Does it require additional steps/staff to enter and extract data? Does it make assessment feel like an add-on? Or does it make assessment work part of the teaching and learning process? Is the software flexible enough to grow to meet your needs? Does data export in a useable format?
Criteria Questions to Consider Cost How much is the annual software license? Are updates included? How much will it cost for Instructional Technology support and training? What hardware/cloud costs are needed to support it? What hidden costs might emerge? Can the software serve the multiple levels of audiences that authentic assessment requires? Who will need to use this software? How will they learn to use it? Can users interpret the results easily? Can the software provide custom and ready-made reporting tools so you can ask a range of action research questions? Audiences
What Criteria Would You Add? What Questions Would You Add?