
Jisc Learning Analytics Update and Data Sources
Explore the latest updates and data sources in Jisc Learning Analytics from April 2016 to May 2017, including uXAPI, LRW, Data Explorer, benchmarking, and more for enhanced educational insights.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Jisc Learning Analytics Apri 2016 May 2017 Jisc Learning Analytics Update
Update Data Sources - What Next? uXAPI LRW, Data Explorer and Study Goal Processes Benchmarking Aggregator
Data Sources - What Next Next focus on: Attendance Library Interventions
Getting the data right! Sample Data + xAPI Statement Templates + example dashboards = good xAPI Statements
Attendance, Library, Interventions: UxAPI 1. Data provided to in simple tab separated format: 2. UxAPI converts to xAPI 3. Data can then used in Data Explorer, Study Goal, or any other analytics application
Product Process Study Goal, Data Explorer LRW
Data Explorer, Study Goal and LRW Processes Overall steering group User groups being established Each product has Jisc internal product owner Monthly release schedule Users submit bugs and feature requests Scheduled in monthly releases Transparent tracking via Product Plan.
Steering Group Study Goal Group Study Goal Group Study Goal Group DataX User Group Feature Request Prioritising Product Owner Prioritised List Product Roadmap Monthly Sprint Planning/Review Monthly Release
Benchmarking What anonymous benching marking is useful to you? Comparison to aggregated peer data? Who does x compared to the average? VLE Use? Attendance? Does your data correlate in the same way? How inclusive are you compared the average or peers?
Benchmarking Anonymous benching marking is hard! We think it s doable! We ll be looking for people that want to work with us on this.