Enhancing Learning Analytics Through Pedagogical Maturity

the missing link in learning analytics n.w
1 / 18
Embed
Share

Explore the significance of pedagogical maturity in advancing learning analytics within higher education, addressing the current challenges and emphasizing the need to integrate pedagogical considerations with technical aspects for sustainable impact. The discussion includes insights on maturity models, analytical maturity, and e-learning maturity models to optimize effectiveness and implementation strategies.

  • Learning Analytics
  • Pedagogy
  • Maturity Models
  • Higher Education
  • Educational Technology

Uploaded on | 1 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. THE MISSING LINK IN LEARNING ANALYTICS: PEDAGOGICAL MATURITY DRANGELO FYNN (UNISA) DR LIZ ARCHER (UWC)

  2. INTRODUCTION New Educational Research Shifts in HE Academic analytics Policy environment Learning & Teaching Funding environment Instructional technology Technological environment Learning analytics Student participation

  3. INTRODUCTION

  4. PROBLEM STATEMENT Learning analytics is still very much in its infancy as far as established disciplines are concerned. (Viberg et al., 2018). The focus thus far has mainly been in attaining and producing effective models; these are, however of little value without implementation.

  5. PROBLEM STATEMENT learning analytics tools are generally not developed from theoretically established instructional strategies, especially those related to provision of student feedback. (Gasevic, Dawson & Siemens, 2015, p. 66) Repurposed academic analytics software e.g. Course Signals

  6. PROBLEM STATEMENT For Learning Analytics to have a sustainable and positive impact on the higher education landscape, the pedagogical component cannot be neglected while technical matters receive all the attention.

  7. MATURITY MODELS Maturity models have a long history in the Information and Technology (IT) sector and are used to strategically implement, manage, plan and optimise systems within an organisation. (Crowston, 1993) Best practices with evidence of effectiveness More recently, the concept of maturity models has emerged within the higher education sector with the model applied to teaching computer science, implementation of business intelligence, implementation of e-learning and m-learning. (Duarte & Martins, 2011; Gu, Chen, & Pu, 2011; Marshall & Mitchell, 2001; Neuhauser, 2004)

  8. ANALYTICAL MATURITY Explorative Empowered Analytically astute Analytically aware Analytically unaware

  9. E-LEARNING MATURITY MODELS Optimising Managed Defined Repeatable Initial

  10. PEDAGOGICAL MATURITY Maturation of teaching and learning processes within the institution to the point where innovative and strategic approaches to teaching and learning are enabled

  11. Level 5 Institutional synthesis of analytics into teaching & learning process Level 4 Projections of student learning based on engagement, Pedagogy performance & non-cognitive factors Consensus and synthesis of between courses - Learning activities Level 3 & Epistemology Between course consensus and explicity about Level 2 Outcomes What constitutes a learning activity Module-specific teaching approach. Level 1 Little integration between modules. Lack of clarity of what constitutes a learning activity

  12. Level Pedagogy Assessment Student experience 5 Institutional synthesis of analytics into Level 5 teaching and learning process Institutional synthesis of analytics into teaching & learning process 4 Projections of student learning based on Level 4 Projections of student learning based on engagement, engagement, performance and non-cognitive performance & non-cognitive factors factors Consensus and synthesis of between courses 3 Learning activities & Epistemology Consensus and synthesis of between courses - Learning activities & Level 3 2 Between course consensus and explicity Epistemology about Outcomes Between course consensus and explicity about Level 2 What constitutes a learning activity Module-specific teaching approach. Outcomes 1 What constitutes a learning activity Little integration between modules. Level 1 Module-specific teaching approach. Lack of clarity of what constitutes a learning Little integration between modules. activity Lack of clarity of what constitutes a learning activity

  13. Level 5 Adaptive testing automated by predetermined thresholds Assessments developed based on current Level 4 Assessment data on student learning. Level 3 Assessment development based on historical trends of student performance Level 2 Multiple assessment modalities Clearly established assessment framework Assessment designed for individual instances of assessments Level 1 No historical track record of student outcomes No clear structure for the assessment process

  14. Level Pedagogy Assessment Student experience 5 Institutional synthesis of analytics into Level 5 Adaptive testing automated by predetermined Adaptive testing automated by predetermined thresholds teaching and learning process thresholds 4 Projections of student learning based on Level 4 Assessments developed based on current data Assessments developed based on current engagement, performance and non-cognitive on student learning data on student learning. factors Consensus and synthesis of between courses Level 3 3 Assessment development based on historical Assessment development based on historical Learning activities & Epistemology trends of student performance Multiple assessment modalities trends of student performance 2 Between course consensus and explicity about Clearly established assessment framework Level 2 Outcomes Multiple assessment modalities What constitutes a learning activity Module-specific teaching approach. Clearly established assessment framework 1 Assessment designed for individual instances of Little integration between modules. Level 1 assessments Assessment designed for individual instances of assessments Lack of clarity of what constitutes a learning No historical track record of student No historical track record of student outcomes activity outcomes No clear structure for the assessment process No clear structure for the assessment process

  15. Student Experience - Disjointed learning experience between courses - Independent, diverse epistemologies - Real-time, automated and personalised feedback throughout - Alignment of Epistemology

  16. Level Pedagogy Assessment Student experience 5 Institutional synthesis of analytics into Adaptive testing automated by predetermined Real-time, automated and teaching and learning process thresholds personalised feedback throughout Alignment of Epistemology - Disjointed learning 4 Projections of student learning based on Assessments developed based on current data experience between courses - Independent, diverse epistemologies engagement, performance and non-cognitive on student learning factors Consensus and synthesis of between courses 3 Assessment development based on historical Learning activities & Epistemology trends of student performance 2 Between course consensus and explicity Multiple assessment modalities about Clearly established assessment framework Outcomes - Real-time, automated and What constitutes a learning activity Module-specific teaching approach. Assessment designed for individual instances of personalised feedback 1 Disjointed learning experience throughout Little integration between modules. assessments between courses - Alignment of Epistemology Lack of clarity of what constitutes a learning No historical track record of student Independent, diverse activity outcomes epistemologies No clear structure for the assessment process

  17. CONCLUSIONS There has been work into developing theoretically derived learning analytics LOCO analyst tool Links academic activity to specific learning domains Development of pedagogical maturity requires multidisciplinary teams Pedagogical maturity ranges on a continuum

  18. CONCLUSIONS To what extent should IRO/MIS drive the development of pedagogical maturity? Can we still develop learning analytics in the absence of pedagogical maturity? How do students shape the pedagogical maturity of an institution? Can a single framework encompass an institutional pedagogy?

Related


More Related Content