Deeper Learning Strategies: Putting Students in Charge of Assessments

turning homework and exams on their head deeper n.w
1 / 28
Embed
Share

Explore the innovative approach of collaborative learning through assessment in the CLASS framework developed by Michael Bieber. This method empowers students to take control of their learning, leading to deeper engagement and understanding of subjects. Discover the motivation, theoretical background, experimental results, and opportunities for collaboration outlined in this comprehensive system.

  • Deeper Learning
  • Collaborative Learning
  • Assessment
  • Student Engagement
  • Educational Innovation

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Turning Homework and Exams on their Head Deeper Learning by Putting Students in Charge CLASS - Collaborative Learning through Assessment Michael Bieber With help from S. Roxanne Hiltz, Erick Sanchez, Yi Xiong, and many others Information Systems Department College of Computing Sciences New Jersey Institute of Technology web.njit.edu/~bieber November 2015

  2. Outline Motivation About CLASS - Collaborative Learning through Assessment Theoretical Background Experimental Results Interesting Issues Invitation to Collaborate

  3. How Now? How do assignments and exams typically work? Where do students learn in that process?

  4. Motivation Deeper learning and interest in subjects How? Learn through active engagement (involve students as active participants) Give students ownership of entire problem life cycle Use online system to streamline management

  5. Learning from doing the CLASS activities Make up problems Solve problems 1 2 Grade solutions Dispute grade

  6. Learning from doing the CLASS activities Problem-based Learning Make up problems Domain Learning Solve problems Peer Assessment 1 2 Grade solutions Self Assessment Dispute grade

  7. Learning from doing the CLASS activities Make up problems learning from reading everything peers write Read - other problems - other solutions - grade justifications - disputes Solve problems 1 2 Grade solutions Dispute grade

  8. Students perform Instructors usually perform Make up problems Edit problems Students can read everything Solve problems 1 2 Grade solutions Resolve grade disagreement Resolve dispute Dispute grade

  9. Students perform Problem Rubric Instructors usually perform Make up problems Edit problems Solution Guidelines Students can read everything Solve problems Grading Rubric 1 2 Grade solutions Resolve grade disagreement Resolve dispute Dispute grade

  10. Constructivist Learning Theory Learners are active creators of their own knowledge, learning by constructing their own understanding and knowledge of the world through experience and reflecting upon that experience (Harasim, 2012). Learners are encouraged to share their experiences, perspectives and questions about each other s understanding (Tam, 2000).

  11. Expertise and Higher Order Learning Bloom s original taxonomy: knowledge, comprehension, application, analysis, synthesis, and evaluation (Bloom, 1956). Bloom s revised taxonomy: remembering, understanding, applying, analyzing, evaluating and creating (Munzenmaier & Rubin, 2013). Knowledge creation has been added to the original version, which can be aligned to the concept of deep learning (Wang, 2012).

  12. Deeper Learning The process of taking what was learned in one situation and applying it to new situations in other words, learning for transfer (National Research Council, 2012). An intrinsically motivated process of personalized meaning construction (Clare, 2007). Learners seek to understand the issues and interact critically with the contents of particular teaching materials relate ideas to previous knowledge and experience examine the logic of the arguments relate the evidence presented to the conclusions (Beattie et al., 1997).

  13. Problem-Based Learning Driven by challenging, open-ended questions, collaborative learning, and constructivist pedagogies (Savery & Duffy, 1995; Swan et al., 2013). An instructional method in which students learn through facilitated problem solving (Hmelo-Silver, 2004). A learning process that enables students to generate new knowledge from the problems of real world and then develop the skills of analytical thinking and problem-solving thinking (Phumeechanya & Wannapiroon, 2014).

  14. Self and Peer Assessment Assessment, teaching, and learning are inextricably linked. Assessment should be integral to education in that it services to guide the teaching and learning process. (Hargreaves, 1997). An effective approach to encourage deeper learning, such as creating new ideas, and critical judgment of students works (Bhalerao & Ward, 2001).

  15. Experiment with Essay Exams NJIT CIS677: Information System Principles Graduate level introductory core course (Masters/Ph.D.) Goal: study how IS/IT can be used effectively Both on-campus and distance-learning sections Software: WebBoard LMS (before CLASS prototype) Traditional Exam: Three-hour, in class, 3-4 essay questions, 6 pages of notes Compared control groups and treatment groups

  16. Enjoyability Cronbach s Alpha=0.68 Questions SA A N D SD Mean S.D. # I enjoyed the flexibility in organizing my resources I was motivated to do my best work 26.2% 48.9% 16.7% 3.88 3.6% 4.6% 1.00 221 23.5% 42.9% 28.2% 3.82 3.4% 2.1% .92 238 I enjoyed the examination process SA - strongly agree (5 points); A - agree (4); N - neutral (3); D - disagree (2); SD - strongly disagree (1); the mean is out of 5 points; S.D. - standard deviation 17.2% 42.3% 22.6% 3.51 10.5% 7.4% 1.13 239

  17. Perceived Learning Cronbach s Alpha=0.88 Questions SA A N D SD Mean 3.55 S.D. # 17.9% 42.5% 21.3% I learned from making up questions I learned from grading other students answers I learned from reading other people s answers I demonstrated what I learned in class 13.8% 4.5% 1.08 240 17.7% 48.1% 19.4% 3.63 9.3% 5.5% 1.06 237 15.8% 45.0% 22.1% 3.54 11.3% 5.8% 1.07 240 13.6% 50.2% 22.6% 3.61 10.9% 2.7% .95 221 My ability to integrate facts and develop generalizations improved I learned to value other points of view I mastered the course materials 21.8% 49.2% 25.6% 3.88 2.1% 1.3% .83 238 17.6% 51.9% 27.6% 3.82 1.3% 1.6% .81 239 7.4% 51.6% 31.4% 3.54 6.9% 2.7% .84 188

  18. Recommendation: Do Again! Question SA A N D SD Mean S.D. # Would you recommend in the future that this exam process used? 20.7% 40.1% 24.5% 8.9% 3.60 5.8% 1.10 237

  19. Experiment with Essay Exams Experimental results: Students felt they learned more Students enjoyed the exam more Students recommend it for future classes What students liked best Active involvement in the exam process Flexibility to use any resources Reduction in tension

  20. So, would I use CLASS again?

  21. C L A S S P r o t o t y p

  22. Fall 2014 Fall 2015 with new CLASS Prototype Engineering Ethics Essay questions about ethics scenarios Quizzes (true/false, matching, short answer) Computer Ethics Essay questions about ethics scenarios PhD Seminar Social Media Essay questions Computer Science MatLab MatLab homework assignments Similar results from student surveys

  23. Issues: Creating rubrics Calibrating student activities Encouraging self review Groups for each activity More flexible structure The few who don t participate MOOCs Measuring actual learning Problem Rubric Make up problems Edit problems Students can read everything Solve problems Solution Guidelines Grading Rubric 1 2 Grade solutions Resolve grade disagreement Resolve dispute Dispute grade

  24. More Issues Issues for students Timing: drawn-out (2.5 weeks) Learning curve to create problems, grade and dispute Calibration & learning to use rubrics Anonymity within online system Trusting peers Trade-offs for instructors Fewer solutions to evaluate, but each is different Fitting into semester schedule

  25. Extending Scope Which problem types? so far: short and long essay questions what about: multiple choices, short answers, computer programs, semester projects Which course activities? so far: exams, online discussion short essays what about: quizzes, homeworks, larger projects, in-class projects, other types of exams Which course subjects? Jr. High, High School, Community College? Group involvement in each CLASS stage? Grading the quality of problems, grades, other steps

  26. Selected Research Questions Aiming for higher levels of learning (and measuring these) Group vs. Individual Activities Learning interpersonal skills Motivating interest in uninteresting topics Motivating articulation to further education Determining best scaffolds for all CLASS steps

  27. What would it take for you to use CLASS? Problem Rubric Make up problems Edit problems Solution Guidelines Invitation to Collaborate! bieber@njit.edu Students can read everything Solve problems Grading Rubric 1 2 Grade solutions Resolve grade disagreement Resolve dispute Dispute grade

  28. References Alonso, F., Manrique, D., Martinez, L., & Vines, J. M. (2011). How blended learning reduces underachievement in higher education: An experience in teaching computer sciences. IEEE Transactions On Education, 54(3), 471-478. doi: 10.1109/TE.2010.2083665 Beattie, V., Collins, B., & McInnes, B. (1997). Deep and surface learning: a simple or simplistic dichotomy? Accounting Education, 6(1), 1- 12. doi: 10.1080/096392897331587 Bhalerao, A., & Ward, A. (2001). Towards electronically assisted peer assessment: a case study. ALT-J, 9(1), 26-37. Bloom, B. S. (1956). Taxonomy of educational objectives : the classification of educational goals: New York : D. McKay Co., Inc., c1956-. National Research Council (2012). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century. Washington, DC: The National Academies Press. Harasim, L. (2012). Learning theory and online technologies. New York: Taylor & Francis Group. Hargreaves, D. J. (1997). Student learning and assessment are inextricably linked. European Journal of Engineering Education, 22(4), 401. Heinze, A., Procter, C., & Scott, B. (2007). Use of conversation theory to underpin blended learning. International Journal of Teaching and Case Studies, 1(1-2), 108. Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educational Psychology Review, 16(3), 235-266. Munzenmaier, C., & Rubin, N. (2013). Bloom's Taxonomy: What's old is new again: The eLearning Guild. Phumeechanya, N., & Wannapiroon, P. (2014). Ubiquitous scaffold learning environment using problem-based learning to enhance problem-solving skills and context awareness. Savery, J. R., & Duffy, T. M. (1995). Problem based learning: an instructional model and its constructivist framework. ducational Technology, 35, 31-38. Swan, K., Vahey, P., van 't Hooft, M., Kratcoski, A., & Rafanan, K. (2013). Problem-based learning across the curriculum: Exploring the efficacy of a cross-curricular application of preparation for future learning. Interdisciplinary Journal of Problem-based Learning, 7(1), 89- 110. doi: 10.7771/1541-5015.1307 Tam, M. (2000). Constructivism, instructional design, and technology: Implications for transforming distance learning. Educational Technology and Society, 3(2), 50-60. Wang, V. C. X. (2012). Understanding and promoting learning theories. International Forum of Teaching & Studies, 8(2), 5-11.

Related


More Related Content