Sociology Outcomes Assessment and Process Review

sociology outcomes assessment n.w
1 / 14
Embed
Share

Explore the detailed process of assessing Sociology student artifacts for ELO mastery, including the review procedures and analysis results. Discover key recommendations for the future of Sociology education.

  • Sociology
  • Assessment
  • Student Artifacts
  • ELOs
  • Review

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Sociology Outcomes Assessment Wade Cole & Claudia Geist Former & Current Director of Graduate Studies Department of Sociology

  2. Outline What did we do? The review process & procedures What did we find? A summary of our analysis and results What s next? Recommendations for the future

  3. Sociology ELOs ELO 1: Understand what sociology is, as a social science discipline. ELO 2: Utilize sociological theories to guide research and improve understanding of social phenomena and human behavior. ELO 3: Learn to use a variety of research methods as a means of understanding the social world and human interaction. ELO 4: Apply sociological and social-science perspectives to the understanding of real-world problems or topics (e.g., issues of diversity, health, globalization, crime & law, sustainability). ELO 5: Communicate effectively about sociological issues, making well-organized arguments supported by relevant evidence.

  4. Process First, we identified courses from which to sample student artifacts. ELO 2 Upper-division required Upper-division elective ELO 4 Lower-division required Upper-division elective ELO 5 Upper-division required Upper-division elective

  5. Process The DUGS contacted instructors for these courses, asking them to submit six student artifacts from assignments they used to evaluate the stated ELO. Instructors submitted two artifacts they deemed high quality, two intermediate quality, and two low quality. Artifacts consisted of written assignments, final papers, and essays. After anonymizing the artifacts, they were distributed to Committee members for their evaluations.

  6. Process Evaluators were provided with a scoring sheet for rating the artifacts. Scoring sheets asked evaluators to consider how well the assignment assesses the specified ELO. For each of the six artifacts, evaluators rated ELO mastery using the following four-point scale: 0 = Poor (There is no evidence that the ELO was addressed) 1 = Emerging/Low (Initial but substandard effort to address ELO) 2 = Competent/Mid (ELO achieved with reasonable proficiency) 3 = Exemplary/High (Artifact demonstrates mastery of the ELO)

  7. Analysis Interrater agreement analyses Kappa interrater agreement scores Raters plus instructor .246*** .498*** .243*** Raters plus instructor .249** n/a .327*** Raters only ELO 2 ELO 4 ELO 5 .120* .497** .130 * p<.05, ** p<.01, *** p<.001 (two tailed). For this analysis, ratings of 0 (poor) were recoded as 1 (emerging/low) to render them comparable with instructors assessment of high, intermediate, and low artifacts. No coefficient is estimated for ELO 4 because none of the artifacts received a rating of 0.

  8. Analysis Disaggregated kappa interrater agreement scores Raters only Raters plus instructor Rating Poor Low Mid High Poor Low Mid High Poor Low Mid High .768*** .518** -.178 .000 n/a .822** .250 .395 -.091 .308* -.037 .182 n/a .484*** .063 .188 n/a .880*** .250 .345* n/a .484*** .121 .348** ELO 2 ELO 4 ELO 5

  9. Summary of Findings Interrater agreement for ELO 4 was acceptable, but raters tended to agree that students were not very successful in achieving this outcome. ELO 4: Apply sociological and social-science perspectives to the understanding of real-world problems or topics. Agreement was much weaker for ELOs 2 and 5. ELO 2: Utilize sociological theories to guide research and improve understanding. ELO 5: Communicate effectively about sociological issues.

  10. Recommendations Faculty should discuss how ELOs are interpreted and assessed, with the goal of generating more agreement across instructors, courses, and raters. There seems to be tacit agreement regarding what constitutes low-quality work, but it is more difficult to identify work that is proficient or exemplary. We may want to develop rubrics for evaluating student mastery of ELOs, to improve interrater reliability and to guide instructors as they design student assessments.

  11. Recommendations We should revisit the ELOs with an eye toward distinguishing them more clearly from one another. What does it mean to utilize sociological theories (ELO 2) and apply sociological perspectives (ELO 4)? How are theories and perspectives different? In qualitative feedback, reviewers noted that assignments designed to assess ELO 2 were not always well-suited to the task, precisely because they asked students to consider perspectives and concepts rather than theories per se.

  12. Recommendations We plan to standardize the department s assessment procedures and methodologies, which will permit longitudinal analyses in the future. It is essential to compile and analyze comparable data over time to establish trends and track progress (or the lack thereof). The Committee will reanalyze ELOs 1 and 3 this spring using the framework established in this report.

  13. Recommendations The Committee might also consider abandoning the practice of soliciting stratified samples of student artifacts from instructors. This strategy primes reviewers to expect patterned variation in the artifacts they evaluate. Raters know that they will be given two examples each of high-, intermediate-, and low-quality work for review, and this knowledge may bias their own independent assessments of the artifacts. If anything, this approach may overstate levels of interrater agreement.

More Related Content