IU Learning Analytics Summit: Disaggregation & Inclusive Learning Analytics Presentation

To Cite and Share this Presentation:

Ambrose, G. Alex, Goodrich, Victoria, Craker, Andrew, McWilliams, Leo (2021) “Using Disaggregation & Inclusive Curriculum Analytics to Identify Barriers, Measure Outcome Disparities, and Close Achievement Gaps.” Indiana University’s 3rd Annual Learning Analytics Summit: Data-informed Stories, Transformational Journeys: Indiana

To access and comment on the slides click here or click here to watch the 18 min recorded presentation

Abstract
We present an inclusive learning analytics framework for identifying at-risk or rather “non-thriving” students in a large-enrollment introductory general chemistry course. With the overall goals of closing opportunity gaps, maximizing all students’ potential for course success, and increasing STEM retention rates, our study used a hybrid approach of combining predictive modeling and domain experts decision-making to identify underperforming students during the early part of the course. We recognize that different institutions will have different definitions of thriving and course structures, but the methods we used in our study provide scholar-practitioners with a set of tools that can be replicated and customized for STEM courses on their campus.

Descriptions
Although identifying “at-risk” students has been a popular field of research for introductory science courses, our study expanded the current research in two areas: 1) broadening the search criteria to students who are likely non-thriving, not necessarily “at-risk” of failing the course; and 2) utilizing early and current course performance data instead of before-course characteristics. These two focus points allowed us to capture a more refined demographic of students, with the goal of helping all students to not just survive, but thrive in STEM programs.

Our study is grounded in these two research questions: (1) What are the best and earliest predictors of non-thriving learners early in the course? (2) What data-driven methods can we provide administrators and instructors to identify these students and help them improve their course performance?

To answer those research questions, we coupled the exploratory data analysis approach with decision-making by domain experts (course professors and coordinators, advisors, data scientists, and learning experts from the university’s teaching and learning center). This hybrid approach ensured campus context was taken into consideration when identifying non-thriving students. We used it to determine the potential grades cut-off for non-thriving triggers, and identify the best and earliest predictors of non-thriving performance. Our predictors were able to catch 6 out of the 6 students who dropped out of the course, and 19 out of the 33 non-thriving students. We plan to improve the accuracy of our predicting model and the effectiveness of our boosting strategies in the future iteration of our study.

References
Bentley, A. B.; Gellene, G. I. A six-year study of the effects of a remedial course in the chemistry curriculum. Journal of Chemical Education 2005, 82, 125–130.

Chan, J. Y.; Bauer, C. F. Identifying at-risk students in general chemistry via cluster analysis of affective characteristics. Journal of Chemical Education 2014, 91, 1417–1425.

Daniel House, J. Noncognitive predictors of achievement in introductory college chemistry. Research in Higher Education 1995, 36, 473–490.

Hunter, N. W. A chemistry prep course that seems to work. 1976; https://pubs.acs.org/sharingguidelines.

Kennepohl, D.; Guay, M.; Thomas, V. Using an online, self-diagnostic test for introductory general chemistry at an open university. Journal of Chemical Education 2010, 87, 1273–1277.

Comments are closed.