Description: We present an inclusive learning analytics framework for identifying at-risk or rather “non-thriving” students in a large-enrollment introductory general chemistry course. With the overall goals of closing opportunity gaps, maximizing all students’ potential for course success, and increasing STEM retention rates, our study used a hybrid approach of combining predictive modeling and domain experts decision-making to identify underperforming students during the early part of the course. We recognize that different institutions will have different definitions of thriving and course structures, but the methods we used in our study provide scholar-practitioners with a set of tools that can be replicated and customized for STEM courses on their campus.
Abstract We present an inclusive learning analytics framework for identifying at-risk or rather “non-thriving” students in a large-enrollment introductory general chemistry course. With the overall goals of closing opportunity gaps, maximizing all students’ potential for course success, and increasing STEM retention rates, our study used a hybrid approach of combining predictive modeling and domain experts decision-making to identify underperforming students during the early part of the course. We recognize that different institutions will have different definitions of thriving and course structures, but the methods we used in our study provide scholar-practitioners with a set of tools that can be replicated and customized for STEM courses on their campus.
Descriptions Although identifying “at-risk” students has been a popular field of research for introductory science courses, our study expanded the current research in two areas: 1) broadening the search criteria to students who are likely non-thriving, not necessarily “at-risk” of failing the course; and 2) utilizing early and current course performance data instead of before-course characteristics. These two focus points allowed us to capture a more refined demographic of students, with the goal of helping all students to not just survive, but thrive in STEM programs.
Our study is grounded in these two research questions: (1) What are the best and earliest predictors of non-thriving learners early in the course? (2) What data-driven methods can we provide administrators and instructors to identify these students and help them improve their course performance?
To answer those research questions, we coupled the exploratory data analysis approach with decision-making by domain experts (course professors and coordinators, advisors, data scientists, and learning experts from the university’s teaching and learning center). This hybrid approach ensured campus context was taken into consideration when identifying non-thriving students. We used it to determine the potential grades cut-off for non-thriving triggers, and identify the best and earliest predictors of non-thriving performance. Our predictors were able to catch 6 out of the 6 students who dropped out of the course, and 19 out of the 33 non-thriving students. We plan to improve the accuracy of our predicting model and the effectiveness of our boosting strategies in the future iteration of our study.
References Bentley, A. B.; Gellene, G. I. A six-year study of the eﬀects of a remedial course in the chemistry curriculum. Journal of Chemical Education 2005, 82, 125–130.
Chan, J. Y.; Bauer, C. F. Identifying at-risk students in general chemistry via cluster analysis of aﬀective characteristics. Journal of Chemical Education 2014, 91, 1417–1425.
Daniel House, J. Noncognitive predictors of achievement in introductory college chemistry. Research in Higher Education 1995, 36, 473–490.
Hunter, N. W. A chemistry prep course that seems to work. 1976; https://pubs.acs.org/sharingguidelines.
Kennepohl, D.; Guay, M.; Thomas, V. Using an online, self-diagnostic test for introductory general chemistry at an open university. Journal of Chemical Education 2010, 87, 1273–1277.
Inclusive Curriculum Analytics for Undergraduate STEM Degrees: Using data to identify barriers, measure outcome disparities, and close achievement gaps
As formal credit earning opportunities grow, such as through credit by examination, it’s imperative that institutions understand how their advanced placement credit acceptance policies shape their students’ experiences on campus. While most schools have focused on how students with advanced credit perform in the follow on classes, fewer have focused on how these policies affect students without the same opportunities. This case study will answer: how do credit acceptance policies shape the student academic experience within one College of Engineering? The poster will focus on how one College of Engineering identified inequities through data driven study of students’ college performance as it relates to their credits earned prior to matriculation. It will provide a roadmap for other institutions to investigate their own student data as it pertains to current policies.
Background, Problem, & Evidence:
More and more students begin college having earned multiple college credits. As formal credit earning opportunities grow, such as through credit by examination, it is imperative that colleges and universities understand how their credit acceptance policies shape their students’ experiences on campus. While many studies have focused on program benefits such as additional schedule flexibility, less time to degree, and exposure to advanced topics, few have quantified the collateral impact of credit earning opportunities on the students that do not have credit when entering college. By not specifically quantifying and understanding this, it is easy to perpetuate or expand an achievement gap that started well before students enter college.
In this session, we will show how one College of Engineering used student performance data to identify and understand potential inequities in existing policy decisions. By accepting credit in required courses, in this case calculus, chemistry and physics, two groups were formed: (1) students that were ahead of the stated curriculum and (2) students that were executing the curriculum as published and expected. Looking at shared courses between these two tracks, such as physics or sophomore level engineering courses, exposed real and concerning disparities in grade performance from this policy. This session will present data from this study and describe a methodology for creating similar data analysis at other schools and within a wide range of programs.
Expanded Figures from the poster:
CoursePathVis is a visual analytical tool for exploring and analyzing students’ progress through a college curriculum using a Sankey diagram. We group students in multiple ways (by their AP courses, term courses, and a user-specified fun-nel course) to offer different perspectives on the underlying data. With these flexible grouping techniques and the funnel-augmented Sankey diagram, CoursePathVis helps us identify patterns or outliers that affect student success.”
Victoria Goodrich, Associate Teaching Professor, Chemical and Biomolecular Engineering
Leo McWilliams, Assistant Dean for Undergraduate Programs, College of Engineering
G. Alex Ambrose, Director of Learning Research, ND Learning | Kaneb Center for Teaching Excellence
Andrew Craker, Pat Miller, Kevin Abbott, Kevin Barry, Alex Oxner, Augie Freda, Shelin Mathews, Ryan Snodgrass, Keith Mcindoo, Roberto Casarez, Joel Dosmann, Chaoli Wang, Brendan O’Handley, Michael Niemier, Morgan Ludwig and Samantha Allison