IU Learning Analytics Summit: Inclusive Learning Analytics Framework for Student Success in an Introductory STEM Course

Duan, Xiaojing, Ambrose, G. Alex (2021) “Inclusive Learning Analytics Framework for Student Success in an Introductory STEM Course” Indiana University’s 3rd Annual Learning Analytics Summit: Data-informed Stories, Transformational Journeys.

To access and comment on the slides click here

Description:
We present an inclusive learning analytics framework for identifying at-risk or rather “non-thriving” students in a large-enrollment introductory general chemistry course. With the overall goals of closing opportunity gaps, maximizing all students’ potential for course success, and increasing STEM retention rates, our study used a hybrid approach of combining predictive modeling and domain experts decision-making to identify underperforming students during the early part of the course. We recognize that different institutions will have different definitions of thriving and course structures, but the methods we used in our study provide scholar-practitioners with a set of tools that can be replicated and customized for STEM courses on their campus.

IU Learning Analytics Summit: Disaggregation & Inclusive Learning Analytics Presentation

To Cite and Share this Presentation:

Ambrose, G. Alex, Goodrich, Victoria, Craker, Andrew, McWilliams, Leo (2021) “Using Disaggregation & Inclusive Curriculum Analytics to Identify Barriers, Measure Outcome Disparities, and Close Achievement Gaps.” Indiana University’s 3rd Annual Learning Analytics Summit: Data-informed Stories, Transformational Journeys: Indiana

To access and comment on the slides click here or click here to watch the 18 min recorded presentation

Abstract
We present an inclusive learning analytics framework for identifying at-risk or rather “non-thriving” students in a large-enrollment introductory general chemistry course. With the overall goals of closing opportunity gaps, maximizing all students’ potential for course success, and increasing STEM retention rates, our study used a hybrid approach of combining predictive modeling and domain experts decision-making to identify underperforming students during the early part of the course. We recognize that different institutions will have different definitions of thriving and course structures, but the methods we used in our study provide scholar-practitioners with a set of tools that can be replicated and customized for STEM courses on their campus.

Descriptions
Although identifying “at-risk” students has been a popular field of research for introductory science courses, our study expanded the current research in two areas: 1) broadening the search criteria to students who are likely non-thriving, not necessarily “at-risk” of failing the course; and 2) utilizing early and current course performance data instead of before-course characteristics. These two focus points allowed us to capture a more refined demographic of students, with the goal of helping all students to not just survive, but thrive in STEM programs.

Our study is grounded in these two research questions: (1) What are the best and earliest predictors of non-thriving learners early in the course? (2) What data-driven methods can we provide administrators and instructors to identify these students and help them improve their course performance?

To answer those research questions, we coupled the exploratory data analysis approach with decision-making by domain experts (course professors and coordinators, advisors, data scientists, and learning experts from the university’s teaching and learning center). This hybrid approach ensured campus context was taken into consideration when identifying non-thriving students. We used it to determine the potential grades cut-off for non-thriving triggers, and identify the best and earliest predictors of non-thriving performance. Our predictors were able to catch 6 out of the 6 students who dropped out of the course, and 19 out of the 33 non-thriving students. We plan to improve the accuracy of our predicting model and the effectiveness of our boosting strategies in the future iteration of our study.

References
Bentley, A. B.; Gellene, G. I. A six-year study of the effects of a remedial course in the chemistry curriculum. Journal of Chemical Education 2005, 82, 125–130.

Chan, J. Y.; Bauer, C. F. Identifying at-risk students in general chemistry via cluster analysis of affective characteristics. Journal of Chemical Education 2014, 91, 1417–1425.

Daniel House, J. Noncognitive predictors of achievement in introductory college chemistry. Research in Higher Education 1995, 36, 473–490.

Hunter, N. W. A chemistry prep course that seems to work. 1976; https://pubs.acs.org/sharingguidelines.

Kennepohl, D.; Guay, M.; Thomas, V. Using an online, self-diagnostic test for introductory general chemistry at an open university. Journal of Chemical Education 2010, 87, 1273–1277.

Inclusive Curriculum Analytics AAC&U Conference Poster

Click here to zoom in for a larger view.
Click “Present” in top right hand to view as full screen.

AAC&U Conference Poster eHandout, 2/11/21

Citation:

Goodrich, Victoria, McWilliams, Leo, Ambrose, G. Alex (2021) One College’s Experience: Exposing Inequities Caused by Pre-Matriculation Credit Earning Policies. AAC&U Virtual Conference on General Education, Pedagogy, and Assessment: Embracing the Best Emerging Practices for Quality and Equity.

Title:

Inclusive Curriculum Analytics for Undergraduate STEM Degrees: Using data to identify barriers, measure outcome disparities, and close achievement gaps

Abstract:

As formal credit earning opportunities grow, such as through credit by examination, it’s imperative that institutions understand how their advanced placement credit acceptance policies shape their students’ experiences on campus. While most schools have focused on how students with advanced credit perform in the follow on classes, fewer have focused on how these policies affect students without the same opportunities. This case study will answer: how do credit acceptance policies shape the student academic experience within one College of Engineering? The poster will focus on how one College of Engineering identified inequities through data driven study of students’ college performance as it relates to their credits earned prior to matriculation. It will provide a roadmap for other institutions to investigate their own student data as it pertains to current policies.

Background, Problem, & Evidence:

More and more students begin college having earned multiple college credits. As formal credit earning opportunities grow, such as through credit by examination, it is imperative that colleges and universities understand how their credit acceptance policies shape their students’ experiences on campus. While many studies have focused on program benefits such as additional schedule flexibility, less time to degree, and exposure to advanced topics, few have quantified the collateral impact of credit earning opportunities on the students that do not have credit when entering college. By not specifically quantifying and understanding this, it is easy to perpetuate or expand an achievement gap that started well before students enter college.

In this session, we will show how one College of Engineering used student performance data to identify and understand potential inequities in existing policy decisions. By accepting credit in required courses, in this case calculus, chemistry and physics, two groups were formed: (1) students that were ahead of the stated curriculum and (2) students that were executing the curriculum as published and expected. Looking at shared courses between these two tracks, such as physics or sophomore level engineering courses, exposed real and concerning disparities in grade performance from this policy. This session will present data from this study and describe a methodology for creating similar data analysis at other schools and within a wide range of programs. 

Expanded Figures from the poster:

CoursePathVis is a visual analytical tool for exploring and analyzing students’ progress through a college curriculum using a Sankey diagram. We group students in multiple ways (by their AP courses, term courses, and a user-specified fun-nel course) to offer different perspectives on the underlying data. With these flexible grouping techniques and the funnel-augmented Sankey diagram, CoursePathVis helps us identify patterns or outliers that affect student success.”

Resources & Related Work:

Forthcoming ASEE Article (check back soon)

Instructional Umbrella IRB

Bartolini, A.; Running, C.; Duan, X.; Ambrose, G. Integrated Closed-Loop Learning Analytics Scheme in a First-Year Engineering Course. 2020 ASEE Virtual Annual Conference Content Access Proceedings. 2020.

Duan, Xiaojing, Ambrose, G. Alex, Wang, Chaoli, Abbott, Kevin, Woodard, Victoria, Young, Kelley (2020) Learning Analytics for Inclusive STEM Student Success. Learning Analytics & Knowledge Conference. Practitioner Report and Poster. Frankfurt, Germany

Duan, Xiaojing, Ambrose, G. Alex, Wang, Chaoli, Abbott, Kevin, Woodard, Victoria, Schalk, Catlin (2020) PerformanceVis: Homework & Exam Analytics Dashboard for Inclusive Student Success. Learning Analytics & Knowledge Conference. Practitioner Demo. Frankfurt, Germany

Ambrose, G. Alex, Duan, Xiaojing, Abbott, Kevin, Woodard, Victoria (2019) Inclusive Learning Analytics to Improve STEM Student Success. EDUCAUSE Main Conference, Chicago, IL

Syed, M., Anggara, T., Duan, X., Lanski, A., Chawla, N. & Ambrose, G. A. (2018) Learning Analytics Modular Kit: A Closed Loop Success Story in Boosting Students. Proceedings of the International Conference on Learning Analytics & Knowledge

Presenters Bio & Contact Info:

Victoria Goodrich, Associate Teaching Professor, Chemical and Biomolecular Engineering

Leo McWilliams, Assistant Dean for Undergraduate Programs, College of Engineering

G. Alex Ambrose, Director of Learning Research, ND Learning | Kaneb Center for Teaching Excellence

Acknowledgements:

Andrew Craker, Pat Miller, Kevin Abbott, Kevin Barry, Alex Oxner, Augie Freda, Shelin Mathews, Ryan Snodgrass, Keith Mcindoo, Roberto Casarez, Joel Dosmann, Chaoli Wang, Brendan O’Handley, Michael Niemier, Morgan Ludwig and Samantha Allison