ND Trains 170 Paraguayan Legal Professors in Online Teaching Pedagogies and Learning Technologies during COVID-19

The US State Department funded a USAID grant to have the University of Notre Dame and Harvard join the Paraguayan Development Institute’s led program named “Rule of Law and Culture of Integrity” (ROLCI) in Paraguay. The ROLCI Program is an initiative of Development Institute (ID)and the United States Agency for International Development (USAID). The goal was to strengthen Paraguayan higher education institutions to improve the rule of law and culture of anti-corruption in Paraguay. The team, lead by the Keough School of Global Affairs’s Pulte Institute for Global Development, gathered trainers and experts from ND Learning | Kaneb Center, OIT”s Teaching and Learning Technologies Group to create, translate, facilitate, and record an interactive webinar series on using state of the art technologies and online pedagogies during and after COVID-19. 170  legal professors from several law schools and training centers in Paraguay such as the National University of Asunción, the National University of Ciudad del Este, the National University of Concepcion and the National University of Caaguazu, the International Center for Judicial Studies of the Supreme Court of Justice, the Public Defense Ministry’ training center, the Judicial School of the Council of the Magistracy and the Public Ministry Training Center, attended and participated in the 6-part live dual-language series. Individual workshops included the following topics and goals (note: the links are to the shared zoom recordings and translated slide decks):

Workshop TitleDescription & Goals Links to Recordings
Online Learning Exploratory Session•Country and Campus exchange
•Open dialogue and needs assessment 
Flexible Teaching & Learning Course Design•Reassess your course design: situational factors, learning goals, and assessment structure for resilient teaching
•Develop broad strategies for engaging your students and helping them achieve the course learning goals regardless of class modality
•Describe general principles of resilient teaching 
•Apply principles of inclusive teaching that apply across modalities

Flexible Teaching Methods Part 1: Live and Pre-Recorded Lecturing with Zoom•Utilize basic pedagogical design principles for using Zoom technology for synchronous and asynchronous teaching
•Experience as a participant in a live Zoom session with backchannel chat, share/annotate screen, live polling and documents
Flexible Assessment Part 1 (Summative): Alternative Assessments & Exam (Re)Design•Describe different exam methods and forms
•Redesign (if needed) your traditional exam/assessment for a remote classAdapt administration procedures to the online environment
Flexible Teaching Methods Part 2: Active Learning Strategies with Free Google Tools•Understand the reasons for incorporating active learning
•Describe & experience possible tools and strategies for hybrid active learning
•Select & apply active learning strategies
Flexible Assessment Part 2 (Formative): Assessing Participation, Preparation, and Attendance•Understand the difference and relationship between formative and summative assessment
•Define the role and value of participation, preparation, and attendance in a resilient class
•Apply concrete strategies for using participation, preparation and attendance for formative assessment purposes

Key collaborators on this project included:

  • Edward Jurkovic, Program Manager. Pulte Institute for Global Development
  • Lorena Gaona Greenwood, Monitoring and Evaluation Specialist, Development Institute
  • G. Alex Ambrose Phd, Director of Learning Research, Kaneb Center, Notre Dame Learning
  • Jennifer Zachman, Associate Professor, Modern Languages, Saint Mary’s College
  • Kevin Abbott, Educational Technology Specialist, OIT
  • Kristi Rudenga PhD, Director of Teaching Excellence, Kaneb Center, Notre Dame Learning
  • John Kuehn, Adjunct Professor of Law, Notre Dame Law School 
  • Kari Gallagher, Adjunct Professor of Law, Notre Dame Law School
  • John Conway, Adjunct Professor of Law, Notre Dame Law School

For more information about this project see:

Inclusive learning analytics for identifying and boosting non-thriving students in large-enrollment general chemistry course

Schalk, Catlin, Young, Kelley, Ambrose, G. Alex, Duan, Xiaojing, Weber, Woodard, Victoria (2020) “Inclusive learning analytics for identifying and boosting non-thriving students in large-enrollment general chemistry course.” Biennial Conference on Chemical Education. Poster.

Because of the global COVID-19 pandemic, the 2020 Biennial Conference on Chemical Education was terminated on April 2, 2020, by the Executive Committee of the Division of Chemical Education, American Chemical Society; and, therefore, this presentation could not be given as intended

Our goals are to identify non-thriving students in a gateway introductory chemistry course, and to develop methods that increase student success and retention rates in the College of Science and College of Engineering. General Chemistry is required for all first semester STEM majors, which totaled 949 students in Fall 2019. Specifically, our focus is on maximizing students’ potential to thrive — that is earning a final grade of C or higher in the course — not just to survive the class. We use student background data, historical performance data, as well as real-time academic performance data in the development of a visual analytics dashboard. This inclusive learning platform is a tool for instructors and administration to identify admissions characteristics and academic performance triggers that lead to non-thriving in the course, or in STEM programs. Course homework and exam item analysis was conducted to identify students who are not likely to thrive based on course performance identifiers so that early actions can be taken to intervene during the semester to boost the chances of these students to thrive in the course. Additionally, a special treatment program, the Science and Engineering (S&E) scholars program, is implemented as an effort to close the achievement gap of underserved and underprepared students while also maintaining the rigor of the course. The 45 students in this small cohort take a summer math refresher course, are enrolled in the same chemistry and calculus sections together, have a reduced course load, and attend extra graded problem solving classes with more one-on-one time with experienced professors and TAs.

ASEE Paper & Presentation: Integrated Closed-Loop Learning Analytics Scheme in a First-Year Engineering Course

One Sentence Overview:
•This study identified students that had the potential to be “non-thriving” at the end of the semester based on historical data and boosted these students in an attempt to improve their performance in the course.

Key Takeaways:
• A trigger of 80% or lower on one of the first three homework assignments was successfully implemented to identify and boost potentially “non-thriving students”.
• Students who responded to the personalized action plan in their boost email performed better than those who did not respond and those who would have been boosted based on the same trigger in the 2017 and 2018 fall semesters.

Citation & Link to the Full Paper & Slides:
Bartolini, A.; Running, C.; Duan, X.; Ambrose, G. Integrated Closed-Loop Learning Analytics Scheme in a First-Year Engineering Course. 2020 ASEE Virtual Annual Conference Content Access Proceedings. 2020.

Click here for the presentation slides

5 Min Screencast Video Demo of the PerformanceVis Dashboard

Duan, Xiaojing, Ambrose, G. Alex, Wang, Chaoli, Abbott, Kevin, Woodard, Victoria, Schalk, Catlin (2020) PerformanceVis: Homework & Exam Analytics Dashboard for Inclusive Student Success. Learning Analytics & Knowledge Conference. Practitioner Demo. Frankfurt, Germany

PerformanceVis is a visual analytics tool developed for analyzing and visualizing students’ chemistry course performance through the lens of time, homework and exams, and demographic and academic background. The Introduction to Chemical Principles course is a required course for all college of science and college of engineering programs at the university and is the second largest course on campus with approximately 1,000 freshmen taking the course.

This engaging tool includes four main views (overall exam grade pathway, detailed exam grade pathway, detailed exam item analysis, and overall exam & homework analysis) which are dynamically linked together for user interaction and exploration. PerformanceVis enables instructors to improve their course and assessment design by visualizing students’ perceived difficulty level and topic correlation between assignments and exams. It assists instructors and administrators in evaluating the impact of a special treatment program (cohort) by reviewing the performance of regular, control, and cohort students overall and by exam. The image below shows a screenshot of PerformanceVis with the right side of the image showing a view of the gender performance gap for those students who were not thriving. The left side of the image shows Exam 1 item analysis for each test question.

Link to 5 min practitioner interactive demo on YouTube

Link to the Interactive Dashboard Tool:

Workshop: Improve your Teaching & Student Learning with Research from the Classroom

Ambrose, G. Alex, Hubert, Dan, Rouamba, Guieswende (2019) “Improve your Teaching Student Learning with Classroom Research.” Kaneb Center for Teaching Excellence Workshop, Notre Dame, IN.

Click here for the slide deck
Click here for the handout

Participants will:

  • Explore the landscape of Discipline-Based Research (DBR) and the Scholarship of Teaching and Learning (SoTL).
  • Brainstorm potential research goals, questions, and data for their own course.
    Become familiar with applied learning research support services and resources (e.g. survey/rubric design, video observation, consent forms, and umbrella IRB).


Journal Article: PerformanceVis: Visual Analytics of Student Performance Data from an Introductory Chemistry Course

Hyperlinked citation with an open acces link to full journal article:

Deng, H., Wang, X., Guo, Z., Decker, A., Duan, X., Wang, C., Ambrose, G., & Abbott, K.  (2019). PerformanceVis: Visual analytics of student performance data from an introductory chemistry course. Visual Informatics.

We present PerformanceVis, a visual analytics tool for analyzing student admission and course performance data and investigating homework and exam question design. Targeting a university-wide introductory chemistry course with nearly 1000 student enrollment, we consider the requirements and needs of students, instructors, and administrators in the design of PerformanceVis. We study the correlation between question items from assignments and exams, employ machine learning techniques for student grade prediction, and develop an interface for interactive exploration of student course performance data. PerformanceVis includes four main views (overall exam grade pathway, detailed exam grade pathway, detailed exam item analysis, and overall exam & homework analysis) which are dynamically linked together for user interaction and exploration. We demonstrate the effectiveness of PerformanceVis through case studies along with an ad-hoc expert evaluation. Finally, we conclude this work by pointing out future work in this direction of learning analytics research.

Student performance, Item analysis, Grade prediction, Learning analytics, Knowledge discovery


VI Graphic 

Practitioner Report: Learning Analytics for Inclusive STEM Student Success

Duan, Xiaojing, Ambrose, G. Alex, Wang, Chaoli, Abbott, Kevin, Woodard, Victoria, Young, Kelley (2020) Learning Analytics for Inclusive STEM Student Success. Learning Analytics & Knowledge Conference. Practitioner Report and Poster. Frankfurt, Germany

ABSTRACT: The challenge was to identify and help underserved and underprepared students in an introductory chemistry course to be retained and thrive in the college of science or engineering while supporting the general population. In this paper, we describe our methods for identifying these students, evaluating the impact of a special treatment program that was provided to a subset of those students, discuss our efforts to help the general population, and evaluate the short- and long-term impacts. In particular, we discuss a data-informed framework for analyzing student and outcome variables.

Keywords: STEM Retention; Learning Visualization Dashboard; Inclusive Pedagogy; Learning Analytics

Click here for a current version of the practitioner report

Educause Workshop: Building Flexible Learning Spaces Utilizing Faculty & Student Driven Design

Ambrose, G. Alex (2019) ” Building Flexible Learning Spaces Utilizing Faculty & Student-Driven Design.” Educause Annual Conference, Chicago, IL

Ambrose, G. Alex (2019) ” Building Flexible Learning Spaces Utilizing Faculty & Student-Driven Design.” Educause Annual Conference, Chicago, IL

Click here for the full slide deck

Click here to enter a virtual tour of a Notre Dame typical and prototype classroom.


The session will open with an introduction and problem identification within the context of higher education. Many IT, facilities management, and registrar units work in silos in designing, building, allocating, managing, and renovating classrooms. Additionally, the problem of missing key stakeholders’ voices (students & faculty) in the learning space design process will be defined. Next, a tour of our active learning classrooms will be provided.

An interactive presentation supported by a digital handout and backchannel will provide a history and evolution of the University of Notre Dame’s learning space research & design methodology. This approach captures faculty and student voices to create data-driven design decisions. Some of the questions we will be able to answer with our framework and tools are: How do you get faculty and student voices into the learning space design process? What do four semesters of learning space evaluation surveys from over a thousand students and dozens of faculty from multiple disciplines tell us? What can we learn from designing and testing medium Active Learning Classroom prototypes to guide future classroom design and budgets? Where do we get the most “bang for the buck” with regard to furniture, technology, or space? What are student/faculty learning space perceptions, recommendations, and impacts? Which data-driven design decisions were made to continually improve our process? What bottom-up partnerships, processes, and guidelines did we develop that were aligned to top-down strategic visions and goals?

Finally, the session will conclude with a Q&A discussion and a return to the digital handout with a review of the backchannel best questions and resource sharing.

Participants will:

  1. understand how faculty and student-driven data design can evolve their campus’ learning space design process
  2. engage in a learning space design methodology experience
  3. create a customized learning space design plan to take to their own campus

Related Posts:

IFALC Poster Presentation on our Learning Space Report Card

Steelcase Active Learning Classroom Grant Report

Photo from the Workshop in the Steelcase Interactive Room

Quick Link here: https://tinyurl.com/2019educauseND

Educause Poster: Inclusive Learning Analytics to Improve STEM Student Success

Ambrose, G. Alex, Duan, Xiaojing, Abbott, Kevin, Woodard, Victoria (2019) “Inclusive Learning Analytics to Improve STEM Student Success.” EDUCAUSE Main Conference, Chicago, IL

Click here to download the

The goal of this project is to ensure our first year students in gateway courses are leaving Science, Technology, Engineering, and Mathematics (STEM) based majors based on a discernment decision not because they were not adequately prepared or lacked recognition of needed intervention, in courses typically consider gateways. The key questions this project is pursuing are: (1) What research, assessment, advising, and course design questions can we ask and answer from the course and student data capture that will support student success? (2) How to cross-reference student learning activity, homework analytics, and exam item analysis to help instructors improve course and assessment design? (3) Can we use analytics data to boost all students’ potential to thrive and improve stem success.

Our Chemistry courses were designed to meet that challenge and help all of our 953 (46% of all first-year student). The learning environment infrastructure was architected to follow the NGDLE interoperability, integration, analytics, advising, and learning assessment dimensions so multidimensional student data can be collected seamlessly to enable comprehensive data analysis and improvement assessment (Ambrose, Abbott, & Lanski, 2017).

Our poster will describe how we built an innovative and transparent learning environment, embedded data collection in course design, captured data from multiple sources into the centralized learning records warehouse (Miller & Duan, 2018), used predictive analytics to support students identified as underperforming to thriving, answered key questions through visualization reports, and identified what is the best and earliest predictor of student success.

Related Educause Review articles that we published on this work: