Even though UMKC was seeing 1-2% bumps in overall retention, outcomes were not increasing for underrepresented minorities who are central to the school’s mission.
We spoke with Barbara A. Bichelmeyer, Provost at the University of Missouri-Kansas City, to learn how UMKC uncovered that one of its largest student success initiatives — Supplemental Instruction — is increasing outcomes for students and helping the university optimize, prioritize and allocate resources toward what works.
1. What role do analytics play in UMKC’s efforts to change institutional and student outcomes?
Our work with the Higher Learning Commission (HLC) Persistence and Completion Academy has been focused on using predictive analytics to understand patterns of attrition on campus, and on gaining greater insight into the role of non-cognitive variables that influence student success. We believe this kind of information about our students is the foundation of a developmental enterprise to build faculty and advisor capacity to meet each student where they are and bring them along — one student at a time.
As we dove into that, we were frustrated by the fact that our data systems were not talking to each other. We were going back and forth between SIS and LMS, plus legacy systems in student support services. We learned that step one was to bring data together in a way that would make it actionable.
Moreover, analytics need to be able to tell us more about the different demographics we serve. We want to be able to identify specific at-risk student groups, leverage existing initiatives in more targeted ways, and deliver personalized experiences for our students in order to change the trajectory of their journeys.
2. Tell us about Supplemental Instruction and how you were measuring its effectiveness.
UMKC pioneered Supplemental Instruction (SI) in the early 1970’s. SI is offered to all students. It is a non-remedial approach to learning that supports students toward academic success by integrating “what to learn” with “how to learn.” SI consists of regularly scheduled, voluntary, out-of-class group study sessions driven by students’ needs.
SI has relied on academic units to provide funding for the program— but those budgets are diminishing each day, which puts the responsibility back on the institution to build a business case.
Previously, we had a semi-automatic report that showed the courses that offered SI, the students who attended, and how many times they had participated. We would create a query that showed those students’ GPA and grades, and if we observed positive changes, we considered SI to be effective.
We had never been able to look at a comparison group and wondered if students who attended were attending because they were already more motivated to do so. We also hadn’t been able to measure the impact of SI on persistence, directly. We needed more rigor for our analysis in order to make confident statements about the ROI of this program that we could bring to conversations with the academic units we were encouraged to offer SI.
3. What did you learn when you analyzed the impact of SI on student persistence?
We’ve used Impact to analyze more than 25 initiatives on campus, including SI. In just hours, Impact allowed us to do sweeping analysis that would have taken months to complete. Results revealed a 7.8 percentage point (%pp) overall lift on persistence for students who attended SI three or more times in the semester. The lift was even more significant for our African American students, students in their first semester, part-time students, and students in the second persistence probability quartile.
That lift translates into nearly $600,000 in additional retained tuition for us.
4. How did you dive into those results to take action?
We looked at the impact of SI on students who attended less than three times, and calculated that if we could encourage those students to participate a few more times… we could retain approximately 130 additional students, which translates into an additional nearly $1 million in retained tuition.
So now, we are putting together nudge campaigns in Illume to reach the students who were shown to be the most likely to benefit from SI. We are also using the Civitas student mobile application to send personalized messages about SI to our students. The ability to meet them where they are is unprecedented for us.
5. What’s next?
We are now working with academic departments to identify additional courses where SI could help elevate student success. Previously, we were recommending SI to courses where more than 25% of students had Ds, Fs, or Ws. Following the Impact results, we used the Civitas Courses application to identify “yellow flag” courses — those where a C reduces a students’ likelihood of graduation — and are now encouraging academic units to pay the associated fee to offer SI in more of their courses, particularly yellow flag.