As operating costs per FTE increased by 36% for Monroe Community College, retention remained flat. Expensive initiatives designed to keep students in school didn’t seem to be having the desired impact on outcomes.
We spoke with Julianna Frisch, Assistant to the President, Strategic Initiatives at Monroe Community College to learn about the college’s work to measure the impact of one of its academic support programs in order to build confidence and gain clarity around how the initiative was moving the needle on student outcomes.
1. How has Monroe’s level of analytics maturity evolved?
Our President is a data-driven decision-maker. The expectation is that we are going to use data to inform every decision we make — it’s been a continuous movement to make analytics part of everybody’s business.
We set an objective that by using Civitas tools, we are going to take action to support our goals to increase term to term persistence by 3 percent and fall to fall retention by 2 percent this year.
2. What are you doing to achieve those outcomes?
One of our first steps — just before taking action to start nudging some of our student groups — has been to analyze the impact of some of our student success initiatives in order to get clarity on how the things we were already investing in were benefiting our students from a persistence standpoint, and which students exactly.
This is something we had never had the ability to do because it’s difficult to isolate specific initiatives for analysis, it’s difficult to measure the direct impact on persistence, it’s difficult to ensure that the students who are doing well because of a program aren’t students who would do well, regardless — and our Institutional Research team doesn’t have the time to dive into the nuances that this type of analysis requires.
With the Civitas Learning Impact application, we were able to do that, with a level of precision that is driving conversations we have rarely been able to have before.
3. What did you learn when you looked into the impact of some of your programs?
An early analysis of our Center for Academic Reading (CAR) revealed an overall lift in persistence of 5.9 percentage points (%pp) for students who visited the center between Fall 2013 and Spring 2017. Out of several initiatives initially run through Impact, that was the greatest impact on persistence we saw, which made us think about how we could get more students there. In particular, the very specific student groups that were shown to benefit the most from CAR — re-admitted undergraduate students, full-time students, and students with the lowest probability to persist.
4. What have you done with these findings?
It’s important to note that CAR started out as a smaller program supported by a Title 3 grant and that we had committed to expand and make it available to all students on campus by Spring 2017. Having these results makes us feel very confident in our decision to not only do that but to put even more momentum behind it, ultimately moving the services to the newly formed Tutoring and Academic Assistance Center (TAAC) where we are better able to support all students at Monroe Community College.
We are using the Civitas Illume application to identify and nudge several groups of students, encouraging them to visit the learning centers and helping them feel excited about the skills they can learn that will enable their success.
5. What’s changing and what’s next?
My monthly meetings with vice presidents across campus now happen around learnings from this kind of analysis. It helps that the Impact results are so easy to consume because we only have 15 minutes to go through highlights and discuss action items. During that time, we can make strong recommendations about how to improve the effectiveness of a program that is currently running, or which student groups we should be targeting in order to move the needle on outcomes — in the current term.
We are currently compiling a list of all initiatives across campus. We recently asked initiative leaders to provide a list of participating students so that we can assess how the initiative has impacted the students they’re serving. More than 30 initiatives have been submitted by different groups across campus. This is significant for two reasons: It shows that teams across campus see the value of Impact and are excited to collaborate with us, and it means we are changing the culture of our college to be more transparent and more data-driven vs. anecdotal.