National and regional foundations have funded significant student success innovation, practice, and policy. Increasingly, however, there is a realization that these initiatives need to be both grounded in and powered by sophisticated data strategies. At our Summit (March 9 and 10 at SXSWedu) we hosted leading foundation executives to explore the promise, possibility and necessity for analytics in grant projects on the road ahead. The following is an excerpt from that conversation. A full-length video is also provided. Guests were William Moses, Kresge Foundation; Jason Palmer, Bill & Melinda Gates Foundation, and Wynn Rosser, the Greater Texas Foundation, moderated by Mark Milliron, Civitas Learning’s chief learning officer. You can view the entire video of this conversation here
On the Access and Success Agenda
A lot of folks talk about this kind of push and pull between access and success. The completion agenda has been a big push around this work. Kresge Foundation has been pretty firm that both of these matter. It has to be both access and success. That’s at the heart of your metrics, right?
Yes. If we knew exactly what was going to work, we’d all be doing it, right? We’re all trying our very best to figure out what we think the best leverage is. For us, we look at the pipeline. We’re concerned that there are lots of students who would not be going to school, even though you still have to fix the success side. If they stopped coming in, they’re not going to be there succeeding. You see a lot of low-income students who don’t have very good advising. They have a bad understanding of what financial aid might be available, or of the cost of college. What we’re trying to do is really help those students figure out what their options are, so that they can make the right choices. I have a 12-year-old. If you know Lumina’s Goal 2025, those 12-year-olds are the class that’s graduating in 2025. At one point, you’d eventually say, “It’s probably not going to help Lumina much if you’re doing access in 2024.”
The idea now is to actually figure out how to make the impact in both access and success, and drive that agenda. I know Jason…you’ve actually been driving that from the beginning.
Our group is called Postsecondary Success. When it was started back in 2007, even before you were there (Mark), it was all about access. Less than 59 percent of students are graduating from college and that is a problem. If you go back to 2006, everyone was talking about just access. We want completion to be more front of mind. We feel like we’ve done a good job so far in making that a major part of the dialogue, as have all you that are here. Now, it’s really about leveraging the evidence – what works to improve student outcomes and gets more students to progress, retain and graduate. This area is what we call Integrated Planning and Advising for Student Success (iPASS)
, using the power of predictive analytics, and using the power of technology to really empower advisers, faculty, and front line workers at the institution with the data they need to help the student in front of them. That’s already showing much promise and improvement in a lot of colleges.
It’s a big deal to start this dialogue and think about what the end in mind is. Wynn, you’ve been catalyzing the conversation in Texas about the data with access and success. I’d love for you to talk a bit about the cohort work you’ve done.
Texas has an amazing data system. We can track kindergartners all the way through college completion and into the workforce. One out of five Texas eighth grader goes on to complete a postsecondary credential. That is a certificate, an associate degree or a baccalaureate. It’s one in 10 if you’re Latino, African American or low income. It’s one in 14 if you’re a male from one of those three categories. In Texas, our K-12 population is 51.8 percent Latino and 60 percent on free and reduced price lunch. The populations that are largest and fastest growing are the populations that higher education has traditionally served the least.
I think getting some insight and some understanding of who those students are and what their experience is as they experience higher education is key. We have a lot of room, as Jason said, those credentials, the job growth that we’re going to see, already are seeing in our country, a lot of those are for sub-baccalaureate. That’s the easiest credential to earn, and it’s the one that we’re awarding the least. There are some things that we can do there, but only if we’re willing to.
Using Data to Drive Initiatives
That’s where this conversation really gets in is diving into the initiatives. I’d love for you to talk about Achieving the Dream. Your foundation specifically said, “Let’s help trustees. Let’s help institutions think differently about data.” One of the biggest things was putting their toe in the water to look at data. It would be interesting for you to reflect on that experience.
I think absent the use of data—the kind of data that Civitas Learning provides institutions—there’s a lot of institutional methodology. I worked in higher ed for 14 years, and I still teach. We think we know what’s going on with students. We think we come from wonderful institutions. We have wonderful success stories about our students. I’ve been in those small groups settings, at round tables you have with two or three trustees and a CEO where, for the first time, they’re looking at this aggregated student data—their data—recent data. It’s aggregated by race, ethnicity, income and the percentage of low-income, first-generation students, developmental education and the success rates.
For the first time, the trustees see data that they’ve never seen before. The light bulb goes off over their head. Sometimes, the CEO had been talking about the success agenda for years with them. Helping them understand, first, the policy environment that needs to be created to get the work done on campus. You can have a CEO and often an administrative leadership team that knows what needs to be done, but they’re fighting institutional inertia and that methodology. Having the board from the outset say, “This is important work and there’s a policy environment that we’re going to create around student success, and we’re going to be asking the questions,’ that’s really important. That all happens for us in the context of Achieving the Dream.
We heard a more complex, nuanced conversation about students than we had heard in our careers. Community colleges laid out: “This is what we thought was going on. This is the data we collected. This is how we tested intervention. This is what we learned, and this is what we’re doing now.” They knew the problem that they were trying to solve, understand how they’re going to measure it, and put a plan in place. And then, following up on that with a real commitment to come back, and learn, and change.
Jason, when the Gates Foundation was leaning in to a complementary initiative around Completion by Design, a lot of the findings pointed to the importance of that data to drive understanding of what’s happening on that pathway from entry to completion. That whole ‘pathway’ conversation changed things. It drove the conversation around the need for the development of the iPASS infrastructure.
Definitely. I remember back to the Achieving the Dream early days as well as Completion by Design’s and getting data in the hands of the right decision-makers to really realize what was going on, not just on their campuses, but in their states and across the entire nation. It was very eye-opening. In the first five years of our strategy, we kind of sprinkled investments all across the country with 300 different institutions. We worked with about a hundred different interventions or forums we were testing. The last five years has been ‘Let’s distill the research that we have. What worked? What didn’t? When it did work? What were the conditions that made it work? Then how can we make some really big bets?’ Now, we have 10 focus areas that we’re really doubling and tripling down on with our investments. One of them is Integrated Planning and Advising for Student Success— Civitas Learning is part of that ecosystem. That’s a major big bet area. It’s actually our third largest of the 10 big bets that we’re focused on right now.
You folks at Kresge have been driving this larger conversation with some of your technology investments. Making sure you have data around the technology has been a big part of your conversations.
Like you said in the beginning, there’s a lot of experimentation at the edges and this idea of trying to go for scale. One of the issues we faced, I’m sure that both of you have faced it as well, is issues of scale where you get a great intervention, and the organization can’t figure out what to do to make this happen on a broader scale, to make it more sustainable. I think, for us, the issue with data is that so much of higher education has been really managed by anecdote. I think this idea of just taking a look at when are students failing, why are they failing and what sequence it’s happening in—constantly asking those questions—I think is really critical. Just having the data doesn’t do you much if you can’t translate it to change.
Yes. Jason, one of the things you’ve been looking at getting institutions to help, is not only to learn what to do as they get the data but also to learn how to scale, right?
That’s true. I’m not going to tell you that we have figured out the scaling thing. We have been testing it enough that we have some hunches about what helps with scaling. The biggest is that many colleges and universities have sort of a planning year, a pilot year, and then maybe they roll it out to half of the campus. It kind of trickles its way out. The faster you can make that planning and pilot year a single year, then actually rolling out completely once you get done with that year, the better chance you can accelerate it from a four-year cycle down to a two-year cycle. You have to take a startup mentality that the pilot and planning year is iterative. You’re rolling out a few features, or maybe you’re working with a certain subset of the advisor group, or maybe you’re just working with freshman students. There are always ways you can cut and slice it to make it the right size for your campus.
Texas has actually lived through a pretty large-scale change where they’ve implemented something at scale that just seems to be an interesting idea in other states: Early College High Schools. It’s been interesting to watch how Texas went big with this strategy, and how the data around it shocked people in the beginning, but now, we all see the impact data. That’s an interesting story to tell because it really is a scale story, right?
It absolutely is. This whole notion of scale; I think about it a lot. The reality is we have things, and actually a lot of things that work at scale. They just don’t work well for most students. We have scaled public education. We have scaled a lot of things and we’re getting the results that we’re getting. We want to be really careful what we scale. Of course, all of us have the best interest of students at heart. Texas has the largest network of Early College High Schools in the country, at about 150. We have more than 39,000 students enrolled in Early College in the state. For those of you that don’t know, Early Colleges actually were something that started from the Bill & Melinda Gates Foundation. There were a number of other investors who came alongside, and we were one of them here in Texas, along with the largest funder which was the Texas Education Agency. Of those 39,000 students, 79 percent of them, I think, are Latinos. I think 80 percent are low income. The intervention is aimed at students who, otherwise, would not have graduated from high school, much less enrolled and completed higher education. It’s not about a creaming strategy. It’s about students who, in our state, wouldn’t have been in the top 10 percent.
It started small. Over a decade later we now have a 150 of these campuses. The impact data, the success data is incredible. The high school graduation rates are above the state average. Their direct enrollment into higher education is 15 percentage points above the state average. They persist and graduated about 20 percentage points higher in baccalaureates. That’s about 20 percentage points higher than the state average. I don’t know that we could have scaled in the same way if our legislature hadn’t been willing to back the initiative.
View the entire conversation from the Foundations of Change panel at Summit 2016.
Listening to the Data
A great thing about that story is the idea that a lot of it is about overcoming belief systems that those students wouldn’t have been successful. By taking that to the scale you’ve got enough people excited about it, and willing to put some fuel on the fire to take it to the next level.
We were talking before this session began that this is probably akin to the transition to computers on college campuses 30 plus years ago where people were initially buying computers because it seemed to make sense, but they figured they would just buy one. Of course, they became permanently implanted throughout the organization. I think the same thing will happen with data analytics. They’re going to look at the entire organization and say, ‘Why are we having a problem with students in this section of this course, or in this part of the university?’ Everyone is going to be thinking about it. Ideally, the leadership of the institution has created an atmosphere where they’re not afraid to bring up those issues.
All of the work that all of you are doing is incredibly important for a couple of reasons. One, is that leadership has to believe this is possible, and take it as a priority. Two, they’ve got to start doing things differently. It’s not just about bolting on some new programs to what is already existing. It’s literally re-organizing and restructuring. We’ve had sessions this morning, and this afternoon, all around the idea of this work being about operating differently. It’s not just about traditional IR. This is actually about operational and diagnostic data, not autopsy data. That means thinking differently.
Definitely. If you go back three years, we did iPASS I where 150 colleges applied to do some type of reform. We were not very structured about exactly what the reform needed to be, but it involved redesigning, advising, and using predictive analytics. iPASS I was, I would say, a more open-ended project. By iPASS II, we knew what worked, and we were able to tell the colleges that were applying, ‘You need to let us know what system you’re using. You need to let us know how you’re going to redesign your advising program. We need to know that this is an integrated approach across your entire campus.’
We told them, ‘We want you to submit three applications. One from leadership. One from IT, and one from IR. They’re all going to come to us in one package.’ Then we received more than a hundred applications. We looked at them. I read many of them myself and said, ‘Alright. Here’s what leadership is saying. This is one of the top priorities, maybe even the top priority. Here’s why they’re doing it. It’s focused on retention or progression. Here are their goals. It fits in. Alright. Let’s look at the IT one. Okay. IT is talking about whether the system is going to integrate with this system or that system, and here’s what they’ve got as a legacy system. Here’s what they’re thinking about acquiring as a new system. Does it reference what leadership is talking about, or does it seem like a silo?’ Then I’d look the same way at the IR application. It was very interesting.
With many of the applications, you could see that a committee had been formed. People got together. They kind of parsed out exactly what they were going to say and what the main talking points were. Others it raised questions if these people even talked to each other before they applied. How could that be possible? It was all over the map. The ones that were most integrated are the ones that we actually chose as the winners.
They were the ones that said, ‘We already planned this two years ago. We’ve been doing the pilot, and we think we’re missing this additional functionality. We think we want to change our advising ratio from one to 600 to one to 300, and institute a professional advising core for the first year, and then transition to faculty when students make their major selection.’ There was that level of detail and specificity. If it occurred across all the application, that said this institution knows what they’re doing. We have a high degree of likelihood of success.
Integrated Data Strategies
Do you see a shift in the grant-making world where people are now saying they want integrated data strategies? Do they need to think through how they’re going to be leveraging data in your kind of projects? Do you see that coming into the grant-making world? It seems like data is getting woven in to a lot more of the grant submissions where people want a formative evaluation strategy along the way. I’d love have you reflect on that for a bit.
I’ll air some of our own lessons learned here so that you understand that just like your institutions, we don’t have it all figured out. We’re still figuring it out.
With the iPASS I program that I talked about from three years ago, about a year after that program got started, we said, ‘We should be evaluating how these colleges are doing.’ Then we quickly added an evaluation to the project. This time around, we were closer. It was six months afterwards, and we actually made sure the evaluation members of our team were part of the process, were actually some of the folks that chose some of the winners, or at least graded them.
Now, what we’re realizing is that we need to go to what I would call evaluation 3.0. We don’t want the report only at the end of the project. We actually want—every six months—for you to come back to us and give us your qualitative findings. It’s often times not going to be the full quantitative churn. That will probably be on an annual basis, but qualitatively what are you seeing on the campuses? What are things you’re observing?
For most of our grants, we do the evaluation internally in partnership with the organization in which we’ve invested. That’s based on a really clear understanding up front about what the milestones are, what the process outcomes are, and what the student outcomes are. What we’ve learned along the way is this is important. We’ve also learned some of our initial assumptions weren’t very good because the first students that were able to finish so quickly in El Paso (in the Early College High School) did it in two years or less. We thought, ‘Well, they’re earning 60-hours of college credit in the community college, so we’ll just give them two more years to earn a baccalaureate and it will be a two-year scholarship program.’ What we learned is most students can do it in three years, but not in two, so we changed the program from a two-year model to a three-year model from a design standpoint. Now, we’re providing three years of funding after high school graduation.
I think this points to the issue of why evaluation is so valuable with data analytics. This is not just about the data. It’s applying it correctly and applying the intervention with fidelity. Someone might say, ‘We’re going to do Early College and we’re going to do two-year scholarships,’ and will end up wasting time because someone else has already done that test and figured this out. We’ve heard a lot of people talking about things like pathways, which is a huge word in postsecondary access and success today, but not everybody is doing a pathway with the same fidelity as the very best pathway organizations pursuing these most effectively. Yet, they think they are. The problem I see with this is that they actually undermine the value of what’s being done and say ‘We did it. It doesn’t seem to make a difference.’
That’s where you end up in that best practicism, right?
You end up adopting because everyone else is adopting it. What’s pretty clear in the foundation world is the movement toward more formative and summative assessment. It’s tuning and testing the interventions and the grant programs to get the maximum impact. Impact philanthropy is the coin of the realm, right?
Yes. This evidence-based intervention model is now, I would say, fully embedded in our education side of the house. We actually take those evaluations very seriously. We think hard about the questions we want to ask. We have a lot more questions now, not just about the product and the taxonomy, but about the implementation and the fidelity of implementation. It’s driving all of our decisions about how we make big bets, what areas we do discovery research best in, and which areas we prioritize with our funding.
It’s probably fair to say you’re looking for partners who think that way, too, right?
Very much so.
Data Sharing Ecosystems
So you’re looking for evidence-based kind of interaction. The Achieving the Dream schools are building partnerships with K-12, and partnerships with universities. Their kind of orientation towards this evidence-based practice, how is that cascading across the ecosystem?
I think, in some cases, it is, but there is resistance in some ways. I’m part of a four-county collective impact organization in the Rio Grande Valley called RGV Focus. I sit there and I hear the superintendents, and the community college president, and the university president talking about sharing data. They put the student in the middle of the conversation and want what’s best for the student. ‘How do we best serve those students?’ In other conversations, I’m not three minutes into it and with similar leaders in other places, I start hearing ‘We can’t share information.’ I think there are places where we know how to do that and we know we can do it. I think that it’s not so much about ‘Is it possible? ‘ It’s about whether we allow the institutions to work together and share data. Will we realize that most students, certainly in a local context, (take the flagships in our state out) most students stay local for higher education. If we hope to increase our outcomes, the best way to do that would be to work with the institutions that are sending us students and to whom we’re sending teachers, and to realize it really is an ecosystem. If we would get to a place where we could share our data, we’ll do better all the way around.
The Road Ahead
Let’s talk about the combination of access and success innovations, together with the data. What has you interested in or excited about that work?
Well, the biggest thing is that we move way beyond the completion agenda and using data. We’re done with that. We are all on top of that now. How do we get higher quality data faster to these users? This is the data we think we want to look at. Is that the right data? The level of the questions and answers has gotten so much more fine grained and so much more focused on helping the student cohorts that are not succeeding on our campuses. We’re at a very different level now where there’s so much activity that’s happening. It’s now about how we integrate the activity, and be more efficient about it. We should all take some credit. We have moved. The field has moved. You all have moved significantly in the last seven or eight years. It gives us a lot of hope that the next seven or eight years are going to be even more impressive.
I think when you start seeing some institutions with high percentages of grant students, high percentages of underrepresented minorities, and growing their enrollments at the same time they trying to see dramatic increases in their graduation rates, that really gives me a lot of confidence and faith – the idea that they’re working in this community.
The institutions, the colleges and universities, they’re so autonomous and relatively powerful. The community college two miles from the public regional acts like they’re 10,000 miles from each other. I think this idea of trying to get communities to address these problems more comprehensively as a local ecosystem, I think is also something that excites us.
Yes, that’s right. The thing I’m most excited about is the sense of urgency I see among the leadership of our state’s community colleges. I see through the Texas Association of Community Colleges, the presidents pushing for change, and I see our boards of trustees supporting that push. The thing I’ll point to that I’m most hopeful about is this notion of guided pathways. Our community colleges have said, ‘That’s the direction we’re heading wall-to-wall across 50 community college districts and 750,000 students.’
The thing that really I’m most encouraged about are the faculty. We haven’t talked much about the individual faculty role today. I think you can’t do pathways without the faculty. As Kay McClenney says, we’ve done all the easy stuff, which by the way wasn’t easy, like eliminating late enrollment, requiring orientation and the mandatory student success courses. We got to change the way students experience the curricular pathways through institutions. We have to bring faculty into that.
I’m pretty convinced that if and when we do that successfully, we’ll start to see changes more dramatic than we have seen, based on the other kinds of reforms or initiatives that institutions have tried.