For those working in the community college or even the access university field, take a minute to download the impressive recently released joint statement: Core Principles for Transforming Remediation within a Comprehensive Student Success Strategy.
It’s impressive not only because it provides useful and compelling student success strategy direction, but also because of the array of groups that participated in developing and promoting it, including: Achieving the Dream, American Association of Community Colleges, Complete College America, Education Commission of the States, and Jobs for the Future. Take the time to read it through. The Six Core Principles will catalyze important conversations and hopefully inspire good work in student success innovation.
Here’s a look at the six core principles they’re advancing:
In the full report they also call for greater collaboration with K-12, Workforce Partners, and Adult Basic Education Providers. The authors also point to the need for community college and university pathway alignment, particularly with content innovation that might impact transfer relationships. And finally, they end with a call for innovating at scale, not making this another excuse to create yet another student-success pilot program.
Key Take Aways on Analytics and the Core Principles
For those leading the charge to make the most of advanced analytics to help students succeed, there are two key take aways from this report. Principle six specifically calls for institutions to leverage data and analytics tools to help students stay on track and provide comprehensive progression data. We couldn’t agree more—especially when it comes to getting data directly to the front lines. Indeed, this is the driving philosophy behind our Degree Map
, Inspire for Advisors
, and Inspire for Faculty
apps. They were built to bring the best of data science and design thinking to bear to help faculty, advisors, and students chart the course to completion. Far too often the student progression data conversation stays in administrative, accreditation, board, or legislative arenas. We have to develop a bias toward getting good data in compelling ways to the front lines if we are going to move the needle on student success—what we like to call the democratization of data.
The second take away relates to the need to innovate at scale. The report writers rightly argue that approaching today’s student learning and completion challenges with random acts of innovation or subscale pilots is not the path forward. However, to make the most of the important, informative, and insightful recommendations provided by this report, our work over the last two years with institutions large and small innovating around student success suggests that leaders are going to need to turn the DIAL—optimize data
to maximize insight
to inform action
and continue learning
—if they are to scale and optimize the impact of these suggestions.
Institutions have a wealth of data about students and their progression through programs. However, it’s usually in silos. Even if it’s brought together in data warehouses, its sub optimized for the kind of analysis needed to try, test, and tune at-scale innovative programs. If you can bring data together from Student Information Systems (SIS, e.g., Banner, PeopleSoft, Jenzibar), Learning Management Systems (LMS, e.g., Canvas, Blackboard, D2L), Customer Relationship Management tools (e.g., EAB, Salesforce, Hobsons/Starfish), and other sources (e.g., tutoring centers, card swipe, surveys), you can begin to consolidate a variety of important variables from various sources; in addition, you can create and test feature variables—variables that are calculated by the combinations of data points—that end up driving a much clearer and more predictive view of our students’ journeys through our institutions. Moreover, our data scientists created a patent-pending technique that leverages student-data availability and performance clustering so you don’t leave out data from your predictive models just because not all students have them. In short, pulling data together from different sources and optimizing it for analysis matters in a big way, especially if you want to build predictive insight into intake, persistence, and completion at the individual and active-student levels. Finally, in partnership with institutions, we’ve learned you can leverage the cloud to create a production-quality optimized data layer that allows you to turn the DIAL without investing millions in hardware, software, and programmers/analysts.
Once the data are staged and the intake, persistence, and completion predictive models built, you can leverage insight applications to “turn the lights on” and get a sense of what’s happening in your institution. Reports like the Six Core Principles are powerful tools to learn from the data and research work of others; however, it’s important to explore these trends and dynamics in your own contexts. As noted in the book Faster, Higher, Stronger
, the best of elite athletes learn that sage advice from nutritionists, trainers, and coaches has to be balanced against the “N of 1” testing, which means you need to make sure it works for you and in your context. They would say that expert advice is directional and important, but must always be tested—as anyone who’s tried a diet that didn’t work for them can attest. To do this insight work, you’ll want to leverage the data to both test hypotheses and to let the machine learning expose relationships that might not have been expected. For example, when using our Illume
insight application, the purpose variable noted in the Six Core Principles report usually pops out for new student groups as one of the most powerful predictors of persistence and completion. And, when the infrastructure is built correctly, as you find this insight you can actually pull lists of active students—students you’re serving right now!—that fit the pattern. Once identified, you can decide what action to take to help these students today, not next semester.
In addition to looking for predictors and patterns and informing immediate outreach, when data about your student flow are pulled together for insight analysis, you can learn important lessons about the impact of courses, course pathways, and programs; intervention strategies and initiatives; and policies and practice. When you have models built to the student level, you can leverage prediction-based propensity score matching (PPSM) to help try and test each of these in a more rigorous way (more on PPSM here
). In essence, PPSM can help create useful control groups to create more comprehensive studies of impact that aren’t subject to typical challenges like selection bias—e.g., students that opt in to orientation programs are often the ones who need it the least and are more likely to be retained anyway. We’re now suggesting that institutions audit their vast array of student success programs and conduct rigorous impact analysis, learning what’s working at what level and for whom.
After better optimizing data and maximizing insight from your own student data, you’re positioned to take directional information like the Six Core Principles and decide on the actions your institution wants to take. The combination of this report and insights from your own data might compel you to change policy and require default course pathways for entering students (principle 2). Or you might decide to launch a modular, accelerated developmental math emporium (principle 4). Or you might decide to begin leveraging analytically fueled applications on the front lines to help advisors and learners chart the course, like Degree Map
or Inspire for Advisors
(principle 6). These actions are now grounded in external learning and internal insight, and the applications powered by your active-student data. However, because you’re leading in the age of analytics
, you effectively instrument whatever policy, practice, initiative, or outreach you launch to ensure you can leverage the resulting data to tune and test it from launch into learning. See this learning brief from Austin Community College on their Degree Map
initiative to get a sense of how this cycle unfolds.
For far too many student success innovations the launch was exciting, but the learning about whether it worked was a long time coming, if it ever came at all. We suggest you move quickly from launch to learning by making sure you fully leverage optimized data, maximized insight, and informed action. In the learning phase, you continue to bring insights in from outside sources. However, you combine these directional pieces (e.g., Six Core Principles) with what you’re learning in real time about your own actions in your own contexts. Think of it like having an organizational FitBit
. Fitness fanatics nationwide are using these types of devices to take directional suggestions from trainers, doctors, friends and family and combine them with real-time data from their own body, insights drawn from apps, and actions they decide to take—e.g., change their sleep policy and go to bed earlier, adapt their eating strategy and balance carbohydrates, or increase activity and shoot for 10,000 steps a day. Regardless, they use the combination of directional learning and real time data to turn the DIAL. Their data fuels their insights to inform their actions to continue their learning and development, and hopefully better health. The same principle applies to our organization. Having a home analytics system allows you to leverage important and useful pieces like the Six Core Principles and begin to turn the DIAL in your own organization and guide the work of change toward real and significant impact for student learning and completion.
As the learning continues, leaders more often than not realize they need more and better data. Faculty begin expanding their use of the LMS. Intake counselors begin asking different and more nuanced questions. New tools are connected into the data flow. Put simply, the learning phase keeps the momentum going and speeds the turn. From learning, the leaders continue to optimize data, maximize insight, inform action and continue learning – they keep turning the DIAL. And all of this is designed to drive toward improved and expanded student learning and completion.
Turning the DIAL
Civitas Learning provides a home analytics system that enables institutions to make the most of their learning data to help students learn well and finish strong. It’s our intention to help diverse institutions with diverse students learn from important external pieces like the Six Core Principles and then turn the DIAL internally to see how they can continue to drive change to improve outcomes for their students. It’s far from easy work, but the results are powerful. Just as we tell our students, critical thinking, creativity, and courageous action–especially when connected to continuous learning–makes all the difference in the world!