- Stop assuming everyone understands. When in a leadership role, it is quite normal – and human – to not want to expose gaps in one’s knowledge or skills. However, simply nodding and saying “I understand” when you don’t is missing an important opportunity to not only increase collective knowledge but also build trust across the team. When presented with an acronym-laden presentation filled with data-informed insights – ask the questions that others may be thinking but too afraid to ask themselves. Make sure that everyone in the room (including you!) has a clear understanding of what is being presented. You don’t have to be the smartest data analyst in the room, but you do need to be able follow the conversation with some level of a functioning “BS detector.”
- Stop assuming academic research is the same thing as data analytics. Related to the assumption above, I see a fair number of academic leaders who are skilled researchers and scholars. However, the skills, software, and approaches used in their academic research may be different than those used in the current approach to data analytics. For those of us who were trained to analyze data with an eye to understanding causality, there can be some discomfort in informing decisions and actions simply based on correlation. However, given sufficient data and strong enough correlations – and with the urgency of addressing student success – this is how we can now use analytics in our work today to measurably improve outcomes.
- Stop assuming that the data will give you the answer. The insights gained from analytics will inform you, but they may not present a clear answer to your questions. That is ok. The data themselves are impotent without context and human agency. They are a starting point from which more and more questions are likely to be generated, all of which will inform how you can craft an answer.
- Stop assuming there will be an “Aha!” moment. The analytics-driven search for the surprise, the unknown, the “Aha!” can be fun, and may shed new light on your student success initiatives. But I believe there is more power in the confirmatory findings – those things that solidify what you may know through experience or “gut.” There is real power in providing some analytic precision to what you already know about a given student’s likelihood to persist. The confirmation through data can provide solid grounds to launch or refine interventions, and you will also have a solid benchmark against which progress can be measured.
- Stop assuming the data will confirm your beliefs. I know this sounds like it contradicts #4 above, but I urge you to consider your own confirmation bias. We all have an orientation toward confirmation bias, which is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s pre-existing beliefs or hypotheses. Be open to letting the analytic-informed insights lead you to a conclusion that may be in conflict with your personally held beliefs about student success. A lot can be learned when we let ourselves be open to the learning!
Share this Post
A Civitas Learning colleague and I teach a class in a master’s of leadership program. The course is designed to help emerging leaders use their organizational data to inform their decisions, policies and approach. Our class design has been informed by the work we have been privileged to do with our various college and university partners. Across all of the different dimensions of higher educational institutions we work with – public, private, for-profit; big and small; urban and rural; access and selective; two-year and four-year; those that have created a successful culture of using data to improve student success and those that are still on the path to doing so… we have found that the starting place in this culture work is often related to knocking down some inherent or tacit assumptions on the part of institutional leaders. Here is a list of assumptions higher ed leaders need to tackle: