In this episode of 3 Big Questions we visit with Dr. Phil Ice. 3 Big Questions is a recurring feature in the Civitas Learning Space.
Should institutional data sources be used and integrated into instructional design strategies, and if yes, what’s the value proposition in doing so?
There are numerous instructional design strategies used in higher education, but to begin to answer this question I am going to focus on one – ADDIE, where the E stands for evaluate. Other models have different associated terminology, but the common denominator is that they all have some phase in which designers are supposed to look at the learning outcomes and determine if the instruction was designed correctly and make modifications accordingly.
Typically the evaluation phase has been a lengthy process in which instructional designers review learning outcomes and collaborate with faculty to understand where there are strengths and deficiencies in the instructional design strategy employed. Then the assessment and design phases are iterated upon in an attempt to strengthen the manner in which content and pedagogical strategies are utilized. In large part, this process has required significant time between iterations as a function of the availability to collect and analyze data. When we look at methodologies being used by entities such as Civitas Learning
, we see dynamic data collection and analysis systems that allow instructional designers to quickly diagnose problem areas and quickly iterate on design strategies. For institutions this translates into the ability to quickly respond to student needs and improve cognitive outcomes.
How has data analysis changed over the last few years and what are the implications for higher education?
When I went through my doctoral program we were taught that research consisted of systematically combing the literature to define areas of need and then move to constructing elaborate research questions with elaborate methodologies. While there is a place for this type of work, it is not the only approach to research, and in some respects can not respond quickly enough to the critical challenges that are emerging in todays educational landscape.
With the advent of extremely robust back end technologies it is now possible to quickly federate large volumes of data on a real-time, or near real-time basis, and implement data mining strategies to understand trends that can provide mission critical intelligence to practitioners. Notably, the ability to continuously collect and monitor data, in conjunction with appropriate interventions is critical to both understanding core issues as well as providing just in time support to stakeholders from student to university presidents.
For researchers this changes the paradigm from one in which problems are assessed retrospectively using very formal research constructs, to one in which there is an expectation of providing data for the creation of solutions to immediate problems. After the fact, the more formal research techniques can be utilized to address causality. For researchers, this dual methodological track calls for either the acquisition of new skill sets around data mining, and associated technologies, to complement traditional research techniques or developing deep collaboration with third party entities that specialize in the former and provide insight to inform the latter.
What do higher ed practitioners need to be doing now across our institutions to further catalyze a whole community of practice in analytics?
Cross-institutional work has always been difficult. In the course of my career I have been involved in several multi-institutional initiatives, with the Community of Inquiry Framework and the Predictive Analytics Reporting initiative being the two most prominent that have reached fruition. However, I have been involved in numerous think tanks and initiatives that have never achieved critical mass and, as a result, have fallen by the wayside. Notably, many of the those have come to market through private entities or other working groups. Regardless of the level of success, the common denominator has always been that the individuals involved all had a sense of common purpose around a fixed set of questions. As I noted in the previous question, the way in which data mining is evolving is changing the paradigm from one of fixed goals to one in which we look at data from an exploratory perspective and begin to develop deeper questions based upon what we see.
For institutions that want to be involved in communities of practice in analytics, this means becoming much more flexible both in terms of initial mindset and goal orientation. In short, we need to accept the fact that data is now telling us things that we might have never expected and in some cases defies conventional wisdom. To be an effective member of a research community, we need to be able to embrace these types of findings and rapidly develop research constructs that address associated causality.
For these types of communities to be successful, practitioners also need to accept the fact that not all truths are universal. Specifically, while a given set of variables may be highly significant for one university they may be virtually meaningless to another. To understand why this is the case we must be willing to understand that this is not only possible but can in fact be the norm. From there community members must develop a shared sense of purpose in understanding these differences and engage in comparative analysis. When this type of data sharing becomes routine, then we can form a baseline for developing best practices that will impact the largest number of stakeholders and have truly universal impact.