The LAK11 MOOC
In preparation for the Learning Analytics and Knowledge 2011 (LAK11) conference, George Siemens, Jon Drown, Dave Cormier, Sylvia Elias and Tanya Currie hosted a massive open online course (MOOC) on the subject. A MOOC is a course where large numbers of participants converge online in order to discuss and debate a subject for a set period of time, where all the resources and interaction data is publicly available and recorded for future use. The participants use formal means of communicating through Moodle (a learning management system) and Elluminate (an online collaboration tool), as well as less formal means such as social network groups, micro blogging, aggregated personal blogs, and social bookmarking. In the LAK11 MOOC, participants used these means to become aware of and deliberate the fundamentals and the future of Learning analytics. The LAK11 MOOC has become the most comprehensive source for Learning Analytics anywhere. I experienced the LAK11 MOOC after the fact by watching the Elluminate sessions, reading the suggested content, perusing the Moodle forms, and playing with the suggested analysis tools.
What are the Learning Analytics Fundamentals?
In general, it seemed like those participating in the LAK11 MOOC conceptualized Learning Analytics as the intersection of 1. Student/machine data, 2. Analysis: how the data are connected and why, 3. Curation: personalized and adapted content and relationships, and 4. Prediction: targeting remediation and interventions, recommending resources and behaviors.
But my reduced conceptualization (note that this is my version of what was deliberated in the MOOC) is lacking. Other important elements of learning analytics were also discussed in the MOOC, but it is not clear to me yet how they come together: such as,
- Data ownership
- Learning philosophy fit
- Open vs. closed systems
- How to deal with the necessity of closed learning systems for minors?
- What are the advantages and risks of the timing of decisions based on predictions (preemptive, just-in-time, postmortem)?
- Who consumes Learning Analytics data and recommendations?
- Who make the Learning Analytics decisions (i.e. algorithms, etc.) (for students, instructors, departments, schools, etc)?
- How are data, recommendations, and requirements best communicated to those involved?
- How can technical decisions in the Learning Analytics workflows be easily evaluated by consumers of the data and recommendations?
- What is learning?
- What is success?
- Can meaningful learning be measured and evaluated by Learning Analytics?
Dr. Linda Baer (2011) from the Gates Foundation also presented a useful hierarchy for strategic intelligence in higher education. As she cited it, the framework was from Competing on Analytics (Davenport & Harris, 2007).
|Optimization||What’s the best that can happen?|
|Predictive Modeling||What will happen next?|
|Forecasting/Extrapolation||What if these trends continue?|
|Statistical Analysis||Why is this happening?|
|Alerts||What actions are needed?|
|Query/Drill Down||Where exactly is the problem?|
|Ad hoc Reports||How many, how often, Where?|
|Standard Reports||What happened?|
There is still quite a lot to iron out in Learning Analytics, and I am beginning to wonder whether it is useful talk about Learning Analytics for the sake of Learning Analytics, given that it is so different depending on what is being measured and in what context.
Learning Analytics is Broad and a Bit Overwhelming
One thing is clear to me about Learning Analytics, it is a bit overwhelming. In order to be conversant in its variety of applications, it appears that one needs to have familiarity and or expertise in the areas described in the image below. What kind of program provides the opportunity to gain skills, know how, and knowledge in all of these areas? With such a high bar for entry, it hardly seems that this field can take off. It appears that there needs to be a way for specialists to come together easily so that interdisciplinary contributions can be made. But are there some skills that should be pervasive? Are data mining and SNA the new language for social new research literacy? If so, it seems that we need what blog software did for web publishing for learning analytics, to learning measurement, evaluation, and prediction may be available to the masses.
Who Benefits from Learning Analytics?
George Siemens addressed the beneficiaries of Learning Analytics in his January 11, 2011 MOOC session (2011). He described what technology or technique is used in Learning Analytics at varying levels of hierarchy in formalized education and who benefits from analytics at those levels respectively.
- Course level:
- Social networks, Conceptual development, language analysis
- Learners, faculty
- Predictive modeling, patterns of success/failure
- Learners, faculty
- Learner profiles, performance of academics, knowledge flow
- Admin, funders, marketing
- Comparisons between systems
- Funders, admin
- National and International:
I think that it will be good to also address who will be disadvantaged by Learning Analytics. In Ian Ayes’ presentation to Google (2007), he mentioned some studies that showed that programed instruction had greater positive effect on student test scores than other methods. The danger in returning to such methods is that teachers become disengaged over time by such restrictions. Disengaged teachers often produce uninspired students. There is no doubt that for certain types of learning, strict regimens are an advantage to students. But that is not to say that strict regimens should be imposed on all learners for all types of learning.
But what does this have to do with Learning Analytics? Well, the immense amount of evidence behind certain techniques of learning and instruction that will be possible through learning analytics may turn teacher discretion into a thing of the past. The current entry point of modern technologies into the educational system usually comes with the assumption of, “let computers do what they do well and people do what they do well”. This may turn into, however, “make people and computers do what computers do well.” As school districts and legislatures look for ways to be efficient with money in education, they often look for large standardized programs that restrict teacher’s creativity and resourcefulness in the name of raising the minimum bar. Will Learning Analytics have the same effect? Will holders of the education purse strings interpret easy-to-graph and-display data as the only data worth entertaining for decision making, thus reducing the teachers’ input and freedom to education and engender passion for learning by restricting them to only follow prescribed, trackable methods? I am sure some of this will happen, but I surly hope that it is in the minority of potential outcomes in the long run.
Ayres, I. (2007, November 8). YouTube – Authors@Google: Ian Ayres. Retrieved from http://www.youtube.com/watch?v=5Yml4H2sG4U&feature=player_embedded
Baer, D. L. (2011, February 8). Systemic Adoption of Learning Analytics. Presented at the LAK11 MOOC, Systemic Adoption of Learning Analytics. Retrieved from https://sas.elluminate.com/site/external/jwsdetect/playback.jnlp?psid=2011-02-08.1140.M.340DDA914E66190DED68B759DCF9C3.vcr&sid=2008104
Davenport, T. H., & Harris, J. G. (2007). Competing on Analytics: The New Science of Winning (1st ed.). Harvard Business School Press.
Fritz, J. (2011, January 11). Learning Analytics. Retrieved from https://sas.elluminate.com/site/external/jwsdetect/playback.jnlp?psid=2011-01-11.1101.M.340DDA914E66190DED68B759DCF9C3.vcr&sid=2008104
Siemens, G. (2011, January 10). Learning Analytics: A foundation for informed change in education. Presented at the EDUCAUSE ELI Webinar: Recording. Retrieved from http://educause.adobeconnect.com/p63014716/