Posted by: Michael Atkisson | June 2, 2011

Learning Analytics: a House Without a Foundation?

This is a reflection on the Week 1 reading list of the Learning Analytics Open Course in conjunction with the 2011 Learning and Knowledge Analytics (LAK11) conference (which I presented at) organized by George Siemens of Athabasca University.

What is Learning Analytics? Where did it come from?

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs (Siemens 2011, http://educause.adobeconnect.com/p63014716/). Several fields, including: inquiry, design, and business, web services, and psychology have overlapped with each other enough that they themselves and others have noticed the potential synergy among these source fields for applications in educational entrepreneurship and research. Learning analytics is a part of and yet apart from its source fields. For example, as business intelligence (BI) and online marketing analytics tools and methodologies have become more available, people in other fields, such as education, have looked for ways to apply those tools so that they may make better, data-based decisions as those in BI have done (Elias, 2011).

Different groups have approached this opportunity from different directions:

  • Academic Analytics, as Goldstein defines it, (2005) is the higher education instantiation of BI for the purpose of higher education administrators to make better management and strategic decisions based on ERP data.
  • Action Analytics, a trademarked name, adds the goal of improving faculty practice to its shared goal (with academic analytics) of improved higher education management (Elias, 2011, p. 3).
  • Educational Data Mining (EDM) essentially is a body of workflows for finding data patterns in large data sets that come from formal educational settings. EDM (Baker & Yayacef, 2009), applies traditional data mining techniques with clustering methods in order to account for multilevel nature of data in education settings (students in the same class will be more similar than students between classes, and the same goes for schools, districts, states, etc., which violates traditional random sampling assumptions).
  • Networked Learning is the study of learning that people do while interacting in virtual networks and with learning resources (“Networked learning – Wikipedia, the free encyclopedia,” n.d.).

Learning Analytics, according to Elias (2011), is a field of tool and process development in the area of teaching and learning that:

  1. the enables students and instructors to make decisions based on real-time or near-real-time data (individual, class, and other data),
  2. personalizes content and learner support, and
  3. continuously refines student and instructor behavior models in order to inform practice.

The assumption is that Learning Analytics is different from other related areas of educational inquiry and intervention in that its focus is using data to change the now in real-time at the class and individual levels.

What is a Learning Analytics Workflow? Machine, Human, or Both?

Just as Learning Analytics has borrowed from other fields conceptually, it has also adopted other analytics workflows for making things happen in real-time o near-real-time. Elias (2011) enumerated five, which unfortunately focus on mainly what the technology infrastructure is doing to the data throughout its analytics life cycle (despite some minimal lip service to the contrary), rather than what people and technology do together in the sociotechnical system (Schwen & And Others, 1993).

Table of Five Learning Analytics workflows

It’s fine to describe what the machines are doing in learning analytics, but these workflow frameworks mix metaphors, describing what the people and the machines do in the same terms. This is problematic because it becomes difficult to parse what decisions a human has made vs. what actions have been taken as a result of an algorithm. For example, Elias (2011) noted one framework called the knowledge continuum, by Baker (2007).

The knowledge Continuum

  1. Wisdom: “Use knowledge to establish and achieve goals”
  2. Knowledge: “Analyze and synthesize, derived from information”
  3. Information:”Give meaning to obtained data”
  4. Data: “Obtain raw facts”

Baker’s framework seems to leave out critical elements of knowledge and knowing even though it attempts to use some of the same language:

  • Wisdom: It seems to be more of a result of the skillful use of knowledge, but this does not address the real-time skills of perceiving, reacting, and making sound judgments.
  • Knowledge: Assuming that knowledge only results from analyzed and synthesized information is a serious flaw in the representation of knowledge. This is a strict adherence to the information processing metaphor (IP) of the brain processing things like a computer does. These strict “processing” metaphors bridal out a variety of other types of knowledge, such as emotional, group, cultural and tacit. It’s as if the Baker thinks that the only things worth paying attention to are explicit are individual.
  • Information: “Giving meaning to obtained data,” this brings several questions: Who is giving the meaning? What is data? What is the skill of giving meaning to data? How does that happen? It seems that this stage of the IP metaphor breaks down pretty quickly.
  • Data: “Obtained raw facts,” this seems to go against a few important assumptions, such as: raw facts assumes that no interpretation is made the perception of the world, as if humans could objectively record the world around them inside their brains. Facts assume again that only explicit countable things are worth paying attention to, when to the contrary, the things that often have the most value are quite difficult to state and demonstrate explicitly. Humans are much more capable than just obtaining facts.

“Baker suggested that predictive analytics and the development of actionable knowledge corresponded with the transformation of knowledge to wisdom. The knowledge continuum highlights that it is in the processing and use of data that it is transformed into something meaningful” Elias (2011).

But prediction is not understanding why. Many experiments can be replicated without understanding why and how. Things in the news and journals appear all the time where the most basic things we have taken for granted are proven to be inaccurate. The use of knowledge does not make it wisdom. Wisdom implies value, sometime the use of knowledge does not result in value. Gyroscope motion does not explain bicycles. You can predict and model what bicycles are going to do. But the assurance that comes from accurately being able to predict what the bicycle wheels do was mistakenly extended to shore up the interpretation of why and how this was happening, which is an entirely different and human side of science. The transformation into wisdom isn’t what brings meaning, but it is developed through socially informed practice.

If we are helping people make decisions, then learning analytics is a moral and ethical endeavor. Being able to predict is not a high enough standard, we must understand why and how before we can ethically recommend. Consequently, learning analytics workflows should included human considerations and tasks in combination with what computers do. There yet has been an articulate of what people do in learning analytics in conjunction with technology with the possible exception of my own presentation at Lak11, that focused on the role of interpretation in Learning analytics.

Is Learning Analytics Another Tool Fad?

It seems that Elias’s (2011) Process of learning Analytics, though more comprehensive the other workflows compared in her article, still essentially describes what the machines and algorithms do. Without a window into what people do to participate in the learning analytics process and how that is different from other sociotechnical endeavors, its will be difficult to differentiate it from and to transcend fly-by-night, tool-centered fads.

What would then differentiate learning analytics from the rest of the pack apart from the technical process? We have learning analytics differentiation in terms of:

  • time (now/near-now),
  • users (instructors and learners),
  • activity context (teaching and learning), and
  • environment (virtual/blended).

But it does not seem like this enough to differentiation keep other related fields from taking over learning analytics ideas. If learning analytics is going to put more than the “tool” stake in the ground, it needs to define and make contributions in the fundamentals of how we interpret and interact with teaching and learning, (i.e. philosophy, epistemology, and metaphor) in addition to its technological innovations. Otherwise, we will be selecting, capturing, aggregating, reporting, predicting, using, refining, and sharing all while not understanding why or how.

Will learning analytics align itself with the fading juggernauts of behaviorism and cognitive science (Gardner, 1987; Williams, 1987)? The eclectic round up of theories of constructivism (Hay & S. A. Barab, 2001)? The burgeoning field of the learning sciences (“Learning sciences – Wikipedia, the free encyclopedia,” n.d.) and its boon companion, design-based research (S. Barab & Squire, 2004)? Or the little known branch of hermeneutics called interpretive inquiry (Westerman, 2006)?  Right now, it seems that learning analytics may have little difference in assumptions as skinner’s teaching machines (Skinner, 1961). Are we returning to a glorified programmed instruction in sheep’s clothing? Or are we expanding out into a broader landscape of what it means to learn (Bruner, 1961; Collins, Brown, & Holum, 1991; Engeström, 2001; Lave, 1991; Savery & Duffy, 1996; Vygotsky, 1978; Wenger, 2007)  aided by technology?

Humans participate in meaning-laden (and many times meaningful) social activity that by default requires interpretation to understand. Without addressing meaning and the whys of behavior and social practices, how can we ethically “recommend” learning prescriptions through our algorithms? Currently, the process of learning analytics seems to abdicate the fields responsibilities by attempting to fit into just what machines can do. Maybe instead we should be tailoring the machines around what humans can do.

Learner Self-participation in Analytics Development

Maybe learning analytics should be more transparent and understandable to the general users of the analytics data. Maybe there is a way to involve user preferences in the type of analysis that can be carried out, like what Google news and Netflix do find out about what you want “fed” to you. Maybe learning analytics should be raising the roles of individuals in the organization of their own analytics.

Google News Story Selection

(“Google News,” n.d.)

Netflix video ratings by users

(“Netflix,” n.d.)

So learning analytics questions become, not only what can humans do in the the learning analytics workflow, but where can consumers of their own learning data participate in their own data curation?

Conclusion

Regardless of where learning analytics takes itself, it will be important for those involved in the field to define how humans guide and participate in the development of the analytics and how transparent that should be to end users. It will also be important to go beyond prediction into why things happen the way they do for certain individuals or groups vs. others. Without being able to explain “why,” learning analytics may be a house without a foundation, another tool fad to either be left behind or subsumed by a field with a more persuasive philosophy.

References

Baker, R., & Yayacef, K. (2009). The State of Educational Data Mining in 2009: A Review and Future Visions. Journal of Educational Data Mining, 1(1), 3-17.

Barab, S., & Squire, K. (2004). Design-Based Research: Putting a Stake in the Ground. THE JOURNAL OF THE LEARNING SCIENCES, 13(1), 1–14.

Bruner, J. S. (1961). The Act of Discovery. Harvard Educational Review, 31(1), 21-32.

Collins, A., Brown, J. S., & Holum, A. (1991). COGNITIVE APPRENTICESHIP: MAKING THINKING VISIBLE. American Educator, 15(3), 6-36.

Elias, T. (2011, January). Learning Analytics: Definitions, Processes, and Potential. Creative Commons. Retrieved from http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf

Engeström, Y. (2001). Expansive Learning at Work: toward an activity theoretical reconceptualization. Journal of Education & Work, 14(1), 133-156.

Gardner, H. E. (1987). The Mind’s New Science: A History of the Cognitive Revolution. Basic Books.

Goldstein, P. J. (2005). Academic Analytics: Uses of Management Information and Technology in Higher Education. Educause, ECAR Key Findings, 1-12.

Google News. (n.d.). . Retrieved June 1, 2011, from http://news.google.com/

Hay, K. E., & Barab, S. A. (2001). Constructivism in Practice: A Comparison and Contrast of Apprenticeship and Constructionist Learning Environments. The Journal of the Learning Sciences, 10(3), 281–322.

Lave, J. (1991). Chapter 4: SITUATING LEARNING IN COMMUNITIES OF PRACTICE. Perspectives on socially shared cognition (pp. 63-82). American Psychological Association.

Learning sciences – Wikipedia, the free encyclopedia. (n.d.). . Retrieved June 1, 2011, from http://en.wikipedia.org/wiki/Learning_sciences

Netflix. (n.d.). . Retrieved June 1, 2011, from http://movies.netflix.com/WiHome

Networked learning – Wikipedia, the free encyclopedia. (n.d.). . Retrieved June 1, 2011, from http://en.wikipedia.org/wiki/Networked_learning

Savery, J. R., & Duffy, T. M. (1996). Chapter 11: Problem Based Learning: An Instructional Model and Its Constructivist Framework. Constructivist learning environments: case studies in instructional design (pp. 135-148). Educational Technology.

Schwen, T. M., & And Others. (1993). On the Design of an Enriched Learning and Information Environment (ELIE). Educational Technology, 33(11), 5-9.

Skinner, B. F. (1961). Why we need teaching machines. Harvard Educational Review, 31(4), 377-398.

Vygotsky, L. S. (1978). Chapter 6: Interaction between Learning and Development. In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in society: the development of higher psychological processes (pp. 79-91). Harvard University Press.

Wenger, E. (2007). Chapter12: Education. Communities of practice: learning, meanings, and identity (pp. 263-277). Cambridge University Press.

Westerman, M. A. (2006). Quantitative research as an interpretive enterprise: The mostly unacknowledged role of interpretation in research efforts and suggestions for explicitly interpretive quantitative investigations. New Ideas in Psychology, 24(3), 189-211. doi:10.1016/j.newideapsych.2006.09.004

Williams, R. N. (1987). Can Cognitive Psychology Offer a Meaningful Account of Meaningful Human Action? The Journal of Mind and Behavior, 8(2), 209-221.


Responses

  1. […] https://woknowing.wordpress.com/2011/06/02/learning-analytics-a-house-without-a-foundation/ […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Categories

%d bloggers like this: