Posted by: Michael Atkisson | February 10, 2012

(Not) Learning Analytics? The Relevance of LMS Behavioral Data

Pardo and Kloos (2011) recognized that learning analytics has limited ability to illuminate learning when the data it analyzes is relegated to student behavioral data from an LMS, particularly for face-to-face classes.

“Although the latest LMSs offer an increasing set of features, students are beginning to reach the educational institutions with solid experience on how to interact with their peers in ways that are not covered by the LMS. The main consequence is that a significant part of the interaction during a learning experience is beginning to take part outside of the LMS in what it could be called a “de-centralized approach” (Pardo and Kloos, p. 164).

Another problem they recognized is the inability for LMS-centric learning environments to provide the tools necessary for authentic learning as courses’ and programs’ learning objectives become increasingly skill-based, such as statistical modeling or writing code in programming languages. LMSs in these situations could only provide content, discussion, and knowledge assessment, which are distant proxies for verified skills gained by using tools of the trade.

In response, they proposed and tested a method for capturing a wider variety of data outside the LMS. As part of a face-to-face system architecture class, they posted a pre-configured virtual machine that contained all the tools students would need to complete assignments and a mechanism to gather and report data for analysis. Students were asked to opt in to downloading and using the virtual machine for coursework.

The results were quite extraordinary.

“The virtual machine was made available at the beginning of the course. Out of the 248 students that signed out for the course, a total of 220 downloaded the machine (88.71%). Out of the remaining 28 students (11.29%), most of them opted to use their own configured environment. The large percentage of students that decided to use the machine shows its acceptance as the course tool” (Pardo and Kloos, p. 166).

“Out of the almost 49,000 events, 15,507 (32.07%) were events in which a URL was opened with the browser. When counting the number of unique URLs, this number falls down to 8, 669. Out of these, only 2, 471 (28.51%) pointed to the LMS. An initial interpretation (pending a more thorough analysis) seems to suggest that students interact with a large number of resources that are outside of the LMS” (Pardo and Kloos, p. 166).

So overall, only 5% (2,471/49,000) of the events recorded by the virtual machines involved the students accessing the LMS as part of their course work. Clearly, for this face-to-face course LMS-centered learning analytics would be severely impoverished in portraying a clear picture student learning. It is also remarkable how high the acceptance rate (82%) was of the virtual machines that enabled the analytics. This method surely has promise for providing robust analyitcs to students, instructors, and administrators on student learning in face-to-face, blended, and fully online classes.


Responses

  1. This is great stuff, Michael. Thanks for bringing this research, as well as your own thoughts, to our attention.
    Keith

    • Thanks Keith. Appreciate the reach out. Enjoying your thoughts as well.
      Michael

  2. […] There’s growing interest in using educational technology to create meaningful learning analytics. A tricky, but crucial area of work. Michael Atkinson: (Not) Learning Analytics? The Relevance of LMS Behavioral Data […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Categories

%d bloggers like this: