The Course Signals team presented at the Second International Conference of Learning Analytics and Knowledge 2012 (#LAK12) a few weeks ago in Vancouver, BC. They showed some very impressive results that ensure their position at the head of the pack in terms of practical impact on students’ lives. They also may have found an impact from their technology that could potentially disrupt college admission practices in a significant way (See the “A Surprising Result” heading below).

Course Signals is a student risk identification system designed by John Campbell and others at Purdue that enables faculty to systematically intervene during the course of a class in measure with a student’s level of risk. It facilitates three primary phenomena:

  • Faculty/student engagement
  • Student resource engagement
  • Leaner performance analysis

It uses a predictive student success algorithm (SSA) to calculate a student’s level of risk based on four main factors:

  • Performance: “measured by percentage of points earned in course to date”
  • Effort: “as defined by interaction with Blackboard Vista, Purdue’s LMS, as compared to students’ peers”
  • Prior academic history: “including academic preparation, high school GPA, and standardized test scores”
  • Student characteristics: “such as residency, age, or credits attempted” (Arnold & Pistilli, 2012, pp. 1–2).
Signals Interface

Signals Interface

The user interactions it facilitates are:

  • “Posting of a traffic signal indicator on a student’s LMS home page
  • E-mail messages or reminders
  • Text messages
  • Referral to academic advisor or academic resource centers
  • Face to face meetings with the instructor” (2012, p. 2).

In the workflow, an instructor sets up before a course starts how many times the SSA should be run (e.g. once a week) and then creates the feedback (referrals to resources, academic advisors, or with instructor, etc.) that students should receive per level of risk per instance of the SSA. The instructor is then in high touch communication with the students with individualized feedback with minimal effort. And it seems to work, “Most students perceive the computer-generated e-mails and warnings as personal communication between themselves and their instructor” (2012, p. 3).

Results

Course Signal’s first pilot was in 2007 so they have a significant amount of performance and evaluative data, putting them a few years ahead of any other system out there.

Retention:

It is notable that the more classes with Course Signals students take, the more likely they are to be retained in subsequent years of schooling. For example, in the 2007 cohort, for the students with 2 or more Course Signals classes in any given year from year 1 to 4, the percentage of those students retained from one year to the next was mostly in the high 90s. In contrast, the percentage of students retained from year to year with no exposure to Course Signals classes from the same cohort in the same years ranged from the high 60’s to the low 80s (2012, p. 3). See the tables below.

Retention Rate for the 2007 Entering Cohort

Retention Rate for the 2007 Entering Cohort (2012, p. 3)

Retention Rate for the 2008 Entering Cohort

Retention Rate for the 2008 Entering Cohort (2012, p. 3)

Retention Rate for the 2009 Entering Cohort

Retention Rate for the 2009 Entering Cohort (2012, p. 3)

Analysis of Retention by Semester of Course Signals Use for the 2007 Entering Cohort

Analysis of Retention by Semester of Course Signals Use for the 2007 Entering Cohort (2012, p. 3)

Analysis of Retention by Semester of Course Signals Use for the 2008 Entering Cohort

Analysis of Retention by Semester of Course Signals Use for the 2008 Entering Cohort (2012, p. 3)

Analysis of Retention by Semester of Course Signals Use for the 2009 Entering Cohort

Analysis of Retention by Semester of Course Signals Use for the 2009 Entering Cohort (2012, p. 3)

Grades:

“Combining the results of all courses using CS [Course Signals] in a given semester, there is a 10.37 percentage point increase in As and Bs awarded between CS users and previous semesters of the same courses not using CS. Along the same lines, there is a 6.41 percentage point decrease in Ds, Fs, and withdrawals awarded to CS users as compared to previous semesters of the same courses not using CS” (2012, p. 2).

Course Signals Grade Results

Course Signals Grade Results

A Surprising Result:

“While this aspect needs to be further investigated, early indications show that lesser-prepared students, with the addition of CS to difficult courses, are faring better with academic success and retention to Purdue than their better-prepared peers in courses not utilizing Course Signals” (2012, p. 3).

Conclusion

Course Signals at Purdue is doing great work that is having a measurable and practical impact on the lives of students. Nevertheless, much work is to be done. I asked Arnold and Pistilli after their presentation at LAK12 what was being done to measure learning and the effect of effective instructional and learning practices, not just measuring grades and retention. They recognized that the learning nut had not been cracked, but that there was a program at Purdue to transition classes to more effective learning practices beyond direct instruction. Even so, an effort to come up with metrics of learning as a result of these innovative learning practices had not yet been developed.

The surprising result of less prepared students who took Signals-enabled classes performing better than better prepared students who did not take a Signals-enabled class is potently a paradigm shattering finding. If students with lower entrance qualifications can perform at a much higher level and have a much higher likelihood of graduation as a result of a single technology, why then would the college not be able to admit a broader range of students? How much better would the prepared students do with the technology as well? What could this mean for graduate education? Could entrance standards be lowered across the board all while increasing student performance?

Course Signals seems to have enabled mass replication of Vygotsky’s zone of proximal development (ZPD), enabling many students at once to increase their ZPD or true development level with minimal instructor resources. Is the Holy Grail of Ed Tech within reach? It will be interesting to see how well this effect endures across other institutions that use SunGard’s licence of Course Signals or when the outcomes are competency- or learning outcome-based rather than grade- or retention-based. It will also be interesting to see if the same effect is observed (once they have the data) in the similar technologies that were presented at LAK12, including: U of M’s e2Coach, D2L’s Student Success System (which had the best design), Sherpa, and GLASS.

References

Arnold, K. E., & Pistilli, M. D. (2012). Course Signals at Purdue: Using Learning Analytics to Increase Student Success. 978-1-4503-1111-3/12/04. Presented at the Second International Conference of Learning Analytics and Knowledge 2012 (LAK12), Vancouver, BC, Canada: ACM.

“WHAT WILL THE GAME-CHANGING TOOL KIT LOOK LIKE for next-generation learning? How can institutions prepare to meet the increasing demands?” (Baer & Campbell, 2012, p. 53). Those are the questions that begin the chapter, From Metrics to Analytics, Reporting to Action: Analytics’ Role in Changing the Learning Environment, by Linda Baer, formerly of the Gate’s Foundation, and John Campbell, father of Purdue’s Signals. I was thinking that the chapter was going to focus on tool kit design for academic analytics. Instead, the focus was on the human side of the socio-technical infrastructure required in order to have effective learning analytics at your institution. Here are some highlights:

Academic Analytics Social Technical Infrastructure

Learning Relationship Management: An Academic Analytics Socio-Technical Infrastructure

Actionable: “The goal for analytics must remain “actionable intelligence,” and as such, the capacity for analytics must go beyond data and statistics and focus on how the information must be utilized” (2012, p. 61).

Aligned: “In order to dramatically improve student outcomes, technology must be fully aligned with educational objectives, standards, curricula, assessments, interventions, and professional development” (2012, p. 55)

Embedded Culture: “Leaders need to create an institutional culture to use analytics tools to maximize the potential for improved student access, student learning, progression, and success” (2012, p. 59).

Shared Vision: “The role that analytics can play within the learning environment will largely depend on the institution’s vision of the next-generation learning environment. (2012, p. 59).

Championed: “…the decision to move forward with analytics depends on knowledgeable champions among senior administrators” (2012, p. 58).

Coalitional: “Building the appropriate models requires staff with statistics and educational research backgrounds. Creating interventions requires domain knowledge (e.g., advising, retention) and advising/counseling skills. For institutions to be successful in academic analytics projects, IT leaders [and other leaders involved] must build a coalition of people” (2012, p. 58).

Conclusion

The authors close by making a comparison to CRMs and proposed a Learning Relationship Management tool and infrastructure. Anyway about it, most institutions have a lot of work to do before academic analytics offices inside academic institutions reach this level of sophistication. Hopefully institutions’ notions of academic analytics will move away from a new way of saying “reporting results” and instead move towards a systemic detection and intervention management system that is easily understood by those participating in learning and in its facilitation.

References

Baer, L., & Campbell, J. (2012). Chapter 4: From Metrics to Analytics, Reporting to Action: Analytics’ Role in Changing the Learning Environment. Game Changers: Education and Information Technologies (pp. 53–65). EDUCAUSE.

Posted by: Michael Atkisson | March 28, 2012

Runmycode lets scientists crowdsource code testing

Runmycode lets scientists crowdsource code testing.

Posted by: Michael Atkisson | March 21, 2012

SunGard’s Course Signals Has Some Competition: Degree Compass

University builds ‘course recommendation engine’ to steer students toward completion | Inside Higher Ed.

Austin Peay State’s Degree Compass enters the club in competition with Purdue’s (now SunGard’s) Course Signals. But supposedly, Degree Compass has predictions for before class begins. Though I am not sure what use that is because who is going to give students customized recommendations for interventions (as is done in Course Signals) when class has not begun. Maybe it is just the characterization of the article’s author, but Degree Compass sounds a little like prediction for prediction’s sake. The point of predictive analytics in higher education is for tailored education and well-timed interventions. Some interesting questions would be to see if completion indeed increases on average, and if so, for whom? and at the expense of what?

Posted by: Michael Atkisson | February 28, 2012

Further dismantling of Higher Education’s Hold on Certified Learning

I thought this article was a great example of the dismantling of Higher Education’s Hold on Certified Learning.

Carnegie Mellon Developing Vendor-Neutral Cert Exam for Kenyan Coders — Campus Technology.

Does remediation need remediation? One teacher’s experience with Khan content.

“What I find frustrating is that the over-emphasis on the value of the data [Khan has] collected. I used the Khan Academy with my students (mixed in with a problem solving and project based approach) at the beginning of this year, and within 6 weeks I abandoned it (after two assessments I’d given to students external to the program) because I found no relationship between what my weakest students had ‘mastered’ according to the types of questions the Khan Academy provides, and the students ability to use their calculations in any other context. They could do 30 problems in a row flawless related to the rules of exponents, for example, but not solve any problem directly related to the rules of exponents that they hadn’t seen on the Khan Academy videos or exercises. The transferability of what my weakest had learned was extremely low. By contrast, the approach absolutely worked fine for my stronger mathematics students, who were able not only to transfer what they had learned, but to work at a pace I could not hope to match.”

via The “Mathlash” To Silicon Valley’s Move Into Education | Co.Exist: World changing ideas and innovation.

 

Happy Valentine’s Day! Great review of pedagogy theory in distance education cited in Dragan Gasevic’s LAK12 MOOC Week 4 presentation.

Anderson, T., & Dron, J. (2011). Three generations of distance education pedagogy. The International Review of Research in Open and Distance Learning, 12(3), 80-97.

Three generations of distance education pedagogy | Anderson | The International Review of Research in Open and Distance Learning.

Posted by: Michael Atkisson | February 10, 2012

(Not) Learning Analytics? The Relevance of LMS Behavioral Data

Pardo and Kloos (2011) recognized that learning analytics has limited ability to illuminate learning when the data it analyzes is relegated to student behavioral data from an LMS, particularly for face-to-face classes.

“Although the latest LMSs offer an increasing set of features, students are beginning to reach the educational institutions with solid experience on how to interact with their peers in ways that are not covered by the LMS. The main consequence is that a significant part of the interaction during a learning experience is beginning to take part outside of the LMS in what it could be called a “de-centralized approach” (Pardo and Kloos, p. 164).

Another problem they recognized is the inability for LMS-centric learning environments to provide the tools necessary for authentic learning as courses’ and programs’ learning objectives become increasingly skill-based, such as statistical modeling or writing code in programming languages. LMSs in these situations could only provide content, discussion, and knowledge assessment, which are distant proxies for verified skills gained by using tools of the trade.

In response, they proposed and tested a method for capturing a wider variety of data outside the LMS. As part of a face-to-face system architecture class, they posted a pre-configured virtual machine that contained all the tools students would need to complete assignments and a mechanism to gather and report data for analysis. Students were asked to opt in to downloading and using the virtual machine for coursework.

The results were quite extraordinary.

“The virtual machine was made available at the beginning of the course. Out of the 248 students that signed out for the course, a total of 220 downloaded the machine (88.71%). Out of the remaining 28 students (11.29%), most of them opted to use their own configured environment. The large percentage of students that decided to use the machine shows its acceptance as the course tool” (Pardo and Kloos, p. 166).

“Out of the almost 49,000 events, 15,507 (32.07%) were events in which a URL was opened with the browser. When counting the number of unique URLs, this number falls down to 8, 669. Out of these, only 2, 471 (28.51%) pointed to the LMS. An initial interpretation (pending a more thorough analysis) seems to suggest that students interact with a large number of resources that are outside of the LMS” (Pardo and Kloos, p. 166).

So overall, only 5% (2,471/49,000) of the events recorded by the virtual machines involved the students accessing the LMS as part of their course work. Clearly, for this face-to-face course LMS-centered learning analytics would be severely impoverished in portraying a clear picture student learning. It is also remarkable how high the acceptance rate (82%) was of the virtual machines that enabled the analytics. This method surely has promise for providing robust analyitcs to students, instructors, and administrators on student learning in face-to-face, blended, and fully online classes.

Posted by: Michael Atkisson | January 31, 2012

Course Signals for Scaled Open Learning: What Do You Tie It to?

In a Google Hangout with David Wiley today we came upon the question, what would Sungard’s Course Signals look like for MOOCs or other scaled open learning formats?

Signals is tied to grade data but what about learning formats where you still want predictive analytics but are not tying to grade data. In Sebastian Thrun and Peter Norvig’s Stanford scaled engineering classes, the metric was competency based and proxied by multiple-choice tests. In a MOOC, what would you tie your multilevel structural equation models to?

Then we thought maybe we could tie the MOOC behavioral data to survey data from participants. We could see to what degree individuals are getting what they wanted out of the MOOC and for what reasons. Then we could see what groupings or factors with latent indicators are represented in the survey responses that are confirmed, contradicted, etc. by those individuals’ types and levels of participation in the MOOC. Here we go, a signals algorithm tied to satisfaction, efficacy, etc.

Posted by: Michael Atkisson | December 29, 2011

Learning Analytics: Ascilite 2011 Keynote

Here is a presentation on Learning Analytics from Simon BuckinghanShum, part of the Open University UK’s Knowledge Media Institute

Learning Analytics: Ascilite 2011 Keynote.

Older Posts »

Categories

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: