A sense of achievement and insight is common in educators who see, for the first time, a visual dashboard populated with real data about their students’ behavior and performance. It might reignite a teacher’s interest in trying new things, like making students spend more time on the LMS, complete more activities, and achieve higher scores. But after a while of tracking behavior, a second feeling begins to take over as differences between data and actual proficiency start to arise.
At this point, people are quick to blame the indicators, and by extension the technology, as if they are “measuring the wrong thing.” In reality, it is only natural that numbers are not enough to tell the complete story about every student.
Take a look at these examples of well-intentioned, but potentially-skewed, analytics goals.
LMS activity: It’s reasonable to think that students who spend more time on the site will perform better, and this is true, but only up to a point. After that, the productivity of an extra hour spent online plummets.
Content completion: Students taking “too long” or “too little time” finishing an activity could be an insightful tip for upcoming instructional design, as it is large variances between students in the same activity.
Not dropping out (soon enough): A possibly-controversial statistic, but one that highlights a problem that might not be addressed sufficiently in higher ed: Early academic shortfall could be a signal of vocational issues, which should be quickly intervened with coaching or orientation.
Efficiency: At some point, data should start to shed light into the differences in learning methods and habits of each student. In this sense, the ideal scenario of analytics is to empower students in their own methods.
This Moodle Practice related post is made possible by: eThink Education, a Certified Moodle Partner that provides a fully-managed Moodle experience including implementation, integration, cloud-hosting, and management services. To learn more about eThink, click here.