How successful students use LMS tools – confirming our hunches

(Blackboard) By John Whitmer, Ed. D. —

Within Blackboard’s data science group, we have an active practice researching how our technologies impact student achievement. We do this research to improve our products, of course, but also to share our findings and insights with the broader education community.  Today, I’d like to share some recent research insights about patterns in tool use and learning outcomes, and how these differ by the learning tool used – in some ways that you might expect and in some that might surprise you.

Learning Analytics researchers have long held that the contexts of learning are critical in making meaningful analyses. With the large data footprint that Blackboard has to analyze, our team is able to look at some of these contexts in depth.

We took a sample of courses from Spring 2016 in North America. All of this data was anonymized at the individual and institutional level; aggregate data was used for analysis. We filtered for course features likely to provide a substantial student learning experience through the use of Blackboard Learn and give enough data for robust analysis: between 10 and 500 students, a mean course time of at least 60 minutes, and using the gradebook. We sampled 70,000 courses from 927 institutions, with 3,374,462 unique learners.  After filtering the resultant data set included 601,544 learners (16.25%) in 18,810 (26.87%) courses.

Within these courses, we found the following distribution in time spent using the LMS. Most time was spent in Course Content, with smaller time spent in other tools, but assessments, grades, and announcements all above 10% of the total use. Some of this usage is not representative of actual activity; assignments for example are largely conducted outside the LMS. But overall, this distribution provides an overview of how students interact with the LMS in ways consistent with what you might expect.

previously reported about a large range in results when looking at the relationship between time spent in the LMS and student grade.  To explore this result further, we analyzed patterns in tool use, specifically looking at how much time students spent using each tool, and how that time was related to the student’s grade. For the purpose of analysis, we categorized student grades into High (80+), Medium (60-79), and Low (<70). We also standardized tool use into quartiles describing that student’s use relative to all students in the course: High ( 75%-100%), Medium (50%-74%), Low (25%-49%), Really Low (0-24%), and None.

Conceptually, it’s important to clarify that we don’t interpret any relationship as causal; using academic technology doesn’t improve learning, any more than picking up a pencil (or typing on a keyboard) makes for a good essay. But it’s hard to write that essay without, well, writing. This use data, interpreted appropriately, can provide interesting predictors and indicators of underlying learning behaviors that we can use to improve student achievement.

Before going further, I hope you’ll indulge me in a thought experiment. I promise it’ll make the following results more interesting.

  1. Which tools were the strongest predictors (positive or negative) of student learning?
  2. What patterns do you expect to see between tool use and grade?
  3. Are these patterns different by tool or are they consistent across tools?  Finished your thought experiment? Or perhaps you already know these results for your campus? Here’s what we found in our analysis.

The most important tools in predicting student grade were:

  • MyGrades
  • Course Content
  • Assessments
  • Assignments
  • Discussion

MyGrades (students looking at their grades)

The most consistent predictor of student achievement was how frequently a student looked at their grades; this surprised me given that other tools (like assessments) directly and tangibly influence a student’s grade. This is an independent behavioral measure and yet is a very strong predictor.

The most successful students are those who access MyGrades most frequently; students doing poorly do not access their grades. Students who never access their grades are more likely to fail than students who access them at least once. There is a direct relationship at every quartile of use – and at the risk of spoiling results for the other tools, this is the only tool for which this direct trend exists. It appears that students in the middle range of grades aren’t impacted by their use of the tool.

This finding suggests that to change student outcomes, we need to use proactive intervention strategies. Examples include  grade notifications within the Activity stream of Learn Ultra, or the alerts provided by Bb Predict.

Course Content (text/multimedia online pages or linked files)

Taken as a whole, Course Content is the Bb Learn Tool in which students spend the most time. The relationship between time spent accessing content and grade are very clear between no use and some use; however, after initial access there is a very small increase for additional access. An interesting result was that after the median, additional access is related to a decline in student grade; students spending more than the average amount of time actually have less likelihood of achieving a higher grade!

This could be due to multiple underlying dynamics: some successful students only need to access content once; they either effectively learn it or file the resource outside of Learn (e.g. saving a PDF on their local hard drive) so that they can reference it later. Further, increased access could be caused by students cramming right before an exam or another deadline.

If this interpretation is accurate, faculty could find reports helpful that show the percentage of a class that’s accessed content resources. The indicators could trigger faculty to send out reminders to students. We’ve increased the detailed reports about resources accessed in Learn Ultra and in X-Ray Learning Analytics (available for Moodlerooms).  And we’ve also developed algorithms that calculate login regularity, to distinguish regular access from cramming behaviors.

Assessments (online quizzes/tests) & Assignments (submitted electronic resources, e.g. homework):

If students don’t complete quizzes or submit assignments for a course, they have lower grades than those who do so. This was not a surprising finding. What was surprising to me is that this wasn’t the strongest predictor of a student’s grade.

The second interesting finding in this area was the strong decline in grade for students who spend more than the average amount of time taking assessments. This is an intuitive result.  Students who have mastered course material can quickly answer questions; those who ponder over questions are more likely to be students who are struggling with the material. The relationship is stronger in assessments than assignments because assessments measure all time spent in the assessment, whereas assignments doesn’t measure the offline time spent creating the material that is submitted. Regardless, this trend of average time spent as the most frequent behavior of successful students is consistent across both tools, and is a markedly different relationship than is found in other tools.

Conclusions

By looking at tool use, we have uncovered patterns in usage that help explain our prior findings about the low relationship between LMS use and student achievement. These patterns are consistent, or at least justifiable, within what we believe to be true about how students “do” education and practices around interacting with gradebooks and assessment. They are important to consider when creating analytics and interpreting the results of reports about usage – as useful interpretations and analyses require more detailed analyses than looking simply at more/less use, and considering the contexts and specific practices that a student is engaged in. The other important consideration is the design of a course and what tools are important; but that’s a blog post I’ll report on another day!

If you’re conducting research into how people are using Learn, and would like help interpreting results or feedback about how your findings might scale, I’d love to hear from you. You can reach me at [email protected] | Skype: john.whitmer or Twitter johncwhitmer

This post was originally published on the Blackboard blog on September 7, 2016.