Learning analytics and student retention

Do your eyes glaze over when learning analytics is mentioned? Recently I have noticed that it is being discussed more broadly around the campus. There are a number of reasons for this, possibly its mention in the Green Paper, but one of which I would like to think is a project some colleagues and I are working on.

So what is learning analytics? Essentially it is the use of data to improve learning. iLearn (Moodle) and other learning management systems (LMS) log student interactions, such as: clicks on resources; forum posts read; time spent online. Learning analytics aims to analyse data like this to help the teacher and student improve the learning experience.

One hoped for outcome from using learning analytics is improving student retention rates. Some research (Arnold, 2010; Macfadyen & Dawson, 2010)  suggests that learning analytics can help teachers identify students that are at risk of becoming disengaged from their course and consequently failing or not completing their unit.

There is some evidence that indicators such as participation in forums and assessment completion have some predictive value for a student’s final grade (Macfadyen & Dawson, 2010; Smith, Lange, & Huston, 2012). Purdue University has developed a traffic light interface using student activity in the LMS to successfully increase student completion in their units (Arnold, 2010; Mattingly, 2012). A similar interface called the Moodle Engagement Analytics Plugin (MEAP) exists for Moodle.

The MEAP allows the teacher to select from a number of different indicators together with their own benchmark settings to calculate a risk score for students in their unit. The indicators are: how late an assessment is; the activity in forums; and the amount of time a student spends online in iLearn. The teaching and delivery grant project that my colleagues, Professor Deborah Richards, Dr Danny Liu, Amara Atif and myself are working on seeks to build on the existing MEAP module.

The Project

Initially we conducted some validation of the MEAP module comparing the risk it calculated with historical student data. This preliminary research has shown that the MEAP used in specific contexts can be used to identify students that are at risk of not completing the unit (Liu, Froissard, Richards, & Atif, 2015). Some teachers will trial this plugin with their classes and we will be collecting data from both teachers and students about their experiences at the end of semester.

Desktop and data
Desktop and data -Chris Froissard

Next we have added the ability of using grade items from the gradebook in MEAP. We call this MEAP+. What this means is that the teacher can not only select from the existing three indicators, they can also use marks that students receive from assessments to calculate a risk score. For example if a teacher believes that passing the weekly quizzes and participating in the forums in the unit indicates that students will pass the unit, they can have MEAP+ calculate risk scores for all their students for these activities.

MEAP+
MEAP+

The teacher can then be notified of students that do not meet the benchmarks set. This is indicated by the student receiving a “high” risk score. The teacher may the opt to implement support to those students to help them address these issues.

But what happens next?

The approach that we are following is to help teachers send targeted emails, customised to these students at risk. The student early alert messaging system (SEAMS)  will be integrated with MEAP+ and will allow the teacher to access relevant information about students at risk (how many posts they have made, how late they have been in submitting assessments, etc.) when composing an email response. The system will also suggest the type of email message that they may want to send for the specific issues identified. A message log will be generated so that teachers will have a record of emails sent.  Feedback on a prototype, tested with unit convenors and student support officers, is invaluable in ensuring that the interface is as intuitive and effective as possible.

SEAMS
SEAMS

We will be testing both the MEAP+ and SEAMS in Session 2 as part of a pilot to see how effective this approach is in contacting students at risk to provide help so that teachers can help more students complete their unit.

If you are interested in this project or would be interested in participating in the trial for Session 2 please contact me: chris.froissard@mq.edu.au

References

Arnold, K. E. (2010). “Signals: Applying Academic Analytics.” The EDUCAUSE Quarterly, 33(31).

Liu, D.Y.T., Froissard, J.-C., Richards, D., & Atif, A. (2015). Validating the Effectiveness of the Moodle Engagement Analytics Plugin to Predict Student Academic Performance. 2015 Americas Conference on Information Systems, August 13-15. (Accepted).

Macfadyen, L. P., & Dawson, S. (2010). Mining LMS Data to Develop an “Early Warning System” for Educators: A Proof of Concept. Computers & Education, 54(2), 588-599.

Mattingly, R. A. B. (2012). “Learning Analytics as a Tool for Closing the Assessment Loop in Higher Education.” Knowledge Management & E-Learning: An International Journal, 4(3), 236-247.

Smith, V. C., Lange, A., & Huston, D. R. (2012). “Predictive Modeling to Forecast Student Outcomes and Drive Effective Interventions in Online Community College Courses.” Journal of Asynchronous Learning Networks, 16(3), 51-61.

 

 

4 thoughts on “Learning analytics and student retention”

  1. I am writing one research paper on student retention using LA and having dificulty with the flow of paper. Kindly help me to solve the confusion.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.