I was fortunate enough to be supported by Macquarie University as part of a professional development grant to attend the Learning Analytics and Knowledge Conference in Edinburgh during April (LAK16).
It’s always difficult to summarise in a short article the numerous ideas that researchers, practitioners and technologists presented over the 2 days of workshops and 3 days of the conference, however, this is what stood out for me and I think may be of interest.
What is learning analytics?
However first, you may not know what learning analytics is, and why it should be of interest to you. The Society for Learning Analytics Research (SOLAR) defines learning analytics as the “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.”
Why learning analytics?
The promise of learning analytics is to turn data into information that can be used to improve both the learning and teaching experience. Researchers in this field look at issues such as what type of data provide the most useful information, how best to present this information so that it is meaningful to both students and teachers, and can the information be used for effective interventions. View an informative infographic summarising the field of learning analytics from OpenColleges. [http://www.opencolleges.edu.au/informed/learning-analytics-infographic/].
The learning management system (LMS) such as MOODLE is ubiquitous. Increasingly student learning and teaching are occurring through interactions in the LMS between students, peers, teachers, activities and content. Learning analytics provides educators with insights into these interactions that we can use to adapt and improve our teaching.
Now that you hopefully have a greater understanding of what learning analytics is and why we should be interested, let’s have a look at some ideas from LAK16.
One of the objectives of the Learning and Teaching Strategic framework (2015-2020) for Macquarie University is a program-based approached to the curriculum. Learning analytics can help to provide insights into what is happening within programs to support analysis and decision making. At LAK16 Simple metrics for curricular analytics by Xavier Ochoa, proposed curriculum metrics. Some of these could be useful when looking at programs. They include:
- Course temporal position (CTP), the average academic period (semester or year) in which the course is taken by the students of the program.
- Temporal distance between courses (TDI), how many academic periods, on average, pass between a student taking two different courses. This can be used to establish the actual sequence of courses a student takes.
- Course duration (CDU), the average number of academic periods that students need to pass the course.
- Profile-based metrics, where the student population is divided into groups based on their performance (GPA). The metrics are then calculated for each group, providing a more useful metric. The difficulty metric he proposes include: Course approval profile (CAP) the number of students (in that group) that have passed the course divided by the number of students that have enrolled; Course performance profile (CPP) the average grade for students (in that group); Course difficulty profile (CDP) the grade of students (in that group) by first subtracting the GPA from the grade and then averaging for that group.
If you are interested in reading more on learning analytics you can access the full conference proceedings at the ACM Digital library.
What’s the question?
One of my observations was that people were developing dashboards or visualisations of the data, without first thinking about the questions they wanted to answer. They had not taken the message of Long and Siemens to heart from their seminal paper Penetrating the Fog: Analytics in learning and education (Educause review online 2011) where they stated that using analytics means that we need to think carefully about what we need to know, and what data is most likely to tell us what we need to know.
Next week we’ll look at the issues of pedagogical learning analytics, the financial implications of an Early alert system, and some more observations from LAK16.