The Association for Learning Technology (ALT) represents technologists, academics, designers, researchers and policy makers from many organisations and sectors across the UK. All members have an interest in learning technology. ALT-C is the annual conference which is attended by around 500 members and has an impressive attendee to presenter ratio of about 2:1.
Like all conferences, there were plenty of chances to interact with practitioners and vendors, discover small projects, and be inspired. In this piece, I wanted to share some insights that arose around data and learning analytics and highlight some of the key platforms and tools on show.
This year’s theme was Data, Dialogue & Doing and my goal was to discover more about what the sector is doing in the space of data and learning analytics. From the many sessions I attended, several related themes arose from the use of data and/or learning analytics – but to start, a quick definition taken from JISC:
“Every time a student interacts with their university – be that going to the library, logging into their virtual learning environment or submitting assessments online – they leave behind a digital footprint. Learning analytics is the process of using this data to improve learning and teaching.”
This is important; the field of learning analytics, and as the use of data grows, it can mean different things to many individuals. While broadly there is consensus, the minutiae of what analytics means, and how it can be used is still somewhat debated. This wasn’t the theme of the conference, but it was evident that the field is still emerging based on a variety of factors.
The idea of using data to provide a dashboard or traffic light system to show students ‘at risk’ is still emerging and immature. There remains this goal (or unfilled potential) for learning analytics that it should be used to provide some level of support for students who may become lost in the system or drop-out. The goals of a predictor or indicator is to use data, interpreted by algorithms, and then into a human-readable format.
However, at this stage the data collected is not comprehensive because courses are not consistent. There is no design standard that can be used to model data-driven outputs for 100% accurate reporting. Therefore the indicators themselves are missing several important factors to make them reliable for reporting or offering any kind of comprehensive window into student’s learning or progress.
There was a strong call from different sessions that we should remain critical of our increasingly data-driven world. The use of analytics as a measurement for human performance needs to be scrutinised. Much like an individual who can’t take out a financial loan because the bank’s computer has decided ‘no’, we must also be sure that a human can intervene when the learning system shows a learner at risk. Learning is one of the most complex processes and set of actions that humans can take. Therefore; it’s highly risky to draw conclusions from data alone.
There is value, however, when using data to make more informed decisions on which students may be more likely to need support. How the data is used, and the impact it can have is one that remains an open part of the dabate. A clear example came out of a session on improving student feedback at scale – using data-informed approaches but still being linked to learning design, educator support and personalised learning:
OnTask is a platform to send customised, personalised, emails to students based on data. The tool creates emails to send in bulk but includes segments of feedback based on logical decisions derived from both the course data and supporting feedback that educators want to send to their students. This may be based on participation, activities, assessments, content consumption, group work activity and more – broadly whatever data is collected can then be linked to the design and support from the course team. We’re exploring this tool and how it may be used to support FutureLearn learners.
Find out more: https://www.ontasklearning.org/
There were several talks on using data which may be based on poorly, or non-existent learning designs. When there is unreliable or unstructured data, the resulting collection, or use of this data results in very poor capabilities for data-driven decisions and insight. While it’s more prevalent in blended and on-campus courses, it was recognised as a key challenge for organisations and their use of data generated by the platforms they use.
In addition, it was widely recognised that learners are in three spaces;
Of these three spaces, only the institutional space is able to generate data that can be used for any kind of analytics. Students are active in all three spaces, each to a different amount. This reason alone is why it’s no easy challenge for the use of data in supporting students and using analytics to predict and support learning outcomes and success.
There are many stakeholders when it comes to data and according to a comprehensive EU-wide research project they have somewhat disjointed priorities:
There were many other themes at the conference but with a focus on data and learning analytics, I focused, at least in this write-up, on the above themes as the core outcomes / takeaways. Data-driven innovation in education continues to show a need for further dialogue around the direction of data and learning analytics. From a FutureLearn perspective, my takeaways are to look at how we can use data to help our partners deliver on their KPIs and also to support learners in achieving their goals and outcomes.