CEO Simon Nelson shares some insights on FutureLearn’s approach to online learning, based on data from the first eight courses to run on the social education platform.
When we developed FutureLearn we set out to create a simple, intuitive education platform, which supports a community of people learning through conversation.
Four months after we launched on 18 September 2013, we have found ourselves wading through the complicated business of measuring our success.
So we asked ourselves: “How do we judge the success of something which is designed to allow people to take away as much or as little as they choose?” We think any form of learning is a good thing and should be encouraged.
The end of our first eight courses, released in beta, at the end of last year allowed us to do just that. They’ve given us enough meaningful data to provide some clear insights into delivering social learning at scale. By looking at the way people are using our product, and drawing conclusions from the aggregated data, we can share some numbers and insights about how we’re doing, and why we think this supports our approach of putting to social learning at the heart of our platform.
This is our first attempt at answering the question we set ourselves.
So what do we measure?
1. Celebrating progress
One of FutureLearn’s core values is to celebrate every step of progress, big and small. The first time that you press the “Mark as complete” button to cross off a step feels like an accomplishment. Finishing your final test and seeing your overall score is a bigger moment to celebrate.
We recognise every step of progress on a learner’s journey, so we determine the success of our product by measuring the progress of our learners. We celebrate all levels of engagement and want to consider all of the different things learners take away from their experience with us.
2. Provoking conversation
Secondly, we’re a social learning platform. Social learning is at the heart of what we do as we know it helps people learn, makes the experience more enjoyable and removes the isolation of distance learning. We want to encourage social learning, even if it’s vicarious learning through reading the interactions of others. In time, we also want to measure how people move from being a vicarious learner to becoming more social.
Our most important numbers
The first step on the journey is joining a course. To date we have had nearly 400,000 course sign ups from over 200,000 registered users in around four months. We began by limiting places on our first courses in order to learn from the experience and grow sustainably. Six of our first eight courses hit their cap of 10,000, with ‘Begin programming: build your first mobile game’, from the University of Reading doing so the day after launch. We’ve now removed the limit on places and are seeing nearly 30,000 learners joining some of our latest courses.
Of the people who joined a course, we saw an average of 60% actually visiting the course once it began. Joining a course has such a low barrier to entry, it’s understandable that people change their mind or their circumstances change between joining and beginning their course. However, over the next few months we will be working on how we encourage more joiners to actually become learners, testing out different promotional and scheduling approaches and also offering people more of a learning experience without the need to sign up for a course by making our content more open and discoverable.
For most of our metrics, we have decided to benchmark against learners (people who actually visited the course) since this, to us, is the most meaningful comparison.
Each step of a course has has a ‘Mark as complete’ button that allows a learner to tick things off their to do list. We class active learners as those who start to use this button. 86% of all learners have done so, and in the process, have effectively assessed by themselves that they have learnt something on our courses.
A returning learner is someone who marks steps as completed in more than one week of the course. 54% of learners are coming back and marking off steps in multiple weeks.
Fully participating learners
Our focus is on ‘participation’, rather than ‘completion’, so that we can assess the quality of the whole learning experience, rather than simply test what people know.
We define ‘full participation’ as a learner completing the majority of steps and all of the assessment (currently just tests but soon to include peer assessment). Despite setting this bar quite high in terms of how much we ask learners to do, we’re seeing 15% achieving this goal, and in so doing, qualifying for our pilot Statements of Participation. So far, we’ve only offered a statement on one of our courses, so we expect this participation figure to go up once all of our learners have something to aim for, as many have registered an interest in earning one.
For those of you who are interested to know how we compare on some of those measures that others use, we are seeing 27% of learners marking steps as complete in the final week, 18% taking all of the available tests and 24% are marking more than half of the steps as complete.
As outlined above, we believe that social learning is really important and we’re happy to see that on average 34% of learners are posting comments. Many more will be viewing and learning from those comments. Naturally the level of conversation varies from course to course with the least social ones seeing 24% of learners posting and the most social featuring a very talkative 45%.
Our learners also post regularly, with the average number of comments across each course being around seven.
Other numbers we’d like to share
In terms of engagement, we’re seeing some really encouraging numbers. An average visit is 25 minutes long and typically learners come back two or three times a week.
We take great pride in the fact we have designed our product to work seamlessly across different devices and screen sizes. In general, 25% of our visits are done on a tablet or mobile. Courses themselves vary greatly with some – like ‘Begin programming: build your first mobile game’ – understandably dominated by laptop and desktop machines, which are required for writing code. But we’re finding with courses like ‘The mind is flat’, from the the University of Warwick, a behavioural psychology course, that many are choosing to learn on their tablets and mobiles, in the comfort of their armchair or on their commute.
Other notable things are that we attract equal numbers of men and women, have a very even spread of ages, serving as many older learners as twenty somethings and, most encouragingly, less than a third of our learners have done an online course before. We are attracting a new audience and inspiring them to learn. A common comment you see around the site is “it’s so nice to get the brain cells working again”.
Recommending us and doing another course
But more than anything, what convinces us that we are on to something here is the reaction of our learners in our social environments and in their responses to our surveys.
Please go and sign up for one of our courses and we hope you will see what thrills us – the excitement, passion, erudition and commitment of a genuinely international community of learners.
This is reflected in the qualitative analysis we have undertaken; 88% of people surveyed at the end of our first eight courses rated the experience good or excellent (nearly half on their own).
Additionally 71% wanted to do another course straight away or in a few weeks time – and over half of our beta course active learners have signed up for another course already.
92% said the course met or exceeded their expectations and when we asked them in a survey at the end of their course if they would recommend FutureLearn to a friend, 94% said they were either likely or very likely to.
We believe that this is the biggest compliment our learners can give and as you might be able to tell, we’re really pleased.