With the development of learning analytics and predictive technology has come an opportunity, or the sense of an opportunity, to predict the behaviour of individuals – be that potential criminals, the habits of shoppers or, indeed, students.
First Steps With Bloom Thrive
As discussed in the previous post on the subject, the University of London International Programmes is engaging with ULCC to pilot Bloom Thrive, a learning analytics system to try to understand our students’ progression through their courses better. We are getting ready to go live with the system this month, but as part of the project’s development we faced a particular challenge that I’d like to address in this blog – namely, how do we know whether the system will work?
In our distance learning environment where students are not on a campus, the nature of student progression and retention is such that we don’t necessarily know whether our students are continuing until they re-register for the following year. Judging the success, or otherwise, of a system that tries to predict how students may progress therefore involves playing the long game – making an educated guess using your analytic system in September may not be borne out as true or false until the following autumn.
University of London International Programmes – Fingertips Facts
Old Data To The Rescue
Our own attempts to test the effectiveness of Bloom Thrive involved taking snapshots of student data from three points in 2014 (October, November and December – up to 40,000 students), running it through the system and testing the outputs against the actual progression for these students in the following academic year. The advantage of this test was that we knew which students progressed. The disadvantage was that the results lacked the live analytical elements that Thrive includes, such as an indication of student ‘wellness’ or whether they are using their VLE.
The results themselves – three reports of 200 students considered at risk of disengagement – identified, consistently, a slightly higher rate of student attrition than our general level. A positive result for the test, but only slightly. In and of themselves these results are not convincing evidence of the effectiveness of the Bloom Thrive, however there are positive signs and it is hoped that the live elements of the system will add more depth and sophistication to the predictions made. And in time we will be able to test the live system to find this out.
Beyond The Data
However there is a paradoxical element to the search for quantified proof that a system such as this will work – successfully identifying and engaging with a student at risk of leaving their studies may mean they continue to study. Unless you quantify that successful reaction the student could be added to the data proving the system wrong.
Institutions may need to take a wider view – judgements of success could factor in an overall reduction in attrition year-on-year, or increases in how students rate institutional student support. Also, as improved student understanding and support is the benefit, a qualitative methodology may be an equally appropriate way to gauge success.