The offer of data-informed insight into student engagement for universities is a hot topic within the sector and the prospect of improving student services, outcomes and retention through identification of disengagement via data proxies is exciting – but how meaningful are the engagement values being processed against student activity and how accurate are the predictive services being offered based on this data and machine-learning algorithms?
The University of South Wales joined the JISC Learning Analytics Service in 2016. In September 2018 the university went live with a ‘full fat’ service, offering teaching and academic coaching staff access to JISC’s ‘Data Explorer’ dashboard of ‘descriptive data’ for student activity for VLE use, grades and attendance. The Data Explorer dashboard computes daily ‘traffic light’ colours for all students and their activity using these proxies for engagement. Staff are invited to browse the data and to form a view on student progress and engagement based on these values. Additionally, from Dec 2018 JISC have been supplying USW with a weekly ‘Predictor’ service – where a machine-learning algorithm, using 2 years of activity data, seeks to calculate a % likelihood of a student passing the current year of a course.
After a full year of operation for the 18/19 session, USW wants to fully understand how this service has and will impact on teaching delivery, student support and professional services: so will be performing a full impact assessment on the service.
Before we start to base improved services on the models of engagement implicit in the service, we wanted to conduct a validation exercise. Building on previous predictive analytics validation exercises such as Dekker (2009), USW sought to answer such questions as: do the traffic lights offer anything meaningful? How accurate is Predictor, and if it is, when and for what reasons?
In this presentation the project lead on the USW implementation will offer context for the service implementation and the role of predictive analytics, referencing the work of Pistilli (2010), Lalitha (2012) and particularly Foster (2016). They will offer some highlights of the impact assessment but will spend time on the results of a validation exercise carried out by a Mathematics MSC research student on correlations between both the ‘traffic light’ values and the ‘Predictor’ calculations against actual known student outcomes.
This event will be useful for staff engaged in planning or implementing a learning analytics service from both a strategic and operational perspective.
Agnihotri Lalitha, Ott Alexander. 2012. Building a Student At-Risk Model: An End-to-End Perspective. In Proceedings of the 7th International Conference on Educational Data Mining, 209-212
Dekker, Gerben W.; Pechenizkiy, Mykola; Vleeshouwers, Jan M. Predicting Students Drop Out: A Case Study. International Working Group on Educational Data Mining, Paper presented at the International Conference on Educational Data Mining (EDM) (2nd, Cordoba, Spain, Jul 1-3, 2009)
Pistilli, Matthew D.; Arnold, Kimberly E. 2010. In practice: Purdue Signals: Mining real-time academic data to enhance student success. About Campus. Jul/Aug2010, Vol. 15 Issue 3, p22-24. 3p. DOI: 10.1002/abc.20025.
Foster, E., Kerrigan, M. and Lawther, S., 2016 (forthcoming). What Matters is Performance, not Predicted Performance: the impact of student engagement on retention and attainment. Journal of Learning Development in Higher Education.