Learning analytics offer widely known benefits for supporting and improving learner performance, but without greater context of user interaction, can offer a number of limitations and potentially misleading perceptions (Greller, W., & Drachsler, H. (2012)). Whilst learning design and analytics have traditionally operated independently of each other, there is huge potential for such analytics to be used to inform curriculum design, evaluating the impact and use of resources, and student interaction (Ifenthaler et. al, 2018). Similarly, the ‘big data’ approach to user analytics can often become overwhelming for the grassroots academic, simply looking for a snapshot of interaction with resources (UUK, 2016)
Talis Elevate is a new product, enabling staff to upload resources and track granular level engagement at resource, student, and course level, as well as creating opportunities for staff to build activities directly into resources by utilising group and personal annotations in text/video.
The Faculty of Science & Technology at ARU ran a longitudinal study using Talis Elevate and LMS Analytics to prove/disprove a number of hypothesis around user behaviour (such as what feedback delivery method was more beneficial for students), and the impact of specific resources based on user interaction (such as how students really make use of documents like the Module/Study guides). By undertaking analysis of student interaction with certain resources across different courses on different learning styles and methods of delivery, we were able to ascertain how students are approaching their learning, how they were interacting within resources, and specific methods of content delivery that attracted the most engagement, allowing us to adjust and improve our teaching approach and focus on resource type. This has allowed the Faculty to make fundamental changes to our delivery methods of content at both course and Faculty level, as well as alterations to resource focus and feedback mechanism.
This session will explore some of the current research around learning design vs learning analytics, some of the critical findings of the project, and the next steps for research, and will be of interest to academics, learning technologists, instructional designers and anyone with an interest in learning analytics.
Session content: evaluation and reflection
By utilising a new level of granular analytics showing student interaction with resources at snapshot points throughout a semester, tracking engagement with the same ‘type’ of resource across a number of courses, and delivering the same resource to students via different mediums (media and text) we were able to ascertain how students interact at both resource and course level. This gave us a very clear idea of ‘tail off’ activity, trends and taxonomies of learning approach over a semester, and a clear indicator of how students prefer to have feedback delivered. Findings for work completed will be demonstrated in this session entirely.
Gordon, D,. (2015), Evaluation of student engagement with feedback: feeding forward from feedback. Anglia Learning and Teaching [online], available at https://myplayer.anglia.ac.uk/Play/6026 [last accessed 24/03/18]
Greller, W., & Drachsler, H. (2012). Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educational Technology & Society, 15 (3), 42–57. [online], available at [accessed May 24 2018].
Ifenthaler, D., Gibson, D., & Dobozy, E. (2018). Informing learning design through analytics: Applying network graph analysis. Australasian Journal of Educational Technology, 34(2) [online], available at [accessed 24/5/18]
Talis Elevate [online], Talis. [online], Available at [last accessed 24/03/18]
UUK (2016), Analytics in Higher Education, , [online], available at [accessed 21/05/18)
Resources for participants