Description
Session Description
How do we know if our teaching and learning is working as we planned? What are the features of a good course that maximise students’ opportunities to reach their potential? Is this new learning innovation I am trying working as I intended? Do we need to change the structure or features of courses to reflect these findings?
Answering these kinds of questions reliably can be challenging and time-consuming, involving obtaining ethical consent, gathering data from dispersed sources and consulting students. However, it is increasingly possible to obtain meaningful insights through the automated analysis of data on the curriculum and student activity.
Learning analytics involves the collation of data around student learning: the use of this data to identify at-risk students is already well established. However, the data also provides opportunities for curriculum analytics, exploring the relationship between curriculum design (how teachers intended the learning to happen), the actual learning activities (what the students do) and learning outcomes. In this session we will discuss our work with institutions involved in Jisc’s learning analytics service. Through a series of consultations and workshops, we have examined the following:
– What insights do teaching staff require and which questions should they be asking of the data? We have consulted with staff at a range of universities to find out what they want to know about the courses they are involved with and how they can use the insights to make improvements.
– What additional data is required to undertake curriculum analytics? There is a lot of information a teacher can take into consideration when reviewing how well a course is going. The course structure, sessions, resources, teaching activity, etc. However, much of this data is not easily available as it can be tied up in course documentation and held in non-standardised formats.
– How do we build the analytics to answer these questions? We have also engaged with institutions and worked with CETIS to develop the most useful types of visualisations and the underlying data structures that are required to provide curriculum analytics to teaching staff, learning designers and other stakeholders.
Participants at this session will find out about the latest developments in curriculum analytics and how some pioneering UK universities are obtaining insight about the effectiveness of their teaching and learning. They will come away with an understanding of some of the main questions about the curriculum that are being asked and how these can be answered with curriculum analytics. They will also learn about the underlying data structures that are in place and the types of visualisations that can best help to provide insight into the effectiveness of teaching and learning.
During the session participants will be asked to suggest the key questions they would like to be able to answer about the curriculum at their own institutions. The presenters will then discuss whether these questions are already being asked elsewhere and work with participants to identify the appropriate underlying data sources and the best visualisations for presenting the analytics to stakeholders.
References
Gunn, C., McDonald, J., Donald, C., Milne J. and Blumenstein, M., 2017. Building an evidence base for teaching and learning design using learning analytics. https://ako.ac.nz/assets/Knowledge-centre/NPF-15-008-Building-an-Evidence-Base-for-Teaching-and-Learning-Design-Using-Learning-Analytics-Data/f3db3b92e9/RESEARCH-REPORT-Building-an-evidence-base-for-teaching-and-learning-design-using-learning-analytics.pdf
Hernández‐Leo, D., Martinez‐Maldonado, R., Pardo, A., Muñoz‐Cristóbal, J.A. and Rodríguez‐Triana, M.J., 2018. Analytics for learning design: A layered framework and tools. British Journal of Educational Technology.
http://uvadoc.uva.es/bitstream/10324/31426/6/Analytics-Learning-Design-Hdez-Leo_etal_AL4LD.pdf
Lockyer, L., Heathcote, E. and Dawson, S., 2013. Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), pp.1439-1459.
(https://journals.sagepub.com/doi/pdf/10.1177/0002764213479367)
Mangaroska, K. and Giannakos, M., 2017, September. Learning analytics for learning design: Towards evidence-driven decisions to enhance learning. In European conference on technology enhanced learning (pp. 428-433). Springer, Cham.
https://brage.bibsys.no/xmlui/bitstream/handle/11250/2487440/paper_102.pdf?sequence=1&isAllowed=y