Description
Session Description
This workshop explores the relationship between learning design and learning analytics. The facilitators are seeking a means to link current developments in big data and analytics (Sclater 2014 a, b and 2016) to day-to-day academic practice by adapting learning design tools in common use (Fung 2017, Nicol 2012).
The session is intended to demonstrate the value of a data-informed approach to enhancing academic practice. We know that academic staff are time-poor and most are not data scientists. This approach helps staff design modules and courses in such a way that the data to tell us exactly what we need to know about student progress is readily available.
We will show how the digital footprint of technology-enhanced learning activities (Salmon 2013) makes it considerably easier to measure the success of a particular approach than with traditional analogue activities. The insertion of digital activities, with a clear pedagogic purpose, into a learning design has the power to support real-time interventions that can make an important difference to learner outcomes.
Blended learning is increasingly prevalent in further and higher education (Ferrell & Smith 2018) but it is not always evident that people fully understand the benefits of having digital learning activities in the blend or, indeed, how to tell if a particular learning design is working as intended. In some cases there is a quota approach to moving learning online. Conversely, there are some institutions/teachers that pride themselves on face-to-face learning and do not see how a blended approach can improve on this.
This workshop will demonstrate ways to help academic staff ensure that their blended learning designs are purposeful. We will explore how to make explicit the pedagogic intent in a learning design and then go on to show how data can enable us to understand whether learner behaviour(Palmer et al 2017) is corresponding to our expectations or not.
We will also explore how such small steps in individual modules and courses can build up to an even more valuable resource when viewed at an institutional level.
Workshop outline:
Short introduction and discussion of current practice amongst participants. (5 mins)
Demonstration of a structured approach to learning design (in this example using the ABC method). (10 mins)
Discussion of readily available data sources and their usefulness. (10 mins)
Group work to analyse a sample learning design to identify existing data points or revise the design to add useful ‘data hooks’. (20 mins)
Feedback and discussion of outcomes (15 mins).
Session content: evaluation and reflection
The session is based on the work of a group of organisations and individuals, connected via the learning design cross-institutional network (LD-CIN), who want to take existing approaches to learning design and enhance them to make better use of data to inform practice. The ideas have been discussed with peers in the network and tried out in a range of different workshop sessions with c.200 participants to date. Each session has built on the feedback from previous iterations. The intended outcome is a set of enhancements to existing creative commons learning design resources to provide a ‘Train the Trainer’ guide to data-informed blended learning design. We are working with a particular creative commons learning design tool for the purposes of this research but the approach and guidance can be applied to any structured approach to learning design. Outcomes and subsequent resources will be disseminated to participants and related networks.
References
Ferrell, G. & Smith, R. (2018). Designing learning and assessment in a digital age. Jisc. Retrieved March 26th 2018 from: https://www.jisc.ac.uk/guides/designing-learning-and-assessment-in-a-digital-age
Fung, D. (2017). A Connected Curriculum for Higher Education. London, UCL Press. Retrieved March 26th 2018 from: http://discovery.ucl.ac.uk/1558776/1/A-Connected-Curriculum-for-Higher-Education.pdf
Nicol, D. (2012). Transformational Change in Teaching and Learning Recasting the Educational Discourse: evaluation of the Viewpoints project. Jisc. Retrieved March 26th 2018 from: http://wiki.ulster.ac.uk/download/attachments/23200594/Viewpoints_Evaluation_Report.pdf
Palmer, E., Lomer, S. & Bashliyska, I. (2017). Overcoming barriers to student engagement with Active Blended Learning. University of Northampton. Retrieved March 26th 2018 from: https://www.northampton.ac.uk/ilt/wp-content/uploads/sites/2/2017/05/Student-Engagement-with-ABL-Interim-Report-May-2017-v2.pdf
Salmon, G. (2013). E-tivities: The key to active online learning. (2nd ed.). London and New York: Routledge.
Sclater, N. (2014a). Learning analytics: The current state of play in UK higher and further education. Jisc. Retrieved March 26th 2018 from: https://repository.jisc.ac.uk/5657/1/Learning_analytics_report.pdf
Sclater, N. (2014b). Code of practice for learning analytics: a literature review of the ethical and legal issues. Jisc. Retrieved March 26th 2018 from: http://repository.jisc.ac.uk/5661/1/Learning_Analytics_A-_Literature_Review.pdf
Sclater, N., Peasgood, E., & Mullan, J. (2016). Learning analytics in higher education: a review of current UK and international practice. Jisc. Retrieved March 26th 2018 from: https://www.jisc.ac.uk/reports/learning-analytics-in-higher-education
Resources for participants
Designing learning and assessment in a digital age. Jisc. Retrieved March 26th 2018 from: https://www.jisc.ac.uk/guides/designing-learning-and-assessment-in-a-digital-age
ABC Learning Design. Retrieved March 26th 2018 from: http://blogs.ucl.ac.uk/abc-ld/
CAIeRO. Retrieved March 26th 2018 from: http://blogs.northampton.ac.uk/learntech/tag/caiero/page/2/
Carpe Diem. Retrieved March 26th 2018 from: https://www2.le.ac.uk/projects/oer/oers/beyond-distance-research-alliance/oers/carpediem/Carpe%20Diem.rtf/view
Viewpoints Curriculum Design Resources. Retrieved March 26th 2018 from: http://wiki.ulster.ac.uk/display/VPR/Home