Description
In higher education, learning analytics is playing an increasingly significant role in informing curriculum evaluation and review. This has been enabled by new developments in analytics methods and tools, as well as increasing access to real-time data on students’ learning activities (Long & Siemens, 2011). Recent research (Corrin, Kennedy & Mulder, 2013; Stephens-Martinez, Hearst & Fox, 2014) found that academics want greater and easier access to data on student engagement with learning activities and resources to inform changes to teaching approaches and curriculum structure. This is especially important for subjects/courses that are delivered online, where it is often more difficult to obtain useful feedback on student engagement and sense of community.
In this presentation we will examine how learning analytics have been used to inform teaching, learning and evaluation in a new online subject: Designing a Curriculum. This core subject in the Graduate Certificate of University Teaching, a subject for academics at the University, was redesigned to be delivered in a predominantly online format in the second semester of the 2013 academic year. As part of the redesign, the teaching team used learning analytics to closely monitor student engagement with the new online learning activities. This allowed the teaching staff to ensure students were on track and completing the requisite learning tasks as well as providing an important data source for evaluating the new curriculum delivery structure.
The representations of this data also formed the basis of an analytics learning activity which required students to view reports of the class’ own engagement and comment on how they would review the curriculum. This activity not only served as a useful learning experience for students on the interpretation of learning analytics data, but also provided valuable evaluation perspectives and feedback for the teaching team.
The presentation will include a discussion of the lessons learnt through the use of learning analytics in monitoring student engagement as the subject was taught, and how this informed student support and adjustments to learning activities. It will also look at the evaluation feedback on the curriculum obtained through the analytics learning activity and the impact that had on the review of the subject design. The results from two iterations of the subject will be presented to show how the changes made in the first iteration of the subject have influenced the second, and the effect of these changes. The iterative improvements and development of learning analytics analysis and visualisation approaches used will also be presented, offering ideas for how teachers can use learning analytics to inform curriculum design decisions.
References
Corrin, L., Kennedy, G., & Mulder, R. (2013). Enhancing learning analytics by understanding the needs of teachers. In H. Carter, M. Gosper & J. Hedberg (Eds.),Electric Dreams. Proceedings ascilite 2013 Sydney. (pp. 201-205).
Long, P., & Siemens, G. (2011). Penetrating the fog: analytics in learning and education. EDUCAUSE Review, 46(5), 31-40.
Stephens-Martinez, K., Hearst, M.A., & Fox, A. (2014). Monitoring MOOCs: Which Information Sources Do Instructors Value?. In Proceedings of the first ACM conference on Learning @ Scale (pp. 79-88). New York: ACM.
Authors
Name | Victoria Millar |
Affiliation | University of Melbourne |
Country | Australia |