This presentation explores the authors’ recent research into meaningful measures of pedagogic activity in online discussions. Learner and instructor participation in a MOOC-based comment forum was evaluated using the DiAL-e pedagogical framework (Burden & Atkinson, 2008). Results from this study indicate that measuring pedagogic activity is distinct from typical interactional measures; there are negative or insignificant correlations with established Learning Analytics methods, but strong correlations with linguistic indicators of pedagogical activity. This suggests that pedagogical and linguistic analysis, rather than interaction analysis, has the potential to automatically identify learning behaviour, and may be more useful in visualising pedagogic activity. Further, the validity of DiAL-e as a coding method has been reviewed and compared with established Computer Mediated Communication content analysis methods, including SOLO (Holmes, 2005) and Community of Inquiry (CoI – Joksimovic et al., 2014), and with Learning Analytics and Language Analysis methods. Findings suggest that different measures of cognitive activity (Dial-e’s engagement with pedagogic activities, SOLO’s increased complexity of understanding, and CoI’s development of reflective discourse) are closely correlated.
Because engagement with e-learning is different from other Web activity, establishing relevant measures of pedagogic engagement within short informal comments is important, and it is useful to develop automated analysis and visualization systems that have meaning in this field that are distinct from general news and social measures.
In addition to the authors’ work, an overview of recent research in this area will be presented including an activity and discussion to establish audience attitudes towards relevant content analysis and visualisation methods.
Burden, K. & Atkinson, S., 2008. Beyond Content: Developing Transferable Learning Designs with Digital Video Archives. Proceedings of World Conference on Educational Multimedia, Hypermedia and Telcommunications, pp.4041–4050.
Duval, E., 2011. Attention please!: Learning analytics for visualization and recommendation. LAK ’11 Proceedings of the 1st International Conference on Learning Analytics and Knowledge, pp.9–17. Available at: http://dl.acm.org/citation.cfm?id=2090118.
Eisenberg, M.B., 2008. Information literacy: Essential skills for the information age. Journal of Library & Information Technology, 28(2), pp.39–47. Available at: http://www.eric.ed.gov/ERICWebPortal/recordDetail?accno=ED491879.
Holmes, K., 2005. Analysis of asynchronous online discussion using the SOLO taxonomy. Australian Journal of Educational and Developmental Psychology, 5, pp.117–127.
Joksimovic, S. et al., 2014. Psychological characteristics in cognitive presence of communities of inquiry: A linguistic analysis of online discussions. The Internet and Higher Education, 22, pp.1–10. Available at: http://linkinghub.elsevier.com/retrieve/pii/S1096751614000189.
Siemens, G. et al., 2011. Open Learning Analytics : an integrated & modularized platform Proposal to design , implement and evaluate an open platform to integrate heterogeneous learning analytics techniques. Knowledge Creation Diffusion Utilization.
Verbert, K. et al., 2014. Learning dashboards: An overview and future research opportunities. Personal and Ubiquitous Computing, 18, pp.1499–1514.
Tim O'Riordan posted an update in the session How should we measure and show online learning activity?  2 years, 7 months ago
FYI: I’ll be using Socrative for audience feedback (eduroam permitting!) – b.socrative.com,