Description
Session Description
Our Virtual Learning Environment (VLE) currently hosts more than 5,000 courses a year. Feedback from a number of surveys and reviews of our digital systems have all highlighted that inconsistencies and poor web usability cause frustration and dissatisfaction for students. Students too often report that they struggle to find important course-specific resources in the VLE, and are spending a disproportionate amount of time on this rather than on the learning activities that they know are important. This is reflected in the work of Reed and Watmough, regarding what they term ‘Hygiene Factors’ and the prevalence of the issue is illustrated in the report on the various baselines implemented in order to address this by HEIs, (Newland B., Emery R.).
In response we have embarked on a project which aims to ensure that courses are accessible and relevant information is easy to find by students. In addition staff should find the VLE easy to use, and are well supported to make and deliver rich courses.
We are tackling the issue by focusing on 6 project work streams: templates, checklists, training and support, terminology, automation and finally measuring and evaluating the impact of the project.
What we feel makes this project unique is that in order to inform these work streams a significant body of research work has been completed by the University’s User Experience Service. Over 3000 responses were collected from surveys alone, with additional in depth, qualitative research carried out through interviews. Results have been surprising and enlightening. For example, we have found that for many stories about students’ experience, they were mirrored by a similar story from the staff point of view. This highlights how we need to consider the staff experience as well, if we are to improve students’ experiences – or, how we need to improve “backstage” processes to improve what users experience on the “frontstage” (Miller, M and Flowers, E, 2016). We also found that students’ pain points were often centred around processes and how tools are used, rather than exclusively on the tools themselves. Our findings have directly driven the design of the new templates, as well as guiding the checklist and development of terminology. In addition this research is contributing directly to developing academic engagement and supporting communications.
The session will focus on:
* Why we conducted user research.
*The research programme designed to build an understanding of the elements of our users’ experience (Garrett, J) – using techniques interviews with students and staff, usability testing (Krug, S), top tasks surveys (McGovern, G), card sorting (Sherwin, K), tree testing (Whitenton, K), and first click testing (Sauro, J).
* How we achieved high levels of engagement from users
* What the research told us.
* What we’ve learned from the research.
* How we’ve used the findings to inform the different streams in our project.
* Where we will go from here.
By focussing on these areas we hope to illustrate how the methodology and principles used in this project are replicable for any projects of a similar nature.
References
Garrett, J. The Elements of User Experience. Available at: http://www.jjg.net/elements/pdf/elements.pdf
Krug, S. Rocket Surgery Made Easy. Available at: http://www.sensible.com/downloads-rsme.html
McGovern, G. Transform: A rebel’s guide for digital transformation. Available at: http://gerrymcgovern.com/
Miller, M. E. and Flowers, E. The difference between a journey map and a service blueprint Available at: https://blog.practicalservicedesign.com/the-difference-between-a-journey-map-and-a-service-blueprint-31a6e24c4a6c
Nielsen, J. Why You Only Need to Test with 5 Users, Nielsen Norman Group. Available at: https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/
Reed, P. and Watmough, S., 2015. Hygiene factors: Using VLE minimum standards to avoid student dissatisfaction, E-Learning and Digital Media, Vol. 12, Issue 1. Available at: https://doi.org/10.1177%2F2042753014558379
Sauro, J. Getting the first click right. Available at: https://measuringu.com/first-click/
Shah, D. Have you had your recommended dose of research?, User research in government blog, Gov.uk. Available at: https://userresearch.blog.gov.uk/2014/08/06/have-you-had-your-recommended-dose-of-research/
Sherwin, K. Card Sorting: Uncover Users’ Mental Models for Better Information Architecture, Nielsen Norman Group. Available at: https://www.nngroup.com/articles/card-sorting-definition/
Travis, D. How to prioritise usability problems. Userfocus. Available at: https://www.userfocus.co.uk/articles/prioritise.html
Travis, D. Red route usability: The key user journeys with your web site. Userfocus. Available at: https://www.userfocus.co.uk/articles/redroutes.html
Whitenton, K. Tree Testing: Fast, Iterative Evaluation of Menu Labels and Categories. Available at: https://www.nngroup.com/articles/tree-testing/