Description
Session Description
Background
MyEd is the University of Edinburgh’s web portal. It is a gateway to web-based services within and beyond the University and offers a personalised set of content with single sign on to key University services such as Bb Learn and Office 365. MyEd has over 100,000 weekly users and is used by applicants, students, staff and alumni to navigate their way around the variety of applications and services that support their University experience.
A recent project delivered an updated technical infrastructure. This provided the Service team with an opportunity to align to the personalised experience theme within the University’s web strategy (University of Edinburgh. 2018) in implementing a new information architecture (IA) and user interface, focussing on the student experience. The data and information that the portal presents to students and how it is made available to them is a key factor in determining its overall success (Yang, et al. 2005).
Methods
Aware that the current structure wasn’t working and can sometimes be confusing for students, the team worked with user experience (UX) colleagues to undertake a number of research activities. The first was with the project team; collating everyone’s previous research and experience of the Service and agreeing on how best the project could work towards making improvements. Once this agreement was in place, the project was able to go forward and design and run activities with its users.
This created the opportunity to work with students and we began to understand how they expect content to be organised and to behave. Running card sort activities, both face-to-face with a small number of students (<20) and online (with 1000+ responses), gave the project team a better understanding and the confidence of how it should implement changes. These were validated through a small number of usability sessions (Neilsen, 2000).
Findings and Discussion
The results from the full loop of activities cover a variety of areas that have gave the team confidence in applying improvements and areas to continue to explore. These areas include:
– finding content – the techniques students use in finding content and services, grouping of content and interface expectations
– vocabulary – learning and using terminology that is understandable to students
– key tasks/services – quick access to the most common and important services
– use of devices – how students expect the portal to behave on different devices and how their own experience can differ
Conclusion
Being involved in these activities the service team gained experience in all stages, from the planning and organising of activities, to the analysing of results. Methodologies in collating feedback and working with students are largely similar in delivery, as are the common factors of key satisfaction areas for students. Abdelhakim, et al. (2011) share that ensuring a continual review and usability feedback loop is a key mechanism in determining the direction of university web portals.
Attendees to this presentation will hear about how the project coordinated these user research activities within a higher education context, the challenges and successes that arose, along with the results of these and how they were disseminated within the project and to a wider audience. It will also share insights into how students are using the portal after these improvements.
This presentation may be of interest to members of the community who work with similar central web portal services and/or those keen to hear more about carrying out user research activities within a higher education environment.
This presentation will describe:
– Initial shaping of the project; what, why and how
– User research activities with students, what went well and lessons learnt
– Results of findings, and how they have been implemented
– The impact of these
– What’s next (continual improvements)
References
Abdelhakim, M.N., Carmichael, J.N. and Ahmad, S. (2011), “Quality evaluation of university web portals: a student perspective”, International Journal of Information and Operations Management Education, Vol. 4 No. 3, pp. 229-243.
Neilsen, J. (2019). Why You Only Need to Test with 5 Users. [online] Nielsen Norman Group. Available at: https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/
University of Edinburgh (2018). University Web Strategy.
Yang, Z., Cai, S., Zhou, Z. and Zhou, N. (2005). Development and validation of an instrument to measure user perceived service quality of information presenting Web portals. Information & Management, 42(4), pp.575-589.