We propose a talk and demonstration of “AnswerBook”, a custom software system that we have developed to support online exams in the Dept of Computing at Imperial College London.
How to conduct examinations effectively via an online platform is something that we have been considering over the last few years, but the coronavirus pandemic accelerated our thinking. After our initial emergency response (converting all of our exams to a format where they could be taken online practically over one weekend), we have now reached a more stable period and are considering how moving to electronic exams can improve the experience for both the candidates and the examiners as we move forwards and once classes return to campus (if we do return to running courses the same way that we used to).
Developing a software system to support the exam process has been key to being able to do this.
In this talk we would like to discuss:
- how the pandemic changed our approach to exams (and what aspects we did not change)
- build vs buy – why did we build our own software tool?
- supporting and streamlining the examination process for examiners, as well as for candidates
- scaling up to deal with large cohorts of candidates
- what went wrong
- beyond multiple choice – to what degree can we support automated marking?
- mathematical questions – a lot of our exams involve candidates hand-writing maths, how did we integrate this with an electronic system?
- student partnership – how we involved students in the development of the system, and how this yielded substantial improvements in user experience and matching their work flow
As part of the presentation we would like to give a live demo of AnswerBook to show it in action.