The elements of e-assessment
When people think of e-assessment, they often think of online testing, in other words, the point when a candidate takes an exam. But nowadays technology is being harnessed to support the whole journey that ‘assessment’ involves.
OCR – Oxford, Cambridge and RSA Examinations – is in the business of assessment. It is one of the UK’s largest awarding bodies with its core remits being to develop fit for purpose qualifications and to assess candidates who take those qualifications. OCR is also part of Cambridge Assessment, the University of Cambridge’s international exams group, which has a global reach in assessment.
Every year, more than three million students gain OCR qualifications. As well as school exams such as GCSEs and A Levels, OCR offers vocational qualifications in areas relevant to large sections of the workforce, such as IT, business and administration, and health and social care. Success in achieving qualifications following appropriate assessment helps to ensure a person’s progression on to the next stage in their education or career and can have a major influence on their future.
In recent years OCR has introduced technology across the breadth of the assessment process: in exam administration, marking, testing, and results. For teachers and trainers delivering OCR qualifications, technology is also being developed in the form of digital games, access to e-resources and membership of e-communities to name but a few, as part of the support package that comes with qualifications from awarding bodies today.
This article takes a look at some of the ways that OCR is using technology to enhance the whole journey of assessment across a wide range of subjects and learners. It also explains why OCR – and Cambridge Assessment – will always be cautious about an immediate transfer to universal e-testing.
From the first vital step of ensuring a candidate is entered to take a particular exam, to downloading those eagerly-awaited exam results, teachers and exams officers go online using OCR’s secure portal for all exam-related information: Interchange. With a separate login and website from OCR’s main website, the introduction of Interchange over ten years ago has dramatically reduced the time-consuming paper-based administration involved in the exam process. Virtually all ‘centres’ – the name given by awarding bodies to institutions who can administer public exams – use Interchange throughout the year. Ensuring that the right candidate is entered for the right exam is a task that, even for schools, is an all year round activity, carried out by a dedicated member of staff. Opportunities to take exams away from the traditional summer season have increased in recent years as a result of the growth of modularisation and resits, as well as the development of new qualifications for which there are near-monthly exam windows. With vocational qualifications, the timetabling and frequency of exams varies to fit around the delivery of training courses. Interchange is a key technological tool in handling an ever-increasing volume of exam traffic.
Trying to make the communication of exam data easier and more streamlined for centres is a constant driver of change. A major project, in which OCR is participating, is now underway to simplify the exchange of exam-related data, the process currently known as Electronic Data Interchange, between all Joint Council for Qualifications (JCQ) awarding bodies and their customers. The A2C Data Exchange project, due to launch in September 2013, will change the format and the way that data is exchanged, ensuring that both general and vocational qualifications can be supported. The new standard can potentially be used for transactions between centres and all awarding bodies, not just those belonging to JCQ.
Accurate and consistent marking is vital to the credibility of the exams system. More than 3000 examiners marked a total of over three million scripts onscreen for OCR in 2010/2011. Developed in partnership with RM Education for OCR’s parent group, Cambridge Assessment, ‘scoris assessor’ is a new tool which allows examiners to mark GCSEs and A Levels on screen. Traditionally, paper exam scripts are sent to examiners who mark the hard copy and return the scripts plus a record of their marks separately to OCR. With onscreen marking, examiners can download sets of exam papers, say up to 20, at a time, and mark them on their computer screens rather than having to wade through piles of paperwork. The papers they mark onscreen may have been paper-based or computer-based test responses originally. As well as managing marking and the return of marking more efficiently and quickly, with benefits to the environment, the new system enables stricter adherence to the marking of principal examiners, producing greater consistency in marking across large teams of examiners spread across the country. In summer 2011, OCR announced that the introduction of onscreen marking had also led to a beneficial side effect; a new method of picking up possible cheating in exams. Through statistical screening, small groups of exam papers where a pattern of marks are very similar can now be identified. The process starts with the assumption that when two or more candidates have been colluding or copying from each other, their marks will be similar. These marks are fed through an advanced pattern matching routine which flags up schools and colleges where marks are so similar that they warrant further investigation. Malpractice investigators can then call up the flagged papers and visually inspect them for signs of collusion or copying.
Another technological tool introduced to support marking is OCR’s digital repository. This is a new option for centres to submit candidates’ work for moderation digitally, replacing the hassle of posting work. If centres chose this method of submitting work to be moderated, large quantities of work can be submitted at the same time due to a bulk upload facility. Moderation forms a significant part of the exam system nowadays and supports assessments other than a final end-of-course exam. Subjects which involve coursework done under controlled conditions, or assessment by the teachers of their own students’ work, is moderated. This enables the awarding body to check, based on a sample of work, that a teacher or tutor is marking correctly according to a set mark scheme for all candidates. As well as providing a convenient system for schools and colleges, OCR’s assessors can easily access the digital repository.
The secure portal, Interchange, is the place for centres to download their exam results. Analysing exam results is a vital activity for teachers and trainers to help them evaluate their own activities and plan for the future. Active Results is a sophisticated new tool to help them drill down their students’ results in greater depth. It can provide micro analysis of the performance of an individual in different questions across one paper, or the performance of a whole class in specific questions, giving the teacher guidance on where to focus teaching in the future. It also enables teachers and heads of department to make more strategic comparisons by looking at the performance of a group within their school or college, or their whole institution against that of other institutions across the country, by filters such as size, funding, or specialism. Active Results was developed by working closely with teachers so that the technology delivers what they want.
OCR has worked hard to improve the support package available to centres delivering its qualifications. As recently as this September, OCR joined forces with two major educational publishers, Oxford University Press and Hodder Education to offer free e-versions of A Level textbooks to all schools and colleges. This offer was taken up with enthusiasm as over 140,000 eBooks were ordered in the first weeks of the launch. Schools and colleges are now able to load the Flash-based eBooks onto their virtual learning environments, thereby reducing the need for transporting conventional textbooks. They can still buy the paper-based versions but now at a 25% discount.
OCR has also won awards for the development of online tools designed to provide fun ways to re-enforce subject knowledge. The World of Science complements OCR’s GCSE Science qualifications by using a ‘Sims-like’ game to make users test their knowledge of topics such as the environment, ecology, and energy. On a more serious note, OCR created a new digital learning device in a joint project with the NSPCC on the subject of Safeguarding to test the specific topic knowledge of people working with children taking one of OCR’s new care qualifications.
Facilitating communication between teachers delivering OCR qualifications across the country is also an important aim and OCR has been one of the first major awarding bodies to create subject specific e-communities. Of the various e-communities, unsurprisingly the IT e-community is the largest with over 1500 members. It’s a two way channel of information between OCR and a wide range of contacts, and is an opportunity for fellow IT teachers to share examples of good practice and support each other.
A large body of research is taking place in OCR and Cambridge Assessment on computer-based testing. OCR currently offers a choice of test formats, paper-based or on-screen, in eleven GCSE subjects. Electronic versions of tests for Key Skills in literacy, numeracy and ICT, Basic Skills in literacy and numeracy, plus awards in administration and health and social care are all available. Since the introduction of e-testing of Key and Basic Skills in 2005, over one million computer-based tests in these areas have been taken.
Many people assume that e-testing is the holy grail of technological advance in assessment. Computer-based testing is widely used in professional certification in aviation, medicine, law and finance industries. But despite OCR’s adoption of technology in e-administration and e-marking, and other areas of assessment, its primary focus is not to make all existing paper and pen tests electronic. The awarding body remains cautious about e-testing as there are issues of practicality and principle to be overcome. Many schools with large cohorts who need to sit exams at the same time for example do not yet have enough computers, or IT support staff, to make this possible. Exams need to be taken at the same time to be fair and avoid the temptations of malpractice. Another issue, OCR believes, is to make sure that computer-based tests measure subject knowledge, rather than IT skills. These issues need to be resolved before the normal adoption of computer-based testing.
OCR is working hard to identify ways in which technology can be applied to benefit the assessment process. With the help of the considerable research expertise within Cambridge Assessment, it keeps abreast of the latest developments and arguments about all aspects of e-assessment. In an increasingly technologically sophisticated world – with a young generation of candidates to match – OCR is keen to continue the debate about how technology can genuinely best support and enhance educational assessment.
If you enjoyed reading this article we invite you to join the Association for Learning Technology (ALT) as an individual member, and to encourage your own organisation to join ALT as an organisational or sponsoring member.
Good news, OCR & Cambridge Assessment finally aiming at not lagging behind Pearson & Edexcel. Interesting as it is, I’d appreciate in-depth objective (where possible) evaluation of the full range of e-assessment options within the market for a grounded judgement.
Hi Alex, OCR hasn’t produced the kind of evaluation that you are asking for, but perhaps there would be something helpful in either the Ofqual efutures site (http://toolkit.efutures.org/) or the research project looking at computer-based test content authoring and delivery undertaken by the University of Manchester (http://www.ukcdr.manchester.ac.uk/) a few years ago?