This paper offers an overview of peer-review exercises conducted using an online system during a period of over nine years – covering different subjects, year levels, institutions, and methodologies.
The [XXX] system has actively supported online peer-review activities since 2007, permitting multi-institutional access from in 2012.
In this time, 992 peer-review assignments devised by 266 instructors have been supported, over 17 institutions on all continents, in a range of subjects: from Accounting to Anthropology, from Palaeontology to Pharmacology.
In the last two academic years alone, over 12,000 unique students wrote reviews on their peers’ work. Since Nichol (2014) argues that the most significant outcome from peer-review activity is as a result of students writing reviews, the effect of this system on the skill development of thousands of students is considerable.
[XXX] is offered free, worldwide, and is designed, developed, and maintained by the authors of this proposal: two academic members of Computing Science staff at a UK university.
Having complete control over the system, and direct communication with the instructors who use it, we are in a unique position to report on the range and scope of the peer-review activities, including trends, emerging issues and individualisation. During the past nine years we have adapted the system in response to several instructors’ individual requests, each request revealing the instructor’s perspective on the nature of a successful peer-review activity.
This proposal falls under the theme “Collaboration and innovation in the open: taking risks, sharing lessons and the importance of open educational practice”. Providing an overview of successful peer-review activities allows us to highlight the changing issues of most importance to instructors.
Our data includes: raw system data (e.g review methodology, parameter settings, review period), instructors’ questionnaire responses (contextual information, e.g, whether the activity was compulsory), email correspondence with instructors (identifying concerns and requests for system improvements), and system logs.
Our primary findings are: (a) instructors’ priorities have changed over time (e.g. with respect to regulation frameworks); (b) there is increasing interest in requiring students to engage with their feedback (e.g. by producing a modified version of their report); and (c) students are unwilling to engage in peer-review activity unless there is extrinsic reward (e.g. course credit).
This presentation presents the results of our investigation of all activities conducted using [XXX], highlighting trends in successful peer-review practice.
Nicol, D. et al. (2014) Rethinking feedback practices in higher education. Assessment & Evaluation in Higher Education, 39:1, pp102-122.
Nulty, D. (2011) Peer and self-assessment in the first year of university. Assessment & Evaluation in Higher Education, 36:5, pp493–507.
Ramm, D. et al. (2015) Learning clinical skills in the simulation suite. Nurse Education Today, 35:6, pp823–827.
Walker, J., Sampson, V. (2013) Argument-Driven Inquiry. Journal of Chemical Education, 90:10, pp1269-1274.