Rethinking Assessments with Generative AI: Strategies and Approaches
Co-authored by Ian Miller, Elisabetta Lando and ChatGPT.
The rise of generative AI has sparked a significant debate across the educational sector. Many teaching practitioners are grappling with the challenge of students using AI for formal assessments and presenting the work as their own, impeding an accurate assessment of the students’ abilities and understanding. However, putting a blanket ban on AI is not the solution, argues instructional technology expert Dr. Helen Crompton in April’s Active Learning SIG webinar. Instead, we need to rethink and redesign assessments to leverage the power of generative AI as a learning tool, working with AI rather than against it.
Assessments have traditionally focused on testing what students know, but there have been long-standing arguments for a radical overhaul of assessment methods. Back in 2019, Simone Bhitendijk of Imperial College stated in a TES interview that for active learning methodologies, such as the flipped classroom, to truly take off, assessments would need to evolve to reflect their interactive nature. Bhitendijk went on to say that it would “require a change in the wider ecosystem to accelerate the uptake” (TES, October 2019).
AI is indeed a significant change within the wider ecosystem. Assessments need to radically shift their focus from merely testing knowledge to evaluating the capabilities or competencies needed to apply that knowledge. Active learning methodologies, even with the use of AI itself, can offer a promising avenue for addressing some of these challenges.
Key challenges:
- ChatGPT and other AI language models can easily generate written content like essays that may get passed off as a student’s original work.
- Typical plagiarism detection tools struggle to identify AI-generated text reliably.
- AI models can introduce biases, inaccuracies, and lack proper citations when automatically generating content.
- The importance now, more than ever, is the need to explicitly embed digital critical thinking skills in all areas of education.
While not foolproof, Dr. Crompton suggests some short-term strategies to make cheating more difficult:
- Require citations and references to be included and verify their accuracy manually.
- Ask students to connect their work directly to class discussions, readings, and experiences.
- Encourage collaboration, group work, and peer feedback as part of the assignment.
Note: Some of these tactics, such as citations and references, may eventually be outsmarted by AI, so they should be considered temporary measures.
The long-term solution, however, remains one of reimagining assessment:
- Use real-world, authentic tasks like problem-solving case studies.
- Incorporate practical components and multi-stage activities that are harder to automate.
- Have students critique AI-generated content to build critical analysis skills.
- Conduct debates with AI to develop argumentative reasoning abilities.
- Use AI for mock job interviews to practise communicating complex topics.
The key is creating active learning experiences that demonstrate deeper understanding beyond just reciting information. This aligns with the concept of ”making thinking visible” (OU innovation reports, 2020), which lies at the heart of active learning methodologies (and indeed lies at the heart of this ALTSIG’s remit). However, “making thinking visible” can pose a significant challenge for educators, who often juggle time constraints and large student cohorts. Furthermore, using generative AI to create active learning assessments requires educators to master the AI tools themselves and learn to create prompts that generate assessments aligned with learning outcomes. This process will take time, and institutions must provide the necessary support for the development of these essential staff digital skills.
While initially disruptive, it might be worthwhile to look at the positives. According to some quarters, this may even be the end of essay mills. Embracing the technology’s capabilities prudently and with proper support could potentially enrich student learning in powerful new ways.