About ALT East England

This house believes that Education has control of the technology, policy and processes being used (Part 1 of 2)

Report on the live debate on (learning) technologies and Artificial Intelligence (AI) hosted by the ALT East England.

by Uwe Richter, Neil Dixon (Anglia Ruskin University), Rob Howe (University of Northampton)

A panel of academic staff, professional service staff, sector experts, and students met on 21st June 2023 to discuss the rapid pace of AI and related technologies and the extent to which institutions were still in control.

This is the first part of the blog post, which reports on the panel’s responses to the questions have we lost control, is AI different from other technologies, and is AI a curse?. 

The second part will discuss is AI a gift?, reports on keeping up to date with professional development and the final vote on whether education has control.

Debate moderated by Michael Webb (Director of Technology and Analytics) from the Jisc National Centre for Artificial Intelligence. The debate panel was :

FOR:

Dr Lee Machado (Professor of Molecular Medicine at Northampton)

Jatin Arora (Head of the Student AI Society at Northampton)

Prof. Simon Walker (Eden Centre, LSE )

Nirvana Yarger (ARU student studying MA Education with Montessori)

AGAINST:

Karl Downing (Digital Development Lead – IT Services at Northampton)

Dr Andrew Cox (Senior Lecturer, University of Sheffield) 

Dr Shaun Le Boutillier (Assessment and Feedback Practice Lead, ARU) 

Randall Lister (ARU student studying MA Education)

The discussion started with considering if universities have control over the technology they use. The FOR group felt that they do have control over the technology used but it takes time to catch up with some technological changes. Universities are long-standing institutions which have always adapted to new technological, economic and societal changes. To help with retaining control, there are several areas universities need to address. The first is how and where students can use their own technology, the second is developing standards that need to be followed, and finally the issue of academic integrity and assessment. For example, “for doing some really kind of proper work with students, which might include helping them with an ethical framework to interpret what they’re getting from AI, and of course understanding how the discipline will be involved in all of that” (Panel member). They concluded that change will take time, and universities need to adapt and possibly streamline their processes to gain or maintain control of the technology. 

The AGAINST group argued that institutions have lost control of the technology. There are factors which indicate that technology is beyond the control of universities. Firstly, we are experiencing emotions like confusion and helplessness, an immediate reaction which indicates that we have lost control. Secondly, we cannot say no to technology and do not have a choice in its use, therefore we are not in control. Technologies are “influencing what information we get and how we use it and maybe ChatGPT just alerts us even more strongly to that” (Panel member).

Technology advances such as translation and transcription services have been introduced over the years and have been improved, but these are not always sufficient for educational purposes. However, the group felt we seem to have lost control over plagiarism detection as this is currently insufficiently accurate and will become more so over time. 

Both groups then considered whether AI was different from other technologies – feeling that on balance it probably was. Artificial Intelligence such as chatbots seems to be easier to subvert and be influenced as a whole to regurgitate inappropriate content, whereas for traditional web technologies, it was easier to limit the distribution of this content. An example is Wikipedia, where we had more autonomy and editorial control over the content. On the other hand, personalisation is seen as a benefit of AI. This can be used to enhance education. For example, exams and teaching content can be more personalised, reducing standardisation. AI has made this a lot easier to achieve.

The panel then moved on to consider whether AI is a gift or curse.

“Generative AI poses substantial challenges to education, not least to assessment, but it also provides opportunities for supporting learners and learning. Unsurprisingly it has also acted as a stimulus for the education sector to examine all aspects of what and how it teaches and to reevaluate assessment practices, things that have remained largely static for a very long time”  Heidi Fraser-Krauss, chief executive officer, Jisc (Webb et al, 2023)

Whilst there is often a black-and-white discussion around AI, the answers are actually more nuanced. During 2023, there has been a proliferation of AI tools – some of which require a separate sign-up whilst others are integrated into existing products such as Microsoft 365. Most of these will have been developed based on certain datasets and have particular programming biases. The ‘black box’ (systems are developed that are not explainable) nature of some of these tools mean that they produce an output based on what information and questions are asked of them. The output may seem very realistic and valid but may contain biases based on the AI systems training data set. In some cases, the output could just be a hallucination (factual errors presented as truth) of the AI system. If these are not detected or checked by a human, they may be interpreted as real instead of fabricated by the system. If left unchecked these inaccuracies may become part of new information and then be referenced as a fact. In this respect, AI could be seen as a curse which has the potential to undermine existing knowledge sources and the credibility of anyone who relies on this information. Furthermore, many of the established AI companies such as Microsoft, Google and OpenAI are very protective of what data is aggregated within their systems and how it will be used. This leads to the possibility of unintentional data breaches where information submitted for one purpose may be shared in a way which was not intended or approved. Current discussions around copyright and intellectual property of the output created are also dominating, as material aggregated by AI systems may be repurposed for new outputs without any clarity about the original owner.

ALT EE would like to thank all the debate panel members, both staff and students, and the debate moderator Michael Webb for making the event happen and their engaging discussions. The second part is now available.

References

Webb, M., Attewell, S., Moule, T., Nicholson, H. and Fraz, A., (2023). Artificial intelligence (AI) in tertiary education. (3rd ed). Jisc. Available at: https://beta.jisc.ac.uk/reports/artificial-intelligence-in-tertiary-education (Accessed 17 October 2023)

Leave a Reply

Your email address will not be published. Required fields are marked *