Notes from ARLT SIG 7th June 2023 webinar –  Achieving inclusive education using AI with Dr Olatunde Durowoju

by Dr Teeroumanee Nadan

This blog was originally posted on 22 June 2023 on Dr Teeroumanee Nadan’s blog teeroumaneenadan.com.

I provide in this blog post a short summary of the Q&A session and additional links for you to read, including previous ARLT SIG events related to the topic. Sadly, there is no captioning embedded in the recording, and YouTube captioning is one of those technological features that will fail an inclusivity test.

Summary of Question & Answer

Note: This is not a word-to-word captioning of the video, it is an edited version.

Q1. Is there any data on what kinds of assessments are more equitable for disadvantaged students?

Context is very important. While undertaking this project, there was no data anywhere on performance of students with regards to specific assessment type. A statistical analysis looking at historical performance was made and there was focus on instant assessment, for example, exams, deferred assessment for example, where you ask students to maybe write a report and you give the assessment brief to them well in advance. So group assessments and individual assessment were also looked at. Students were asked about the level of anxiety that they had towards group assessment and individual assessments. And there were clear differences between the perception of all the various student groups. It is important to know how the data was diaggregated. It is not enough to just say “ethnically diverse”. There are several groups within that have different perceptions and that also needs to be taken into consideration. Your institution is the first place to go for that sort of data. There is also a need to start to aggregate those sort of data across universities. One type of assignment we have found to be really bad are exams. Both student groups, White and Ethnically Diverse expressed a level of anxiety with regards to exams. When we looked at the performance data, the average performance for exam was very low compared to the others. And when you then look at these two different groups, you can also see that as well. There is a huge disparity in there. So, exam was one area that had the biggest gap. Of course for obvious reasons, for exams you do not know the questions beforehand, you are given a limited period of time. For exampl, for non-native English speakers, that is always challenging to try to interprete what is required of you. And of course there were anecdotal evidence to suggest that exam will be the problem, through that project were able to confirm that this was actually the case.

Q2. If ChatGPT has been trained on ‘biased’ data, how does it decolonise anything?

Decolonisation is far reaching than just the mere search for textbook from different countries or from different cultures. I asked ChatGPT to give me and one of the leading text, for example, from Chile. It returned an abstract of the book and it also gave me a translation of that abstract. So my next step then was where I can find this textbook. At the moment we do not have the right solution, but there are few things that we can use ChatGPT in the spirit of decolonisation. ChatGPT is generative AI, it uses statistical learning to present text from lots of sources of information. It is very dangerous. Because it is biased, it is more likely to give you biased information. People can also fool ChatGPT to respond in a certain way. If we ask ChatGPT to look for a book, we still need to verify things ourselves, because ChatGPT cannot address decolonisation in the truest sense at the moment but in terms of looking for text from different countries, we can use it with caution.

Q3. Comparing sources and using Chat GPT to inform understanding suggests that everyone will have equal access to technology – but is that the case?

ChatGPT is a conversational AI technology. There is disparity in terms of access to digital technologies, of course, when you look at it from that sort of ethical point of view. When COVID hit it was then that we realised that many of our international students, for example, do you have access to laptops, many of them were doing assessment on their phones. When we provided laptops for students, all laptops were taken very quickly. So, we have this group of students without access to all the digital sector technologies that we have. So, how do we ensure that students have access to this technology? If I was on a class and say go on ChatGPT, many of them have a phone and can access it that way. It is still a big issue in terms of the digital divide. Using AI as a conversational piece to get information, although flawed at the moment, and that is an area that we need to look into, well, continues to look into at least there is still some sort of access.

Q4. Have you seen that gap dropping as a result of all of this work that you have actually done?

The project is still at very early stages. We have identified some of the gaps. At the moment we have just introduced different solutions. One of them is a mentorship programme that we are rolling out across all faculties for peer mentoring, students in the final year or second year mentor first year students. We have not implemented the solution long enough to be able to say it is having an impact. We need to give it at least one year more before we start to see if there is any sort of shift in the end.

Q5. How do you assess authenticity?

You have to look at historical performance data for your students. For us, it was very clear that exam was the biggest disparity inducer. I think we have an over-reliance on written text. We need to start moving away from that. I think there are two lenses that we can use here. The first lens is where people have been talking about how we sort of diversify assessments. The other lens looks at what exactly are we measuring? Is it just regurgitation of knowledge or is it more skill based? I am not saying that we should completely take exam out of the picture. What I am saying is that we cannot over rely on exams. When you look at the life of a student, the journey throughout the university, what is the percentage of exams that is written? What is the proportion of the credits? Then we can start to build a picture , are we relying on one kind of assessment compared to the others? That is absolutely necessary moving forward.
Further comments
  • Chimamanda Ngozi Adichie: The danger of a single story | TED https://www.youtube.com/watch?v=D9Ihs241zeg
  • It’s worth bearing in mind that ChatGPT does not support most/many minority languages, so even when asking it to search for resources from X country it might not be able to access it because it does not support the language
  • Do AI companies share information about the provenance and biases in their training data? If not, why not? (edited)
  • Richer students have always been able to pay for private tutoring and they will also be able to afford the best paid-for versions of AI (e.g. GPT4)
  • If you collect data, especially student experience data, it must be clear how it will be used and students must see the impact of completing those surveys/questionnaires – making a difference or activating change [during their student life journey]
Recording & Presentation
YouTube recording for Olatunde Durowoju’s presentation https://www.youtube.com/watch?v=yO913NCCMzw

Download Olatunde Durowoju’s presentation slides (https://bit.ly/3Jv9fVD)

Further Reading

Notes from ARLT SIG 3rd Nov meeting – Anti-racist Approaches in Technology with Liza Layne (https://bit.ly/3ktSPTw)

Leave a Reply

Your email address will not be published. Required fields are marked *