We must no longer stick our heads in the sand

Setting an agenda for Generative AI in education

Kunstmatige intelligentie. Foto: Pixabay
Photo: Pixabay

The release of ChatGPT by OpenAI in November 2022 divided educators. While some embraced generative AI, others decided to ban it from the classroom altogether. Utrecht University adopted a “hybrid” policy that prohibited the use of generative AI (GenAI) unless otherwise indicated by course coordinators. This approach makes the issue an individual problem and places the responsibility on teaching staff, who are also required to be knowledgeable about these tools to make such decisions. Nonetheless, it correctly acknowledges that a “one-size-fits-all” solution does not exist, emphasising the importance of tailoring to specific learning goals. 

A lot has been said and tried over the past one and a half years, often based on anecdotal experiences and gut feeling. Still, our understanding of daily AI practices among students and teaching staff remains relatively limited, with very few empirical insights.  In February 2024, we conducted a survey to get a better overview of AI practices and find out how UU students and teachers use generative AI in education, how they view this technology, what kind of risks they identify, and how competent they judge themselves in its use. They were also asked about the support and training that students and teachers receive from the university. The survey was completed by 1,981 respondents from the various faculties of Utrecht University, of whom 1,633 were students and 348 were teachers. The survey was carried out in the context of the GenAI in Education project by the Utrecht Education Incentive Fund.

First, we must no longer stick our heads in the sand. While we should remain critical and sceptical of these commercial tools, we must acknowledge that GenAI is here to stay. The survey reveals that the majority of students (86 percent) and teaching staff (84 percent) have used GenAI tools like ChatGPT. Students (9 percent) seem less likely to pay for GenAI tools than teaching staff (12 percent). There are, however, differences between faculties: at the Faculty of Law, Economics and Governance, 22 percent of students have paid subscriptions whereas only 6 percent of staff members do so - a worrying development seeing that paid versions (such as ChatGPT) outperform free counterparts. An additional challenge is that these tools are continuously evolving, requiring ongoing efforts to understand their functions and map student practices.

Second, we need to start an open conversation among teaching staff and between teaching staff and students. The survey indicates that the topic is polarising, with some students and teaching staff being strongly opposed. Indeed, there are many valid arguments against the use of AI in education such as privacy, inequalities, misinformation, biases, and environmental concerns. Its implementation requires deliberation and oversight. Posting guidelines and tutorials online is not going to solve the problem if we cannot even agree on GenAI’s place in education. We should have that conversation and push for the development of alternatives to commercial GenAI tools that align more closely with our values! 

One notable yet concerning observation is that some students signal “fearmongering” and a “fear culture” related to GenAI. They think it’s important to remove the stigma currently attached to using AI tools. Dominating conversations about fraud are understandable but tiring. We shouldn’t start from the assumption that students are only out to plagiarise/commit fraud. There can be legitimate reasons to use GenAI and a lot of uses have parallels to previous practices: brainstorming with friends, getting feedback from colleagues, and using spelling/grammar checks. Moreover, student work has always been the outcome of interactions with other people and technologies. Maintaining academic integrity is not impossible with GenAI but the conditions of use need to be critically and openly discussed. 

Third, it is necessary to determine what types of skills and competencies we want to teach future generations. The survey shows that students use GenAI for a variety of purposes: brainstorming and inspiration, writing support, programming, and searching information, but also for understanding course material and as personal teaching assistants (for feedback, for example). These points raise important questions about the communicative skills we should teach students, as well as whether they can critically evaluate AI-generated outputs and maintain the ability to read academic texts in the future. Are they currently over-relying on GenAI without understanding its limitations? Predicting words is not reasoning!

Fourth, we should seize this opportunity to collectively rethink education in general. From workshops and conversations with colleagues, we found that many of the issues we encounter and need to address have very little to do with these technologies per se. The real challenge is that GenAI highlights structural weaknesses in existing curricula, where at times the focus has shifted from supporting a process to grading a product. Take the writing of a final paper for a course as an example. If that process is supported and supervised throughout its various phases, then the use of generative AI poses less of a risk. Numerous bottom-up initiatives aimed at exploring and experimenting with GenAI are underway, yet they are scattered and isolated across different courses, departments, and faculties.

Finally, GenAI policies should be clearer and students and staff require education on its ethical and responsible use. The majority of survey respondents perceive a lack of clarity in GenAI policies at Utrecht University. In open questions, students and teaching staff also indicated the need for GenAI training. 

In 2024, within the Faculty of Humanities, all bachelor programmes introduce a digital learning trajectory. Here tool criticism - the critical reflection on the ethics of digital tools, and how they impact knowledge production - will be part of the three-year curriculum. There might be potential for scaling this approach in adjusted form to other faculties. Teaching staff also need training with and access to these tools (which may also allow for a better negotiation of terms). While education is the core business of a university, the BKO and SKO programmes have positioned educational training as isolated events instead of promoting ongoing improvement of teaching abilities and professional development. 

What is clear: In some places GenAI might be useful, while in others it poses serious risks to the quality of education (and research). We need to support staff in tackling this challenge and educate students about GenAI’s limitations and ethical considerations. 

Advertisement