Academic departments across the country are having the same circular conversations about AI. Staff are anxious about students’ AI use, expressing concerns over academic integrity and the implications of AI as a shortcut in the learning process.
Yet beneath these discussions lie unspoken assumptions about why students use AI, with little meaningful dialogue with them about the use and role of AI in university teaching and learning.
We wanted to ask students directly how they use AI, how they believe it could be used in teaching and learning, and how it shapes their university experience. We spoke to 15 students from the School of Politics and International Relations at the University of Nottingham, including both undergraduate (years 1-3) and postgraduate taught students.
The most common explanation for AI use given by students is one that staff don’t talk about in their meetings: the deteriorating student-staff relationship.
Student explanations for AI use
Students are using AI for various aspects of their studies, including identifying and summarising readings, producing essay plans and even writing their assessments. The students we spoke to offered a range of explanations for why they use AI in these ways. For some, FOMO plays a role: they see their peers using AI to score good marks while investing less time and effort into their coursework.
However, by far the most common explanation, and one consistent across year groups, was that students feel more comfortable using AI than approaching academic staff. When asked why this was the case, there were three explanations.
First, students described using AI in their day-to-day lives as an “unbiased” mentor that aids decision-making and provides companionship. They noted that AI does not judge their questions, and so they avoid the anxiety of wondering whether they were asking something “too basic” or “too simple”, unlike when approaching academic staff.
Next, students said that using AI is far more efficient and helpful than asking tutors for support. They explained that they want quick answers to their questions, yet it can be difficult to get a response from an academic that is both timely and helpful.
We repeatedly heard stories about staff giving overly complicated explanations, or of students attempting to book office hours only to find all of the available slots were taken. In contrast, AI is immediately available at any time, and students can ask it to “dumb down” explanations without fear of shame or embarrassment until they get an answer they consider “sufficient”.
The final reason was around students fearing they were being treated “like numbers” by academics – in contrast to AI, which was seen as always polite, eager to help and, crucially, very validating. AI’s over-validation of its users is a well-documented issue. However, the appeal of this validation to students who feel neglected by academic staff and who desire positive reinforcement is significant. This dynamic encourages sustained AI use rather than engagement with tutors.
The deeper issue
The common variable across these explanations was that student AI use is driven, at least in part, by a deteriorating student-staff relationship.
The first explanation for students’ AI use that we heard aligns with broader shifts in social interaction: one-in-three young people turn to AI for guidance and support and one-in-ten have said that they prefer talking to AI than to other people. But the anxiety expressed about approaching academic staff for help shows that the student-staff relationship is an important underpinning cause. Given this anxiety, combined with how embedded that AI has become in students’ daily lives, it is unsurprising that they feel more comfortable turning to AI for support in their studies than approaching academic staff.
The second and third explanations emphasise the structural issues facing UK universities, and their significant impact on the student-staff relationship: students’ AI use reflects a diminishing capacity for academic staff to support students in a way that feels valuable and accessible.
When asked to elaborate, various students said that they felt academics were too busy to offer meaningful support or take the time to break down complex ideas into more digestible chunks. This reflects broader concerns about academic workloads and their knock-on effect on student experience, which are ultimately shaping student-staff relations.
The third explanation builds on this by highlighting what students see as the heart of the student-staff relationship and yet increasingly absent in contemporary higher education: being treated and respected as individuals. With increasing student-staff ratios, growing staff workloads, and young people already turning to AI for guidance and support, it’s increasingly difficult for staff to get to know their students and offer the individualised learning experience they desire.
Meanwhile, AI’s tendency to “over-validate”, combined with its immediate availability to provide bespoke responses to students, means that students are forming stronger relationships with AI than with academic staff. The issue here is that AI’s over-validation is enabling students. Meanwhile, empowering students to learn and develop critical thinking requires academic staff to challenge their arguments and views. Being challenged in this way is more uncomfortable for students, especially in the absence of a well-established relationship with their tutors.
Where do we go from here?
If we want to appeal to students and draw them away from a problematic overreliance on AI, we need to consider the student-staff relationship. Rather than focussing on concerns over academic integrity, conversations around AI should focus on address students’ doubts about engaging with their tutors, increasing staff availability and prioritising efforts to recognise students’ individual learning needs. This will require meaningful change at the institutional level to give staff the capacity to work effectively towards improving the student-staff relationship.
Fortunately, it’s not all doom and gloom. Students consistently emphasised the importance of a human connection and offered examples of what this looks like. They stressed the value of academic staff teaching within their areas of expertise, where their enthusiasm and passion are evident and help to motivate students to engage with their course. Students also told us that they felt far more comfortable approaching staff with questions when those staff appeared genuinely enthusiastic about teaching and being in the classroom.
If we want to address students’ AI use and have genuinely useful, informed conversations, we need to begin by talking to our students. Recognising that addressing the challenge of increasing student AI use starts with strengthening the student-staff relationship, staff and institutions need to lean into their passion for their subjects and embed that enthusiasm into their teaching. Based on what students told us, this is the most effective way to draw them away from an enabling over-reliance on AI and to empower them in their education.