Disengaged AI use is a symptom of the structural squeeze on choice

As universities cut optional modules to balance the books, Mack Marshall finds that restricted choice is one of the clearest correlates of disengaged AI use

Mack Marshall is Wonkhe SUs’ Community and Policy Officer

There’s something that comes up repeatedly in our focus group transcripts – not always with these exact words, but it’s pretty much the same thing.

A student starts off describing an assignment. The module, or the way it’s assessed, doesn’t feel like it fits. It doesn’t feel connected to why they came to university. And then the deadline approaches. And AI becomes the obvious solution – not to learn, but to get through, to cope.

One cybersecurity student, describing an assignment on incident response, explained this clearly:

I put the assignment into my favourite LLM. Then I put the rubric in as well, and I asked it how I should format the assignment, how I should write it. I basically just asked it how to structure it, because structure in writing is really my issue. And with the rubric, I guess it’s gonna help me structure in a way that’s gonna get a very good grade.

His summary of what he was actually doing – taking his ideas and using AI to convert them into something that would hit the marking criteria – is a decent description of a rational allocation of effort in a system where interest and assessment have come apart.

What the numbers show

In the run up to this year’s Secret Life of Students, we set out to get underneath the AI adoption statistics that already exist and ask other, more interesting questions. Do students feel they have actually learned what they have produced, what are they weighing up when they decide how to use AI on a specific piece of work, and what wider factors are associated with (problematic) AI usage? Our research on AI was never really about misconduct and cheating, it was about returning to the purpose and practice learning and the role of assessment.

When we looked at the survey data in Trained to Stop Learning (from 1,055 students across 52 providers, weighted for gender and level of study) we found a clear gradient between intellectual engagement and AI use.

Among students who said their course was “not very often” intellectually stimulating, 73.6 per cent reported using AI tools on their assessed work. Among those who found it “very often” stimulating, that figure fell to 57.3 per cent.

But the more telling difference wasn’t whether students used AI, it was how.

Less stimulated students were more than twice as likely to use AI for drafting and writing (13.8 per cent versus 6.0 per cent), and significantly more likely to use it for planning and brainstorming (34.5 per cent versus 22.8 per cent). They weren’t reaching for AI to understand their subject better. They were reaching for it to get the thing done, AI tools were about efficiency for them.

Students told us the same thing in their own words. In the focus groups and free text comments, a connection between compulsory modules, absence of choice, and heavier AI reliance came up repeatedly:

I didn’t really find it intellectually stimulating in first year, probably because the modules were compulsory and there was no choice involved. I just found it boring.

The modules I’m not interested in tend to get a “that will do” attitude.

Some compulsory modules can be very boring, and modules don’t accurately portray what’s going to be taught.

Students were describing something quite specific – not a general disengagement from their degree, but a targeted withdrawal from the parts of it that they had no agency in, resulting in disconnection from the reasons they enrolled. Where they lacked choice and interest, they sought to “get through” it.

What it looks like in practice

A game development student described how a shared design-theory assignment had landed differently across disciplines. The assessment was oriented around design thinking that sat outside the usual ways of working for both programmers and artists. The consequence was concrete:

There were some instances of programmers using it to write the introduction for their essays, and artists using it to just explain the essay itself.

Neither group was dismissing the value of learning. They were caught between the kind of thinking a module required and the kind of thinking they had come to university to develop.

A business management student described a compulsory statistics module she’d had no say in taking. She’d used AI to generate worked examples and then copied the approach, without following the underlying logic.

I just needed to get through it. I don’t use stats in any of my other modules – I’m never going to need to know this. So I basically let it do the working and I formatted it to look like mine. I probably couldn’t explain any of it now.

A second-year history undergrad described a core module on historiography that she found abstract and disconnected from the periods she’d chosen to study. She’d had no option to take something else.

I didn’t even really understand what argument I was supposed to be making, so I described the essay question to copilot and asked it what the response should look like. Then I basically wrote around that. I don’t think I learned anything…yeah I just learned what the module wanted to hear.

Another student was more analytical about the variable. He described genuinely engaged AI use on modules he cared about, but he was using it to interrogate ideas, fact-check his own thinking, push his understanding further. But he was explicit that this didn’t apply everywhere:

There are some unrelated courses or uninteresting modules in my course that I wouldn’t want to spend so much time on. But if it’s the major stuff, that’s where I learn more.

Not all of this disengagement announces itself as boredom.

Sometimes it’s a vaguer sense that an assignment exists for the institution’s benefit rather than the student’s. One physics student described a placement reflection she’d done at the last minute – she rated her learning confidence at 5 or 6, and said she still didn’t understand what the brief had actually wanted from her:

It seems like it was for their benefit, not for my self.

In her case AI wasn’t the outlet – guidance on whether she could use it was absent, so she didn’t – but the purposelessness she described is recognisably the same state of mind.

And sometimes it’s lower-grade than that. A children’s nursing student, reflecting on her last assignment, put it simply:

You do them because you kind of have to. You don’t really look back on it and learn from it.

“Something to get through?” someone else asked. She agreed.

The timing couldn’t be worse

None of this would matter quite so much if module choice were holding steady. It isn’t.

OfS polling published earlier this year found that 83 per cent of students who had noticed cost-cutting at their institution felt a gap between the experience they believed had been promised at enrolment and what they actually received.

Larger class sizes, greater use of online delivery, and reduced choice all featured. One student’s summary was:

My course is now totally different to when I came.

The financial mechanism behind those changes has been tracked in detail. OfS’ own sustainability reporting has recorded a third consecutive year of declining surpluses and liquidity, with 43 per cent of institutions expecting a deficit for 2024-25.

A Universities UK survey published in May 2025, drawing on responses from 60 institutions, found that 46 per cent had removed optional module choices in response to financial pressure – up from 29 per cent the year before.

The instinct behind consolidation is often coherence – modules that build on each other rather than running in parallel.

But our data suggests it’s ownership, not design, that protects against disengagement – and consolidation that removes elective choice trades one problem for a worse one.

In other words, module choice is contracting at exactly the moment our research shows that restricted choice is one of the drivers of disengaged AI use.

The students who told us they reached for AI on modules they didn’t choose, found boring, or couldn’t see the point of are not describing a hypothetical. They are describing a curriculum that is, across the sector, becoming more compulsory, more consolidated, and fundamentally feels less their own.

A symptom, not a cause

When we track intellectual stimulation across three years of successive polling waves, low stimulation doesn’t just predict different AI use. It predicts almost every negative outcome we measure, across a gap of 20 to 36 percentage points on each one.

Whether students feel part of a learning community, whether they trust the assessment to test understanding rather than performance, whether they believe staff are accessible and responsive, whether the course matches what was advertised and whether they feel confident about their career – they’re all significant.

In this wave, students who find their course unstimulating are more than twice as likely to agree that you can get good grades without understanding the material (55 per cent versus 23 per cent), and nearly three times as likely to say there’s a gap between what their course says it values and what it actually rewards (64 per cent versus 23 per cent). Career confidence scores drop more than two full points on a ten-point scale. And life satisfaction and sense of purpose drop with them.

As with many other findings in Trained to Stop Learning, disengaged AI use is but one symptom of something much wider.

When students are stuck in content or churning out outputs they didn’t choose and can’t see the point of, they don’t just reach for AI – they also check out of their learning community, lose confidence in their future, stop trusting the assessment, and conclude that gaming the system is a more rational strategy than engaging with it.

It’s not so much that “bored students use ChatGPT.”

It’s that the structural squeeze on choice produces a cascade of disengagement, alienation, cynicism, collapsing confidence. And AI usage is just one way of handling it without it damaging their prospects, and it’s likely freeing up their time for more rewarding activities or additional pressures and responsibilities.

1 Comment
Oldest
Newest
Inline Feedbacks
View all comments
Matt Robb
30 days ago

The examples here also show something else – also highlighted in Jim’s presentation at the Secret Life of Students conference. Some – possibly many – students feel like they are ‘doing the work’ (reading) and then using AI ‘just for structuring’ or drafting.

This is the worst possible use of AI. Reading and synthesising is a commodity skill in AI. If you feel like you’re ‘doing the work’ by reading, and then outsourcing the structuring, synthesis and deciding on the significance, you’re completely wrong. You’re doing the lowest value work and outsourcing the highest value work

Shows how important it is that HEIs draw up AI use policies that encourage and explain where and why AI is useful. And that HEIs measure the gains in critical thinking and creativity so that they know if their students are actually improving these skills