There’s a strain of thinking in current debates about AI and higher education that I find alarming.
It runs roughly like this – if AI can produce an essay, then essay-writing was never really valuable.
If a machine can summarise the reading, then doing the reading yourself was just inefficiency.
If the output can be replicated, the human labour that produced it was a kind of pretence all along.
Across colleges, schools and universities, educational institutions are hitting a big red panic button shouting that the robots are taking over their homework.
Whilst I agree there should be a focus on understanding built through human connection, I’m not so sure that the replicability of essays and reading strips this work of its value – that something “harder to automate” is therefore “more valuable to develop,” or that we should on any level embrace that replacement as a productivity gain.
The biggest danger AI introduces to education is feeding an obsession with efficiency, in the process losing sight of the inherent worth of learning.
The currency problem
Some argue that our fear of AI stems from how it reveals our value was never in what we knew, but in other people not knowing it – but this is only true when knowledge, specifically facts, is treated as a currency.
Under these conditions, only some non-fiction writers, not artists or authors or animators or scriptwriters, have their “value” threatened.
What makes any piece of writing readable is the relatability, the stories from a lifetime as a lens to view information through, the application of those anecdotes to the topic – allowing a reader a unique insight into the writer’s perspective and a sense of conversation with them.
Of course, AI could learn to write as if it had a human lifetime to draw on, but only if it were allowed to fabricate the small human moments.
Copywriters and the small community of people who write manuals for new smart speakers might find their work entirely and indistinguishably replicated by AI, but this only proves texts completely devoid of the seasoning of an author’s voice, designed to sound anonymous and universally applicable, fit an argument that relies on the value of writing as only a transaction of information.
The measurement trap
There’s a reasonable critique to be made of an oversimplified view that producing something proves you learned something. However, this doesn’t mean the thing produced is therefore worthless.
The idea that AI might prompt education to think of new ways to judge learning could be a positive thing – but that doesn’t equate to the original not being worth judging.
“If the activities can be replaced by a machine, what were they measuring” is a question blinded by the assumption that the only purpose or value of these activities was measurement.
To say essay-writing skills aren’t valuable proclaims that interpreting and analysing others’ work, the basis for practically all development of ideas ever, isn’t valuable.
AI can never (in its current capabilities as a language model fed by plagiarism and predictions) read text, watch a film, experience a political event, and come up with an entirely new response or interpretation, sparking future original developments – scientific or creative.
Only years of learning how to properly cut up and reassemble a text into an essay allows me to write this now and respond to arguments in an interesting way that adds to the discussion.
I feel lucky to have been the last generation to go through most of my education without AI being used by my teachers and peers, but being level with the wave of change means I also have to experience its takeover, and the suffocation of anyone resisting its use.
If we accept that the skills I’ve learnt are now not worth teaching simply because there’s another way to achieve the same outcome, every generation after me will be condemned to gradually lose their abilities of interpretation.
The content delivery fallacy
AI making facts (when they are facts) and frameworks accessible is a good thing, and could benefit education. Tailored revision guidance based on analysis of your test results? Great.
But computers’ best uses have always been as apathetic analysts, going through data millions of times faster than we can, since Turing’s Enigma machine, since the first calculators.
But to doggedly clamp your hands over your eyes and ears in relation to every other space AI invades, specifically generative AI, ignores the tonnes of deadly ice floating below the surface. Although, in this case it isn’t hiding underwater – it’s plainly and enthusiastically advertised as a productivity miracle by tech oligarch funded ads, tech oligarch funded thinktanks, and even the OBR.
Saying that AI could benefit education by removing or diminishing an unequal scarcity of knowledge is true in many cases, but not by summarising lectures into 300-word paragraphs devoid of nuance or connection, or replacing classes in essay technique with tutorials on “how to phrase a prompt to maximise outcomes.”
“Content delivery” can only be “automated” if the only job of lecturers is to read out facts in a monotone voice and respond to questions with a quick glance at a flowchart of possible answers.
Some acknowledge this – that human connection between student and teacher is still necessary – but we must extrapolate this to the connection between a writer and a reader when AI could be used to summarise texts or write new ones.
We didn’t say storytelling was dead when they invented books – we celebrated the occasion of hearing someone lead you through the twists of a tale in an even more engaging way.
The students who suggest that, in writing an essay, you can simply “google a quote that supports your argument” with “no idea what the text means” might get full marks – but why even bother going to university if you don’t want to learn what the greatest thinkers in the subject say about it?
Similarly, to only read AI summaries of texts or lectures and say you’ve understood it is like watching an animated summary of Fight Club and saying you’ve seen it, or reading a review of Romeo and Juliet and saying you’ve been to the theatre.
The cinematography, the pacing and time spent with the work, the sonnets and the screams, will never be conveyed in the facts alone.
Go tell your grandma
Education shouldn’t be an exercise in the fastest way to inject you with the information required to become an efficient worker. AI being able to do some things humans can only causes a “confrontation with value” if you judge someone’s value on their productivity alone, compared to every other available way to do the task.
Go tell your grandma that the jumper she’s knitting you for Christmas is worthless, her skills “obsolete,” because there’s a machine that can do it faster.
What causes people to question the value of university isn’t actually AI but uncertainty as to the ongoing purpose of higher education – whether universities exist to sort and qualify, or to form and transform.
AI doesn’t “force the question” – it simply adds a point to the argument of people who say unis are just a library for youths to sit and debate the meaning of life for a few years before graduating to a job in marketing. Look, they can say, a computer can do everything you can, so everything you do is worthless. You can watch a movie about Moldova, so you shouldn’t bother visiting.
Why should the value of a task only be determined by its efficiency and the perceived economic or social value of the outcome? Is there no point learning or developing unless you become something that can be consumed by society, that if there’s a cog smoother than you then the machine that is our work-centric world will simply discard you and upgrade?
The forgotten purpose
Can we truly believe universities only serve the two purposes of “access to well-paid jobs” and “coming of age”? If so, maybe they’ve simply become sorting machines, checkboxes for climbing the capitalist ladder. Has the pursuit of knowledge become worthless?
Google’s inescapable AI “summary” tool said universities were for “career prospects” and “personal growth,” surprising no one in parroting the obsession with productivity.
Maybe the crisis of AI’s implications for education, the nihilism about the purpose of universities, could be abated if we stopped thinking of them as a rung on the ladder to more work, more money, and more work, and more money until you reach the top and can be buried in the highest, most expensive graveyard.
You can die safe in the knowledge no one will ever have to read your work as you intended, spend time with it, because it’s already been absorbed into the system and will be regurgitated without your consent into any AI response someone might ask, slightly altered to fulfil their biases.
Buddhists don’t become monks, meditate silently, patiently, for years to reach nirvana, just to put on their CV that they’ve been there done that and would now like to apply for a role further along the trifold path.
The first university, University of Al-Qarawiyyin, began as a mosque and similarly to its religious teachings, treated learning as a process of understanding and growth, study and research for its own sake.
When did we forget that learning had inherent worth? If we were to remember that, and act accordingly, AI would never be able to threaten that common goal.