Artificial Intelligence has come a long way, from Turing to Alexa and now ChatGPT – but what does all of this mean for students?
The concept of AI dates back to the 1950s when pioneers like Alan Turing and John McCarthy laid the groundwork for the field. Early AI efforts aimed to create machines that could mimic human intelligence, solving complex problems and learning from data.
However, progress was slow, and AI experienced periods of hype and disillusionment, known as “AI winters.” It was not until recent years that advancements in computing power, big data, and machine learning algorithms propelled AI into the forefront.
As a recent graduate I have seen the rise in AI first hand, from a passing comment to full assignments being written by AI tools.
Entering the role of a student leader opened my eyes to the larger picture of AI. The tools aren’t just an assignment production line, but an enabler for students. With AI, students can produce coding in seconds, detailed artificial art, and complex reflections.
But for some students this is not the only use of AI – many students depend on the assistive technology aspect of it, using tools like grammarly, Microsoft dictate, and speech recognition to access their education.
For a number of student leaders across the sector, it’s clear that there’s both uncertainty and fear surrounding AI tools, with worries that their use stalls student progression – and that somehow the use of AI makes teaching staff redundant.
It’s a classic and understandable fear that has accompanied many other technological advances in the past. But the truth can be different – in many cases staff and students that adopt AI have deeper level conversation, explore learning together and redefine their understanding of knowledge. Students and their representatives are keen to grapple with and embrace the puzzles this all generates.
The problem is that the rapidity of its development has created a void in processes – with many uncertain on what the future holds.
The ChatGPT revolution raises huge questions for universities – how we assess, how we teach, how we enable equitable access to education, how we demonstrate and signal learning gain, and how we run the ongoing, over-competitive rat race in a way that students feel is fair.
Students and their representatives are keen to grapple with and embrace the puzzles this all generates. But some of the student leaders we speak to report struggling to engage on the issues with depth, and even where there’s a working group a tendency to focus excessively on threats or opportunities – but rarely both.
Generative AI and large language models are here to stay, and more and more students are using ChatGPT and similar tools as a method to cope when trying to balance life and study.
The conversation, as well as the technology, is constantly changing. As student leaders we of course encourage students to “produce and own their own work” – but we are very aware of the changing landscape, and the more our members embed the tools into their lives the harder it is for them to know what is right and what is wrong.
Muddling through on teaching and learning is one thing – but allegations of academic misconduct are not suited to shifting sands. Some policies are out of date. Some are being “reinterpreted” in ways that students can’t reasonably be expected to have anticipated.
In some cases new policies or clauses are popping up overnight and during the process of creating work to be assessed. And new research highlights real risks in both tools (and humans) naively thinking they can detect tools’ use.
We also know that students are more anxious than ever, and want their universities to provide as much certainty as possible. So can the circle be squared?
The integration of AI in universities has unlocked a vast array of opportunities to enhance learning experiences, support staff, and streamline administrative processes. From personalised learning to automated grading, AI-powered solutions are transforming higher education. As we look to the future, the potential for AI in universities is immense, promising new frontiers in virtual learning, intelligent campuses, and beyond.
But not only does the sector need to navigate these advancements ethically and urgently, it also needs to ensure that it doesn’t treat students as passengers when the pilots are struggling.
Students will forgive uncertainty and inconsistency if they’re treated as partners, involved in discussions both at institutional and programme level that reflect the way they live their lives now.