Eve Alcock is Director of Public Affairs at QAA


Ailsa Crum is Director of Membership, Quality Enhancement and Standards at QAA


Nick Watmough is a Quality Enhancement and Standards Specialist at QAA

Earlier this month, the governor of the Bank of England told the BBC that AI wouldn’t massively increase unemployment. He argued that in the face of technological revolutions “economies adapt, jobs adapt, and we learn to work with it.”

There won’t be fewer jobs, but there will be different ones. In January, MIT’s Technology review recalled, in this context, that in 1938 its then president Karl Compton had supposed that emergent technologies have “created so many new industries” that “technological unemployment is a myth.”

Today, as our post-industrial revolution sees once inviolable white-collar professions fall into step with the march of the machines, another Dr Compton – King’s College London’s Martin Compton – has stressed that universities must be unafraid to engage with the employment opportunities offered by new tech, to prepare students “for the world they’re actually going to be living in.”

The increasingly common use of AI tools is causing rapid shifts in the labour market. A key challenge for policymakers, educators and researchers is to identify which areas of work will be most affected by this technology.

People and skills

Will demand increase for creative and critical communicators, highlighting those “people skills” sometimes portrayed as among the benefits offered by arts, humanities and social science degrees?

Or will those “soft” transferable skills be all too easily simulated by AI tools, underlining our need for those rigours traditionally associated with scientific and technological innovators, people able to ask the insightful questions the tech can’t?

Or might these tools instead disrupt and transform the disciplinary paradigms of public discourse, deconstructing and resolving the dichotomies of such popular stereotypes?

In any case, it looks like successful graduates will increasingly need to operate in a hybrid workplace. What, then, is the role of higher education in preparing those graduates for that workplace? And how can it most effectively use AI tools to do so?

There are opportunities for such tools to help marginalised and differently abled students achieve their potentials in their studies and their careers. But this will require not only (as a recent HEPI report proposed) that providers subsidise access to the most effective tools, but also that employers continue that support – to avoid deepening a digital divide which exacerbates disadvantage.

Meanwhile, such studies as one of our new Collaborative Enhancement Project led by the University of Bath – exploring ways of “making human learning visible in a world of invisible generative AI” – raise broader questions beyond the scope of our reflections here. What, after all, are the things we should hold onto if we are to avoid adapting so far that we break something precious – something innate to our disciplines, our cultures and our humanity – that may not easily mend?

Thinking alongside the bots

AI isn’t all about academic integrity, although the volume of concerns in this area has sometimes felt overwhelming. In fact, higher education is increasingly embracing progressive learning, teaching and assessment strategies designed to prepare students for AI-enhanced professional environments.

Sarah Eaton has envisaged a “post-plagiarism era where we can’t know where the human ends and AI begins” – one in which hybrid outputs are the norm. To accept that, we must acknowledge the ways in which artificial intelligence might change the ways we understand concepts like cheating, and what constitutes good learning.

It may not be enough to define proper uses of these tools in generic institutional guidelines (such as the advice offered by Oregon State University). This may instead need to be tailored in individual assessment briefs.

For professionally recognised and accredited awards, this work needs to be undertaken in collaboration with professional, statutory and regulatory bodies (PSRBs). This is a matter of particular urgency in such areas as health and social care where students need to be able to demonstrate fitness to practice at the point of graduation.

Discipline-specific considerations are crucial. Medicine, for instance, where risk to harm is high, may need an initial emphasis on more traditional approaches to build foundational skills and promote professional ethics.

Other disciplines also call for particular consideration. Last month, Stuart Nicholson, Beverley Gibbs and Manajit Chakraborty (from the Dyson Institute of Engineering and Technology) wrote on Wonkhe that we ought to exercise “special caution in the integration of generative AI, specifically in two main areas: creative work (which includes engineering), and work-based learning”.

AI-assisted graduates

Yet despite such reservations, those providers which embrace this tech to prepare their students for a hybrid workplace will produce graduates who may, with AI assistance, significantly increase the productivity of their organisations. This will make them remarkably smart investments for employers. It may only be unprecedented financial constraints which have so far prevented providers launching into an all-out sectoral race to become the best-equipped to ready Gen Z – and then Gen Alpha – for Gen AI.

Educators must help their students build the skills to use these tools to find answers effectively and understand how they got there, to critique disinformation and to appreciate originality. That’s something which will require closer collaborations and more robust transitions between providers at all levels.

Providers will also have to grow closer to labour markets and workplace trends. Provider and employer relationships need to be strengthened, as HE steps up to its crucial role in preparing inquisitive, digitally literate graduates capable of adapting to huge changes over the courses of their lives.

Progress will require rapid investment in CPD to help staff develop effective pedagogies and assessments. To support the establishment of strategies to underpin institutions’ investment priorities, in terms of both human and physical resources, sector bodies should encourage providers to debate the challenges and potential rewards to their staff, students and graduates of working to lead this field.

The integration of these technologies into learning, teaching and assessment should soon ensure that the hybrid nature of the workplace is anticipated in the hybrid nature of educational models – models which, from traditional and vocational degrees through to apprenticeships and lifelong learning initiatives, encompass the acquisition, understanding and application of skills essential for people of all ages, backgrounds and abilities to flourish in this fast-evolving environment.

The Quality Assurance Agency’s latest edition of Quality Compass, on the topic of navigating the complexities of the artificial intelligence era in higher education, can be downloaded here.

Leave a Reply