Stepping up to the leadership challenge of AI

The rise of AI creates a testing time for higher education leadership. Shân Wareing reflects on what a leader might do in response - and why it matters

Shân Wareing is vice chancellor of Middlesex University London

AI is everywhere in the media and HE conversations at present, and it’s possibly more embedded already in our personal and professional lives than we are aware.

It’s a testing moment for leadership both in wider society and in higher education. Tech developments and particularly artificial intelligence require leaders who can tolerate uncertainty, accept they are not experts and that they don’t need to be experts in order to lead.

We need leaders who, despite the ambiguity and complexity inherent in AI debates, are able to chart a constructive and optimistic routes for staff and students, despite their having very different experiences, skills and expectations of digital change.

Leadership failure in the face of external challenge is common. One such failure is delay, and failure to set a clear direction, in the face of complexity. The Johnson government, via the Covid enquiry, has provided us with the image of a shopping trolley swerving without a plan. The animal kingdom offers images of a frog swimming in the pan of steadily warming water oblivious to its existential danger or the perennial ostrich with its head in the sand. Ignoring AI while waiting to find out what all the fuss is about is an approach we don’t want.

Another unwanted response would be the use of AI as an opportunity for personal gain, revving up the situation into a crisis with the aim of disrupting the stabilising processes of governance, and frightening people into rapid ill-informed decisions, to accrue more personal power. We might think of Frank Underwood, fictional US President in House of Cards saying “We don’t submit to terror. We make the terror,” or Supreme Chancellor Palpatine persuading the Senate in Star Wars to vote for his emergency powers, both leaders who generate and harness fear to shape the actions of frightened people.

In higher education, this technique is observable in sales pitches and reports where the authors can make a profit, for example, the 2013 report An Avalanche is Coming which predicted the replacement of local physical universities by Massive Open Online Courses (or MOOCs) and called for immediate action.

To complete this trio of undesirable leadership approaches, we do well to avoid blaming staff or students for a university’s inability to rise to the challenges of AI, by suggesting their aversion to risk, their laziness, or their lack of skills or imagination are at the root of the problem. This is the equivalent of saying “I’m unable to select and develop people, set realistic goals, motivate people or create an environment in which they can thrive” and, we might add, “thus demonstrating I am unfit for this leadership role I hold.”

These three approaches all sidestep the responsibilities of leaders for their students and staff, and indeed leaders’ roles in the stewardship of higher education through global changes.

Being better

So if you were aiming to lead well in a world which includes AI, what would be the requirements?

Leaders must want to develop their staff, to support them and see good outcomes for them, as well as for students. A leader, like a teacher, creates an environment for growth, establishes safety during change, and offers a vision of a possible future that we can create together. In practice this means finding out what staff already know about AI, and how they are using it or managing its use, and drawing more staff into the process of learning about, utilising, and regulating the use of AI, and in supporting their colleagues.

Successfully creating a holistic and community-owned university response to AI means bringing together staff and students with different expertise. Learning technologists, IT students, journalists, multimedia artists, and many more groups of students and staff will have well informed views about AI in their spheres of expertise. It can be a welcomed opportunity for students and junior staff to cut their leadership teeth, and for co-creation with students and staff.

A leader needs to balance what is urgent with allocating a reasonable amount of time to bring about effective long term change. The ability to address pressing needs while setting realistic time frames is important for giving staff the space to work with a sense of purpose but not panic. To achieve this, a leader needs to understand the tech well enough, but more importantly, they need to understand risk management, strategy and change.

And, like Canute demonstrating his inability to command the waves, leaders need to accept that there will be aspects of AI we can’t regulate for and are outside our sphere of influence. We need to mitigate the risks of these rather than implementing knee-jerk futile bans on usage. Again, leaders need sufficient understanding to know which aspects of AI this applies to. Anyone thinking they can ban the use of AI in student assignments is – to return to the animal kingdom for an image – shutting the stable door after the horse has well and truly bolted.

Why it matters

In the midst of profound uncertainty about the impact of AI on the future of work, on the nature of knowledge and education, on research and the creative industries, and with the apocalyptic visions of the business entrepreneurs reverberating around us, our university leaders have a responsibility to chart a course which maintains the values and mission of higher education, moves us forward, while being slow to blame staff and students for lack of progress, looking first to our own leadership responsibilities.

This responsibility is more profound than we might always fully appreciate. On Armistice Day, my family was discussing the two minute silence and the importance of remembering the sacrifice of those who fought in the world wars and other conflicts – sacrifice that has enabled us to live the way we do now. When we uphold education, social mobility through access to professional work, independent thought and creativity, we act with respect towards those who fought and died in conflicts for the principles of democratic society.

AI potentially threatens those principles: it may end or reduce certain kinds of work which may or may not be replaced by other work opportunities; it may undermine the integrity of news reporting, and the role of the creative arts in challenging the status quo, by recycling and amalgamating existing text and images. Or it could be harnessed to support and improve democracy and social justice. University leaders therefore have a responsibility to engage with AI and shape wider society’s response to it. In doing so we honour the legacy of those who fought or died for our way of life.

Leave a Reply