Learning lessons from medical imaging about what AI transformation really means

Gary F Fisher and Dean Fido explain what the experience of medical imaging professionals can teach academics about the future impacts and opportunities of generative AI

Gary F. Fisher is the University of Derby’s Learning Design and Online Practice Manager

Dean Fido is an associate professor of Forensic Psychology at the University of Derby

It’s just over a year since OpenAI released its large language model ChatGPT for open use. Higher education, like many other sectors, was left reeling from the potentials of this new tool. Within less than a week of ChatGPT releasing to the public, the first articles appeared outlining the risk that ChatGPT could pose to assessment integrity.

This dialogue swiftly moved from decrying the risks to extolling the opportunities, with academic practitioners from multiple institutions and disciplines identifying innovative ways of deploying ChatGPT and similar generative AI tools to support different areas of academic practice. So useful and effective did these various generative AI tools appear that some academic practitioners found themselves asking the same question that many professions have found themselves asking in the face of the ever evolving capabilities and proliferation of AI: will AI replace me?

This, evidently, has not happened. Nor does it seem likely that academic professionals will find their jobs at risk through the proliferation of open artificial intelligence tools. Rather, ChatGPT and other similar tools seem to be in the process of assimilating into the broader landscape of HE practice, with users finding out that its capabilities, which once seemed so impressive, have their limitations and finitudes that are dependent on their particular user to overcome. The question remains then, if ChatGPT won’t replace academics, what will it do? What impact, if any, will generative AI have on the academic profession?

New roles for humans

To answer this question, it is useful to look beyond the walls of the academy towards another profession: radiography, and the wider the field of medical imaging. For upwards of a decade now, this field has arguably been at the forefront of applying and assimilating artificial technologies into its professional practice within the field of medicine and allied healthcare. Medical imaging professionals have found that, through the application of various commercial and open artificial intelligence tools, various professional competencies that were once the sole domain of the human professional can – to an extent – be “outsourced” to an artificial actor.

Many of the commonplace, process-driven tasks executed by professional radiographers, such as the analysis and reporting of images, the adjustment of radiation dosage, and the planning of a patient’s treatment, can be achieved more quickly, and in some cases to a higher standard of consistency, when performed with the support of medical AI than when performed solely by human professionals.

Voices within the medical imaging field have thus spent the last decade discussing the same existential question so many professions face of whether their role will continue to exist in the face of advancing technologies.

Radiographers do still exist. In fact, there is a notable shortfall of trained radiographers with approximately 10.5 per cent of radiographers posts within the UK unfilled. This is because the medical imaging professional has not been replaced by AI. Rather, they have been augmented, informed, and, most importantly, changed as a result of its application and assimilation. Rather than disappearing, as certain commonplace tasks and processes have been outsourced to AI, the role of human professionals has expanded in a new direction. Their responsibilities have shifted towards steering these tools, inputting commands, interpreting and quality-assuring results, and communicating outcomes to patients. Importantly, as increasing numbers of radiologists, oncologists and radiographers are getting involved in clinical trials using AI, they have retained and expanded their position as knowledge creators, a realm that extends beyond the capabilities of large language models.

Community analysis

This transformation did not happen overnight. A search using the terms “medical imaging” and “artificial intelligence” yields thousands of articles and publications rigorously examining the proficiency and reliability of AI tools when applied to different areas of practice. For over a decade, the medical imaging community has been engaged in a continual, iterative dialogue evaluating the exact capabilities and limitations of the tools at their disposal. It has calculated in granular detail precisely which elements of the profession can be supported by an AI and which must necessarily be completed by a human actor.

This ongoing dialogue has facilitated the building of a knowledge bank that provides a foundation of information, ensuring that when imaging professionals trust AI to complete a certain proficiency – augmenting their capacity and freeing them up to direct their own attention elsewhere – they can be absolutely certain of the validity of that decision. This meticulous care is necessary because, after all, this is healthcare. Lives are at stake.

As higher education continues on its own journey with generative AI, the experience of the medical imaging community provides both insight into the shape of things to come and, potentially, a call to action.

The development within the imaging community has only been possible through rigorous, precise, and granular analysis of the specific capabilities of AI tools in different aspects of their clinical practice, identifying not just opportunities but also limitations and risks. It is only through this robust dialogue that this field has been able to fully exploit the opportunities posed by AI to augment their working capacity in the face of high workloads and limited staffing. These same challenges are likely being experienced by many in the higher education sector at the moment, with 30 per cent reporting daily symptoms of burnout.

For us to follow their example, we need to evaluate precisely what AI can do in higher education settings and what, at present, it cannot do. We should exploit the opportunities posed by AI to positively reshape our roles for our benefit and, most importantly, for the benefit of our students.

One response to “Learning lessons from medical imaging about what AI transformation really means

  1. Very interesting article. I wonder if you also mean radiologists as well as radiographers. I’ve worked with both and still manage to confuse the two professions. In Scotland i’ve heard a lot about the shortage of Radiologists as well.

Leave a Reply