AI could be our radiologists of the future, amid a healthcare staff crisis

There are more radiological scans than ever, but too few radiologists to interpret them. The Medical Futurist, CC BY

It is almost 40 years since a full-body magnetic resonance imaging machine was used for the first time to scan a patient

Courtesy of Alison Murray, University of Aberdeen

It is almost 40 years since a full-body magnetic resonance imaging machine was used for the first time to scan a patient and generate diagnostic-quality images. The scanner and signal processing methodsneeded to produce an image were devised by a team of medical physicists including John Mallard, Jim Hutchinson, Bill Edelstein and Tom Redpath at the University of Aberdeen, leading to widespread use of the MRI scanner, now a ubiquitous tool in radiology departments across the world.

MRI was a game changer in medical diagnostics because it didn’t require exposure to ionising radiation (such as X-rays), and could generate images on multiple cross-sections of the body with superb definition of soft tissues. This allowed, for example, the direct visualisation of the spinal cord for the first time.

Most people today will have undergone an MRI or know somebody who has. Along with the other tools available to radiologists, MRI has become essential to confirm the extent of disease, identify whether the patient has responded to treatment, and to demonstrate complications and in some cases guide intervention.

But radiology has become a victim of its own success, with an exponential rise in the number of imaging examinations requested within increasingly complex healthcare systems that serve an ageing population. Demand outstrips the supply of radiographers and radiologists available to produce these scans in publicly-funded healthcare systems such as the NHS.

In Scotland, in particular, the number of consultant radiologists has flat-lined over the past ten years, while the range and complexity of imaging methods grows with each generation of scanners. Radiologists are running in order to stand still, with even the most efficient departments outsourcing some of their workload to external agencies.

Interpreting the extraordinary detail from MRI scans is something that could be automated using AI. mgdtgd, CC BY-SA

The potential and problems of AI

Meanwhile, innovators in industry have seen the potential opportunities that artificial intelligence (AI) might bring to healthcare, particularly radiology and pathology which are based on digital images. Machine learning algorithms fed with large amounts of past diagnoses can generate new rules for classifying scans based on past examples. The approach of applying this technique to diagnostic scans is known as radiomics.

A barrier to wider use is the lack of secure access to sensitive patient data with which to develop and test AI models. Another is the public’s lack of trust of new methods – even though computerised decision-making in healthcare dates as far back as the early 1970s. Finally, there is the problem of evaluating new methods based on real-world data.

We might ask whether we need artificial intelligence in patient care at all. But the power of these new techniques could offer huge opportunities. No matter how skilled, humans are subject to fatigue, boredom and regular interruptions, and these are when errors can occur.

Machines can work without tiring, but their ability to make intuitive decisions or rely on years of experience to recognise when an abnormality poses an urgent risk is unknown. Even without relying on artificial intelligence for complex matters, just using it for mundane tasks such as appointment booking, allocating staff and equipment, prioritising radiologists’ jobs, or incorporating data from health care records would free up clinicians’ time for other tasks.

A testbed for future healthcare

In the UK, iCAIRD, the Industrial Centre for Artificial Intelligence Research in Digital Diagnostics, brings together experts from the Universities of Aberdeen, Edinburgh, Glasgow and St Andrews together with the NHS and industry partners such as Canon and Phillips in a £15m centre based in Glasgow.

Launched last year, the project will test how well artificial intelligence algorithms compare to human expertise by providing secure access to anonymised clinical data in areas including breast cancer screening, stroke diagnosis and treatment, chest X-rays from A&E, and cervical and endometrial cancer pathology. Using an established approach for secure access to anonymised images, reports and relevant clinical data, AI researchers will be able to develop and test their methods. iCAIRD will also create a national digital pathology database.

Cancer care typically involves multidisciplinary team meetings between clinicians from different specialisms: in the same way, the aim at iCAIRD is that multiple artificial intelligence applications can be integrated to create an AI-based virtual multidisciplinary team meeting, where knowledge from radiology and pathology can direct personalised management of cancer patients.

Just as new drugs must be properly evaluated before use, so must new artificial intelligence methods. We are fortunate to be able, through iCAIRD, to evaluate performance of these new algorithms with real-world data. It is clearly crucial to bring the public on this journey of evaluating AI as a potential solution.

Any new way of working is likely to come at a price – whether that is profit for the firms developing AI, just as the pharmaceutical industry profits from new drugs – or at a cost to the public in the loss of absolute patient data privacy. How to balance these and ensure good governance of AI in healthcare should be a matter for public debate, and not the role of a single sector, or a handful of companies.

Ultimately the benefits will be maximised if we, as healthcare staff, patients and members of the public, are involved in determining the direction of the journey. The responsibility lies with us all.

Alison Murray, Roland Sutton Professor and Chair of Radiology, University of Aberdeen

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Comments