- NHSX-commissioned investigation identifies “scepticism” among clinicians and patients over health technology
- NHS urged to develop “robust approach” towards informing public on use of technology in healthcare
- Public concerned over losing “emotional judgement” of doctors to automated services
A report commissioned by NHSX found a belief among patients and clinicians that new technology was sometimes foisted on the NHS to “satisfy a political or commercial imperative” rather than improve healthcare for patients and clinicians.
The research on behalf of health and social care secretary Matt Hancock’s new centralised tech agency added: “Understanding and agreeing on purpose of technological implementation is the starting point for a more constructive conversation.”
The report “Patient AI” – published by the Royal Society of the Arts, Manufactures and Commerce – urges the NHS to develop a “robust approach” to informing the public how technology is used in healthcare.
Last year the Information Commissioner’s Office ruled that Royal Free London Foundation Trust and Google’s trial of the Deepmind system failed to comply with data protection law.
The report also said medics can view “radical” digital technology as a “second class intervention” that weakens doctor-patient relationships.
The RSA was commissioned by NHSX to investigate use of AI in healthcare in order to improve understanding of how commissioners, clinicians and patients interact with technology.
The research – carried out last year – included interviews with clinicians who are involved with introducing new technologies into the NHS.
The RSA noted there was a “lack of clarity” among those interviewed over where technology should be introduced in the health system and how it would be managed.
AI uses computer systems to carry out tasks or solve problems that usually require human intelligence.
The trial used AI to identify patients, often with long-term conditions, who were at risk of an unplanned hospital admission. These patients then received coaching from nurses for up to six months to help them take greater control of their health.
However, the RSA investigation concluded that if AI is to be adopted “at scale” across the NHS then patients need to be better informed on the use of technology in healthcare.
During a separate piece of research referred to in Patient AI, an NHS clinical chair told the RSA: “I had one patient ask me: What if the radiologist goes against the AI system? Does the patient have a right to know that the radiologist has gone against the AI and, if so, why they went against it? And if so, and the AI is right and the doctor is wrong, what happens then? And vice versa, who is responsible?”.
The report noted that members of the public are “highly worried” about losing the “emotional judgement” of doctors and nurses if automation is used more widely across the NHS.
It also stressed that the public and patients need to be involved in research so the need for new technologies is identified.
The report states: “Indeed, the public were often noted to be sceptical. Radical digital technologies were often seen as “second class interventions” used to save time and money, while diminishing doctor-patient relationships.
“Many interviewees expressed concern about these trends. They spoke of the need to involve the public and patients in research, design and use, so that the tools and services being deployed within the NHS are ones that people want, rather than what a commissioner thinks service users want.”
The report went on to say: “There are considerable challenges among the clinical community. Interviewees discussing technology in patient-facing roles suggested that their colleagues at times see digital interventions as second-class interventions in certain contexts.
The RSA concluded that new automations need to be built on a “robust evidence base” if they are to be trusted by patients and clinicians. It also called for the NHS to set up a network of clinical AI champions who will help shape attitudes and practices towards innovations.
NHS England was approached for comment.