The fortnightly newsletter that unpacks system leaders’ priorities for digital technology and the impact they are having on delivering health services. Contact Ben Heather in confidence here.

To use a computer in the NHS is to know frustration.

Multiple logins, slow speeds, the “spinning wheel of doom”. The complaints are many and oft-repeated, from the health secretary down.

Sometimes staff have not been properly trained. Sometimes the software is deficient. Sometimes it is both.

Often, these shortcomings are no more than an irritant in the smooth running of a hospital. But sometimes people die who might otherwise have lived.

Last week, HSJ revealed NHS Pathways, a piece of NHS software used to help triage millions of patients calls to 999 and 111 every year, had been implicated in the deaths of 11 people.

In many of these cases examined by coroners, there were failures in how the software was used; poor staff training, basic human error. But in all these cases there was also evidence suggesting the software was deficient. In at least three cases, evidence was provided to the coroner indicating deficiencies were already known months, sometimes years, before people died.

This is not the first time that concerns about NHS Pathways have been raised. Nor the first time questions have been raised about how, when mistakes are made, improvements are made.

Russell Hibberd, whose six-year-old son Sebastian Hibberd died after NHS pathways failed to identify obstruction of his bowel, told HSJ the management of the software was “farcical”.

After sitting through the inquest into his son’s death, he said he felt NHS Digital was “resistant to change”, discounting evidence its product had failed in its core task of identifying the seriousness of Sebastian’s condition. NHS Digital has said it assessed the evidence, “to ensure that any necessary lessons are learned”.

With algorithms increasingly taking over clinical heavy lifting in the NHS, the story of NHS Pathways holds important lessons for the future and how we ensure digital technology prevents, rather than contributes to, patient harm.

Early wobbles

NHS Pathways was created in 2008 by Connecting for Health (now effectively NHS Digital) as a digital tool to allow call handlers at 999 services to direct patients to GPs and other health services rather than send an ambulance.

However, the software really took off as the main (and soon only) system behind the NHS 111 service that rolled out from 2011.

Concerns about its effectiveness were raised as early as 2012, but initially the criticism was that it was too risk-averse. In 2012, HSJ revealed that NHS 111 services using the software actually increased the number of ambulance call-outs (a 2018 study eventually found 999 services that used NHS Pathways were less risk-averse than the alternative system). In 2013, a review found widespread concerns about the effectiveness of the software. In Bristol, local GPs reported “numerous” inappropriate responses, including the police being called to the death of a palliative care patient, which appeared to be due to Pathways.

In 2014, an NHS England review criticised the governance group overseeing NHS Pathways for meeting too infrequently and responding slowly to 111/999 providers’ concerns. It said there was a lack of clarity about what was expected of 111 or 999 services using the software.

By 2015, stories were emerging about NHS Pathways not picking up symptoms of serious illness. NHS Pathways advice was implicated in the death of baby William Mead, who died of sepsis after both NHS 111 and GPs failed to pick up on his illness.

Learning from mistakes

NHS Pathways is the most frequently used clinical support software in the country. It triages 16 million patients a year, advising call handlers with no clinical expertise whether a patient’s symptoms require immediate ambulatory care or a visit to the GP the next day. It covers more than 800 symptom pathways.

However, the way NHS Pathways is governed is opaque, so much so that coroners have often sent reports raising concerns about the software to the wrong organisation.

An examination of NHS Digital board papers and minutes show that since 2015 there has been almost no discussion of NHS Pathways, and safety concerns raised by software users at a board level. The only references are whether it was on budget and service outages. NHS Digital says it assesses every concern within 24 hours and updates the content of NHS Pathways twice a year but there has been no broader safety review of the system since at least 2015.

It is even unclear who runs the NHS Pathways team. About 18 months ago, NHS Digital tried to recruit a clinical director for NHS Pathways but there has been no public mention of a person in this role since. Deputy clinical director Darren Worwood has responded to the more recent coronial concerns raised about the software. NHS England has overall responsibility for software but, since the 2014 review, there is little record of proactive oversight.

The Royal College of GPs hosts a governance group that is meant to independently review NHS Pathways’ clinical advice. But what precisely it reviews is unclear. When approached by HSJ, the RCGP was unable to say how many of the 11 coroner’s reports HSJ discovered which raised concerns about NHS Pathways the group had considered (the closest it got was some but not all). It was also unable to say what changes the group recommended after deaths it did know about, referring questions about its independent functions to NHS Digital. The RCGP was not even able to list the group members’ names, only the royal colleges to which they belong.

This is the same group that, in 2014, NHS England criticised for only meeting twice a year and not responding quickly enough to concerns raised by providers. It now meets three times a year.

Nothing to see

NHS Digital has taken exception to concerns raised about its responsiveness to NHS Pathways complaints. It has said changes were made after concerns were raised and, when changes weren’t made, it was only after careful consideration deemed them not necessary. It has often drawn attention to local failures in care where the fault lay with local 999/111 services (a more detailed defence of each death can be found here).

On two occasions, the agency told coroners concerns raised about NHS Pathways, based on ambulance trusts’ evidence after a patient’s death, were so baseless they should be struck from the record.

It has also said it has acted on all NHS England’s recommendations from the 2014 review, including setting up new “service management board” that has been meeting every three months since July 2016.

Who oversees digital clinical safety?

It is difficult to assess these claims; about whether a piece of clinical software was safe when ambulance trusts said it was not, whether changes were made when ambulance trusts said they were not, and whether the changes were made quickly enough when ambulances trusts have said they were not.

But it is increasingly important that someone does.

Currently, the job of assessing the safety of clinical software hovers somewhere between NHS Digital itself, the Care Quality Commission and Medicines and Healthcare Products Regulatory Agency.

NHS Digital as the owner of NHS Pathways is clearly conflicted in this instance, while the CQC only regulates how software is used, not the product itself. MHRA has such a light-touch approach to regulating clinical software that it doesn’t require companies to disclose their product’s name (NHS Digital’s unnamed software is listed here).

Health secretary Matt Hancock has pledged stronger safety regulation for clinical software but so far there have been few specifics, other than a testing “sandbox” for software developers.

If there is an overriding theme to HSJ’s investigation into NHS Pathways, it is fragmentation. Coroners write reports in isolation, unaware colleagues in other parts of the country are doing the same. Ambulance trusts log concerns with NHS Pathways but often seem unaware of the outcome. Patients’ deaths are treated as isolated failings, their families unaware that another family had grieved after similar failings.

Software will inevitably support – and perhaps one day make – more decisions about everyone’s medical care. Other medical software, such as Babylon Healthcare’s chatbot, has already attracted criticism over the advice it offers patients, and similar cases in the future are almost inevitable.

As this new world looms, it important we have confidence these products are at least as safe as what they are replacing, usually human clinical judgement.

But it is equally important we have confidence that when the wrong decision is made, changes are made for the better. 

Considering the contradictions in the official responses to these cases, that confidence is in short supply.