Technology should be embraced by healthcare organisations as a tool for clinicians - and not confined to the realms of admin, argues Matthew Swindells.
Having stumbled into the world of IT after a career in NHS management, I am tempted to think that HSJ wanting a regular technology column sums up the NHS’s problem with the field.
It’s always in a box – something for leaders to worry about alongside trying to run the NHS. But there are many ways that NHS leaders can use IT to address the quality, innovation, productivity and prevention drive, and radically lower the total cost of healthcare while also improving quality.
The potential for IT to help drive out errors and support evidence based care is dramatic. Adverse drug reactions account for 3-5 per cent of all hospital admissions and cost the NHS £500m a year. The National Patient Safety Agency estimated in 2005-06 medication errors cost the NHS £750m a year, and reported that between 2005 and 2007 the number of reported medication incidents had doubled, with at least 100 patients dying or suffering serious harm per year.
IT-based closed loop medicines management can have a dramatic impact on this. A US study proved that “computerised physician order entry” reduced error rates by 55 per cent and rates of serious medication errors fell by 88 per cent in a subsequent study by the same group. A different study demonstrated a 70 per cent reduction in antibiotic-related avoidable drug errors after implementing decision support for these drugs.
Medication is only one area where errors harm patients and waste resources. Evidence based care is proven to be cheaper and more effective.
In a five-year unpublished study, a hospital chain in the US and Cerner client compared 25,000 patients treated on an evidence based pathway for treating pneumonia with 35,000 treated outside of the pathway. It found that patients on the pathways had a 5-7 per cent mortality rate, compared with 9-10 per cent for those not on the pathway. Patients on the pathway had a length of stay of around six days and their care cost an average of $4,721 in the final year of the study, compared with seven days and $6,841 for the others.
Why is it so hard to ensure that evidence based practices are routinely applied? Why does it take 15-17 years for half of all doctors to incorporate new knowledge into their practices?
Clinicians want to apply good practice, but it is unreasonable to expect them to always know the current best practice. It is also impossible for a paper and memory-based system to ensure that all the members of a care team, across multiple institutions, know and are following the right care pathway. A 2005 paper said that if a diligent doctor finished medical school and residency knowing everything there was to know and then retained two articles a night, after 12 months they would be 1,225 years behind on their reading.
Only the right kind of IT can support clinicians. In 2005, a study in the BMJ showed that the provision of decision support significantly improved clinical practice in 94 per cent of cases and that “automatic provision of decision support as part of the workflow” is seven times as effective as the second best approach”. In other words, real-time decision support in an electronic medical record makes a difference, an intranet with guidelines on it does not.
So why has the NHS been so unsuccessful in adopting IT? The truth is, it hasn’t been. The NHS has the highest level of digitised primary care of any health system in the world, driven by clinicians who adapted their practice to take advantage of new technology.
The challenge is to get clinicians to pull IT into hospitals to profoundly change the way medicine is practiced. That’s why the British Computer Society has been campaigning for every hospital to have a chief clinical information officer reporting to their medical director to make information technology a clinical tool, not an administrative one.