The world of public information about the NHS seems to have moved a long way in the past few days.
First, on Thursday and Friday, the reaction to Basildon and Thurrock Foundation Trust’s problems. On top of the expected criticism of the trust and the system it operates in, there was particular fury the public had not been told earlier.
The Care Quality Commission was barraged with anger that its website still said the hospital’s services were “good” – almost as if the blood splatters and under staffing wouldn’t have mattered much if they were listed online.
The Independent wondered: “How can we choose a hospital if the ratings prove to be so wrong?”
On Today, Evan Davis insisted to Barbara Young: “[The ‘good’ rating] is on your website as I speak, I’m looking at it… Probably people should not look at your website. The public have absolutely no idea who to believe or who to trust.”
It seemed out of kilter with the usual public demand for high standards to be universal, and lack of enthusiasm for choosing based on data about quality. Barbara reasonably insisted the focus should be on the boards who can improve their trust rather than the regulator.
Then over the weekend the Dr Foster group published its 2009 hospital guide. It is very different to its previous hospital guides. For the first time it gives trusts a patient safety points score and a “banding” out of five and lists them in a league table.
Public and media reaction, partly because of the changes to the information, was much louder than before. As well as labelling the dozen band-one trusts “under performing”, and criticising the “postcode lottery” of variation, commentators directly contrasted the new information with what had gone before.
The Observer said the under performance came “despite the fact that last month the CQC, the health service regulator, judged overall care at eight of the trusts to be good or excellent”. Equally five of the eight trusts the annual healthcheck found to have “weak” services were in Dr Foster bands four or five.
The Dr Foster information (whose validity has been hotly disputed by many in the service) has put huge pressure both on the badly rated organisations to work out why they rated badly and try to change that. Being a well used public information provider is a platform to get custom from trusts, who want to understand the methodology and improve. It has also pressured the government and regulators to act.
At least for this week, following the widespread Dr Foster coverage, there are at least two high profile and apparently authoritative providers of public information about the NHS. The government and its arms, and the (semi) independent, for now, Dr Foster.
NHS Choices, which is meant to be the primary NHS source, lists the annual healthcheck rating first, along with a fairly small collection of other figures. It includes hospital mortality ratios, currently defined and provided by Dr Foster. The Information Centre is in the process of tendering out that function, and the provision of other quality indicators for Choices.
Dr Foster, which ran Choices until around a year ago, but lost the contract to Capita, has since added a lot to its own website. It is more comprehensive than the official portal and now features “quality accounts” with measures covering safety, effectiveness and patient experience for each trust. They include the new patient safety score and banding, which presumably are outside anything that will be recreated on NHS Choices.
Questions raised include: Is there one authoritative source of public information? Should there be one authoritative source? Does it matter if they are independent of the Department of Health?