news focus Much NHS data is not fit for the uses to which it is put, says the Audit Commission. Tash Shifrin reports

Published: 14/03/2002, Volume II2, No. 5796 Page 11 12

Some things are meant to be full of holes - like colanders, Swiss cheese and fishing nets.

Some are not: jumpers, buckets and road surfaces spring to mind.

But NHS data, definitely part of the 'not meant to be full of holes' category, is looking distinctly perforated.Data Remember, the punningly titled Audit Commission report on the quality of patientbased information, to be published tomorrow, reveals the extent of the gaps.

'Most trusts need to improve basic process, ' the report states mildly, over a chart that shows that less than 45 per cent of trusts have 'satisfactory procedures to record the information for twohour accident and emergency waits'. Is the correct procedure followed for suspending patients from waiting lists? Not in more than 10 per cent of trusts, the same chart shows.

These are headline areas of activity, where the NHS is under huge pressure to show improved performance. The fact that part of the service cannot accurately state what its current performance is does not inspire confidence.

At 18 per cent of trusts, there were 10 or more areas in which 'little reliance could be placed on the data collected', Audit Commission reviewers found.

And they point out that this is non-clinical, mainly administrative data 'which should (in theory) be relatively straightforward to define and collect'.

The chart shows the sharpest example of a problem that has surfaced repeatedly. 'As the coverage of NHS information systems has grown so, too, have concerns about the quality of data on which the information is based, ' Data Remember says.

Allegations of waiting-list 'fiddles' at a small number of trusts and plans to publish details of surgeons' performance (see box, overleaf ) have made the question of data quality a crucial one.

Audit Commission health strategy director Peter Wilkinson says: 'Data quality sounds like a fairly boring topic but It is vital for patient care and essential for clinical governance and effective management. It is a very important foundation for everything the NHS does.'

The problems with NHS data are not new. 'From the early 1990s it was apparent that returns that should have shown similar results actually varied widely, ' the report says. And Commission for Health Improvement reviews have found 'a worrying number of examples of incomplete and inaccurate coding, missing data sets, double-entry of data and a lack of awareness of information among senior staff '.

Mr Wilkinson says: 'The issue needs higher priority. There are a number of basic things that most trusts can improve, from making sure NHS numbers are used, to making sure the right start times are used for procedures.

'There is not a quick fix. An awful lot of people are involved in patient information. They all need to understand the importance of their role. It is a big cultural issue.'

He stresses the importance of training and making staff understand how their data will be used.

'If you do not realise the information is going up the chain, how much attention will you pay to getting it right?'

There is an issue of investment in IT, he says, but adds: 'It is important to make sure the basic system is right before [the data] goes into whizzy IT.'

A problem that has grown over time is that NHS data is now being used for a wider range of uses than those for which it was originally collected. The Department of Health's hospital episode statistics (HES) database collects 12 million records a year and has been used to identify trends and plan at a national level.

But the Audit Commission says there are 'widespread doubts' about whether the data is fit for the purposes to which it is now being put. A 'much higher standard of accuracy' is needed if decisions are to be taken based on far smaller local data sets, which can be thrown out by a few significant errors.

'Some of these doubts are well founded, ' the reviewers found.

Data quality assurance arrangements were found to be satisfactory at only 61 per cent of trusts, while 9 per cent had no effective quality assurance processes.

Even if accuracy was bang up to the mark, there is a 'longstanding' relevance problem.

There has been no complete review of how information is classified since the Körner steering group on health services information in 1982. This means that the NHS is using definitions based on practice in the 1970s and has not caught up with what the report calls, with some understatement, the 'farreaching changes' since then.

The concept of a finished consultant episode, for example, is seen as 'irrelevant' in an era of teamworking, when care might be led by a different professional.

There is material in Data Remember to give pause for thought at every level, from the need to update Körner, to the advice that 'most trusts could improve their recording of patients'NHS numbers'.

Peter Wilkinson stresses that Data Remember is not accusing anyone of 'fiddling' information.

The commission has been asked by NHS chief executive Nigel Crisp to carry out some spot checks after a National Audit Office report on suspension of patients from waiting lists. He says: 'You may have a few people who have manipulated information - That is what the NAO tackled. But there is a big issue about the accuracy of information that is not about people fiddling.These are highly motivated people doing a good job.'

It is that bigger issue that the commission has tried to help with, providing feedback from its review of data on a trust by trust basis. 'The approach we have taken in line with CHI and the DoH is that It is a long-term issue and We have all got to work together towards a common end.'

Cut and thrust: making a difference 'I think It is fair to say the surgical community has no confidence whatsoever in hospital episode statistics data.'So says Society of Cardiothoracic Surgeons secretary Bruce Keogh (left).The question of data quality is crucial for the society: it is in negotiations with the chief medical officer Professor Sir Liam Donaldson and health secretary Alan Milburn over publishing information about individual surgeons'performance.Mr Keogh believes the public will be able to see performance data by a named surgeon by 2004 - but not before a lot of work is done. In HES data, he says, information on which surgeon has carried out which procedure can be out by 30 to 50 per cent.Mr Keogh says: 'We have been collecting data voluntarily at unit and surgeon level.We now have agreement to fund appropriate hardware and software and a data manager in each unit.'

Surgeons will have input into the process, 'and in return the health secretary will expect the public release of that data'.

To safeguard against abuse of a data collection system where surgeons will monitor themselves, the society has worked with the Nuffield Trust, the US Rand organisation and the Californian department of health on a methodology for validating the data: independent observers will go in to check a random sample of about 10 per cent of notes.

The other key issue for the surgeons is the way performance data will be presented.The data should be presented as survival rates, not as mortality rates, he says.'If a surgeon has a mortality rate of 2 per cent in a year and another has a mortality rate of 4 per cent, the reality is There is no difference between them, but it looks as though one is twice as bad as the other.'Citing survival rates of 96 and 98 per cent gives a more realistic impression, he feels.

The agreement with the CMO and Mr Milburn also means that performance will be recorded for surgery based on low-risk patient cases.'You have got to find a way that doesn't encourage surgeons to turn down high-risk patients - which is happening now, 'he says.A separate column could record the numbers of innovative procedures on high-risk patients in a way that brings kudos and encourages surgeons to take on special trials, Mr Keogh adds.

Cold snap

The Audit Commission undertook a 'light touch' review of the quality of non-clinical data between 1 July and 31 October last year.

The results provide 'a snapshot'of data quality arrangements across 279 English trusts.A second- stage review will look at arrangements for producing clinical data.