Michelle Rhodes and Jim Hatton explain how their trust’s emergency department acted decisively to bring performance reporting and data quality up to the mark
data assessment uses explicit measures and an overall judgementData assessment uses explicit measures and an overall judgement
The emergency department at Nottingham University Hospitals Trust treats more patients than any other in the country. In August 2009 we became aware of inconsistencies in the reporting of our performance against the national emergency access standard. Some patients had been in our emergency department longer than we had reported.
An independent investigation into what went wrong and how concluded that staff always put the care of patients first, and that no patients had been harmed or disadvantaged. However, it described how there had been unclear understanding of the rules that should have been applied to the timing of discharges from the department.
These reporting inconsistencies led the trust board to question the accuracy of other trust data. We needed to make improvements to assure our patients, commissioners, board and staff that we were reporting accurate information on all standards.
We clarified the emergency department rules and checked their interpretation with staff, and changed many established working practices to make them more responsive, expert, and patient centred.
The need to provide enhanced assurance of the authenticity of our wider data and reports provoked the most far reaching change to the generation and presentation of our board performance report.
The new integrated board report, developed with the support of management consultants McKinsey, includes a set of indicators covering all aspects of the trust’s performance, including quality, operational, workforce and financial performance.
Finding the best
To identify the most appropriate indicators, we reviewed hundreds that were already being tracked internally, as well as others used by UK and international peers. We spoke to managers, clinicians and commissioners and narrowed the list to 136 measures to be included in a standardised format in the monthly board report.
The report distinguishes the highest priority indicators from those that require less urgent attention. The highest priority indicators are included in every monthly report; the others are monitored monthly and brought to the board’s attention only if they are underperforming.
It is one thing to describe problems in the report, another to use the report and its generation to identify problems as or before they happen. The new level of clarity and forecasting now contained in the report allows more intelligent performance tracking. The report describes not only what has happened but what is anticipated for performance and what interventions are proposed. It includes an assessment of data quality and robustness for each measure included.
The Nottingham data quality badge is a visual indicator that acknowledges the variability of data and makes an explicit assessment of the quality of evidence on which the performance measurement is based. Each measure is assessed as “exemplary”, “sufficient” or “not sufficient” on six distinct elements of data quality. For each element - timeliness, completeness, granularity, audit, source and validation - a colour code shows the strength of data quality.
In addition to the six elements, which are each assessed against an agreed rubric (see chart), the centre of the mark requires the judgement of an executive director. This provides a balanced overall judgement, recognises the executive directors’ responsibility for the quality of information used to manage the trust and to assure the board, and is particularly helpful where we have not completed an assessment for each of the six individual elements.
Had the Nottingham data quality badge been in place this time last year, we would have seen that our performance against the emergency access standard was good (green) but that our data quality was poor (red). Armed with this information we might have more quickly discovered and solved the under-reporting.
We are ratifying our kite mark for each of our top level performance measures. Where quality elements are weak we will improve them. The work will then be extended to all the measures.
Among the benefits of this new board level attention to data quality is an increased willingness by clinicians to invest time and resources in providing the necessary information. Access to high quality data across the trust is vital to track authentic improvements in patient care.
We believe other NHS trusts, and potentially organisations in other sectors, can learn from our difficult experience. The changes in our board report are timely given the new government’s determination for greater transparency across the public sector, and increasing demand for readily available online data.
Being ever clearer about which information is most helpful for judging performance, and maximising the quality of that information, is vital for public service management and improvement.
Michelle Rhodes is former chief operating officer and Jim Hatton is deputy director of information and performance at Nottingham University Hospitals Trust.