How are different health systems performing?

By Dave West • 2 May 2018

A comparison of health systems’ overall performance: which are leading the pack and which are trailing?

How does different health systems’ overall performance compare? Here’s a chart showing all sustainability and transformation partnerships against three important dimensions: Type one emergency department performance in the final three months of 2017-18, forecast year end deficit relative to overall system budget and whether any trusts are rated inadequate for quality.

If you can’t see the charts try viewing on a larger screen or zooming out in your web browser.

Does looking at performance on the planned care waiting time target give a similar picture? A few systems are substantially up or down the pack, but there’s quite a lot of crossover.

This discussion should probably start by underlining that the quadrant lines are set at averages – not at standards for good performance, such as 95 per cent for the emergency department target, 92 per cent for planned care and financial balance at zero or above. No systems met the 95 per cent standard in this period and only two were forecasting overall financial balance. The large majority of the NHS is still trying to deliver the basics; deep problems are not in isolated areas.

There are many provisos for any measuring of performance and I particularly stress one: financial performance here is based on absolute underspend/overspend, not distance from control totals – using control totals might substantially change where some systems sit. Views will differ about which is the fairest way.

Integrated care systems and the top end

What does this spread of performance tell us about integrated care systems – the advanced guard of STPs? They were chosen partly based on strength of performance, so it’s no great surprise that many make it to the first (top right) quadrant. It’s speculative but some would argue that in these areas good relationships, joined up care, better financial management and better performance can be mutually reinforcing.

Two ICS are outside the top pack – there’s no strict requirement on finance or performance as there was in early foundation trust tests (however much that fell apart for FTs in subsequent years).

The best performing quadrants include some known favourites to be in the imminent next wave of ICS (West Yorkshire, Suffolk and North East Essex) and might point to other viable candidates: the North East, Gloucestershire, Somerset, Derbyshire, Birmingham perhaps. But performance is not the only ICS test – relationships, commitment to working as a system and the ability to deliver Five Year Forward View objectives are given substantial weight and not measured here.

Trailing the pack and system intervention

In the opposite corner, at the bottom left, are some health and care systems with the deepest problems on several fronts. Some are familiar names from national interventions in recent years: Staffordshire, Cambridgeshire, Lincolnshire and Northamptonshire.

But there are some newer names in the bottom left too. Problems in Herefordshire and Worcestershire, Shropshire and Cornwall have become stark over the past year or so. Lancashire, the ICS in the toughest spot, has real problems with emergency care among other things (its leaders have never pretended otherwise).

The problems are dominated by big counties, some very rural, more than the coastal areas that are often known for difficulties, though these are still present. It’s hard to read too much into that, though, as there are also large rural counties at the top end.

The West Midlands is overrepresented.

NHS England is (with encouragement from government) considering a more high profile system intervention in problem STPs and these patches would be candidates. They have already received plenty of under the radar intervention of course – nearly all seeing their STP leaders change, for example.

A noisy new intervention would presumably be along the lines of the 2015 “success regime”. On which point, there are some notes of hope and recovery in the current spread of performance.

The 2015 success regime action was in Essex, Devon and Cumbria. None are among the leading pack but all three are now clear of CQC inadequate ratings (the most serious of these measures, surely) and two are above average in their emergency departments. Deficits have been harder to shift. Cumbria has the biggest of all; all three are above average.

London, where in 2014 several systems were among the most troubled, does not look so bad now, having shaken off inadequate ratings and with north east and north central London in the top quadrant on A&E/finance.

A gloomy question is whether these systems have moved decisively in the right direction or whether everyone else’s standards have dropped around them. Perhaps a bit of both.

Population health and prevention

Readers may be forgiven for thinking “wasn’t this health system stuff meant to be about wellbeing, prevention and integration – looking beyond acute performance and trust finance?”

That opens up a massive can of worms: a huge range of indicators can be used to consider population health, relationships, joined up care or how preventive services may be.

Just to demonstrate a few, here’s under-75 heart disease mortality mapped against the level of use of hospital beds after emergency admission, and highlighting STPs that have the lowest per capita levels of general practice staffing:

This doesn’t say a lot, though it shows that in terms of these two population health outcomes, some ICS have big problems and some do not have much to worry about. It can be expected that age, deprivation and other non-NHS drivers of health need play a big part.

Many of the areas with the lowest per capita general practice workforce (GPs and nurses) are, according to these measures, healthier. This highlights the complexity of the factors involved, and the limited impact of the NHS, particularly in the short term.