opinion: OVER THE WALL: 'The agendas of health, social services and local government are so intermeshed that it no longer makes sense to think of a handful of key intersections for monitoring'

The plethora of government initiatives exhorting or requiring joined- up working is probably at saturation point, and unlike previous eras the government not only lays down what ought to be done but wants to know whether it has been achieved. Inspectors will be at a premium.

The Audit Commission and social services inspectorate's joint review teams will in future cover each social services department every five years instead of the current seven, and talks are already under way to relate this to the Commission for Health Improvement.

Links between SSI staff and the NHS Executive regional offices have never been closer, and the new regional commissions for care standards will at last bring scrutiny of residential care and nursing homes under one agency.

Performance indicators are Labour's tool for ensuring that modernising agendas are implemented. Both the NHS and social services have been given their own performance assessment frameworks by which they will be monitored. With the new emphasis on partnership, some explicit links are made between the two frameworks, but to what extent do these amount to a shared agenda?

At first sight, the similarities are plain. Six areas are identified for the NHS framework: health improvement, fair access, effective delivery of appropriate healthcare, efficiency, patient/carer experience and health outcomes of NHS care.1 These have been in use since April, and should already be reflected in the first round of health improvement programmes. The draft framework for social services came out in February, and is due to come into effect in September.2 It has five dimensions: national priorities and strategic objectives; cost and efficiency; effectiveness of service delivery and outcomes; quality of services for users and carers; and fair access.

The two frameworks' most novel feature is that they actually relate to one another through a number of specific interface indicators. So far so good, but the limited scope of these partnership indicators is disappointing.

Of the 41 NHS indicators and the 46 for social services, only four are picked out as interface indicators: emergency admissions for older people; emergency psychiatric readmissions; delayed discharge; discharge from hospital.

The organisational interdependence on both sides of the performance assessment framework is far more pronounced than the interface indicators imply. Social services and the wider local authorities, for example, could certainly be said to be involved in measures relating to death from all causes, deaths from accidents, disease prevention and health promotion, chronic care management and mental health primary care.

Equally, the NHS is hardly immune from social services indicators relating to child protection, the unit costs of residential and nursing care for older people, the health of children being looked after, intensive homecare, suicide, falls and hypothermia.

In truth, the agendas of health, social services and the rest of local government are so deeply intermeshed that it no longer makes sense to think of a handful of key intersections singled out for monitoring.

The NHS goal of reducing waiting lists, for example, may mean those with less serious conditions are treated sooner than those with more urgent conditions. But it's not just that the four interface indicators themselves are limited - other measures are questionable, and other possible indicators are not even included.

Considering the emphasis in Saving Lives: our healthier nation on local government's role in improving local health, it is disappointing the Audit Commission dropped its proposal to measure local authorities' performance.

As for the NHS, a feeling persists that the indicator for patient and carer experience is too focused on acute rather than community services. The indicators tend to be backward-looking and standardised, leaving insufficient scope for innovatory, multidisciplinary initiatives.

Commenting on the NHS framework, the Association of Directors of Social Services questioned whether the six performance dimensions sufficiently emphasised the benefits of a joined-up approach.3 Indeed, they proposed a seventh dimension of national performance: partnership working. Exactly what it might include is debatable: the ADSS goes for such things as joint data definitions and data collection, joint data requirements in HImPs, and a joint approach to user, carer and patient feedback.

There is much to commend in this proposal. If national priorities guidance can be joint - and if national service frameworks are expected to apply to both health and social care - it seems reasonable to expect a more joined-up approach to the very things which will get measured.

But a point arrives in the partnership debate when the whole thing gets terribly frustrating. The greatest exponents of joint working recognise that the NHS and local government agendas are deeply inter-connected.

These are often the people who feel least threatened by the logic of their position: that organisational inter-dependence can reach a point where the only sensible conclusion is some form of integration. Perhaps one day that message will also permeate political thinking.

REFERENCES

1 The NHS Performance Assessment Framework. NHS Executive, 1998.

2 A New Approach to Social Services Performance: consultation document. Department of Health, 1999.

3 The New NHS: national performance frameworks. ADSS, 1998.

Bob Hudson is a senior research fellow at Leeds University's Nuffield Institute for Health.