We demand other industries deliver standards that all customers recognise, and so we should also insist on patient experience benchmarking across the NHS

The other evening I met my builder, played football, and went for a beer. My extension will fit on the plot, the opposition’s goalmouth was not narrower than ours, and my pint was as full and satisfying as the next man’s. What’s more, the builder’s firm, the sports club and the pub company all knew they were responsible.

Measurement, standardisation, accountability - they go together seamlessly. But not, it seems, with regard to patients’ experience of the NHS. For all the welcome progress made in the last year to embed patient experience in the mainstream, and which Joan Saddler outlined in HSJ, the government continues to fight shy of standardisation and accountability.

High Quality Care for All said the first two steps towards making quality the organising principle for the NHS were to “bring clarity to quality” and to “measure quality [in a framework] that enables us to publish comparable information on key measures”. Is this being fully achieved?

First, clarity. When we talk about patient experience, do we all understand the same things? It is apparent that we do not.

Some relate patient experience to “customer care” issues such as the environment in reception areas. Others confuse patient experience measures with patient reported outcomes. Some clinicians think experience data is only useful as a direct add-on to clinical outcomes.

High Quality Care for All did not explain or elaborate the term “patient experience”, and the Department of Health dived into step 2 - producing indicators. But to know what indicators we require, we need to know what is most important to measure.

There is still a pressing need to identify “core domains” of patient experience that everyone - clinicians, providers, commissioners and regulators - can put at the heart of quality measurement. Picker Institute Europe will shortly publish a discussion paper suggesting core domains of inpatients’ experience, based on a detailed analysis of survey data.

These domains should be defined by patients. Most patients say that the most important aspects of care are their interactions with health professionals: communication, consistency, co-ordination, respect and dignity, and involvement in decisions. Once patient experience measurement is focused on core domains, it becomes easier to select specific indicators that should be used widxely across organisations.

The national quality indicators include a first set, mainly drawn from the national inpatient survey which Picker Institute Europe develops and co-ordinates. These are based on what patients have told us they value, but their use is to be voluntary.

How can we compare?

So what happens when we get an information channel that is supposed to help patients and the public to compare quality? Quality accounts, which every healthcare provider will soon publish, are not standardised. The national indicator set can be ignored. Measurement can be entirely “local”. And the accounts do not have to be independently validated. How can we compare and choose?

If measurement is not standardised, and data is not comparable, that hinders accountability.

In any case, it seems there is little appetite to ensure that boards and managers are required to look regularly at, and take action to improve, the experiences of their patients.

The draft regulations for registration omitted this, stating only that providers should seek the “views” of their users. There will be a legal duty to publish quality accounts, but no accompanying duty to raise standards. The Department of Health’s reports on the Mid Staffordshire scandal rejected using poor patient experience results as a trigger for investigation and enforcement.

The Care Quality Commission, to its credit, is emphasising service user experience as a key component of compliance with registration (even if it is omitted from the regulations).

However, regulators are limited in their capacity to help drive up quality if the statutory tools are not designed to make patient experience an equal governance priority to finance and safety.

Picker Institute Europe wants patient experience to have more teeth.

It should be defined as the FA defines a football pitch - with common components whose dimensions are understood by all in the game. It should be measured the way builders measure space - with common yardsticks. Its usage should be enforced the way the Weights and Measures people hold my local pub to account. We don’t want the patient’s glass to be half full.