Following a number of pilot trusts undergoing new look acute hospital inspections, Kieran Walshe and colleagues assess what has changed and what needs improvement in the revised approach

magnifying glass

Magnifying glass

In the wake of NHS reviews such as the Francis report, the Care Quality Commission has completely revised its approach to inspections.

‘The new model is seen as transformative in comparison with the forms of regulation it replaces’

Those working in the 40 or so NHS trusts where the CQC piloted its new approach from September 2013 to April 2014 will have had first hand experience of the new inspections. So how have things changed, and what can be learned from the pilot programme?

Previously, the CQC inspected acute hospitals using its generic regulatory standards, with small teams of two or three inspectors who often had limited acute sector experience. Each inspection involved a couple of days on site, and usually involved only one or two specialties or a few ward areas. 

The new approach is very different and:

  • uses larger and more expert inspection teams with both CQC inspectors and external advisers;
  • involves a much more detailed and extensive set of inspection processes, drawing on a wider range of data sources and fieldwork;
  • focuses the inspections on eight defined core service areas – accident and emergency, surgery, medicine including care of older people, children and young people, maternity and family planning, end of life care, intensive/critical care and outpatients;
  • assesses and rates performance in each of these core service areas and at hospital and trust level in five domains (safe, effective, caring, responsive and well led), using a four point rating scale (inadequate, requires improvement, good or outstanding); and
  • produces a much more detailed and comprehensive inspection report with full narrative description of services in each core service area alongside quantitative ratings.

A typical inspection of a single site acute hospital trust involves around 30 inspectors visiting for three to four days; usually about 90 person days of inspection fieldwork. This is scaled up for multisite acute trusts. 

The CQC commissioned an independent evaluation of the new approach to inspection, which involved directly observing inspections, interviewing around 80 CQC and hospital staff, online surveys of CQC inspection team members and hospital staff, observing meetings and reviewing documents.

Overall, the CQC’s new acute hospital regulatory model receives more or less universal endorsement from stakeholders, not least from acute hospitals themselves, and is seen as transformative in comparison with the forms of regulation it replaces.

It is regarded as more credible, rigorous and much less likely to miss any issues of significant concern.

Broad scope

But there are issues with some aspects of the regulatory model such as its cost, pace and timing. Consistency and reliability of assessment is also an issue.

‘The large inspection teams allow the CQC to cover the eight core service areas in depth, but are hard to manage’

The new acute hospital regulatory model has been implemented at scale and pace, and that has given rise to some challenges that should be resolved in the medium and longer term as it matures.

One chief executive noted that the inspection was far broader than anything they had experienced. Another felt it was subjective rather than objective: “[Inspections are] not analytical in their approach – because they have so many different teams, the whole question of regulatory consistency, they don’t really have a means of addressing that.”

The CQC has brought into its inspections a new cadre of external inspectors known as specialist advisers, with content knowledge, seniority and expertise it does not have in-house, working alongside its own inspectors who have regulatory experience.

Many of the CQC’s specialist advisers, who range from medical and nursing directors at board level to junior doctors and student nurses, have experience of other forms of inspection such as the Keogh, royal college or deanery reviews. The large inspection teams provide “feet on the ground” in numbers that allow the CQC to cover the eight core service areas in depth, but are hard to manage.

Sustainable team sizes

The costs and sustainability of these very large inspection teams are problematic, and the regulator could move in future to make use of smaller, more expert teams with strong regulatory content and data analysis expertise.

‘Hospitals have generally welcomed the return of ratings and their use at clinical service level’

Inspection teams also need more formal training and development – it was evident that hospital staff perceived the inspectors as the public face of the CQC and work on this is underway.

There is scope to do more preparatory work before inspections, to focus data collection more closely around the key lines of enquiry that the CQC has developed to guide inspections, and to allow more time during inspections for analysis and reporting.

At the moment, CQC inspections collect more data than is needed or than can be used in performance ratings and reporting.

Although the CQC has only piloted the use of ratings in some trusts, hospitals have generally welcomed the return of ratings and their use at clinical service level.

They largely agree with the ratings as assessed by CQC inspection teams. The overall agreement is about 77 per cent, although on the whole hospital staff rate their services higher than the inspectors, and levels of agreement are lower when the CQC has given a “requires improvement” or “inadequate” rating.

Consistency concerns

Hospital staff often have some concerns about the consistency of the rating process. As one trust interviewee put it: “I think there does need to be a bit of calibration of these teams; I think there has to be some guidance given about the extent to which they are at liberty or not to make qualitative judgements, the extent to which they actually need to triangulate and base on evidence, and there’s probably something about how you kind of calibrate across the reports as well.”

‘This is a much more costly and intensive form of inspection, but these are very complex and high risk organisations’

The rating process is highly implicit and very reliant on professional judgement. In tests of individual rating consistency with CQC inspection team members, we found that levels of agreement about what domain to assign particular items of evidence to varied widely.

Published ratings go through an extensive quality review process led by the chief inspector of hospitals, but there are ways that the CQC could improve reliability and consistency without sacrificing the scope for appropriate professional judgement, through more detailed definitions and guidance for inspectors, more training on how to rate, simplification of the domain and rating scales and ongoing monitoring.

For some hospitals, the written report has seemed more negative than initial verbal feedback, and they have been quick to spot any inconsistencies either within their own report or between it and those for other hospitals.

After each inspection, there has been a “quality summit” just after the written report is published, at which the regulator has presented its findings and the trust and other stakeholders have responded. We found these were important capstone events to the process, but were too early after publication to provide an effective mechanism for action planning and follow up.

The NHS Trust Development Authority and Monitor have a key responsibility for oversight of performance following inspection, but their engagement in the process has seemed quite variable.

Work in progress

Overall, the CQC’s new approach to hospital inspection is a work in progress, as the regulator acknowledges. It has already been adapting and changing its methodology during and after the pilot programme, and more improvements can undoubtedly be made.

This is a much more costly and intensive form of inspection, but these are very large, complex, multiservice, high risk and very costly organisations, which require a high level of scrutiny.

If the new approach is to be seen as good value for money in the long term, the key issue may be how the CQC’s inspections are going to be used and acted upon by trusts and other parts of the complex system architecture created by the Health and Social Care Act (particularly Monitor, the TDA and NHS England). 

Ultimately, the impact of the inspections on hospital performance is likely to be the most important metric in future evaluations.

Kieran Walshe is professor of health policy and management and Alan Boyd is a research associate at Manchester Business School; Rachael Addicott is senior research fellow, Ruth Robertson is a fellow in health policy and Shilpa Ross is senior researcher at the King’s Fund