I find HSJ's reporting of the concept of 'no blame' in patient safety incidents.incredibly disappointing and disquieting (opinion, page 16, 12 April).

I find HSJ's reporting of the concept of 'no blame' in patient safety incidents.incredibly disappointing and disquieting (opinion, page 16, 12 April).

The National Patient Safety Agency debated this issue at length in its formative years and rejected 'no blame'. The agency has been consistent in encouraging an 'open and fair' culture. This exists where there is recognition that humans are prone to error.and systems in which people have confidence to report and learn from incidents. But.it also recognises that people remain accountable for their actions.

The point is that errors need to be reduced and their effects mitigated. This is best done by learning from incidents and taking remedial action. We can only learn when they are reported. Reports will only come about when there is no fear of retribution.

Clearly if negligent behaviour has taken place this needs recognition and appropriate action but most errors are the result of unintended human failings and these failings frequently occur when there are other pressures on the individual.

The concept of latent conditions, those factors which predispose an individual, in any given situation, to make a mistake is really important (see Managing the risks of Organisational Accidentsby Professor James Reason).

The ultimate aim in incident reporting is to enable learning so action will be taken to reduce patient harm. In aviation it has been demonstrated that an increase in reporting of incidents leads to a reduction in serious incidents over time. This is important. The NHS must enable people to feel that reporting will not lead to retribution unless, of course, there is evidence of negligent behaviour.

Surely if this concept is understood by aviators it is not too much to ask for the NHS.

Jeremy Butler, retired general operational standards manager, British Airways and former non-executive director, NPSA.

Straight talk on patient safety

Frank Burns' article takes a robust approach to the patient safety problem, but he is rather selective in his arguments as to where the effort should go. He quite rightly identifies the 'signal lack of emphasis' given to patient safety among all the current national priorities, but does not follow this through, preferring to claim that inadequate discipline is the real problem.

It's true of risk management that some.airlines are exemplars of safety culture but they rely as much on technological improvements and trained teams as they do on individuals. In fact the teamwork and technology is there because the air industry recognises that individuals make mistakes, and the effect of these needs to be engineered out wherever possible.

There are technological solutions to a number of common patient safety problems - barcoding for identification, prescription support software, and drug administration processes that.prevent errors. But there is no enforced national programme or finance for these - they are left to individual care providers to introduce and manage as their resources permit, with little or no standardisation. This is as logical as suggesting all local authorities come up with their own systems of traffic control, including choosing the colour of their local traffic lights.

Just a couple of points in detail - he cites the staff survey figure that only 50 per cent.of staff feel their trust takes effective action to reduce errors. In my experience as a clinical risk manager over the last eight years I know the frustration of seeing a problem for which there is no easy solution, and also the danger of leaping in to solve one issue but inadvertently creating another. Many risks identified by a single incident may take time to analysis and address, and sometimes the solution is technologically or financially beyond our reach.

The second point is that, in our trust, and I suspect in many others, the 'no blame' culture (a phrase we never liked, preferring 'just culture') has never been used to protect repeat offenders, or the irresponsibly careless, or absolute failure to follow agreed procedure. But we do not blame staff for systems failures, and we as an organisation are never as hard on staff who have made a human error as they are on themselves.

Sarah Williamson, clinical risk manager, Sheffield Teaching Hospitals foundation trust (writing in a personal capacity).

Slwilliamson49@hotmail.com