A misunderstanding of what ‘human factors’ means often leads organisations to try to change their people rather than the processes that could transform patient safety, finds Claire Read

Janet Anderson is used to misconceptions about the area in which she works. A senior lecturer in the faculty of nursing and midwifery at King’s College London, Ms Anderson seeks to apply human factors to improve the quality and safety of healthcare.

While she feels awareness of the area has grown significantly in recent years, she says the next challenge is ensuring nurses, medics and managers actually understand what the field is all about.

It’s really about supporting people to optimise their work and remain safe

‘“Human factors’ has that connotation that people immediately think it’s something to do with people – and it’s something to do with humans, perhaps, that makes them prone to error or not able to do the right thing. But that’s really not what it’s about,” says Ms Anderson, who will be speaking at the 2016 Patient Safety Congress, run by HSJ and Nursing Times.

“It’s really about supporting people to optimise their work and remain safe. It’s not about trying to change something necessarily about humans.”

Supporting people

This is a nuance she fears is often missed amid discussion about patient safety. “Often in healthcare, we find that efforts to improve quality come down to encouraging people to be careful and be aware of the risks.

You have to make up for all the deficiencies elsewhere

“That’s focused on somehow changing people, whereas human factors would be about how we can design the task – or the devices, or the packaging, or whatever it happens to be – to make it more likely that people are able to do it correctly.”

Professor Peter Buckle, principal research fellow at Imperial College London and another of the experts speaking at Patient Safety Congress, puts the distinction simply: “I always argue that the human completes the system,” he says.

“Whatever you’re doing, you actually have to complete the system, and you’re doing so with all the bits and pieces you’ve been given – it could be software, it could be hardware, it could be a poor working environment where you can’t see properly – and you have to make up for all the deficiencies elsewhere.

“What we really need to do is to turn it around and say: how would we design things to make you work at the highest possible level of your performance? Instead of which, as humans we normally find ourselves having to overcome the deficiencies of the designs around us.”

Other industries

In other safety critical industries, such as aviation, there has been a long term effort to consider the scientific discipline that is human factors.

Trevor Dale, a former pilot who now offers safety training to healthcare organisations, says it first became a key area of focus for British Airways following the Kegworth air disaster in 1989. The plane involved was new and, unknown to the pilots, had a different ventilation system to previous models.

When smoke appeared on the flight deck, their knowledge of the previous design led them to shut down the wrong engine.

People often come up with well intentioned designs, but hadn’t really thought through where and how they’re going to be used

“If you were generous, you’d say healthcare is 10 to 15 years behind [on this agenda],” reports Mr Dale. “A friend, who is shortly to leave British Airways, recently sat in on one of our training courses for anaesthetists and operating department practitioners. He listened and at the end said: ‘They’re 20 to 30 years behind us’.”

That is not to say progress has not been made. In 2003, Professor Buckle co-authored a report entitled Design for Patient Safety.

Commissioned by the Department of Health and the Design Council – and written with a professor of engineering at Cambridge University and a professor of design at the Royal College of Art – the paper served to underscore “how little human factors design was actually taking place in the healthcare system”.

Adds Professor Buckle: “After that there was an explosion of research around this, which I think has led to some very interesting, different approaches to how you incorporate the human in the design of equipment and systems right from the beginning.”

His own current work at Imperial centres is on in vitro diagnostic devices.

“Companies come to us with ideas that they want to follow up, and we’re able to look at the usability of it, we’re able to look at the cost effectiveness of it, we’re able to do clinical trials because we’re based within a huge healthcare trust and we’ve got academic clinicians who lead different areas,” he explains.

“What comes across very, very clearly is that people often come up with well intentioned designs, but hadn’t really thought through where they’re going to be used and how they’re going to be used.”

Conventional wisdom

He gives the example of miniaturisation and portability, often seen as highly desirable but which is not always appropriate in a healthcare setting. “When you examine it in detail, you realise actually you often don’t want that because you can’t see the device properly or your fingers are too big to input information.

Another area of focus is understanding how people overcome the problems they encounter in system design

“For most of us, the worst [outcome] is perhaps you turn up at your meeting at the wrong time, or you send something rude to a friend when you meant to say something polite. But in the context of healthcare, that could be a bit of information that’s now electronically in a system which someone else will act on.

“It could be anything from the wrong drug dose to predictive text misspelling the name of somebody or, more often, a dropdown menu where you’ve accidentally selected the wrong thing.

“What we do is actually start from the point of view of how can we make this thing more usable to the point whereby it’s very hard to make mistakes?”

Another area of focus in human errors research is understanding how people overcome the problems they encounter in system design. Ms Anderson, for instance, is currently looking at the issue of resilience.

“It’s particularly pertinent in healthcare, because it’s really about looking at how we can help systems and organisations to adapt to pressures,” she argues.

A deviation

Interestingly, the outcome may be a slight deviation from conventional wisdom. “Previous approaches to patient safety are much more about: let’s standardise this, let’s make sure it’s done exactly the same way, every time, and it’s associated with this mindset that if people just followed the rules, everything would be okay.

You can’t impose something without consulting and working with others to participate

“Resilience is about saying well, there’s a reason people don’t follow the rules and that’s usually because the situation they’re faced with doesn’t fit the rules, and therefore they have to adapt. So the focus of our resilience work is how we can help them to adapt safely rather than keep emphasising that you shouldn’t have adapted, you should have just followed the rules.”

Suggests Mr Dale: “If you’re getting repeated breaches of procedures, you’ve got a few possibilities – one is that your training’s poor, another is that your staff have become demotivated and it’s become normalisation of deviancy, or the third thing is your procedures might not be fit for purpose.”

Ask Professor Buckle what key point he hopes nurses, medics and managers would take away from his Patient Safety Congress presentation, and he offers one word: participation.

“You shouldn’t buy anything, you shouldn’t design anything, unless you’ve really had a conversation with the people who are doing the job – or with patients themselves – about whether it’s going to make things better,” he says. “You can’t impose something without consulting and working with others to participate.”

He continues: “It sounds a bit fluffy, but without it you’ve got a disaster on your hands. And it’s not easy to do.

”I think people think, well, that’s fine, we’ll have a focus group. That’s not how it works – actually, you do need trained professionals who know how to develop this participatory idea and come out with proper design constructs. It’s not just a ‘nice to have’, it’s an essential to have – and the expertise is available.”