There's a scene in Monty Python and the Holy Grail where a king has an errant son under guard. After a father-son exchange, he prepares to leave and asks the guards to stay with his son. After walking a short distance, he turns to find them walking behind him. There follows an absurd conversation in which he fails to get them to do what he asks.

I sometimes mention this to groups of specialist registrars and new consultants to illustrate the contest between rational thought and the collective mental programming of groups that they will encounter in their careers. It isn't enough to have a good idea and a powerful argument. We have to get alongside the mental programming of those with whom we work.

This may explain why we often struggle to get learning into practice. It does not help if the groups are not able to get out enough for mental software upgrades. We can see what happens when they do. There was no need for a Darzi review to tell vascular surgeons that the district general hospital was not a sensible basis for a service. They worked it out as a society and in recent years have worked with their trusts to create networks so that work is done in the right places. Contrast that story with cancer, where external prescription prompted responses ranging from acquiescence to resistance.

We are not the only ones grappling with the conversion of knowledge into practice. It is often assumed that the cure for scurvy among sailors was identified by James Lind in the mid-18th century but it now seems likely it was known to James Lancaster at the beginning of the 17th. Unfortunately, he didn't have the chance to present a paper at the annual sea captains' congress.

Recently I heard a distinction between medicine and management regarding the use of evidence. Medicine, it is argued, is evidence based while management is evidence light. Fine, until colleagues tell you the number of times they cannot find a randomised controlled trial in the Cochrane Collaboration database. This argument also misunderstands the formative nature of evidence available to the manager. In organisations it is difficult to divide your efforts into control and study groups. The key is often to assess the value (and research quality) of individual attempts at innovation.

If converting evidence into practice sometimes leaves a vacuum, it will often be filled by politics. It is all very well moaning about the lack of an evidence base for responses to healthcare-acquired infections, but where was everyone before it became a matter of public concern?

A fundamental issue is the historical lack of emphasis on knowledge transfer. We have assumed that communication is a simple matter of "transmit and receive". The research community has tended to think of communication as a matter of papers, conferences and seminars. And for the practitioner, getting to the evidence means getting your head above the heat and fire of daily working life, be that as a clinician or a manager.

One National Institute for Health Research programme, commissioning research into the service delivery and organisation of health services, has taken the issue seriously enough to change its terms of reference. It will now commission activities to improve knowledge transfer. This includes commissioning the NHS Confederation to develop a network of organisations committed to being more involved in research and putting research into practice.

You may ask, is this only for social sciences and organisational studies? Absolutely not. Let's look at organisational governance, for instance. If the trust scorecard and risk process does not capture sufficient data on mortality and morbidity, how do we know if well-intentioned innovations produce improvement? Worse, does the absence of such measures create a sort of corporate hearing impairment? And does that alter expectation down the trust about what's important, with potentially disastrous consequences? Poor governance gets in the way of transferring knowledge into practice.

All this highlights the lack of incentives at the organisational level. I am keen to see the inspection and regulation process help here. The Healthcare Commission and its successor could require us to demonstrate how we adopt research findings. Of course, this needs to be scaled to the type of organisation. It would be unfair to expect the same of primary care trusts as of large university hospitals. But all of us need to keep an ear open to what is coming through as better practice.

But then I hear my chief executive colleagues (and part of myself) say: don't Standards for Better Health and performance assessment give us enough to do on this? Well yes, but I would certainly volunteer for work on proposals about research adoption and knowledge transfer. To use a quaint phrase, it's about identifying the learning organisation.

The National Institute for Health Research is grappling with the incentives question and has been using the think tank Rand Europe to develop ideas on reward.

Meanwhile, there are instances of evidence producing local change, where organisations and services help. Compliance on how to manage venous thrombosis from King's College Hospital's own research has been important in an area of clinical risk much greater than healthcare-acquired infection. Evidence on acute stroke management prompted the inception of a 24/7 service over two years ago. The sharing of evidence and results with PCTs and trusts has prompted movement towards a stroke network, which I imagine would have happened even without the Darzi review recommendations for London.

My guess is that anyone reading this will have their own stories of adoption of research findings. Hearing their stories is probably at least as important as hearing review proposals and recommendations. And how they dealt with the mental programming challenges is key.