Not too long ago, a disturbing case occurred in which a thermal injury was sustained by a patient as a result of the use of overheated irrigating fluid during a bladder tumour resection. Sadly, there was not a happy ending. The bladder never recovered from the trauma, but became shrunken and contracted, producing debilitating symptoms. A cystectomy and urinary diversion became necessary, but further complications set in, and eventually these were fatal. To the surprise of all concerned, litigation was not initiated by the family, although the hospital involved did offer a profound apology.
So what lessons are there to be learned from so tragic an episode? First, an enquiry into the facts underlying the case revealed a ‘chain of errors’ culminating in the fatal mistake of instilling irrigant near to boiling point into the bladder. A member of the nursing staff had deliberately reset the temperature gauge in the warming cabinet to nearly 100 degrees, with the best of intentions, to achieve rapid warming of especially cold glycine solution, and had then forgotten to reset it to the usual temperature. An inexperienced agency nurse had removed the overheated bag of fluid and set it up for irrigation without recognising the ‘red flag’ associated with doing so. Another aspect of this case was that, on the day in question, none of the normal theatre nursing staff was present, so no one really appreciated the importance of the irrigant temperature. The operating surgeon had no idea that anything was wrong until the temperature of his resectoscope started to rise, but by that time the damage had been done.
In other ‘high risk’ industries, such as aviation, a great deal of attention is focused on the so-called ‘human factors’ that lie behind similar catastrophes. Indeed, crew resource management (CRM) training in error avoidance is mandatory.1 This was introduced largely in response to the disaster in the Canary Islands when a KLM jet collided with a Pan Am aeroplane, which was taxiing on the runway in foggy conditions. It transpired that an impatient KLM captain had initiated take off without clearance from the air traffic officer in the control tower. The very junior co-pilot did not feel confident enough to over-rule the captain’s precipitous and foolhardy decision. As a consequence, all the crew and passengers in both planes perished.
There are a number of teaching features in these CRM courses that apply to medicine just as much as aviation.2 The concept of the ‘error chain’ that develops as a series of mistakes culminating in the final, fatal error is a useful one. Not only can ‘system errors’ be identified, such as poor labelling or storage of potentially hazardous drugs, but ‘red flag’ danger signs, for example two patients with the same name and of similar age, can be recognised as potential sources of mistake, and an error chain can be interrupted before the fatal mistake occurs.
The ‘undue deference’ of juniors towards their seniors, which was partially responsible for the Tenerife runway disaster, and which certainly exists in medicine, is also emphasised in these CRM courses. Attention is also paid to the importance of senior doctors heeding the warnings of those around them. In a case in Wales, where the wrong (normal) kidney was inadvertently removed with fatal consequences, the potential error was actually pointed out by a medical student, but ignored by the operating surgeons. Parallels have been drawn between that ‘wrong-side’ error and the crash of the British Midland Boeing airliner at Kegworth, in Hertfordshire, when the pilot responded to an in-flight engine failure by incorrectly closing down the unaffected perfectly normal engine, with fatal consequences.
One enduring feature of the medical profession is a reluctance to face up to and admit freely that an error has occurred. This form of denial probably stems from several sources: first, in medicine criticism is not accepted as a gift. Nobody comes to work aiming to harm a patient, but when harm does occur, feelings of guilt and embarrassment often prevail. This leads to a reluctance to admit to the mistake, apologise to the patient and share the learning experience with others. Consequently, important lessons remain unlearned. The critical incident reporting system set up in the UK by the National Patient Safety Association (NPSA) has gone some way to improving this situation but a ‘blame culture’ still tends to persist, with managers being quick to suspend doctors suspected of inadvertently harming a patient, and the ‘feral pack’ of media even quicker to seize upon and report an untoward incident in a biased and sensationalist manner. Moreover, hospitals vary markedly in the enthusiasm with which they monitor and report accidents, with a more than five-fold variation in the number of critical incidents and near misses recorded by comparable institutions.
The overall impact of medical errors is far from trivial, not only on the patient and his or her family, but also on the staff and institution perceived as perpetrating the mistake. Almost 4000 patients treated by the NHS in England died last year following ‘safety incidents’ in which some aspect of their care went wrong. A further 7500 patients suffered severe harm as a result of accidents or botched treatment. Figures for the final six months of 2008–09, published recently by the NPSA, show that over the year 11054 patients died or suffered harm as a result of medical errors, a rate of almost 1000 a month.
The financial cost of fighting medical negligence claims against the NHS is not to be ignored, and, to make it worse, a great deal of the money paid out is in fact going into the pockets of lawyers. In one recent case, the law firm received 58 times as much as the victim. There have been 52000 clinical negligence claims over the past five years, costing the NHS more than £5 billion. During this time, the NHS has spent £700 million on fees to lawyers for clinical negligence claims. Sadly, almost half (48 per cent) of the total compensation paid goes to the lawyers, as compared with 35 per cent five years ago.
To a considerable extent the remedy for these problems lies in our own hands. In order for error avoidance to occur, a safety culture has to prevail. The recent introduction of the preoperative checklist is a useful practical development.3 Clinicians, and those who work with them, need to be imbued with a cross-checking mentality, not only in the operating theatre, but also in clinic and on the ward. Situational awareness, risk perception, cautious decision-making, undue reliance on memory and stress responses are all factors involved, as are teamwork and leadership. All of these areas can and should be included in continued professional development programmes.4–6
If and when you inadvertently fill your diesel engine car with petrol, you may curse and wonder how on earth you could have done it, but at least the problem can be resolved, albeit at some considerable expense. If, however, we inadvertently perform wrong-side surgery, or make a similarly critical clinical mistake, a simple remedy may not be possible.7
Professor R. Kirby, MA, MD, FRCS (Urol), FEBU, Director, The Prostate Centre, London;
Mr P. Dasgupta, MSc(Urol), MD, DLS, FRCS(Urol), FEBU, Consultant Urological Surgeon,
MRC Centre for Transplantation, NIHR Biomedical Research Centre, King’s College London, Guy’s Hospital, King’s Health Partners, London;
Mr C. Beacock, MBChB, FRCS, MD, Consultant Urologist, Royal Shrewsbury Hospital, Shropshire
1. O’Connor P, Campbell J, Newon J, et al. Crew resource management training effectiveness: a meta-analysis and some critical needs. Int J Aviat Psychol 2008; 18:353–68.
2. Helmeich R. On error management: lessons from aviation. BMJ 2000;320:781–5.
3. World Health Organization. World Alliance for Patient Safety. Implementation manual. Surgical safety checklist, June 2008. www.who.int/patientsafety/safesurgery/tools_resources/SSSL_Manual_finalJun08.pdf
4. Morey J, Simon R, Jay G, et al. Error reduction and performance improvement in the emergency department through formal teamwork training. Health Serv Res 2002; 37:1553–81.
5. McCulloch P, Mishra A, Handa A, et al. The effects of aviation-style non-technical skills training on technical performance and outcome in the operating theatre. Qual Saf Health Care 2009;18:109–15.
6. Taylor C, Hepworth J, Buerhaus PI, et al. Effect of crew resource management on diabetes care and patient outcomes in an inner city primary care clinic. Qual Saf Health Care 2007;16:244–7.
7. Kirby RS. Learning the lessons from medical errors. BJU Int 2003;92:4–5.
(Source: Trends in Urology & Men’s Health)