Article Text

Download PDFPDF

Reporting of medical errors: time for a reality check
Free
  1. Lucian L Leape
  1. Harvard School of Public Health, Boston, MA 02115, USA leape{at}hsph.harvard.edu

    Statistics from Altmetric.com

    Request Permissions

    If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

    Earlier this summer an expert group chaired by the Chief Medical Officer in the UK produced a comprehensive and thoughtful analysis of the current unacceptable state of identifying, analysing, and learning from medical mishaps.1 Although this report, provocatively named An Organisation with a Memory, applies specifically to the UK National Health Service (NHS), its analysis—and prescriptions—apply to health organisations the world over. The report embraces the insight from industrial safety research pioneered in the UK by Reason and others that human errors typically result, not from carelessness or incompetence, but from systems failures that are sometimes complex and difficult to analyse and correct.2

    The call for better reporting, a more open culture, better mechanisms for ensuring that necessary changes are made, and a much wider appreciation of the value of the systems approach is welcome. The cornerstone of the recommendations is a greatly enhanced system of national reporting of adverse events. Although the benefits of such a programme seem self-evident, two questions must be addressed before proceeding with such a plan—namely: “Why aren't these events being reported now?” and “What would be the cost of such a system?”

    Charles Billings, architect of the highly successful Aviation Safety Reporting System in the USA, has pointed out that there are two major reasons why people don't report adverse events: fear and lack of belief that reporting will lead to improvement.3 Fear is multidimensional—fear of embarrassment, fear of punishment of self, fear of punishment of others, fear of litigation. Fear arises from the belief that errors and mishaps are caused by carelessness for which the responsible individual should be punished. Doctors and nurses have been taught to believe this, so they fear both making a mistake and being caught. They and the public are quick to blame individuals when they make errors.

    The expert group notes that “blame cultures . . . can encourage people to cover up errors for fear of retribution.” This masterful understatement conceals the heavy price that the blaming culture extracts from doctors and nurses whose errors are discovered.4 Interestingly, these punishments are usually calibrated to the gravity of the injury, not the gravity of the error. The nurse who administers a tenfold overdose of morphine that is fatal will be severely punished, but the same dosing error with a harmless drug may barely be noted. For a severe injury, loss of the right to practise or a malpractice suit may result. Moderate injuries may result in a reprimand or some restriction in practice. Punishment for less serious infractions are more varied: retraining, reassignment, or sometimes just shunning or other subtle forms of disapproval.

    But the worst punishments are often self-inflicted: shame and guilt.5 The expectation of perfect performance is deeply ingrained in doctors and nurses, beginning in school and continually reinforced in everyday practice. Shame results when we fail, which we inevitably do. Not surprisingly, doctors and nurses often will not admit errors—to themselves or to others. They don't report errors they can hide.

    Reporting also rarely leads to improvement. Typically, the inquiry stops with the identification of the person who made the mistake; organisations learn little about underlying causes and are not motivated to make changes that would prevent the error recurring. Medical staff are aware of this and react accordingly. Why expose yourself or a colleague to the risk of punishment when no benefit will result? Curing this dysfunctional system—creating the learning organisation that the report calls for—will not come easily.

    But if these obstacles could be overcome and a national reporting system were implemented, what would it cost to collect and analyse reports and make recommendations? The Aviation Safety Reporting System spends about $3 million annually to analyse roughly 30 000 reports, or about $100 (£66) per case. These “near miss” situations are far simpler to analyse than actual accidents, thorough investigation of which would almost certainly cost far more. It would be interesting to know, for example, the cost per case of investigations reported to the confidential enquiries. However, if we applied the figure from the Aviation Safety Reporting System to the 850 000 adverse events that are estimated to occur annually in the UK,1 the cost of investigation would be £50 million annually.

    Assuming that such expenditure is unlikely to be forthcoming, what are the alternatives? One might be to randomly sample and analyse, say, 10% of events. While such a sample might not be truly representative, it could produce useful information. Alternatively, analysis could focus only on fatal injuries which probably represent about 5–10% of all events. This might produce the most reliable data since deaths are easy to identify and hard to conceal. Another option is to identify a group of egregious—or “sentinel”—events that suggest a serious breakdown of safety such as surgery on the wrong part of the body, suicide of a patient under precautions, or maternal deaths. This would provide a more manageable number and have the advantage of possibly leading to changes that would be universally appreciated. Yet another approach is to identify a target condition for study—for example, patient falls or mishaps associated with use of certain types of drugs such as anticoagulants, chemotherapy, or insulin. All institutions would be asked to identify all target events during a one year period, conduct internal investigations, and report findings for national collation and learning. The costs of the investigations would be borne by the reporting institutions.

    Whichever approach is taken, the NHS would be wise to test the method before implementing it by assembling a group of expert analysts to process a batch of cases to determine both the yield and the cost of collecting and analysing data and of making recommendations. Consultation with managers of the British Airways Safety Information System on the costs of running that highly successful reporting system would also be worthwhile. The costs of a properly performed investigation are probably such that only a few can be afforded annually. If that is so, then great care must be exercised in deciding what reports are required to be filed by whom, for unanalysed reports are worse than no reports, breeding discouragement, cynicism, and distrust.

    Although the fiscal constraints to implementing any of these alternatives are considerable, the more formidable barrier remains the punitive environment that pervades our institutions. Changing that will be difficult indeed, for it is so deeply embedded in our hearts and minds. One way of changing hearts and minds is to change behaviour. Vincent and his colleagues at University College, London have pioneered the use of a medical accident investigative tool that leads hospital staff through a comprehensive and rigorous examination of all of the factors that could have played a part in causing an injury.6 Not only does this process invariably uncover multiple systems defects, the process of using it imprints on the users the inescapable fact that accidents result from multiple causes, of which the obvious human error is often the least important. If the findings are used by the hospital to correct defects, internal reporting will skyrocket. This tool, a protocol for the investigation and analysis of clinical incidents,7 should be in the safety armamentarium of every hospital.

    The NHS has a historic opportunity. An Organisation with a Memory gives much needed guidance and issues a mandate that must not be ignored. Wisely implemented and adequately funded, it can lead to substantial improvements in the safety of health care.

    References