Diagnostic Error & Normalization of Deviance Mitigation

The impact of Diagnostic Error on Patient Care Outcomes

Graber, et al. 1 define diagnostic error as “a diagnosis that was unintentionally delayed (sufficient information was available earlier), wrong (another diagnosis was made before the correct one), or missed (no diagnosis was ever made), as judged from the eventual appreciation of more definitive information.” Graber, et al. 1 found that diagnostic error rates occurred in 5% of radiology or pathology diagnoses and up to 12% in clinical practice, or higher in emergency care where complex decisions are required on average every 72 seconds2.

Since then, a literature review by Graber 3 highlighted autopsy studies identifying major diagnostic discrepancies in 10–20% of cases, internists misdiagnosing 13% of common conditions, 10-30% of breast cancers missed on mammography, 1-2% of cancer biopsies misread, and 12-51% of subarachnoid hemorrhage misdiagnosed.

Graber 4 discussed issues concerning the complexity of mitigating diagnostic errors:

  1. They are “fundamentally obscure”
  2. That have not been viewed as a healthcare system problem
  3. Physicians rarely perceive their own error rates as problematic

Why are diagnostic errors a leading cause of harm in hospitals?

The aetiology of diagnostic errors comprises no-fault, systems-related and cognitive errors5. No-fault errors include incidents in which the disease presents itself with masked or unusual manifestations, or in which diagnosis is obscured secondary to a patient being uncooperative with the evaluation, accounting for approximately 7% of the sole cause of cases.

A systems analysis perspective attributes diagnostic errors to failures in delivery or function, e.g. indicating and treating an abnormal or critical laboratory result, productivity pressures, discontinuous care, i.e. handovers, poorly standardized processes and protocols, poor communications or the lack of essential patient relevant information6. A corrective systems approach would focus on the redesign of the work environment, technology and communication tools and organizational structure to optimize their ability to provide safe care.

A cognitive psychology perspective would accredit diagnostic errors to how physicians reason, formulate judgments and make clinical decisions. Prior to decision selection, the necessary data must be acquired, analysed, synthesized and verified. Each of these steps present a risk of error ranging from information perception deficits, failed heuristics, biases, or other cognitive or affective states that impact correct decision selection. See Bordini, et al. 5 for a more information about cognitive biases impacting diagnostic errors.

Research by Graber 4 identified that system related factors contributed solely to 19% of diagnostic errors, with cognitive factors impacted 28% of cases solely. In reality, the majority of diagnostic errors are multifactorial and multicausal in nature. As human beings we have a tendency to be swayed by initial impressions (whether our own or others’). As clinicians we may discount, fail to detect, or be slow to act on subtle but significant signs that suggest a misdiagnosis. Sometimes we may even improperly interpret frankly abnormal signs as “normal” for a particular patient or circumstance.

This ‘normalization of deviance,’ and diagnostic error itself, should be less likely in settings where tracking response to treatment and questioning patient progress is a team-activity; or where collaborative-cross checking and second-guessing ourselves becomes a normal daily routine.

None of us is as smart as all of us!

To reduce an individuals’ propensity to be influenced by system or cognitive factors, the natural counterbalance is to adopt a team-based approach to decision making. As a result, the potential for individual biases, failed heuristics, perception deficits or environmental factors to influence diagnostic errors is reduced by including additional agents that would also need to succumb to these same erroneous ways. The risk or limitation of teams is of course the potential for groupthink where overconfidence in the group’s ability creates complacency or confirmation bias. But, this can be moderated through closed-loop communication processes, collaborative cross-checking, checklists and associated treatment protocols.

Re-consider the slippery slope of deterioration, with these additional processes earlier in the resilience time spectrum that embed proactive care through processes such as nursing change of shift huddle, bedside handover and interdisciplinary bedside rounds. These processes require staff at a minimum input the latest information for the patient and their risk states, but should also prompt staff to question the patient’s response to treatment, cross-check possible emergent risk states or deviations from the expected response to the plan of care and plan for discharge. Thus overcoming deficiencies in situation awareness that can contribute to diagnostic error or normalization of deviance, all as a matter of routine. Consequently, a new normal of the continuum of care is created, where proactive nursing change of shift processes coupled with Structured Interdisciplinary Bedside Rounds (SIBR) subtly coerce staff into proactive patient management, with identify and mitigate strategies built into the normal flow of care.

The Re-designed Slippery Slope of Deterioration

The structure, process, and team culture of an Accountable Care Unit offer great potential to re-examine assumptions and act on unexplained irregularities.

To explore these concepts further and discover more practical strategies for overcoming clinical inertia and integrating other advanced teamwork concepts in your teams visit our Resources.

References

1 Graber, M. L., Franklin, N. & Gordon, R. Diagnostic Error in Internal Medicine. Arch. Intern. Med. 165, 1493-1499, doi:10.1001/archinte.165.13.1493 (2005).
2 Fitzgerald, M. et al. Trauma Resuscitation Errors and Computer-Assisted Decision Support. Arch. Surg. 146, 218-225, doi:10.1001/archsurg.2010.333 (2011).
3 Graber, M. L. The incidence of diagnostic error in medicine. BMJ Quality & Safety 22, ii21-ii27, doi:10.1136/bmjqs-2012-001615 (2013).
4 Graber, M. Diagnostic errors in medicine: a case of neglect. The joint commission journal on quality and patient safety 31, 106-113 (2005).
5 Bordini, B. J., Stephany, A. & Kliegman, R. Overcoming Diagnostic Errors in Medical Practice. The Journal of Pediatrics 185, 19-25.e11, doi:https://doi.org/10.1016/j.jpeds.2017.02.065 (2017).
6 Singh, H., Petersen, L. A. & Thomas, E. J. Understanding diagnostic errors in medicine: a lesson from aviation. QSHC 15, 159-164, doi:10.1136/qshc.2005.016444 (2006).