Read First

Disclaimer

  • Medicine is not a recipe, not all patients are the same.
  • Always assess and manage the patient in front of you.
  • The following information will act as a guide. 

Aims

  • To assist new doctors with assessing and managing common presentations to the ED e.g. JMOs, overseas trained doctors, doctors from other fields of medicine undertaking a term in ED.
  • To ensure you do not miss life-threatening causes of patients presenting complaint.
  • To ensure you rule out life-threatening causes of the patients presenting complaint before ruling in more benign causes.
  • To assist with structuring your ED case presentations.
  • To assist with structuring your ED case notes.
  • To avoid cognitive biases.

Remember

  • More is missed by not looking rather than not knowing.
  • If the patient does not respond to first-line treatment, ask yourself ‘Am I missing something?

Cognitive Processes

  • System 1
    • Intuitive, fast and easy
    • Based on personal mind lines, heuristics, beliefs, judgement and preferences
    • Accurate for many decisions but vulnerable to cognitive biases
  • System 2
    • Analytical, slow and takes effort
    • Based on science and rational

Cognitive Bias

  • Affective error
    • Convince yourself that what you want to be true is true instead of less appealing alternatives
    • Example: A friend with a headache will obtain a benign diagnosis rather then being subjected to an LP to exclude SAH.
    • Example: An annoying patient with SOB will have anxiety rather than a PE.
  • Aggregate bias
    • Aggregate data involved in the validation of clinical decision instruments does not apply to the patient in front of you.
    • Example: increased CT usage when PECARN or PERC ignored.
  • Ambiguity effect
    • Select diagnoses where probability is known rather than those where probability is unknown.
    • Example: Fever in returned traveler from Caribbean more likely to have influenza then chikungunya because we know how to test for and treat influenza.
  • Anchoring bias
    • Prematurely settling on a single diagnosis based on a few important features of the initial presentation and failing to adjust as more information becomes available.
    • Example: Pleuritic chest pain = PE but d dimer returns negative, but it still has to be PE.
  • Ascertainment bias
    • Your thinking is shaped by prior expectations.
    • Example: A known homeless IVDU with reduced LOC must have overdosed, when in fact he has severe hypoglycemia.
  • Availability bias
    • Tendency to judge likelihood of a disease by the ease with which relevant examples come to mind.
    • Example: In the middle of flu season, easier to diagnose everyone with SOB with influenza, potentially missing PE.
  • Base rate neglect
    • Failure to incorporate the true prevalence of a disease into diagnostic reasoning
    • Example: Overestimating PTP of PE leading to working it up in essentially no risk patients, increasing costs, false positives and pt harm.
  • Belief bias
    • Tendency to accept/reject data based on personal beliefs
    • Example: personal belief that TPA works in stroke despite evidence to the contrary.
  • Blind obedience
    • Inappropriate deference to recommendations of authority (superiors) in absence of sound rationale.
  • Blind spot bias
    • Failure to recognise our own weaknesses or cognitive errors, while it is easier to recognise others weaknesses.
  • Commission and omission biases
    • Commisision = Tendency towards action rather than inaction
    • Omission = Tendency towards inaction rather than action.
    • Example: When working up low risk pts we make errors of commission by overinvestigating when we would be better off doing nothing.
    • Example: In resuscitation situations, we are hesitant to act leading to omission.
  • Confirmation bias
    • Once you have formed an opinion you have a tendency to only notice evidence that supports you and ignore contrary evidence.
    • Example: Unilateral throbbing headache, photophobia, nausea and family hx of migraine = migraine, yet the fact that the patient described the onset as thunderclap is unconsciously discounted.
  • Framing effect
    • Decisions are affected by way questions are asked or where the pt is seen.
    • Example: More likely to miss dx of AAA if you assess the pt on the ambulance stretcher than in resus bay.
  • Gamblers fallacy
    • Erroneous belief that chance is self correcting.
    • Example: The last 3 pts with SOB were secondary to PE, therefore the next pt with SOB will not have PE.
  • Overconfidence
    • Tendency to think one knows more than one does.
  • Premature closure
    • Tendency to stop too early in diagnostic process, accepting a diagnosis before gathering all necessary information or exploring all important alternatives
    • Example: When the diagnosis is made, the thinking stops

When the Patient Deteriorates

  1. Activate System 2
    • During my initial assessment, was I distracted, fatigued or cognitively overloaded, or did I like/dislike the patient?
    • Was the diagnosis suggested to me by someone else?
    • Did i just accept the first diagnosis that came to mind?
  2. Repeat the history and exam
  3. Review the investigations
  4. Review collateral history and old records
  5. Does the repeat assessment match the initial diagnosis?
  6. Has the worst case scenario been assessed for and excluded?
  7. What else could this be (think of 3 alternatives)?
  8. What other organ systems have I not considered?
  9. Is this an atypical presentation of something common?
  10. What other investigations are required?
  11. Is there a checklist I can use?
  12. What can’t I explain?
  13. Get a second opinion.

References