make Vet Answers my homepage
 

A deep dive into veterinary diagnosis: Part 2 Managing Cognitive Error

Posted in Guest Blogger @ May 5th 2022 - By The AMR Vet Collective Team
A Deep Dive Into Veterinary Diagnosis Part 2 Managing Cognitive Error

A deep dive into how we think through diagnoses - cognitive biases that might trip you up.

We hope that you have been enjoying thinking about ‘thinking’ after the last blog post: A deep dive into veterinary diagnosis: Part 1 Intuitive & Analytical Systems. Yes, we are talking “metacognition” for all of those nerds out there (we, of course, use ‘nerd’ as the greatest term of endearment…)

Our last blog post summarised the first of three papers published by Canfield and colleagues (https://doi.org/10.1177/1098612X15623116) and covered the basics of System 1 (immediate and unconscious) and System 2 (effortful and analytical) thinking. This second post will cover the second paper in the series (https://doi.org/10.1177/1098612X16631233) and explores cognitive error.

So, what IS cognitive error?

In the paper it is described as “flawed clinical reasoning, due to faulty knowledge, data gathering and synthesis”.

Cognitive error is often driven by ‘biases’ which can occur with both types of thinking (although is most common with System 1 and flawed System 2 thinking).

Cognitive biases that might trip you up...

So, let’s cover a few of the more important cognitive biases that might trip you up in your diagnoses…

Confirmation bias

The first is ‘confirmation’ bias. This is where you tend to interpret, recall and focus on information that confirms your prior or early beliefs (preconceptions) about a case. This bias can result in the discounting of other possibilities once you latch on to an early diagnosis.

Availability bias

Similar but different is ‘availability’ bias. When this bias occurs, diagnoses or events that have greater ‘availability’ in your memory because you have seen these cases recently, or they were associated with an unusual or emotional presentation, are more likely to be called on than others.

Anchoring bias

Another important bias is ‘anchoring’ bias where a single piece of information (often the first) is relied on too heavily when making decisions, acting as an anchor to which all other information is referenced, rather than part of the big picture. This bias also causes us to stick with a diagnosis, despite additional information (including lack of treatment response) that might discredit it.

Gamber's fallacy

Finally, ‘gambler’s fallacy’ is where we think that a certain event / diagnosis is more or less likely to happen based on the outcome of previous events (every vet in the history of the world has had ‘runs’ of certain presentations, breeds or owners). While this may very well be true for infectious or environmental diseases, most presentations can and should be considered as independent events.

No doubt some, if not all, of these biases are familiar to you in some way.

Sound familiar?

Can you recall instances when you have fallen prey to one or more? Or, conversely, can you think of times where the use of these biases has actually helped you to work through a diagnosis? Yes, it’s true that while biases can mislead us, they can also provide “short cuts” for thinking about a diagnosis, leading to rapid results.

BUT, because these results aren’t always correct, the next step is to harness our knowledge about biases and systems thinking to ensure that we are aware of their presence and can minimise their misleading effects while capitalising on their benefit.

Harnessing the simple “de-biasing” strategy

To do this, Canfield and colleagues discuss a simple “de-biasing” strategy that focuses on asking yourself two questions when you become aware that you may have fallen prey to bias in your diagnostic process:

What do I need to do next to support my presumptive diagnosis?

(…and how do I ensure that this supporting evidence is objective and accurate?)

If my diagnosis is wrong, what other possibilities exist?

(…and how can I remain both open-minded AND sceptical to ensure that what I have read or heard is NOT misleading me?)

If you consistently ask yourself these simple questions in relation to each case you address, you will be well on the way to managing cognitive error.

Your take home message

So, the take home messages here are:

  1. Be AWARE of the biases that you are (or might be) operating under;
  2. Take time to REFLECT on your cases and how you approach them (any great diagnostician is a great reflector and accepts that they will be wrong from time to time);
  3. Ensure that you use trained System 2 thinking to support any System 1 thinking; and
  4. Introduce some ACCOUNTABILITY, seek ADVICE and develop CHECKLISTS where necessary to support these approaches.

In short, if you ensure that all processes that you utilise combine a healthy dose of objectivity, open-mindedness and scepticism you can’t go wrong (OK, you can always go wrong, but the probability will be vastly reduced!).

Stay tuned for our third and final blog in this series that will discuss the use of mental shortcuts (heuristics) and illness scripts in diagnostic reasoning.

Click here to read:

A deep dive into veterinary diagnosis: Part 1 Intuitive & Analytical Systems

A deep dive into veterinary diagnosis: Part 3 the use of heuristics in clinical reasoning

This post first appeared on the AMR Vet Collective website 6.4.2022

About The AMR Vet Collective

At The AMR Collective, we translate the science around AMR and stewardship into meaningful and practical information that veterinarians can call upon to make informed, evidence-based decisions in their daily practice.

 

 

Click here to visit the AMR Vet Collective website

Comments

There are currently no comments.

Add Your Comments

All comments will be submitted to the administrator for approval.

 
To prevent spam, please type in the code found in the red box to verify you are a real person.
 
  Required fields
 

Blog Categories

 

Recent Blog Entries

 

Tag Cloud

 
follow us on twitter