Why ancient errors can aid in modern medical diagnosis

Astronomer Percival Lowell’s errors can remind practitioners to not take “facts” for granted

There is not actually life on Mars, but astronomer Percival Lowell’s errors can remind practitioners to not take “facts” for granted.
There is not actually life on Mars, but astronomer Percival Lowell’s errors can remind practitioners to not take “facts” for granted.

I recently wrote an article extolling the virtues of the SOAP method (subjective, objective, assessment, and plan) as the ultimate tool to help veterinarians recognize patterns to practice medicine. I would go so far as to say pattern recognition is the aim of all medical learning. It follows adding new facts to your brain cache should broaden your ability to recognize patterns and yield greater clinical success.

However, not so fast. Recent studies suggest many medical mistakes occur not due to a lack of knowledge, but to insidious defects in the way we think.1 To delve deeper into the concept of these defects, known as cognitive bias, we are going to take a journey through the cosmos.

The Martians are coming

Our guide is 1890s American astronomer Percival Lowell. He built an observatory2 to study the dark smudges on Mars’ surface that his contemporary, Italian astronomer Giovanni Virginio Schiaparelli, termed “channels” (“canali” in Italian). When Lowell looked through his state-of-the-art telescope, he did not see just a few of Schiaparelli’s canals. Rather, he saw hundreds, straight as a ruler, crisscrossed to form a network of supposed waterways connecting the polar ice caps to myriad oases in the arid equatorial plains.3 Given the recent, labor-intensive completion of the 120-mile-long Suez canal, Lowell believed that a planet-wide canal system was present, and it evidenced a staggeringly advanced civilization.

We now know the Red Planet has no canals and no advanced alien civilization. So, what happened?

Lowell was subject to two common misfires of thinking, apophenia and priming. Apophenia is the tendency to perceive a meaningful pattern between unrelated objects. In this case, Lowell’s high magnification telescope caused his mind to perceive linear connections between topographical shadows. In other words, his brain created an optical illusion of discrete canals connecting smudgy dots on the planet’s surface. We all fall victim to seeing patterns where there are none: the face of the “Man on the Moon,” clouds forming animal shapes, the conviction once you have performed two fat, friable spays that week, you must brace yourself for the third to walk through the door.

Wait … I just made the case pattern recognition is the bread and butter of our lives. We also must be on guard against patterns we think we see but are not real.

In addition to apophenia, the second mistake Lowell committed was relying on priming. Schiaparelli described channels (a natural topographical feature). The Italian word “canali” translated too easily to the English “canal” (a man-made ditch engineered to direct water).3 Primed with a preconception of canals, Lowell looked at Mars expecting to see a straight-lined public infrastructure project. Lo and behold, he thought he found it. This priming principle caused a cognitive bias called anchoring, in which a person relies too heavily on the first piece of information on a topic to make inaccurate conclusions.4

Clinicians can easily be swayed by those first pieces of information on a topic. Years ago, I remember coming home from a Dr. Todd Tams CE lecture on ulcerative gastritis and the role helicobacter might play. Thenceforth, I vowed to be on the lookout for the bug. The following week, a sour-breathed, intermittent hematemesis Chihuahua came in, and all I could think about was sending home an amoxicillin/metronidazole cocktail for helicobacter. I was anchored. Was I right? Playing the numbers, probably not. Yet, I was firmly in the grip of a cognitive bias.

Blind spots in the practice

Lowell serves as an example of the availability bias, as well, thanks to another planetary error he made about Venus (See “Mistakes are from Venus”). This is a belief the answer that comes most easily to mind is the right one. Most of the time, the first thing we think of (the information most immediately available to us) is, in fact, correct. This is how we navigate through life most of the time. The ease and availability of information only becomes a “bias” when it steers us wrong.

An example of having a positive outcome is when you see a dehydrated, 12-week-old, vomiting/diarrhea puppy, you reach for the parvo test because parvo is so often the cause and so often serious.

Here are some examples of that first (most available) answer leading you astray:

1) The parvo pup tests negative. We had a saying for that in my clinic: “Diagnosis: Parvo-Not-Parvo.” Meaning, either we thought the parvo test gave a false negative or we were going to treat this suspected viral enteritis like parvo anyway, so what did it matter? (Except, now we had to explain to the paying owner why we wasted money on the parvo test; don’t you just hate that?) I did not miss coccidia because we always ran a fecal for these pups. However, it took great leaps of mental effort—and only after treatment did not progress as expected after a few days—for me to remember intussusception should be on my list, too. Heaven forbid we get a congenital Hirschsprung’s disease causing intestinal stenosis.7 (By the way, I had never heard of Hirschsprung’s before; I just looked it up for this article. I mean, there are zebras and then there are zebras.)

2) Take the recent case study you read: that the Labrador vomiting a week after she was spayed is suddenly and automatically a tied-off ureter until proven otherwise. You run to do an ultrasound when you could take a beat, gather more history, and recognize this for the garbage gut it is.

3) How about the missed diagnosis? Do you carry that around your neck like a brick for months, promising yourself you will never make that mistake again? For better or worse, I have run more ACTH stimulation tests on puny, barfy, normokalemic dogs than perhaps I ought because I did not want to miss another atypical Addisonian. However, do I always treat those same patients for whipworms? When the fecal is negative, they are not even on my radar.

The availability bias makes you exaggerate the likelihood of a diagnosis that comes readily to mind—and dismiss the likelihood of a diagnosis that is harder to recall.8 While we cannot expect to have a crystal ball and land on a correct diagnosis by magic, cognitive biases like apophenia, priming/anchoring, and availability can cause each of us to make diagnostic and treatment mistakes.

Three strategies

Diagnosis and treatment should strive for an analytical ideal. Evidence-based medicine increases our confidence that certain diagnostics and treatments will yield repeatable results. Yet, we still need to recognize our blind spots.

Knowing you are at risk of cognitive bias can help you avoid mistakes, but to truly overcome, you must take your thinking one level deeper. Here are three strategies to ensure you refocus your mental telescope.

Each of these tactics requires you to disengage from the fast, efficient, effortless thinking you have developed through hard-won experience and plug into a labor-intensive, self-aware, analytical frame of mind.9 You do not have to do it for long, but you should do it long enough to spot-check yourself. This process is appropriately called “paying” attention because it costs you energy and time to reap the reward of accuracy. The more practiced you are at thinking about your thinking, the faster you can pivot through bias to adjustment.

We use cognitive shortcuts all the time to navigate life on autopilot. However, cognitive bias can mislead even the best scientific minds. When you are tempted to blaze through your thinking because you believe you do not have the time or energy to slow down, remember the famous astronomer who cried, “Mars has canals; Venus has spokes!” Far from making me laugh, Percival Lowell’s mistakes humble me. I hope they humble you, too.


American astronomer Percival Lowell got it wrong about another planet, too, Venus. Here is how: The cloudy atmosphere of Venus made it impossible to obtain focused images of the planet’s surface, until Lowell closed his telescope’s 24-in. aperture down to 1.6 in. Then he locked onto an astonishingly clear image of the planet. The topography revealed a dark central spot that radiated spoke-like extensions. Every time he looked, he saw the same thing. He spread the news.

Alas, the spokes of Venus do not exist. What was Lowell seeing this time? The answer is tantalizingly ironic. The extreme narrowing of the lens aperture unexpectedly turned the telescope into an ophthalmoscope that projected the shadows of Lowell’s retinal blood vessels onto the image of the planet. Every time Lowell viewed Venus, he was looking at the fundus of his own eye.5,6 As the retinal hub-and-spoke pattern was familiar to him in the form of wheels he had seen in his everyday life, he placed inordinate confidence in the fact the first thing to come to mind was correct: this finally-in-focus pattern accurately represented the surface features of Venus.

Holly Sawyer, DVM, worked 19 years in small animal private practice before joining GuardianVets to train and coach the veterinary professionals who deliver 24-hour triage support to hospitals. In her free time, she snowshoes and hikes with her two dogs and bakes chocolate chip cookies, not always in that order. Writers’ opinions do not necessarily reflect those of Veterinary Practice News.


  1. Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform and Decis Mak. 2016;16: 138. doi: 10.1186/s12911-016-0377-1
  2. Mooney, Madison. Secrets of Lowell: The Clark Telescope. Lowell Observatory. September 17, 2020. https://lowell.edu/the-unique-history-of-the-clark-telescope/
  3. NASA. The ‘Canali’ and the first Martians. NASA.gov. https://www.nasa.gov/audience/forstudents/postsecondary/features/F_Canali_and_First_Martians.html Accessed August 2, 2022.
  4. The Decision Lab. Why we tend to rely heavily upon the first piece of information we receive: Anchoring Bias, explained. https://thedecisionlab.com/biases/anchoring-bias Accessed August 2, 2022.
  5. Eschner, K. The bizarre beliefs of astronomer Percival Lowell. Smithsonian Magazine. March 13, 2017. https://www.smithsonianmag.com/smart-news/bizarre-beliefs-astronomer-percival-lowell-180962432/
  6. Sheehan, W. Venus spokes: An explanation at last? Sky & Telescope: The Essential Guide to Astronomy. July 23, 2003. https://skyandtelescope.org/astronomy-news/venus-spokes-an-explanation-at-last/
  7. Morales-Miranda, A. Congenital intestinal stenosis and Hirschsprung’s disease: two extremely rare pathologies in a newborn puppy. BMC Vet Res 2019;92(15). doi.org/10.1186/s12917-019-1806-z
  8. Morgenstern, J. Decision making in emergency medicine: Availability Bias. First10EM. March 7, 2022. doi.org/10.51684/FIRS.125778
  9. Kahneman, D. Thinking, Fast and Slow. 2011. New York, New York: Farrar, Straus, and Giroux.

Post a Comment