‘Trust Me, I'm an Expert’: The Pandemic Parade of Pompous Professionals

By Chuck Dinerstein, MD, MBA — Jul 30, 2024
In a world where experts are our go-to folks for solving everything, the COVID-19 pandemic exposed just how fallible these specialists can be. Despite impressive titles and years of experience, many experts were just as clueless as the rest of us, often with unwarranted confidence. A recent study delves into the uncomfortable truth: the more knowledge you think you have, the less aware you might be of your actual ignorance.
Policy Experts in Action - Generated by AI

We rely on experts and their expertise to help us solve problems, be it our individual health concerns with a physician or issues of policy that impact regional or national populations. The missteps of expertise in the COVID pandemic have generated a tidal wave of articles, ostensibly analyzing but more frequently blaming health or financial outcomes on the intent of experts. But what if these missteps, made at the edge of our knowledge, are generally more in the nature of expertise than in alleged deep-state conspiracies?

A recent study in the Journal of Behaviorial Decision Making, which looks at experts and their knowledge, begins with an important but generally overlooked question. What is an expert? Merriam-Webster defines an expert as

“one with the special skill or knowledge representing mastery of a particular subject.”

The researchers point out that the definition is too vague for quantitative research and that while the general field has no authoritative definition, expertise is frequently measured by

  • Professional titles or degrees
  • Years of experience
  • Performance on “domain-specific” tasks

They also consider a more philosophical dimension to expertise – the ability to know what you do not know. 

“To know what you know and what you do not know, that is true knowledge” 

– Confucius

The researchers suggest that this true knowledge, what we might call wisdom, is what we most desire. Experts, as COVID reveals, can profoundly influence our lives, but they are human, and their judgments are “not always accurate, and they can be overconfident.” 

“We seek to understand whether experts, as classified by these criteria, can live up to the idealized conception of an expert from a philosophical standpoint.”

Dunning-Kruger effect

This study looks at a bias in our perceptions, the Dunning-Kruger effect, where people with limited competence overestimate their abilities (Dunning was one of the two authors of this current study). While Dunning-Kruger is often applied to the nonexpert, this work zeroes in on how this cognitive bias impacts experts and their opinions. They introduced two measures of “metacognition,” knowing what we know and don’t. 

  • Murphy’s Resolution – the ability to distinguish between correct and incorrect responses based on the respondent's confidence. A high Murphy’s resolution means that the expert’s confidence in their answer is a reliable indicator of its correctness.
  • Yates Separation – “the gap between average confidence for correct versus incorrect responses.” A high Yates' Separation suggests that the respondent’s confidence is aligned with correctness – an individual knows what they do and does not know. 

In short, the wise expert will have 

“a high Murphy's Resolution, a large Yates' Separation, as well as high confidence for correct answers and low confidence for incorrect ones.”

To measure an expert’s wisdom, they conducted studies involving climate scientists, psychologists, and investors. Each study identified a group of experts by title or degree and a group of non-experts found through online aggregators. Participants were asked a series of domain-specific questions and were asked to record their answer and their confidence in the response. [1]

Unsurprisingly, experts demonstrated greater knowledge and more accurate self-assessments than non-experts. They showed greater confidence in those questions answered correctly than incorrectly. However, “they still made misjudgments that they held with confidence.” Those mistakes were an unfounded confidence in what they did not know – they were less aware of gaps in the knowledge than the nonexperts, leading the researchers to conclude that 

Awareness of error appeared blunted by expertise.”

The researchers revised their findings once they accounted for the difficulty of the questions – flattening the questions’ “hard-easy effect.” The improved calibration of experts disappeared, and they exhibited “greater overconfidence than citizens.” It may be that experts' unfounded confidence comes from their day-to-day experience with easy tasks, unlike those confronted during the COVID pandemic. 

Defining expertise by years of experience rather than academic titles and degrees made no difference. While experts were better calibrated to what they know, the Dunning-Krueger bias leading to overconfidence is in full effect for knowing what they do not know. Expertise can generate unfounded confidence. We have seen this in several Nobel Laureates who have opined in areas outside their Nobel expertise; Linus Pauling and his views on Vitamin C is a prime example. [2]

“That is, experts had better metaknowledge regarding what they knew but equal or worse metaknowledge regarding what they did not know.”

Why might experts be just as impacted by the Dunning-Krueger effect as mere mortals? One offered explanation is what is described as a positive-biased reward system – rewarding those “showcasing” their knowledge with correct answers on tests. Tests that do not penalize wrong answers accelerate this bias. It is more challenging to say “I don’t know” or “I’m not sure” than to weave some plausible ideas into a possibly coherent-sounding fabric. No one wants their physician to end the conversation with “I don’t know.” [3]

The answer may lie in another ancient concept, hubris - the quality of “extreme or excessive pride, dangerous overconfidence, and complacency, often in combination with arrogance.” 

“In a sense, lacking knowledge itself is not a disaster, lacking the awareness of one's lack of knowledge is. It hinders one from gaining knowledge, listening to others' good advice, and making efficient decisions, and this is true for both misinformed bottom performers and experts. In sum, the work herein provides a cautionary tale about guarding against error in judgment and action: Being an expert means making more correct decisions, but it does not mean the eradication of all errors. Those errors may not be anticipated, so one must stay on guard against making errors, whatever one's level of expertise might be.”

The unanticipated Dunning-Krueger effect seen in experts, as demonstrated in this study, provides a reasonable understanding of why our experts got COVID wrong without positing some nefarious intent or cabal. Experts are human, and humans err. The real message in the COVID “root-cause analysis” is being open to divergent, dare I say diverse, views.

 

[1] “Science says that the global average temperature in the air has increased approx. 3.1 °C in the past 100 years. [A. True; B. False] How certain are you that your answer is correct? [Rate Confidence]%”, “If a test of the null hypothesis gave p = 0.03, it means that the null hypothesis has a 3% chance of being true. [TRUE] or [FALSE] How likely do you think your response is correct? [Rate Confidence]%”, “You invest $500 to buy $1000 worth of stock on margin. The value of the stock drops by 50%. You sell it. Approximately how much of your original $500 investment are you left with in the end? [A. $500; B. $250; C. $0] [Rate Confidence]%”

 

[2] The problem is ubiquitous enough to be named Nobel Disease. Among the afflicted are Kary Mullis, winner of the 1993 Nobel Prize in Chemistry for development of the polymerase chain reaction (PCR), who did not believe AIDS was caused by HIV or  Nikolas Tinbergen, the 1973 Nobel Laureate in Physiology or Medicine for discoveries concerning the organization and elicitation of individual and social behavior patterns in animals. In his Nobel acceptance speech, he promoted the belief that autism was due to a lack of maternal warmth, “thereby setting a nearly unbeatable record for shortest time between receiving the Nobel Prize and saying something really stupid about a field in which the recipient had little experience."

 

[3] I would argue that I don’t know is often a better way to begin a conversation with a patient because it acknowledges uncertainty and frames the discussion around what the physician believes is the best path forward. 

 

Source: Metaknowledge of Experts Versus Nonexperts: Do Experts Know Better What They Do and Do Not Know? Journal of Behaviorial Decision Making DOI: 10.1002/bdm.2375

Category

Chuck Dinerstein, MD, MBA

Director of Medicine

Dr. Charles Dinerstein, M.D., MBA, FACS is Director of Medicine at the American Council on Science and Health. He has over 25 years of experience as a vascular surgeon.

Recent articles by this author:
ACSH relies on donors like you. If you enjoy our work, please contribute.

Make your tax-deductible gift today!

 

 

Popular articles