The Flawed, Idealized Metrics of Lists Like WalletHub's '2019 Best & Worst States for Doctors'

Related articles

Our culture likes lists. Websites and media entities recognize people click on them often. The problem is that they routinely skew reality, rather than reveal it.

Our culture likes lists. Especially if they are distilled down versions of unwieldy, complex topics. Who among us didn’t find value in Cliff’s Notes student study guides even when we did read the entire book for high school English literature class? But, at least these tackled the nuances and layered themes that allowed for greater understanding of the close-up and 30,000 foot views. Websites and media sources today recognize that people click a lot on lists. And, as our attention spans evaporate due to nonstop stimulation they are ripe for consumption. The problem is lists like WalletHub’s just released 2019’s Best & Worst States for Doctors skew reality more than they enhance it.

The personal finance site

“In order to help doctors decide where to practice...compared the 50 states and the District of Columbia across 18 key metrics. Our data set ranges from average annual wage of physicians to hospitals per capita to quality of public hospital system.”

Just because there are “18” metrics used doesn’t mean they are genuinely “key” to optimal medical practice. For instance, salary estimations are routinely inaccurate as they are so disparate between and within regions depending on specialty chosen to employment model participation. Ignoring factors that erode physician autonomy or intensify administrative burdens, for example, make it tough to consider Wallethub’s “medical environment” assessment as comprehensive or representative.

One metric used was the number of continuing medical education (CME) credits required for maintaining a state medical license. But, physicians have those and more requirements for specific hospital affiliations or their malpractice insurance responsibilities. It is typically the maintenance of certification (MOC) for specialty board renewals, not necessarily CMEs, that has come most under fire in terms of onerous stipulations.

Then, there is the share of medical resident retention rates used. This is highly variable and readily has nothing to do with greater satisfaction with a state’s way of practice. Social influences might be at play, like having selected the residency location in the first place because it is where you grew up or you knew was known to be best for your specialty or because you were offered a position by an already entrenched, respected mentor. The litany is long and very personal on why doctors end up in particular places.

What is included and what is excluded in this list shifts methodologies substantially, thereby impacting value. Does that mean this list and others like it are useless? Well, not entirely, but it does mean using a list like this to guide your future might be more of a gamble than a sure thing. Taking it with a grain of salt that might spark considerations that could be of importance to you is likely a safer bet. Teasing out good information from bad is always a worthwhile exercise. Reviewing what indicators were actually implemented in this publication is the best first step to weighing how significant or not it should be taken in your career decision-making.

And that is different for everyone.