Risk Calculators Need Maintenance Just Like All Infrastructure

When you mention infrastructure maintenance what comes to mind; roads, bridges, a political football, as exciting as watching paint dry, job creation? Have you considered the possibility that a form of infrastructure maintenance can change whether you are considered an “at risk patient” in need of aspirin or statins? That question and its answer is the subject of a new paper in the Annals of Internal Medicine.

Because we have been collecting medical data for some time, we can create mathematical models of a patient’s risk of developing complications from a variety of diseases, especially cardiovascular disease. The classic complications, heart attack, death from coronary disease and strokes have been extensively modeled, taking a large dataset on patients followed over time and applying various statistical techniques to make predictions of which patients will suffer from these events. These pooled cohort equations, which most of us refer to as risk calculators, are used to guide therapy, and that is a good thing. There is no sense prescribing medications for people who will not be helped, better to target our treatment to those with real risks. Many of today’s guidelines are couched as recommendations for patients “at risk” during the next ten years, so it becomes crucial to have accurate risk calculators otherwise some patients will be overtreated and others undertreated (the more worrisome of the two).

The study looks at the risk calculator developed in 2013 and considers two maintenance strategies, updating the data and improving the mathematics behind the model. Updating the data makes intuitive sense; improving the statistical analysis is a bit harder to comprehend. The problem here is two-fold; first, there is overfitting, a mathematical term, where too many variables are used to explain findings – making the simple overly complex. Second, there is an assumption in these models about how risk and treatment interact, where to simplify calculation the treatment has a fixed relationship to risk reduction; this is a poor assumption when considering dynamic biologic processes like a disease.

The authors compared the risk calculations from 2013 to two new models, the first using the same modeling technique but newer data, the second using newer data and modeling. They found:

  • Updating the data alone again overestimated risk for low-risk adults and underestimated risk for high-risk adults.
  • Updating the data and modeling methods improved on these numbers especially lowering the overestimate for low-risk adults.
  • Updating the model alone did not improve performance.

While this may be to be a bit boring, like bridge maintenance, it has real-world implications, like a rusted out bridge support. Merely updating the risk calculator moves 11.8 million patients from high-risk to low-risk, from treatment to observation. That is a lot of reduction in medication, cost, and worry.

As with any model, there are limitations, the author's new data reflected lower-risk patients, in part because of concerns that the 2013 risk calculation overestimated risk. But the study demonstrates that we need to have processes to update our assessment of risk, we need infrastructure maintenance. Otherwise, we may well find that our healthcare system which increasingly is evidence-based [1] and data-driven looks more and more like our bridges.

 

[1] The dynamic nature of our beliefs is one reason I prefer the term evidence-informed over evidence-based – the evidence keeps changing.

Source: Clinical Implications of Revised Pooled Cohort Equations for Estimating Atherosclerotic Cardiovascular Disease Risk  Annals of Internal Medicine DOI: 10.7326/M17-3011