What I'm Reading (Jan. 4)

Science speaks on behalf of the Radium Girls
Fun Facts 2023
Geometry as art
Data Leakage, Oh My!

“Once pretty, vivacious young women in their late teens and early twenties awaiting marriage and children, one by one, they sickened. On X-ray, their bones looked moth-eaten; their teeth fell out, leaving pockets of pus– every dental effort to treat them caused more tooth loss. Eventually, their jawbones broke or splintered in their mouths, or they suffered cancerous sarcomas of their limbs, requiring amputation. Their spines crumbled, their legs shortened, so they painfully limped. For years no one could determine what ailed them. They were the ‘Radium Girls.’”

This introduces Dr. Billauer’s two-part article on the Radium Girls and occupational safety law. It is a story that lingers in your thoughts. Last week, Jack Devanney from Gordian Knot News returned to that story to argue, based on science, that the linear no-threshold (LNT) approach to radiation regulations is more than deeply flawed; it is wrong. Politics plays just as significant a role in adopting LNT as movie-goers were exposed to in Oppenheimer. Devanney’s article, An ugly hypothesis shot down by a beautiful fact, provides the science behind those horrible Radium Girl deaths

 

Nearly every website and media platform ended the old year with the “Best of.” ACSH was not immune to this bit of “journalistic respite.” But the listing of “fun facts” to be found in the reporting by The Atlantic was special.

“Mars has seasons, and in the winter, it snows.

Bats are arguably the healthiest mammals on Earth.

The genetic mutation behind “Asian glow” might help protect people against certain pathogens—including tuberculosis.

The overwhelming majority of sweaters available on the American mass market are made at least partly of plastic.

Mice and rats can’t vomit.”

From The Atlantic, 81 Things That Blew Our Minds in 2023

 

One of the paradoxical qualities of creativity is that very frequently, limitations result in much more creative results than when an artist is given “free reign.” There is something to be said for being placed in the box that you must think yourself out of. Islamic tradition prohibits images of Allah, Muhammad, and all the major Judeo-Christian prophets. Figurative depiction of living creatures is discouraged as all of these may be considered forms of idol worship. So much of Islamic art is based on geometric images. The unique geographic position of Instanbul has made it a cultural and religious melting pot for centuries. Here is a short video of some “images spanning Islamic, Ottoman, Greek, and Byzantine designs.”

 

From Aeon, by way of Vimeo, Takrar

 

 

 

Data leakage sounds like some vaguely worrisome bodily condition, but it actually refers to an error in machine learning – when the data used to train AI is not kept entirely separate from the data used in testing, which results in the AI “knowing” and answers before the test. This is just one of the issues involving AI as it is applied to medicine and a host of other screening and administrative regulatory activities.

“1. Lack of clean separation of training and test set: If the training dataset is not separated from the test dataset during all pre-processing, modeling and evaluation steps, the model has access to information in the test set before its performance is evaluated.

2. Model uses features which are not legitimate: The model has access to features that should not be legitimately available for use in the modeling exercise, for instance if they are a proxy for the outcome variable.

3. Test set is not drawn from the distribution of interest: The distribution of data on which the performance of an ML model is evaluated differs from the distribution of data about which the scientific claims are made.”

The new tool in town comes without an instruction manual, and to understand what AI is “telling” us, we need to educate ourselves on some of its flaws. From Nature, Is AI leading to a reproducibility crisis in science?

And a bit of a worrisome bonus reading material.

“While generative AI has the potential to help patients, and save money on administrative tasks, it also poses significant dangers. It can perpetuate biases and skew decision-making with inaccurate, or hallucinated, outputs. Large technology businesses, including OpenAI and Microsoft, have not been transparent about the data fed into their models or the error rates of specific applications.

The creeping consolidation harkens to a similar process that unfolded in the business of selling electronic health records, where a couple of companies, Epic and Oracle, own most of the market and can set prices and the pace of innovation. In recent months, Microsoft has aligned itself with Epic to begin to embed generative AI tools and capabilities in health systems across the country.”

From Stat, How the shakeup at OpenAI underscores the need for AI standards in health care