9/11 Dust Report: CNN Should Dust Off Its Editorial Standards

Related articles

Junk epidemiological studies are rather like pigeons in New York—an omnipresent nuisance that you learn to live with and ignore and hope they don't make too much of a mess. It's too bad that CNN didn't have the judgment to do this. Instead, the network took what is just about the worst study to ever fly the coop and not only ran with it but also sensationalized it by using both children's health and the September 11th terrorist attacks as manipulative hooks. Shameless. 

The study, which (somehow) managed to make its way into a journal {Environment International) is a textbook example of "epidemiology by darts,"—throw enough meaningless data at a giant target and you're bound to hit something. When you toss in selection bias, serious flaws in data gathering, lack of biological plausibility, and obviously meaningless results the conclusion is the Quarduple Crown of junk science. A consummate mess. 

CNN's seriously lame attempt to lend credence to a worthless study. It is tragic that these children lost parents on 9/11, but this does not mean that the dust in the air altered their lipid profiles. 

In this quintessential example of junk epidemiology, no fewer than 13 authors from five different institutions managed to waste time, research funds, and paper by attempting to tie the presence of a group of ubiquitous chemicals, which were some of the thousands that were released into the air when the towers collapsed, to cholesterol levels in teens. They failed spectacularly, from a scientific standpoint. Here's why. 

The ludicrous premise of the study is that a common class of chemicals, called "perfluoroalkyl substances,"  (PFASs) somehow managed to affect the cholesterol levels in children who were exposed to them, which, as the authors state, "might be an early marker of atherosclerosis and cardiovascular diseases." It is difficult to even know where to start debunking this so please forgive me if I just dive right in.

The authors measured the levels of 11 PFASs, which are found in a host of common products, including paper, cookware, carpeting, furniture, and clothing, in the blood of children who were close to the towers and compared them to children who were not. They attempted to establish a relationship between the concentration of the PFASs and markers of future heart disease: triglycerides, total cholesterol, and LDL cholesterol, insulin resistance, and increased brachial artery distensibility (elasticity of arteries). By any measure, this study is a magnificent abomination. It would take longer than the construction of the Freedom Tower to read all the flaws in the study, so here are some of the more grievous ones:

  • Thousands of children were exposed to varying amounts of a huge number of chemicals following the collapse of the towers. A total 123 were evaluated with regard to exposure to 11 (out of thousands) of chemicals. This flaw is an example of both selection bias and small sample size. The choice of which children to include or exclude could easily alter, nullify, or even reverse the results.
  • The protocol of sample collection was disclosed in an earlier study: "Samples were collected from February 20, 2014 to March 21, 2016, and study visits were scheduled to occur after at least six hours of fasting." Are they kidding??? One sample of blood was drawn 13-15 years after the towers collapsed, and this is supposed to have any meaning? No - it cannot. A single time point measurement in any 15-year study is a joke. In this one, even more so. During those 15 years, these kids were exposed to these same chemicals every day of their lives. Trying to tease out whatever may or may not have happened more than a decade before is impossible.
  • The fact that the blood samples were collected after fasting borders on hilarious. Fifteen years after exposure the samples could have been collected after eating, bowling, a macrame lesson, or, (better still) not at all. It would have made no difference. This may be the single dumbest thing in the entire study. 

On top of that, the "evidence" which is given in the paper not only doesn't show any harmful effect, but it could be argued that it shows evidence of no effect whatsoever. 

"We observed a significant, positive association of perfluorooctanoic acid (PFOA) with triglycerides (beta coefficient = 0.14, 95% CI: 0.02, 0.27, 15.1% change), total cholesterol (beta coefficient = 0.09, 95% CI: 0.04, 0.14, 9.2% change), and LDL cholesterol (beta coefficient = 0.11, 95% CI: 0.03, 0.19, 11.5% change). Perfluorohexanesulfonic acid levels were associated with decreased insulin resistance (beta coefficient = − 0.09, 95% CI: − 0.18, − 0.003, − 8.6% change); PFOA and perfluorononanoic acid were associated with increased brachial artery distensibility."


  • They measured 11 chemicals. One of them (PFOA) resulted in a "modest" (9-15%) (1) effect on triglycerides, cholesterol, and LDL. The other 10 did not affect any of the three markers.
  • One chemical (perfluorohexanesulfonic acid) had an "even more modest" effect (8.6%) on insulin resistance, but not any of the other parameters. The other 10 chemicals did not affect insulin resistance.
  • Two chemicals (PFOA and perfluorononanoic acid) did something or other (read: nothing) to arterial elasticity; the other nine did not. 
  • These were the only statistically significant changes. They can be attributed entirely to chance

American Council on Science and Health Advisor and biostatistician Dr. Stan Young explains: "11 chemicals times 5 outcomes = 55 statistical comparisons. With that many tests, you expect a few nominally statistically significant results. Doing the math, 55 x .05 (p value) = 2.75 significant numbers. You would expect 3 or so statistically significant results by chance alone. That's what they got."

And even if the "significant" numbers were real, the study is still a mess because of the lack of biological plausibility. Within a class of 11 similar chemicals, a few did something. A few did something else. Most did nothing. No trend of any kind, other than randomness is apparent. It is biologically implausible to have such a haphazard response to a series of chemically related compounds.

In the end, after six pages of irrelevant data, useless statistics, and pure speculation, there is nothing there. Zero  Ten percent changes in a few surrogate measures of heart disease 13-15 years after an event? How this made it by a reviewer is beyond comprehension.

Let's finish with the authors' conclusion:

"This research adds to our knowledge of the physical health impacts in a large group of children who were exposed to the WTC disaster, and pinpoints the potential high risk of atherosclerosis and cardiovascular diseases in these children as a result of PFASs exposure."

And mine:

"These guys should be locked out of their labs. Except they don't have labs, they just do bad statistics and they could do that on the subway."

This is as bad a study as I've ever seen. Had anyone in the approval chain at CNN done an hour's worth of critical thinking, there would have been a different headline," which would have been more accurate.

"9/11 dust NOT tied to heart risk in children, study says."

Though that wouldn't have gotten them a whole lot of readers and perhaps a full-page ad from an environmental lawyer.


(1) By "modest" I mean zero. Miniscule changes like these are not real.