March Madness, Medical Edition: Public Health’s Sweet Sixteen

By Chuck Dinerstein, MD, MBA — Mar 13, 2026
Public health has its own bracket of champions: breakthroughs that eliminated deadly diseases, revolutionized surgery, and opened entirely new doors in medicine. From vaccines to mRNA technology, these sixteen advances didn’t just win scientific matchups—they helped humanity cut down the nets against some of its toughest biological opponents.
Image: ACSH

Across these public health “sweet sixteen,” America’s influence often came less from the initial discovery than from turning promising ideas into functioning systems of medicine and public health. The United States repeatedly scaled fragile laboratory discoveries into mass-produced therapies, built durable scientific institutions and research-focused medical schools, and used its industrial capacity to manufacture drugs and medical technologies on a global scale. Philanthropic models, from the March of Dimes to modern biomedical foundations, mobilized public support and funding for research, while regulatory structures such as randomized controlled trials established strict standards for testing and approving new treatments. Here are sixteen breakthroughs that transformed medicine, saved millions of lives, and reshaped how societies prevent and treat disease.

1. Vaccination

Smallpox was a major threat to the Continental Army, with some studies suggesting the disease killed ten soldiers for every one lost in battle. In early 1777, to protect his forces, George Washington ordered the inoculation of Continental Army recruits, making it one of the first large-scale public health efforts in the United States. In 1800, Benjamin Waterhouse, a cofounder of Harvard Medical School, introduced the new smallpox vaccine to America after testing it on his own family. The federal government soon got involved as well: the Vaccine Act of 1813 appointed Dr. James Smith to distribute the vaccine free of charge, marking the first nationwide vaccination campaign. More than a century later, the Centers for Disease Control and Prevention (CDC) helped lead the global effort that eventually eradicated smallpox worldwide.

2. Anesthesia 

The first public surgical demonstration of ether anesthesia took place in 1846 at Boston’s Massachusetts General Hospital, conducted by William T.G. Morton, a dentist, and John Collins Warren, a surgeon and the first dean of Harvard Medical School, in what is now known as the “Ether Dome.” The successful demonstration proved that surgery could be done without unbearable pain, leading to longer, more complex operations and transforming modern medicine.

3. Germ Theory of Disease

While European scientists like Louis Pasteur and Robert Koch made groundbreaking advances in germ theory, American medicine quickly established institutions to apply these ideas in practice. The Johns Hopkins School of Medicine, founded by pathologist William Henry Welch and influenced by leaders such as physician William Osler, surgeon William Stewart Halsted, and gynecologist Howard Atwood Kelly, became a model for research-focused medical education in the United States. Their approach combined laboratory science, clinical training, and specialized fields. Soon after, Major Walter Reed’s 1900 discovery that mosquitoes transmit yellow fever provided strong evidence for germ-based explanations of disease and changed the understanding of tropical infections.

4. Antiseptic & Aseptic Surgery

American surgeon William Halsted of Johns Hopkins established sterile techniques, surgical gloves, and modern residency training. Boston surgeon Ernest Amory Codman later advanced surgical quality through his “End Result Idea,” arguing that hospitals should track every patient outcome in order to identify errors and improve care. Together, these innovations helped transform surgery from a high-risk procedure into a safer and more systematic practice.

5. Public Health & Sanitation

Public outrage has frequently driven public health reform. In 1906, Upton Sinclair’s novel The Jungle revealed brutal working conditions in Chicago’s meatpacking industry, but readers were most disturbed by its descriptions of contaminated meat and unsanitary practices. This backlash prompted President Theodore Roosevelt to investigate, leading Congress to pass the Pure Food and Drug Act and the Federal Meat Inspection Act, establishing federal oversight of food safety.

Earlier epidemics had already begun shaping the development of national public health systems. After cholera outbreaks in the 1870s, the National Quarantine Act expanded the authority of the Marine Hospital Service—originally created under President John Adams—to inspect ships and immigrants for infectious diseases. The agency later evolved into the U.S. Public Health Service, and its small Hygienic Laboratory eventually became the National Institutes of Health.

6. X-Rays & Medical Imaging

Within a year of Wilhelm Röntgen’s discovery of X-rays in 1895, a physicist at Yale used the technology to diagnose a wrist fracture, demonstrating its immediate medical value. The new imaging method spread rapidly across American hospitals. Over the following decades, collaborations between academic researchers and industry helped develop increasingly sophisticated imaging tools, including CT and MRI scanning, allowing physicians to see inside the body without surgery. 

7. Insulin 

Insulin was first isolated in 1921 by Canadian researchers Frederick Banting and Charles Best, providing a lifesaving treatment for diabetes. American pharmaceutical company Eli Lilly quickly addressed the challenge of large-scale production by developing a purification process called isoelectric precipitation, which enabled reliable insulin manufacture from animal pancreases. Decades later, Lilly and Genentech helped pioneer the production of genetically engineered insulin made by microbes. Approved by the FDA in 1982, Humulin® became the first recombinant DNA drug used in humans. 

8. Antibiotics

Penicillin was first discovered in 1928 by Alexander Fleming in London, but he could neither stabilize the drug nor produce it in useful quantities. A decade later, Oxford scientists Howard Florey, Ernst Chain, and Norman Heatley demonstrated its medical potential, though their makeshift production methods—using bedpans and milk churns to grow mold—were far too inefficient to meet wartime needs. The breakthrough came when the team brought their work to the United States in 1941, partnering with researchers at the USDA laboratory in Peoria, Illinois. American scientists and pharmaceutical companies turned penicillin into a mass-produced medicine by developing deep-tank fermentation, using corn steep liquor to boost yields, and discovering a high-producing mold strain from a Peoria cantaloupe. With companies like Pfizer opening large production plants in New York, penicillin became widely available to Allied troops by 1943, marking one of the most significant American industrial achievements of World War II.

9. Blood Typing & Safe Transfusion

In the late 1930s, Dr. Charles R. Drew developed innovative blood preservation techniques while studying at Columbia University, playing a crucial role in establishing one of the first modern blood banks. His work on separating and storing plasma, a more practical alternative to whole blood for storage and transport, saved many soldiers' lives through successful overseas shipments. Despite urgent blood shortages, the US military mandated that the Red Cross exclude Black donors and later segregate their blood. Drew, a Black man, strongly opposed this discrimination. In a 1944 letter, he called the policy “a bad mistake,” stating that “no official department of the Federal Government should willfully humiliate its citizens… There is no scientific basis for the order…”

10. Randomized Controlled Trials 

Randomized controlled trials (RCTs) are regarded as the gold standard in medical research because they evaluate treatments in ways that reduce bias. The first modern RCT was carried out in 1948 by the British Medical Research Council to assess streptomycin for tuberculosis. In the United States, similar studies were quickly adopted by the National Institutes of Health, and subsequent drug-regulation reforms—especially after the thalidomide tragedy—made controlled trials a legal requirement for approving new medicines.

11. Polio Vaccine

In 1938, President Franklin D. Roosevelt, who was paralyzed by polio, launched the March of Dimes, mobilizing millions of small donations from Americans to fund research against the disease. The organization supported scientists like Jonas Salk, who developed the first successful inactivated polio vaccine, tested in 1954 in a large trial involving over a million American schoolchildren, the “Polio Pioneers.” After the vaccine was declared safe and effective in 1955, nationwide immunization campaigns sharply reduced polio cases. Albert Sabin developed an oral, live-attenuated vaccine, tested on Russian children, which was easier to administer and could provide community-wide protection, becoming widely used in the 1960s.

12. Organ Transplantation

The first successful transplants were skin grafts in 1869. The first cadaveric skin graft was performed during World War II on an airman with a 70% burn. Joseph Murray, a physician on that team, went on to perform the first successful organ transplant, a kidney from a twin brother donor. Dr. Thomas Starzl performed the first human liver transplant in 1963, while physicians at Stanford advanced heart transplantation techniques. The National Organ Transplant Act of 1984 established a national system to recover and allocate organs fairly based on medical criteria. The United Network for Organ Sharing (UNOS), an independent nonprofit, received the federal contract to manage the Organ Procurement and Transplantation Network (OPTN) and maintain the national transplant registry.

13. The Birth Control Pill

The birth control pill was developed in the United States during the 1950s through a partnership among activists, scientists, and philanthropists. Margaret Sanger, a well-known advocate for birth control, along with philanthropist Katharine McCormick, funded research by biologist Gregory Pincus and physician John Rock, who created the first oral contraceptive using synthetic hormones to prevent ovulation. After successful clinical trials in Puerto Rico, the FDA approved Enovid in 1960, making it the first oral contraceptive available in the United States. The pill quickly became popular, playing a significant role in increasing reproductive freedom, shaping social change, and transforming women’s health in the years that followed.

14. Cardiovascular Advances

In the mid-20th century, American physicians made important advances in treating coronary artery disease and atherosclerosis through surgery and medication. At the University of Minnesota, C. Walton Lillehei helped pioneer open-heart surgery, while in Texas, Michael DeBakey and Denton Cooley developed innovative heart surgery techniques and devices that expanded options in cardiovascular care. In the 1970s, Andreas Grüntzig introduced balloon angioplasty, which helped establish the field of interventional cardiology by allowing doctors to open blocked arteries without major surgery. At the same time, American pharmaceutical companies developed statins, drugs that lower cholesterol and have become some of the most widely used therapies worldwide.

15. HIV/AIDS Therapy

In 1981, the Centers for Disease Control and Prevention (CDC) first identified unusual clusters of infections among young men, marking the initial recognition of AIDS in the United States. Over the next decade, scientists and doctors, largely supported by NIH research funding, worked to understand the virus and develop treatments. By 1996, breakthroughs in combination antiretroviral therapy transformed HIV from a rapidly fatal disease into a manageable chronic condition. US biotech and pharmaceutical companies, including Gilead Sciences, played a key role in developing these lifesaving medicines. Building on these advances, President George W. Bush launched the President’s Emergency Plan for AIDS Relief (PEPFAR) in 2003, a historic global initiative that expanded access to HIV treatment and prevention for millions of people worldwide.

16. Genomics & mRNA Technology

The Human Genome Project began in 1990 as a large international effort led in part by our National Institutes of Health and Department of Energy. Its goal was to map the entire human genome, approximately three billion DNA base pairs, creating a reference blueprint for medical research. The project boosted the growth of the biotech industry, with companies like Celera Genomics, founded by J. Craig Venter to accelerate private genome sequencing, and 23andMe, which developed technologies and services based on genome data. The first draft was announced in 2001, and the project was completed in 2003, laying the foundation for modern genomics and personalized medicine.

Around the same time, researchers were investigating whether messenger RNA (mRNA) could be used as medicine by instructing cells to produce specific proteins. Early experiments showed promise but faced significant challenges because mRNA was unstable and caused strong immune reactions. Breakthrough work by scientists including Katalin Karikó and Drew Weissman demonstrated that modified nucleosides could stabilize mRNA and reduce inflammation. These discoveries ultimately enabled the rapid development of mRNA vaccines and helped launch biotechnology companies such as Moderna.

Did you feel a different breakthrough should have made it to the brackets? Post your thought in the comments.

Subscribe to our newsletter

Chuck Dinerstein, MD, MBA

Director of Medicine

Dr. Charles Dinerstein, M.D., MBA, FACS is Director of Medicine at the American Council on Science and Health. He has over 25 years of experience as a vascular surgeon.

Recent articles by this author: