What the Safety Net Can Catch
Table of contents
The previous posts found that the foundational evidence for aluminium adjuvant safety rests on four rabbits and eighty-four adults. But thin pre-market evidence is not unique to aluminium. For most vaccine components, the standard response is the same: post-market surveillance would have caught a problem by now. Billions of doses. Decades of monitoring.
That response assumes the monitoring systems would produce a signal if one existed. This post examines whether they could.
The Systems¶
Vaccine safety monitoring takes two forms. Passive systems collect reports that clinicians or the public choose to submit. Active systems search medical databases for diagnoses that appear more frequently after vaccination than expected.
| System | Country | Type | Coverage | How it works |
|---|---|---|---|---|
| VAERS | US | Passive | National | Anyone can file a report |
| Yellow Card | UK | Passive | National | Healthcare workers and public submit reports |
| EudraVigilance | EU/EEA | Passive | 450M+ | National agencies forward reports |
| VSD | US | Active | ~12M (~3% of US pop.) | Links vaccination records to electronic health records |
| PRISM/BEST | US | Active | 100M+ | Insurance claims and EHR data |
Passive systems depend on someone recognising that an adverse event might be connected to a recent vaccination, then choosing to report it. Active systems scan for statistical signals automatically, typically within predefined risk windows. Some approaches, like the VSD's TreeScan, broaden the outcome space but still operate within defined follow-up windows. Both types monitor whole vaccine products.1
The Capture Rates¶
The commonly cited figure for passive surveillance underreporting is "fewer than 1% of adverse events are reported." This number comes from the introduction of a 2010 AHRQ report, where it was extrapolated from 1980s studies of drug adverse reactions, not vaccines.2 It was never measured for vaccines. The AHRQ report set out to measure it by building automated VAERS reporting from electronic health records at Harvard Pilgrim Health Care. The project was terminated before validation. The authors wrote that "the necessary CDC contacts were no longer available and the CDC consultants responsible for receiving data were no longer responsive." No subsequent study has completed a direct measurement of overall VAERS reporting completeness.
What we actually know about VAERS capture rates comes from three studies that compared reports against independent reference datasets:
| Event | Capture rate | Source |
|---|---|---|
| Rash after MMR | <1% | Rosenthal & Chen 19953 |
| GBS after seasonal flu | 12% | Miller et al. 20204 |
| Anaphylaxis after seasonal flu | 13% | Miller et al. 2020 |
| Anaphylaxis after MMR | 25% | Miller et al. 2020 |
| Intussusception after RotaShield | 47% | Verstraeten et al. 20015 |
| GBS after HPV | 64% | Miller et al. 2020 |
| VAPP after OPV | 68% | Rosenthal & Chen 1995 |
| Anaphylaxis after H1N1 pandemic flu | 76% | Miller et al. 2020 |
The pattern: the more dramatic and media-visible the event, the higher the capture rate. The same adverse event in different contexts makes this explicit. Anaphylaxis after routine seasonal flu vaccination: 13% captured. Anaphylaxis after 2009 H1N1 pandemic flu: 76%.4
The UK's MHRA estimates that "10% of serious reactions and between 2 and 4% of non-serious reactions" are reported via Yellow Card. When asked for the basis of that estimate under Freedom of Information, the MHRA responded: "it is not possible for the estimated rate of under-reporting to be calculated."6
A serious adverse event is a regulatory category: hospitalisation, disability, or threat to life.7 A condition that develops gradually, never triggering a discrete hospitalisation event, never crosses this threshold.
The Easy Catches¶
Every successful post-market safety detection shares the same profile:8
| Case | Risk rate | Time to signal | Condition |
|---|---|---|---|
| RotaShield / intussusception | ~1:10,000 | 9 months | Acute bowel obstruction in infants |
| Pandemrix / narcolepsy | ~1:18,000 | 10 months | Sudden-onset sleep disorder in children |
| 1976 Swine Flu / GBS | ~1:100,000 | 2 months | Ascending paralysis |
| OPV / VAPP | ~1:2.7M | Known from start | Paralysis |
Four features in common:
- Distinctive: clinically unusual conditions (intussusception, narcolepsy, ascending paralysis)
- Acute: onset within days to weeks of vaccination
- Temporally clustered: a clinician connects the event to a recent vaccine
- Dramatic: hospitalisation, paralysis, or death
RotaShield was caught because 15 intussusception cases reported to VAERS within nine months triggered an investigation. The CDC deployed 40 epidemiologists across 19 states.9 Pandemrix was caught because neurologists in Finland and Sweden independently noticed an unusual spike in childhood narcolepsy and connected it to a new vaccine administered months earlier. The signal was geographically concentrated in those two countries; trials conducted elsewhere would not have found it.10
None of these signals was detected in pre-licensure trials. RotaShield trials enrolled approximately 15,000 infants; detecting a 1:10,000 event requires approximately 60,000.9 Pandemrix's risk of 1:18,000 in children would need over 55,000 per arm. Every historical case that led to withdrawal or restriction was caught by post-market surveillance, not by the trials that approved the product.
These are the system's successes. They are genuine. They also describe the boundaries of what the system can see.
What Would Be Invisible¶
Now consider a signal with the opposite profile. Not acute, but gradual. Not distinctive, but common-background. Not temporally clustered, but distributed across months or years. A small increase in asthma, eczema, food allergy, or autoimmune disease, spread across millions of children.
No clinician connects a diagnosis of asthma at age 3 to a vaccination at 8 weeks. No parent files a VAERS report for eczema that appeared gradually over months. No Yellow Card is submitted for a food allergy that emerged at weaning. The passive systems never see it, because no one thinks to report it.
The active systems have their own limits. The VSD's Rapid Cycle Analysis uses risk windows of days to weeks.11 The self-controlled case series (SCCS), the dominant analytical design in active surveillance, compares a person's risk during a defined window after vaccination to their risk during a control period. It was designed for acute, transient outcomes with a clear temporal onset.12 A child who develops asthma gradually between ages 1 and 4 has no 42-day post-vaccination window to fall into.
Then there is the structural gap. Adverse event reports name the product, and any product's ingredients are public. But no surveillance system routinely analyses by shared ingredient.1 Signals are flagged for Infanrix Hexa or Bexsero, not for "aluminium hydroxide across all products." Even if a system did group by ingredient, the schedule would limit what it could find: nearly every child on the routine schedule receives aluminium at every primary visit. There is no aluminium-free control group within the vaccinated population.
The One Time Someone Asked¶
In 2023, Daley et al. used the VSD to estimate cumulative aluminium exposure from vaccines in 326,991 children and tested for associations with health outcomes. They found a positive association with persistent asthma: an adjusted hazard ratio of 1.19 per 1 mg increase in aluminium exposure.13
This is the only published study to use an active surveillance system to examine aluminium exposure as a specific variable. A subsequent Danish study of 1.2 million children found no such association.14 Both studies, their methods, and what they can tell us are examined in a future post.
The Blind Spots¶
These are structural features of the surveillance system, not limitations specific to aluminium. Any chronic, diffuse, or delayed signal from any vaccine component would face the same blind spots. "Billions of doses and no one noticed" is only as strong as the systems doing the noticing.
If surveillance cannot close the question, does the silence prompt new research, or does it become the answer?
If you spot an error in my reasoning, data, or sources, tell me. I'll correct it publicly.
-
The VSD (Vaccine Safety Datalink) covers approximately 12 million people, including roughly 2.1 million children, representing about 3% of the US population. PRISM/BEST is the FDA's Sentinel system, covering over 100 million lives through insurance claims and electronic health records. Neither system codes for vaccine ingredients; both monitor outcomes by vaccine product. The VSD also uses TreeScan, a hypothesis-free data mining tool that scans all ICD-10 diagnosis codes for unexpected clustering after vaccination. This is not limited to predefined outcomes, but still operates within defined post-vaccination time windows. For system descriptions: CDC, "Vaccine Safety Datalink (VSD)," accessed 2026-03-20; FDA, "BEST Initiative," accessed 2026-03-20. ↩↩
-
Lazarus R, Klompas M. "Electronic Support for Public Health: Vaccine Adverse Event Reporting System (ESP:VAERS)." AHRQ, 2010. Grant ID: R18 HS 017045. The "fewer than 1%" figure appears in the introduction, extrapolated from Scott HD et al. 1987 and Rogers AS et al. 1988, both studies of physician reporting of drug (not vaccine) adverse reactions. The ESP:VAERS project aimed to automate VAERS reporting from electronic health records but reported that CDC cooperation ended before the study could validate its findings. ↩
-
Rosenthal S, Chen R. "The reporting sensitivities of two passive surveillance systems for vaccine adverse events." Am J Public Health 1995;85(12):1706-1709. PubMed 7503351. ↩
-
Miller ER, Moro PL, Cano M, Lewis P, Bryant-Genevier M, Shimabukuro TT. "Post-licensure safety surveillance of vaccines: completeness of the Vaccine Adverse Event Reporting System." Vaccine 2020;38(47):7458-7463. PubMed 33039207. Capture rates were estimated by comparing VAERS reports against confirmed cases identified through active surveillance or medical records review. The sixfold difference in anaphylaxis capture between seasonal flu (13%) and H1N1 pandemic flu (76%) occurred during the same reporting period, suggesting that heightened public and clinical awareness during the pandemic drove the difference. ↩↩
-
Verstraeten T, Baughman AL, Cadwell BL, Zanardi L, Haber P, Chen RT. "Enhancing the sensitivity of the Vaccine Adverse Event Reporting System (VAERS)." Am J Epidemiol 2001;154(11):1006-1012. PubMed 11724716. ↩
-
MHRA Drug Safety Update, May 2019: "It is estimated that only 10% of serious reactions and between 2 and 4% of non-serious reactions are reported." The estimate is unsourced in the publication. MHRA Freedom of Information response 22/032 stated: "it is not possible for the estimated rate of under-reporting to be calculated." ↩
-
The ICH-GCP (International Council for Harmonisation, Good Clinical Practice) definition of a Serious Adverse Event includes: death, life-threatening event, inpatient hospitalisation or prolongation of existing hospitalisation (the 24-hour threshold is commonly applied), persistent or significant disability/incapacity, or congenital anomaly. ICH E2A, "Clinical Safety Data Management: Definitions and Standards for Expedited Reporting," 1994. ↩
-
Signal detection timelines compiled from CDC archival records, EMA regulatory actions, and published case analyses. For RotaShield: CDC MMWR 1999;48(27):577-581. For Pandemrix: Nohynek H et al. PLoS ONE 2012;7(3):e33536. For 1976 Swine Flu: Schonberger LB et al. Am J Epidemiol 1979;110(2):105-123. For OPV/VAPP: CDC, "Vaccine-Associated Paralytic Polio," cdc.gov. ↩
-
CDC. "Intussusception among recipients of rotavirus vaccine: United States, 1998-1999." MMWR 1999;48(27):577-581. RotaShield was licensed August 1998 and withdrawn October 1999. Pre-licensure trials enrolled approximately 15,000 infants; detecting a 1:10,000 event with 80% power requires approximately 60,000 subjects. Later rotavirus vaccines (RotaTeq, Rotarix) expanded trial sizes to 60,000+ specifically because of the RotaShield experience. ↩↩
-
Nohynek H, Jokinen J, Partinen M, et al. "AS03 adjuvanted AH1N1 vaccine associated with an abrupt increase in the incidence of childhood narcolepsy in Finland." PLoS ONE 2012;7(3):e33536. Risk approximately 1:18,400 in children and adolescents; 5- to 14-fold increase. First suspected by Finnish neurologist Dr. Markku Partinen in December 2009, approximately two months after mass vaccination began. The signal was geographically concentrated in Finland and Sweden; trials conducted elsewhere would not have detected it. ↩
-
The VSD's Rapid Cycle Analysis (RCA) performs weekly sequential hypothesis testing, comparing observed-to-expected rates of predefined outcomes in risk windows typically spanning 1 to 42 days after vaccination. Outcomes and risk windows are specified in advance; conditions with gradual or delayed onset fall outside these windows. ↩
-
The self-controlled case series design, introduced by Farrington (1995), uses each case as their own control, comparing event rates during defined post-vaccination risk windows to rates during control periods. It is optimised for acute, transient exposures and temporally defined outcomes. For conditions where onset is gradual or where the exposure is chronic (such as cumulative aluminium from multiple vaccine doses), the design has no natural risk window to test. ↩
-
Daley MF, Reifler LM, Glanz JM, et al. "Association Between Aluminum Exposure From Vaccines Before Age 24 Months and Persistent Asthma at Age 24 to 59 Months." Academic Pediatrics 2023;23(1):37-46. PubMed 36180331. Study population: 326,991 children born 2008-2014 in the VSD. Adjusted hazard ratio for persistent asthma: 1.19 (95% CI 1.14-1.25) per 1 mg increase in cumulative aluminium exposure. The authors noted that the finding "should be interpreted with caution" and called for replication. ↩
-
The largest study often cited here is Andersson et al. 2025, a Danish register-based cohort of 1.2 million children. The main analysis measures dose-response: hazard ratios per 1-mg increase in cumulative aluminium. This can detect whether more aluminium is worse than less. It cannot detect effects shared by all doses, for the same reason that comparing 20 cigarettes a day to 10 cannot detect harms common to all smokers. The cohort does include 15,237 children (1.2%) with no aluminium-adsorbed vaccines, but they anchor the low end of the continuous dose-response model rather than serving as a comparison group, and 82% had the lowest rate of GP visits, making direct comparison unreliable. Andersson NW, Bech Svalgaard I, Hoffmann SS, Hviid A. "Aluminum-Adsorbed Vaccines and Chronic Diseases in Childhood: A Nationwide Cohort Study." Ann Intern Med 2025;178(10):1369-1377. doi:10.7326/ANNALS-25-00997. Design and limitations examined in a future post. ↩