Quantitative Skepticism

Quantitative Skepticism

Extraordinary claims deserve rigorous analysis. This section contains projects where I apply statistical methods, data collection, and critical thinking to evaluate popular beliefs, viral claims, and persistent myths.

The goal is not cynicism, but clarity. Sometimes the data supports the claim. Often it does not. Either way, the methodology matters more than the conclusion.


Projects

Missing 411: A Bayesian Analysis

Do “mysterious” wilderness disappearances defy statistical explanation?

The Missing 411 phenomenon claims thousands of people have vanished under unexplained circumstances in US National Parks. I gathered official NPS visitation data, compiled Search and Rescue statistics from peer-reviewed sources, and applied Bayesian inference to test whether these disappearances are anomalous.

Spoiler: When adjusted for the 300+ million annual park visits, the numbers are exactly what statistics predicts.

Tools: Python, SciPy, NPS IRMA API, Bayesian Poisson-Gamma models


More Coming Soon

Future analyses may include:

  • Shark attack risk perception vs. reality
  • “Cluster” disappearances and spatial statistics
  • Other viral claims that warrant quantitative scrutiny

Why This Matters

This type of analysis demonstrates:

  • Data collection skills: API integration, FOIA data, academic sources
  • Statistical rigor: Bayesian inference, base rate analysis, Monte Carlo methods
  • Critical thinking: Identifying methodological flaws, selection bias, scope creep
  • Science communication: Making technical results accessible

These are the same skills required to evaluate AI model outputs, identify hallucinations, and distinguish genuine patterns from statistical noise.


“The first principle is that you must not fool yourself, and you are the easiest person to fool.” — Richard Feynman