RFK Jr. and Probability
When humans are so bad at estimating risk, how does this impact vaccine hesitancy?
The new Secretary of Health and Human Services, Robert F. Kennedy, Jr., has made it his mission to dismantle the public health system in the United States.
This week, RFK Jr. came out urging new research into treatments for measles infections. To many of us, this sounds like it could be good on its face—measles patients are generally offered supportive therapy, rather than a ‘miracle drug’ that cures the infection itself. But the solutions he’s proposing—cod liver oil, vitamins, existing drugs—have little to no chance of working, and they draw attention away from the obvious public health response:
The measles vaccination (given as part of the MMR trio) is up to 97% effective at preventing infection.
Today’s post is not going to be an exhaustive refutation of anti-vaccine myths: better and more thorough reviews abound, with some examples here, here, and here. Instead, we’ll walk through some of the research on how we humans think about and weigh risk.
Research has consistently found that humans are not naturally good at understanding very very big or very very small numbers.
This is why you see folks popping an Ativan and taking deep breaths as a commercial airplane taxis but not thinking twice about hopping in a cab to drive home. From a numbers sense, this makes no sense—from 2002-2022, there were fewer than 700 serious injuries in US air travel, compared to 48 million injuries from automobiles. (This point was much easier to make to my students before the spurt of deadly plane crashes that took place in the first months of the second Trump presidency.)
The same is true when we talk about probabilities. No matter how many times you pull the same slot machine, your chances of winning stay the same each go. Yet people will insist they’re on a ‘hot streak’ or that ‘their turn is coming up,’ and they can lose huge amounts of money in the process.
Three sources of cognitive bias
Cognitive biases are ways our brains convince us to act in irrational or non-normative ways. The biases behind these mis-assessments in risk were part of the research that earned Daniel Kahneman the 2002 Nobel in economics. (For those interested, Kahneman expounds on these biases for a non-academic audience in Thinking Fast and Slow.)
To see this for yourself, I want you to think about the room you’re sitting in. Looking at the floor layout, try to draw a line in your head half-way down the room. For most of us, this isn’t too difficult. We can fairly easily visually divide a space in thirds or quarters or even tenths.
It becomes much trickier when I ask you to picture 1/100th of the room, or 1/1,000th, or 1/1,000,000th.
The same is true in the other direction—most of us could easily picture what it would look like if the room were doubled or tripled in square footage. But what about 20x? 100x? 1000x? How far would you need to go before your room were the size of a city block, or the state of Colorado?
These biases are also why we have a hard time comparing very small or very big numbers. If I asked you to picture a line with 0 on one and a billion on the other end, how close would would you place the one-million marker to either side? What does it mean for something to have a 1 in a 1,000 chance of death versus a 1 in 10,000 chance?
Finally, we as humans respond differently to events that will happen with certainty versus those that are uncertain (e.g., the guarantee of getting $1 versus a 10% chance of getting $10) and where the low-probability event is of very high consequence (e.g., dying in an airplane accident).
These are only a few of the cognitive biases that impact our choices every day. We’ve also seen that people need to be paid much more to give up something they already have versus how much they’re willing to pay to get the same thing. We often weight evidence that supports what we already believe (or want to believe) over comparable evidence against. We weigh something in the short term more heavily than something further in the future (like the uncomfortable 24 hour side effect window for the COVID-19 vaccine over the potential of developing long-COVID.) We think ourselves special and discount risks that something bad will happen to us. (This bias is behind knowing that biking and skiing helmets are important but choosing to go without.)
The goal here isn’t to convince anyone that they are making a ‘mistake’ or ‘bad at math’—no matter how much math you know, we all have a hard time understanding these tiny or massive numbers. Instead, it’s important that we understand how these biases impact the life-and-death choices we make every day.
What does this have to do with vaccines?
Everything.
Scientific research lives in these numbers—we celebrate the COVID-19 vaccines for reducing our chances of catching the virus by 54%, needing to go to urgent care or the ER for COVID-19 by 33%, and dying by over 90%. Yet research has shown that we are inherently bad at understanding how those odds and numbers turn into our own personal risk and the risk to our loved ones—and what choices we should make for our own and our community’s health.
Anti-vaccine arguments often focus in on ‘adverse events’ (when a patient has a negative reaction to a vaccine, like an allergic reaction) or disproven claims linking vaccines and autism. The latter is myth that is used to justify not protecting kids from fatal childhood illnesses like measles and polio. The former, adverse events, are uncommon, by construction—if they were common, then the vaccine would never have passed out of clinical trials.
Our cognitive biases around risk and probability mean that, every day, parents choose not to vaccinate their children because of disproportionate fear of unlikely (or impossible) unintended consequences. Confirmation bias also plays a role here, with folks seeking out the one website or study that agrees with them instead of weighing the overwhelming body of evidence saying vaccines work and are safe. And getting the flu despite having a flu shot doesn’t mean it ‘didn’t work.’ Breakthrough infections happen, and it sucks to take an hour out of your day getting vaccinated yet still end up sneezing. But at a community level, all of those vaccinations mean that there are fewer people overall getting sick, even if some do
RFK’s new position at the head of the US public health apparatus was a low probability event—one that many of us were disproportionately worried about because of how consequential it would be…and is already proving to be.
Last thoughts
Many of us are very reasonably scared right now—of the dismantling of the social safety net, of becoming ill, of unknowingly exposing our children to irreparable harm. But, just like in cognitive biases about airplane travel, we don’t weigh these fears equally, and we don’t judge those risks equally. This means that, sometimes, we let the fear of an adverse vaccine reaction (a very unlikely event) overrule our fear of contracting a preventable illness (which, as the CDC is gutted, is becoming a more likely event).
In the face of rampant misinformation about healthcare, research, and safety, it’s important that we push back against our cognitive biases. We must work to recognize when we’re making decisions based on biased judgments and gut instincts, especially when bad-faith actors are hijacking and misusing our very real fears to obfuscate fact and reason.
Thanks are due to Esteban J. Quiñones for feedback on this post.
I like to joke that humans are so bad at measuring risk we had to develop a science to counter our hubris.