Ever Wonder? from the California Science Center

...how to look out for vaccine misinformation? (with Tara Haelle)

June 09, 2021 California Science Center Season 2 Episode 10
Ever Wonder? from the California Science Center
...how to look out for vaccine misinformation? (with Tara Haelle)
Show Notes Transcript

A few weeks ago, we spoke to journalist Tara Haelle (@tarahaelle), who’s written about vaccines and vaccine hesitancy for a decade. If you haven’t already listened to her first episode, where she breaks down how to make sense of all the news around COVID vaccines, please go check it out. We had such a great conversation with Tara that we couldn’t fit everything in a single episode—so now we’re airing more. 

 There’s so much news coming out about COVID vaccines, it can be hard to keep up. And it doesn’t help that our best efforts to make rational decisions can be thwarted at every turn. It turns out that our own brains can sometimes mislead us, making it harder to judge whether what we’re learning is likely to be true or not.

 Do you ever wonder how to look out for vaccine misinformation?

 Tara had some great advice for dealing with misinformation. And she explained why humans are so bad at evaluating risk and what we can try to do to be better at it.

Have a question you've been wondering about? Send an email or voice recording to everwonder@californiasciencecenter.org to tell us what you'd like to hear in future episodes.

Follow us on Twitter (@casciencecenter), Instagram (@californiasciencecenter), and Facebook (@californiasciencecenter).

Support the show
Perry Roth-Johnson:

Hello! This is Ever Wonder? from the California Science Center. I'm Perry Roth-Johnson. A few weeks ago, we spoke to journalists, Tara Haelle, who's written about vaccines and vaccine hesitancy for a decade. If you haven't already listened to her first episode, where she breaks down how to make sense of all the news around COVID vaccines, please go check it out. We had such a great conversation with Tara that we couldn't fit everything in a single episode—so now we're airing more. There's so much news coming out about COVID vaccines, that it can be hard to keep up. And it doesn't help that our best efforts to make rational decisions can be thwarted at every turn. It turns out that our own brains can sometimes mislead us, making it harder to judge whether what we're learning is likely to be true or not. Do you ever wonder how to look out for vaccine misinformation? Tara had some great advice for dealing with misinformation. And she explained why humans are so bad at evaluating risk and what we can try to do to be better at it. Take a listen.

Devin Waller:

Evaluating risk has been an important part of navigating the pandemic in our daily lives. Um, we've had to make risk assessments about what activities and choices to make in order to slow the spread of this disease. And now with the vaccines available, we have to evaluate the benefits versus risks of getting vaccinated. So, first of all, why are humans so bad at evaluating risks to begin with? And how can we get better at that?

Tara Haelle:

I love the way you phrase that because it acknowledges right at the fact that we are really, really lousy at assessing risks. We just aren't. Oh, well, okay. We're, we're good at assessing the risks that our ancestors had to worry with. We're good at hearing a lion roar and knowing that we need to get into that cage fast. Okay. That kind of risk. We're good at that because that's how our brains developed to assess risk. But today there are not lions chasing us into caves. We don't have to worry about that anymore. We have to decide if getting onto a giant, several ton piece of metal that supposedly stays on the air by itself to go from New York to LA safe. And if I described it that way, people are like, oh no, that's not safe, but Hey, airplanes are safer than cars. So, um, our brains are more, they rely more heavily on concrete things. And when you say that there is a, um, uh, one in 500,000 chance of something happening that's abstract. How do we, well, even more abstract than that is if we say there's a 0.02, 5% chance of something happening, right. What does that mean? Right. Um, and sometimes what helps us to do the analogy. So if someone's at a one in 500,000 chance of something happening, I think about where I went to high school, which is Arlington, Texas, and that city is about 350,000 people. And I think, okay, so let me add another 150,000 onto that. Well, I don't know how much Mansfield is next door. Let's say Mansfield is 150,000. So one in 500,000 is of the entire city of Arlington and Mansfield one person. Okay, well that actually makes me understand it more. Now I can actually have a, something to grab onto. That makes sense to me to understand something. So you really have to use that. Um, we also, our risks are affected by those same biases I've talked about before. And one of the most common ones is availability, bias. Availability bias means that we are fixated more on something that we more recently heard about or what sounds more dramatic. And that stands out as a bigger risk than the things that are more mundane that we're more familiar with. So if you're going to the beach and you watch jaws a couple of weeks earlier, which I don't advise, by the way, you're going to be, you know, your friend's probably going to have to drag you, kicking and screaming into that water because you're going to be worried that there's a shark in the water. Nevermind. The fact that sharks mostly are out during like Dawn and dusk and are more scared of you than you are of them. Um, or the fact that we kill a million sharks, a hundred million sharks a year and they kill maybe five of us a year, right? Those stats, those numbers, aren't, what's in your head, what's in your head is, oh my God, I'm going to die of a shark attack. Right? And also not in your head is the fact that the undertow might cause you to drown or that the sunburn is gonna, you know, give you a higher risk of skin cancer or that simply driving there. You have a risk of getting into a car accident. All of those are much, much, much greater risks than a shark encounter, but that's not what you're thinking about because it's not the dramatic thing that stands out in your head. And that happened a lot with vaccines, with the vaccines and autism thing. Well, I heard that my next door neighbors cousin's daughter got autism after she had the vaccine. First of all, we know that that was a coincidence. Now we know that there's no connection, but even if we didn't know that yet you haven't heard from the other like 200,000 daughters of cousins of neighbors on your block, one, a vaccine and nothing happened, right. You've only heard about that one. So our brains sort of tech, they, they, they grab onto, they latch onto those risks and it works our sense of risk. How can we get better at it? Um, one is what I said already using those analogies to try and make some sense, um, try to focus. And this is hard. Try to focus on the numbers rather than the emotions or examine your emotions and try to pinpoint where those emotions are coming from. Um, and I say, that's hard because it's kind of like telling somebody who's having an anxiety attack. Have you tried not having an anxiety, right? Yeah. If someone says, if you're, if you're uptight or anxious about something and someone says, calm down, what's the first thing you do.[inaudible] so I hesitate to say that because you know how easier said than done, right? Um, but you know, it, it is hard and you have to just, you have to compare risks, um, with the blood clots, with the Johnson Johnson vaccine, we do know that those were almost certainly caused by the vaccine and they also occur. Um, what was it? Six cases in 7 million shots or something like that. Compare that to, um, out of every person who gets COVID and is not hospitalized one in 100 to a hundred, and twenty-five people will have a blood clot of the ones who are not hospitalized when you're hospitalized for COVID one in 20 people for every 20 persons who are in the hospital for COVID. One of them is going to get a blood hot if you're in ICU, it's one in five. Wow. So can, you know, put that into perspective, compare those risks. Um, and it's hard to do that, but that's what you have to do. You have to really look at those risks by the numbers and then say it is totally rational in a human sense for me to be scared that I'm going to be the one in a million, but the odds are much better than I'm going to get COVID and develop a blood clot. And it gets hard to do. I, I can't, I don't have an easy solution to that because it is a natural human inclination to worry that you're going to be the one that makes sense. That's not, I'm not going to say that's irrational. I mean, I guess it is tactically irrational, but it is, it is built into our DNA. So, you know, you're not crazy if that's the way you think you just have to find ways to overcome it. Yeah.

Perry Roth-Johnson:

It's, I'm not an unreasonable, I mean, I've, I've been in those conversations personally. Um, I think another analogy I heard, uh, that I liked because it was such a common drug. I think it was Dr. Francis Collins, the head of NIH on meet the press. He said, you're less likely of getting a blood clot from J and J I'm paraphrasing here. Um, then like getting struck by lightning next year or that the risk of aspirin inducing a significant intestinal bleed is much higher than what we're talking about here and take aspirin all the time.

Tara Haelle:

Yeah, exactly. And that's, it's always a benefit risk. I think one thing that helps with risk is to keep in mind that there's no such thing as a risk-free option. Right. Um, another example of a risk is, um, actually if you'll indulge me for a minute, I'm going to read a quote from someone that I think your audience will know no matter who's in the audience. One of our founding fathers, the inventor of loads of things, um, quite the dilettante in, uh, France, Benjamin Franklin, Benjamin Franklin had the opportunity to, um, inoculate his son against smallpox vaccination as a form of inoculation. But this was different. This was variolation which predated vaccination. And it did carry risks, many more risks than vaccination, but his quote was in 1736. I lost one of my sons, a fine boy of four years old by the smallpox taken in the common way. In other words, he caught, caught smallpox in the wild and died. I long regretted bitterly and still regret that I had not given it to him by inoculation. And he says, you know, for those who are making the same decision, no, that the pain is just as bad either way. And what he was demonstrating is something called omission bias, which is you believe that you'll escape the risk by omitting the action. So if you don't do the action, you feel less responsible if something bad happens to you because you chose not to act. So if I don't get the vaccine and I get COVID, well, that's just bad luck. But if I get the vaccine and I get an adverse event, well, that's my fault because I got the vaccine. That's faulty thinking because not acting is also an active choice. And the best example, you know, the best comparison is seatbelts. If you're, if you put on the seatbelt and then you get injured as a result of having put on your seatbelt and an accident, you know, the right response that is not, I shouldn't have put on the seatbelt, right. Because what, you know, far more often, the seatbelt's gonna save your life. It's not that no one's ever been hurt by a seatbelt, but you know, the relative risk of the seatbelt is definitely better to be wearing one. So that's a really hard one. Just keep in mind that there is no such thing as a risk-free choice in life saying in your house nonstop. Well, not during a pandemic, I'm not going to use that example, scratch that. I can't use that example anymore. Yeah. Drinking water, you could choke on the water. Um, and a lot of times, the way to overcome those fears is to do something right away. You know, if you fall off your bike, you get back on the bike. As soon as you can, if you, if you, um, you know, go into the water and swallow it and it gets up your nose in the pool, the best thing to do is to get back in the pool, as long as someone's there with you. Um, I, one time was eating fruit at, uh, uh, bicycle workshop I was at and I choked on some honeydew and it was stuck in my throat and I couldn't breathe. And someone had to do the Heimlich on me. And the very first thing I did after I got out was I took some breaths. I drank some water and then I ate another piece of honeydew because I knew that if I didn't, I would never touch how many do again. Um, so, you know, it's, you have to examine your own fears and, you know, really think about, you know, why am I scared of this? And even if it feels this way, you know, w what's the, what's the rational response to that?

Devin Waller:

There's been an onslaught of misinformation and even disinformation online. What should people look for to avoid falling into these traps?

Tara Haelle:

Well, first of all, it's helpful for people to understand the difference between misinformation and disinformation, um, misinformation. Yeah. Is just what it sounds like it's, it's in accurate information, regardless of intent. It's just not factual. Um, it may be accidentally unfactual. It may be intentionally in factual, but it's just not factual information. Disinformation is an intentional attempt to mislead the person misleading either knows. They know that it's not accurate information they're intentionally lying, or they have made a conscious effort not to find contradictory information. They haven't made an intellectually honest attempt to confirm that it's accurate, right. They've been, they've been neglectful. Um, so that's one thing to keep in mind. And there are lots of motivations for disinformation, disinformation agents. Um, the two biggest ones are usually political and financial. They ha they have a finding, you know, a lot of in the anti-vaccine community, there's a lot of money in being an anti-vaccine advocate. There's a lot of money in it in terms of being able to recognize it. First thing you should do is cross check. Anything you find, what are other outlets saying? And are there, if you're only seeing it in a certain type of news, if you're only seeing it on super liberal blogs or super conservative blogs, or, you know, um, if you're seeing it on a site that's selling something. I don't trust. If someone's, if I find a doctor who is selling something on their site, I don't, that doesn't mean that that doctor is not a good doctor or is as pseudo scientific or anything, but I'm not going to trust what's on his site because I can't be sure that what he's telling me isn't connected to what he's selling. So I look at what they're selling, you know, what is their, what is their motivation? So, um, it's sort of like being back in school, author's purpose, what's the purpose of the writer? Why is this person communicating this information? What do they have to gain from it? Um, and a lot of people think this is kind of ironic because then there's people who think that journalists have this, um, agenda, right? And there are some journalists who have agendas and there, there are opinion journalists out there. Um, there are journalism outlets that might have a particular point of view, but in terms of straight news, the average journalist goal is to serve the public by providing them accurate and thorough information, and then getting a paycheck for doing that. And that paycheck doesn't come from a company, uh, beyond the company. That's hired them to write the story. So like, if I write something for the Washington post, I want my paycheck from the Washington post. Um, now the Washington post is owned by Amazon, but I don't have anything to do with Jeff Bezos. He doesn't know me. I don't know him. Um, you know, it's only my editors of the Washington post that I'm dealing with. So thinking about author's purpose, cross checking, using fact-checkers and look for things that seem to the be true or too scary. Now that last one is tricky because at the start of the pandemic, we were all terrified. And frankly, we had good reason to be. So, you know, it's, it's like, I'm not saying that just because the news is scary, that automatically it's not true, but it means have a higher threshold of skepticism about it. Um, in the case of the pandemic, the reason I was like, okay, there's good reason to be scared is because the infectious disease experts I followed were staring. In fact, a friend asked me, oh gosh, this was maybe six or eight months ago. She says, Tara, how do I know when to worry about the variance or something like that? And I said, when Anthony Fowchee worries, then you were, and it was funny because about a month ago she sent me a text. She says, she sent me a headline with Fowchee and she was like, felt, she is worried. Now I'm worried. And I said, you have a right to be, yeah. So you find the experts and see when, you know, when they worry, then you worry. If they're feeling good, then you should feel good. So I think where it gets most challenging with misinformation is when it has to do with political or ideological things where there's not a fact, or non-facts like there's facts that can support one position and there's facts that can support another position, but neither of those positions can be considered the quote unquote right position. When you have things like that, it's really hard to pick apart when you might be misinformed, because you might be following someone who shares your values and shares your ideas and worldview and ideology. That doesn't mean they're not misleading you sometimes or misleading themselves and sharing that with you and not realizing it. So I would encourage people to regularly examine their own biases. We all have biases, you know, that you cannot escape bias. It is a human trait. In fact, dogs have biases. Okay. It's how they get trained. I've never thought about that. Um, well, I mean, how do you train you train with association, right. Well, how do we, so it's, it's a reality. It's in our DNA. Okay. It's in our neurons. We can't not have bias. What we can do is be aware that that bias is there and look for it and then find ways to examine it. That doesn't mean counter it necessarily, but examine it. Am I being as thoughtful about this issue, expose yourself to people who are different from you and people who have different ideas from you who are respected in their circles. Hey, look for people who are, you know, who, if you're a Republican, who were the Democrats supporting, if you're a Democrat who the Republicans look up to and, and look at what they're saying, and I'm not saying you won't get angry sometimes by looking at it or upset or frustrated, but expose yourself to that so that you know that you're not staying in an echo chamber and that will help you. Um, one of the most insidious types of, um, bias is confirmation bias and confirmation bias means that you are unconsciously looking for things or clicking on things or exposing yourself to things that already fit with what you believe. So it reaffirms, it confirms what you already believe. So you have to make a conscious effort to look for things that do not necessarily with what you agree with. It doesn't mean it's going to change your mind, but you have to examine them and make sure that, you know, you're not leaving out things that you hadn't thought of. Otherwise, you won't be exposed to new information that might change your mind. Um, I laugh because someone in my family would often joke about the fact that Tara is always right. Terry thinks she's always right. And I actually got really offended when they said that because it's a good day. When I find out I'm wrong, it's good to find out you're wrong. It's good to, to tell people you're wrong. I, one time someone asked on Twitter who was ever admitted, they were wrong on something. Find some tweets. And I went back and found like 15 tweets easily. And then look, this is, I was wrong about this. And I was wrong about that. And I was, I like being wrong because it means that I'm still learning. I'm still bringing in new information. And I am not who I hate to use the word arrogant. I am humble enough to realize that I am never going to know everything. And sometimes I'm going to have ideas on things based on information. That's limited to a certain point. And then when I get more information that might change what I believe and changing my mind is not a weakness. It doesn't make me a flip-flopper like we hear in politics. Sometimes it means that I'm town instantly taking in new information. And re-evaluating what I think and what I believe based on that new information, and guess what that's, how science works and, you know, um, that's why scientists changed their minds. Um, you know, I never understood the idea of blaming a politician for being a flip-flopper because to me, a politician changing their mind is evidence that they're taking in new information and reconsidering things. And I think that's a positive attribute. So, uh, when I was working on my book, the informed parent, there were two topics that I had. I knew I had my own confirmation bias on, like, I knew that I felt strongly about them. And I knew that my opinions had opposing opinions in the community. And I set out when I wrote those sections to prove myself wrong, I acted like I was a debate candidate. And I was having to prepare myself to debate the opposite side of what I believed. And I explicitly looked for evidence to support the other side. And as it turned out, one of my beliefs was further affirm. Like I felt even more strongly about it by doing that exercise. The other one, I did a 180, the other one I realized I had been wrong and I completely changed my perspective on it. And it changed the way I wrote about it. So, um, I think that's an important exercise and it shows that it can happen both ways. You know, in, in one case I found more evidence that I didn't even know about the supportive, what I thought. Um, but on the other one, I found evidence that I wasn't aware of, um, and flaws and the evidence that I had been relying on that I hadn't considered. Um, and both of those made me a better science consumer. And both of those may be a better journalist.

Perry Roth-Johnson:

Yeah. Got it. That's our show, and thanks for listening! Until next time, keep wondering. Ever Wonder? from the California Science Center is produced by me, Perry Roth-Johnson, along with Devin Waller. Liz Roth-Johnson is our editor. Theme music provided by Michael Nickolas and Pond5. We'll drop new episodes every other Wednesday. If you're a fan of the show, be sure to subscribe and leave us a rating or review on Apple Podcasts—it really helps other people discover our show. Have a question you've been wondering about? Send an email or voice recording to everwonder@californiasciencecenter.org, to tell us what you'd like to hear in future episodes.