We keep the problem of vaccine misinformation in perspective


The most prominent public figures are skeptical of vaccines, like Tucker Carlson or the Senator Ron Johnson (R-Wisconsin), understand this. They don’t should spread obvious lies. They can easily focus from night to night on unusual cases of serious side effects. Or I can selectively present the results of scientific studies or government communications in ways that seem to suggest something ominous about viruses or vaccines. Or I can completely bypass the scientific question ranting about how the government vaccine actually relates to social control. Like all illusionists, they know that the most powerful tool available is not misinformation, but misdirection.

This subtle distinction is often lost on members of the media and the political establishment. Sometimes “misinformation” becomes a general term for any material used to deter people from firing, whether it is objectively false or not. Recent New York Times article about an influential anti-vaxxer, for example, Joseph Mercola, titled “The Most Influential Disseminator of Coronavirus Disinformation Online,” concluded by noting that Mercola posted on Facebook suggesting that the Pfizer vaccine is only 39 percent effective against Delta infection. Mercola accurately conveyed the findings of the actual study, the one that already was covered by the main news. The Times the article adapted it, however, because it did not mention the second finding of the study, that the vaccine was 91 percent effective against serious diseases.

Undoubtedly Mercola – an osteopathy doctor who has he got rich selling “natural” health products that are often advertised as alternatives to vaccines – would do a service to its followers by sharing that data point. The real statistics of picking cherries to sow suspicion in vaccines is dangerous. But sweeping that example under an umbrella of misinformation means getting involved in a crawling concept. Misinterpretation is not the same as misinformation, and this is not just a semantic difference. Facebook, YouTube and Twitter are rightly under enormous pressure to do more to prevent the spread of dangerous untruths on their platforms. They are often referred to well-known media organizations. It would be worrying for online freedom of speech if, in the name of preventing real-world harm, platforms were routinely suppressed as “disinformation” posts that contained nothing objectively false. It is difficult enough to distinguish between truth and untruth. It would be reckless to ask platforms to take responsibility for judging whether they are users interpretation the fact – their opinion on the issue of public policy – is acceptable or not.

“It is certain that misinformation makes things worse,” said Gordon Pennycook, a behavioral psychologist at the University of Regina. “There are people who believe in fake things and read them online. That must be happening. “But Pennycook continued,” the more you focus on it, the less you talk about the ways people become indecisive and have nothing to do with misinformation. “

In his research, Pennycook conducts experiments to find out how people actually react to network misinformation. In one study, he and his co-authors tested whether to convince people of this claim in the headline of fake news after being exposed to it online. (Sample headline: “Mike Pence: Homosexual Conversion Therapy Saved My Marriage.”) In one phase of the experiment, exposure to fake news increased the number of people who rated the lawsuit as accurate from 38 to 72. You could look at it and say that misinformation on the Internet increase belief by 89 percent. Or you may notice that there were a total of 903 participants, which means that the titles only worked on 4 percent of them.

The current debate about misinformation about vaccines sometimes seems to imply that we live in a world of 89 percent, but a figure of 4 percent is probably a more useful guide. It would still be a serious problem if only a small percentage of Facebook or YouTube users were susceptible to misinformation about vaccines. They are more likely to refuse to be vaccinated, get sick and spread the virus – and perhaps their false beliefs – to others. At the same time, it’s important to keep in mind that it’s somewhere around one third adult Americans still decide not to get vaccinated. Even if Facebook and YouTube could delete all anti-vaxx content from their platforms overnight, that would take just one bite out of a much bigger problem.

Source link


Please enter your comment!
Please enter your name here