Covid Test Misinformation Spikes Along With Spread of Omicron

The added demand for testing and the higher prevalence of breakthrough cases have created an “opportune moment” to exploit.On Dec. 29, The Gateway Pundit, a far-right website that often spreads conspiracy theories, published an article falsely implying that the Centers for Disease Control and Prevention had withdrawn authorization of all P.C.R. tests for detecting Covid-19. The article collected 22,000 likes, comments and shares on Facebook and Twitter.On TikTok and Instagram, videos of at-home Covid-19 tests displaying positive results after being soaked in drinking water and juice have gone viral in recent weeks, and were used to push the false narrative that coronavirus rapid tests don’t work. Some household liquids can make a test show a positive result, health experts say, but the tests remain accurate when used as directed. One TikTok video showing a home test that came out positive after being placed under running water was shared at least 140,000 times.Identifying information has been redacted.And on YouTube, a video titled “Rapid antigen tests debunked” was posted on Jan. 1 by the Canadian far-right website Rebel News. It generated over 40,000 views, and its comments section was a hotbed of misinformation. “The straight up purpose of this test is to keep the case #’s as high as possible to maintain fear & incentive for more restrictions,” said one comment with more than 200 likes. “And of course Profit.”Misinformation about Covid-19 tests has spiked across social media in recent weeks, researchers say, as coronavirus cases have surged again worldwide because of the highly infectious Omicron variant.The burst of misinformation threatens to further stymie public efforts to keep the health crisis under control. Previous spikes in pandemic-related falsehoods focused on the vaccines, masks and the severity of the virus. The falsehoods help undermine best practices for controlling the spread of the coronavirus, health experts say, noting that misinformation remains a key factor in vaccine hesitancy.The categories include falsehoods that P.C.R. tests don’t work; that the counts for flu and Covid-19 cases have been combined; that P.C.R. tests are vaccines in disguise; and that at-home rapid tests have a predetermined result or are unreliable because different liquids can turn them positive.These themes jumped into the thousands of mentions in the last three months of 2021, compared with just a few dozen in the same time period in 2020, according to Zignal Labs, which tracks mentions on social media, on cable television and in print and online outlets.The added demand for testing due to Omicron and the higher prevalence of breakthrough cases has given purveyors of misinformation an “opportune moment” to exploit, said Kolina Koltai, a researcher at the University of Washington who studies online conspiracy theories. The false narratives “support the whole idea of not trusting the infection numbers or trusting the death count,” she said.The Gateway Pundit did not respond to a request for comment. TikTok pointed to its policies that prohibit misinformation that could cause harm to people’s physical health. YouTube said it was reviewing the videos shared by The New York Times in line with its Covid-19 misinformation policies on testing and diagnostics. Twitter said that it had applied a warning to The Gateway Pundit’s article in December for violating its coronavirus misinformation policy and that tweets containing false information about widely accepted testing methods would also violate its policy. But the company said it does not take action on personal anecdotes.Facebook said it had worked with its fact-checking partners to label many of the posts with warnings that directed people toward fact checks of the false claims, and reduced their prominence on its users’ feeds.“The challenges of the pandemic are constantly changing, and we’re consistently monitoring for emerging false claims on our platforms,” Aaron Simpson, a Facebook spokesman, said in an email.No medical test is perfect, and legitimate questions about the accuracy of Covid-19 tests have abounded throughout the pandemic. There has always been a risk of a false positive or a false negative result. The Food and Drug Administration says there is a potential for antigen tests to return false positive results when users do not follow the instructions. Those tests are generally accurate when used correctly but in some cases can appear to show a positive result when exposed to other liquids, said Dr. Glenn Patriquin, who published a study about false positives in antigen tests using various liquids in a publication of the American Society for Microbiology.“Using a fluid with a different chemical makeup than what was designed means that result lines might appear unpredictably,” said Dr. Patriquin, an assistant professor of pathology at Dalhousie University in Nova Scotia.Complicating matters, there have been some defective products. Last year, the Australian company Ellume recalled about two million of the at-home testing products that it had shipped to the United States.But when used correctly, coronavirus tests are considered reliable at detecting people carrying high levels of the virus. Experts say our evolving knowledge of tests should be a distinct issue from lies about testing that have spread widely on social media — though it does make debunking those lies more challenging.“Science is inherently uncertain and changes, which makes tackling misinformation exceedingly difficult,” Ms. Koltai said.The Coronavirus Pandemic: Key Things to KnowCard 1 of 6The global surge.

Read more →

YouTube Bans Anti-Vaccine Misinformation

YouTube said on Wednesday that it was banning several prominent anti-vaccine activists from its platform, including the accounts of Joseph Mercola and Robert F. Kennedy Jr., as part of an effort to remove all content that falsely claims that approved vaccines are dangerous. In a blog post, YouTube said that it would remove videos claiming that vaccines do not reduce transmission or contraction of disease, and content that includes misinformation on the contents of the vaccines. Claims that approved vaccines cause autism, cancer or infertility, or that the vaccines contain trackers will also be removed.The platform, which is owned by Google, has had a similar ban on misinformation about the Covid-19 vaccines. But the new policy expands the rules to misleading claims about approved vaccines such as those against measles and hepatitis B, as well as to falsehoods about vaccines in general, YouTube said. Personal testimonies relating to vaccines, content about vaccine policies and new vaccine trials, and historical videos about vaccine successes or failures will be allowed to remain on the site.“Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board” in policies that bring its users high-quality information, the company said in its announcement.Misinformation researchers have for years pointed to the proliferation of anti-vaccine content on social networks as a factor in vaccine hesitation — including slowing rates of Covid-19 vaccine adoption in more conservative states. Reporting has shown that YouTube videos often act as the source of content that subsequently goes viral on platforms like Facebook and Twitter, sometimes racking up tens of millions of views.YouTube said that in the past year it had removed over 130,000 videos for violating its COVID-19 vaccine policies. But this did not include what the video platform called “borderline videos” that discussed vaccine skepticism on the site. In the past, the company simply removed such videos from search results and recommendations, while promoting videos from experts and public health institutions.This is a developing story. Check back for updates.

Read more →

Facebook Groups Promoting Ivermectin as a Covid-19 Treatment Flourish

Facebook has become more aggressive at enforcing its coronavirus misinformation policies in the past year. But the platform remains a popular destination for people discussing how to acquire and use ivermectin, a drug typically used to treat parasitic worms, even though the Food and Drug Administration has warned people against taking it to treat Covid-19.Facebook has taken down a handful of the groups dedicated to these discussions. But dozens more remain up, according to recent research. In some of those groups, members discuss strategies to evade the social network’s rules.Media Matters for America, a liberal watchdog group, found 60 public and private Facebook groups dedicated to ivermectin discussion, with tens of thousands of members in total. After the organization flagged the groups to Facebook, 25 of them closed down. The remaining groups, which were reviewed by The New York Times, had nearly 70,000 members. Data from CrowdTangle, a Facebook-owned social network analytics tool, showed that the groups generate thousands of interactions daily.Facebook said it prohibited the sale of prescription products, including drugs and pharmaceuticals, across its platforms, including in ads. “We remove content that attempts to buy, sell or donate for ivermectin,” Aaron Simpson, a Facebook spokesman, said in an emailed statement. “We also enforce against any account or group that violates our Covid-19 and vaccine policies, including claims that ivermectin is a guaranteed cure or guaranteed prevention, and we don’t allow ads promoting ivermectin as a treatment for Covid-19.”In some of the ivermectin groups, the administrators — the people in charge of moderating posts and determining settings like whether the group is private or public — gave instructions on how to evade Facebook’s automated content moderation.In a group called Healthcare Heroes for Personal Choice, an administrator instructed people to remove or misspell buzzwords and to avoid using the syringe emoji.An administrator added, referring to video services like YouTube and BitChute: “If you want to post a video from you boob or bit ch ut e or ru m b l e, hide it in the comments.” Facebook rarely polices the comments section of posts for misinformation.Identifying information has been redacted.Facebook said that it broadly looks at the actions of administrators when determining if a group breaks the platform’s rules, it said, and if moderators do break the rules, that counts as strikes against the overall group.The groups also funnel members into alternative platforms where content moderation policies are more lax. In a Facebook group with more than 5,000 members called Ivermectin vs. Covid, a member shared a link to join a channel on Telegram, a messaging service, for further discussion of “the latest good news surrounding this miraculous pill.”“Ivermectin is clearly the answer to solve covid and the world is waking up to this truth,” the user posted.After The Times contacted Facebook about the Ivermectin vs. Covid group, the social network removed it from the platform.Identifying information has been redacted.

Read more →

Virus Misinformation Spikes as Delta Cases Surge

Researchers have recorded a new burst of false and misleading information about the coronavirus after a decline in the spring.In late July, Andrew Torba, the chief executive of the alternative social network Gab, claimed without evidence that members of the U.S. military who refused to get vaccinated against the coronavirus would face a court-martial. His post on Gab amassed 10,000 likes and shares.Two weeks earlier, the unfounded claim that at least 45,000 deaths had resulted from Covid-19 vaccines circulated online. Posts with the claim collected nearly 17,000 views on Bitchute, an alternative video platform, and at least 120,000 views on the encrypted chat app Telegram, where it was shared mostly in Spanish.Around the same time, Britain’s chief scientific adviser misstated that 60 percent of hospitalized patients had been double-vaccinated. He quickly corrected the statement, saying the 60 percent had been unvaccinated. But antivaccine groups online seized on his mistake, translating the quote into French and Italian and sharing it on Facebook, where it collected 142,000 likes and shares.Coronavirus misinformation has spiked online in recent weeks, misinformation experts say, as people who peddle in falsehoods have seized on the surge of cases from the Delta variant to spread new and recycled unsubstantiated narratives.Mentions of some phrases prone to vaccine misinformation in July jumped as much as five times the June rate, according to Zignal Labs, which tracks mentions on social media, on cable television and in print and online outlets. Some of the most prevalent falsehoods are that vaccines don’t work (up 437 percent), that they contain microchips (up 156 percent), that people should rely on their “natural immunity” instead of getting vaccinated (up 111 percent) and that the vaccines cause miscarriages (up 75 percent).Such claims had tailed off in the spring as the number of Covid cases plummeted. Compared with the beginning of the year and with 2020, there was an observable dip in the volume of misinformation in May and June. (Zignal’s research isn’t an accounting of every single piece of misinformation out there, but the spiking of certain topics can be a rough gauge of which themes are most frequently used as vehicles for misinformation.)The latest burst threatens to stymie efforts to increase vaccination rates and beat back the surge in cases. The vast majority of people testing positive for the virus in recent weeks, and nearly all of those hospitalized from the coronavirus, were unvaccinated. Public health experts, as well as doctors and nurses treating the patients, say misinformation is leading to some of the vaccine hesitancy.Disinformation researchers say the spike shows that efforts by social media platforms to crack down on misinformation about the virus have not succeeded.“These narratives are so embedded that people can keep on pushing these antivaccine stories with every new variant that’s going to come up,” said Rachel E. Moran, a researcher at the University of Washington who studies online conspiracy theories. “We’re seeing it with Delta, and we’re going to see it with whatever comes next.”In the past few weeks, the vast majority of the most highly engaged social media posts containing coronavirus misinformation were from people who had risen to prominence by questioning the vaccines in the past year.In July, the right-wing commentator Candace Owens jumped on the misstatement from Britain’s scientific adviser. “This is shocking!” she wrote. “60% of people being admitted to the hospital with #COVID19 in England have had two doses of a coronavirus vaccine, according to the government’s chief scientific adviser.”After the scientific adviser, Patrick Vallance, corrected himself, Ms. Owens added the correct information at the bottom of her Facebook post. But the post was liked or shared over 62,000 times — two-thirds of its total interactions — in the three hours before her update, a New York Times analysis found. In all, the rumor collected 142,000 likes and shares on Facebook, most of them coming from Ms. Owens’s post, according to a report by the Virality Project, a consortium of misinformation researchers from outfits like the Stanford Internet Observatory and Graphika.When reached for comment, Ms. Owens said in an email: “Unfortunately, I’m not interested in The New York Times. The people that follow me don’t take your hit pieces seriously.”Also in July, Thomas Renz, a lawyer, appeared in a video claiming that 45,000 people had died from coronavirus vaccines. The claim, since debunked, relies on unverified information from the Vaccine Adverse Event Reporting System, a government database. The baseless claim had been included in a lawsuit that Mr. Renz filed on behalf of an anonymous “whistle-blower,” in coordination with America’s Frontline Doctors — a right-wing group that spread misinformation about the pandemic in the past.Mr. Renz’s video got more than 19,000 views on Bitchute. The unfounded claim was repeated by the top Spanish-language Telegram channels, Facebook groups and the conspiracy website Infowars, collecting over 120,000 views across the platforms, according to the Virality Project.In an email, Mr. Renz said his practice had “performed the due diligence necessary” to believe in the accuracy of the allegations in the lawsuit he had filed. “We actually do not believe that the Biden administration is responsible for this, rather we believe that President Biden, like President Trump before him, was misled by the same group of conflicted bureaucrats,” Mr. Renz said.On Thursday, Mr. Torba, the Gab chief executive, claimed that he was “getting flooded” with text messages from members of the military who said they would be court-martialed if they refused a coronavirus vaccine. Though military leaders have pushed to vaccinate troops and Defense Secretary Lloyd J. Austin will seek to mandate coronavirus vaccines by September, there is no evidence that the military plans to court-martial troops who do not get vaccinated.Mr. Torba’s post collected 10,000 likes and shares on Gab, according to data from the Virality Project. Documents that he pushed on Gab’s news site to help service members request vaccine exemptions, including for religious reasons, also contained misinformation.One of the documents made use of an old antivaccine talking point that aborted fetal cell lines were used in the development of the Covid-19 vaccines — but Catholic and anti-abortion groups have said the vaccines are “morally acceptable.” The documents reached up to 2.2 million followers on Facebook, according to CrowdTangle data..css-1xzcza9{list-style-type:disc;padding-inline-start:1em;}.css-3btd0c{font-family:nyt-franklin,helvetica,arial,sans-serif;font-size:1rem;line-height:1.375rem;color:#333;margin-bottom:0.78125rem;}@media (min-width:740px){.css-3btd0c{font-size:1.0625rem;line-height:1.5rem;margin-bottom:0.9375rem;}}.css-3btd0c strong{font-weight:600;}.css-3btd0c em{font-style:italic;}.css-w739ur{margin:0 auto 5px;font-family:nyt-franklin,helvetica,arial,sans-serif;font-weight:700;font-size:1.125rem;line-height:1.3125rem;color:#121212;}#NYT_BELOW_MAIN_CONTENT_REGION .css-w739ur{font-family:nyt-cheltenham,georgia,’times new roman’,times,serif;font-weight:700;font-size:1.375rem;line-height:1.625rem;}@media (min-width:740px){#NYT_BELOW_MAIN_CONTENT_REGION .css-w739ur{font-size:1.6875rem;line-height:1.875rem;}}@media (min-width:740px){.css-w739ur{font-size:1.25rem;line-height:1.4375rem;}}.css-9s9ecg{margin-bottom:15px;}.css-uf1ume{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-box-pack:justify;-webkit-justify-content:space-between;-ms-flex-pack:justify;justify-content:space-between;}.css-wxi1cx{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:column;-ms-flex-direction:column;flex-direction:column;-webkit-align-self:flex-end;-ms-flex-item-align:end;align-self:flex-end;}.css-12vbvwq{background-color:white;border:1px solid #e2e2e2;width:calc(100% – 40px);max-width:600px;margin:1.5rem auto 1.9rem;padding:15px;box-sizing:border-box;}@media (min-width:740px){.css-12vbvwq{padding:20px;width:100%;}}.css-12vbvwq:focus{outline:1px solid #e2e2e2;}#NYT_BELOW_MAIN_CONTENT_REGION .css-12vbvwq{border:none;padding:10px 0 0;border-top:2px solid #121212;}.css-12vbvwq[data-truncated] .css-rdoyk0{-webkit-transform:rotate(0deg);-ms-transform:rotate(0deg);transform:rotate(0deg);}.css-12vbvwq[data-truncated] .css-eb027h{max-height:300px;overflow:hidden;-webkit-transition:none;transition:none;}.css-12vbvwq[data-truncated] .css-5gimkt:after{content:’See more’;}.css-12vbvwq[data-truncated] .css-6mllg9{opacity:1;}.css-qjk116{margin:0 auto;overflow:hidden;}.css-qjk116 strong{font-weight:700;}.css-qjk116 em{font-style:italic;}.css-qjk116 a{color:#326891;-webkit-text-decoration:underline;text-decoration:underline;text-underline-offset:1px;-webkit-text-decoration-thickness:1px;text-decoration-thickness:1px;-webkit-text-decoration-color:#326891;text-decoration-color:#326891;}.css-qjk116 a:visited{color:#326891;-webkit-text-decoration-color:#326891;text-decoration-color:#326891;}.css-qjk116 a:hover{-webkit-text-decoration:none;text-decoration:none;}“I’m telling the truth,” Mr. Torba said in an email. “Your Facebook-funded ‘fact checkers’ like Graphika are wrong and are the people peddling disinformation here.”Facebook, which has become more aggressive at enforcing its coronavirus misinformation policy in the past year, remains a popular destination for people discussing the misinformation.Media Matters for America, a liberal watchdog group, found over 200 public and private Facebook groups, with around 400,000 members, that were dedicated to antivaccine discussion. The groups, which The Times reviewed, added 13,000 members in the last seven days, according to Media Matters.Many of the most popular posts in the groups did not include explicit falsehoods. One was an image of a Scooby Doo character unmasking a ghost with a caption that read, “Let’s see what makes you scarier than all the other variants.” The unmasking revealed the logos of MSNBC and CNN, implying that the cable channels were overstating the severity of the Delta variant.But like the comments on many of the other pages, those beneath the Scooby Doo item did contain unfounded claims. They also included calls to violence.“China is completely to blame,” one comment said. “We’re going to have to fight them eventually, so I advocate a preemptive nuclear strike.”Facebook said that it removed confirmed violations of its coronavirus misinformation policy from comments, and that it had connected people with authoritative information about the virus.“We will continue to enforce against any account or group that violates our Covid-19 and vaccine policies,” Aaron Simpson, a Facebook spokesman, said in an email.Ms. Moran, the researcher, predicted there would be a “natural attention cycle” for this new round of misinformation. “After this spike, like with the original Covid strain, we’ll see it simmer down to normal levels of misinformation for a little while,” she said.But the coronavirus misinformation will not go away anytime soon, Ms. Moran predicted. “Unfortunately it’s not spikes and troughs, but steady levels of misinformation,” she said.Jacob Silver

Read more →