LONDONR / NEW YORK (Reuters) – Since the World Health Organization declared the coronavirus novel an international health emergency in January, Facebook Inc. (FB.O) has removed more than 7 million pieces of content with false claims about the virus that could pose an immediate health risk to people who trust them.
PHOTO PHOTO: A 3D logo printed on Facebook is seen in front of the words displayed of coronavirus disease (COVID-19) in this illustration taken March 24, 2020. REUTERS / Dado Ruvic / Illustration
The social media giant, who has long been under fire from lawmakers over how he handles misinformation on his platforms, said he had banned allegations such as ‘social distancing does not work’ in recent months because they pose a risk of harm. immediate ‘. Under those rules, Facebook removed a video post Wednesday by U.S. President Donald Trump in which he claimed children were “almost immune” to COVID-19.
But in most cases, Facebook does not remove misinformation about new COVID-19 vaccines that are still under development, according to the company’s vaccine policy director Jason Hirsch, arguing that such claims do not meet its near-term threshold. damage. Hirsch told Reuters the company is “struggling” with the dilemma of how to claim police in connection with new vaccines that are still untested.
“There is a ceiling on how much we can do until the facts on the ground become more concrete,” Hirsch said in an interview with Reuters, speaking publicly for the first time about how the company is trying to address the coronavirus vaccine issue.
Tom Phillips, editor at one of Facebook’s fact-finding partners Full Fact, sees the confusion this way: “How do you check the fact about a vaccine that doesn’t exist yet?”
Right now, misinformation ranging from baseless claims to complex conspiracy theories about developmental vaccines is spreading to a platform with more than 2.6 billion monthly active users, a summary of posts by Reuters, a fact-checker at Facebook and other researchers found.
The concern, public health experts told Reuters, is that spreading misinformation on social media could discourage people from eventually getting the vaccine, which is seen as the best chance of curbing a pandemic that has infected millions and killed hundreds of thousands. worldwide, including 158,000 people in the United States alone.
At the same time, free speech advocates worry about increased censorship during a time of uncertainty and lasting consequences long after the virus has subsided.
Drawing the line between truth and falsehood is also more complex for new COVID-19 vaccines, fact-checkers told Reuters, rather than content about vaccines with an established safety record.
Facebook representatives said the company has been consulting with about 50 experts in public health, vaccines and free expression on how to shape its response to claims about the new COVID-19 vaccines.
Although the first vaccines are not expected to hit the market for months, polls show that many Americans are already worried about getting a new COVID-19 vaccine, which is being developed at a record rate. About 28% of Americans say they are not interested in getting the vaccine, according to a Reuters / Ipsos poll conducted between July 15-21. Among them, more than 50% said they were nervous about the speed of development. More than a third said they did not trust the people behind the vaccine.
The UK-based nonprofit center for Digital Counting Hate reported in July that anti-vaccination content is booming on social media sites. Facebook groups and sites account for more than half of all anti-vaccines that follow all social media platforms studied by CCDH.
A public Facebook group called “Reject CORONA V @ X AND SCREW BILL GATES”, referring to the billionaire whose foundation is helping to fund vaccine development, was started in April by Michael Schneider, a 42-year-old contractor of the city in Waukesha, Wisconsin. The group grew to 14,000 members in less than four months. It was one of more than a dozen created in recent months which were dedicated to opposing the COVID-19 vaccine and the idea that it could be mandated by governments, Reuters revealed.
Schneider told Reuters he is suspicious of the COVID-19 vaccine because he thinks it is evolving too fast to be safe. “I think a lot of people are resting,” he said.
Posts about the COVID-19 vaccine that have been labeled on Facebook as “false information” but have not been removed include one by Schneider linked to a YouTube video claiming that the COVID-19 vaccine would alter the DNA of people, and a post claiming the vaccine would give people the coronavirus. (See Reuters fact check: reut.rs/30t1toW]
Facebook said these posts do not violate its policies regarding immediate harm. “If we simply removed all conspiracy theories and conspiracies, they would exist elsewhere on the Internet and the wider social media ecosystem. “It helps to give more context when these scams show up elsewhere,” said a spokeswoman.
Facebook does not tag or remove posts or advertisements expressing opposition to vaccines unless they contain false claims. Hirsch said Facebook believes users should be able to express such personal views and that more aggressive censorship of anti-vaccine views may also push people to hesitate about vaccines towards the anti-vaccine camp.
“‘It is the type of STEROIDS”
At the heart of Facebook’s decisions on what to remove are two considerations, Hirsch said. If a post is identified as merely false information, it will be tagged and Facebook can reduce its reach by limiting how many people the post will be displayed. For example, he took this approach with the video Schneider posted suggesting that the COVID-19 vaccine could alter human DNA.
If the false information is likely to cause immediate harm, then it will be removed altogether. Last month, under those rules, the company removed a video treating hydroxycholine as a coronavirus cure – albeit only after it garnered millions of views.
In March 2019, Facebook said it would begin reducing the rankings and search recommendations of groups and sites that spread misinformation about each vaccine. Facebook algorithms also link to organizations like the WHO when people search for vaccine information on the platform.
Some public health experts want Facebook to lower removal standards when considering false claims about upcoming COVID-19 vaccines. “I think there is a task platform (from) to make sure they are removing anything that could lead to harm,” said Rupali Limaye, a social scientist at Johns Hopkins Bloomberg School of Public Health, who has been in talks. me Facebook. “Because it’s such a deadly virus, I think it should not just be ‘immediate.'”
But Jacob Mchangama, executive director of the Copenhagen-based think tank, who was consulted by Facebook about his approach to vaccines, fears the consequences of mass deletions: “This could have long-term consequences for free speech when the virus is contained with hope, “he said.
Malformations related to other vaccines have not infrequently met the Facebook threshold to risk immediate harm.
However, in Pakistan last year, the company intervened to refute false claims about the polio vaccine that was leading to violence against health workers. In the Pacific island state of Samoa, Facebook deleted vaccine misinformation because the low vaccination rate was exacerbating a dangerous measles outbreak.
“When it comes to vaccines, it’s not a theoretical line … we try to determine when there is likely to be imminent harm as a result of misinformation and we try to act in those situations,” Hirsch told Reuters.
To combat misinformation that does not meet its removal criteria, Facebook pays off fact-checkers – including a Reuters unit – who may rate the posts as false and attach an explanation. The company has said that 95 percent of the time, people who saw fact-checking warning labels did not click on content. [bit.ly/33z7Jh6]
However, the fact-finding program has been criticized by some researchers as an inadequate response to the amount and speed of viral disinformation on platforms. Fact checkers also do not value politicians’ posts and they do not judge posts that are exclusively in private or hidden groups.
Determining what constitutes a false allegation in connection with the COVID-19 shooting is far more difficult than checking the facts about a vaccine set with a proven safety record, fact checkers on Facebook told Reuters.
“There’s a lot of content we see and we don’t even know what to do with it,” echoed Emmanuel Vincent, founder of Science Feedback, another Facebook fact-finding partner, who said the number of vaccines in development made it hard to debut claims about how a shot would work.
In a study published in the May issue of Nature, physicist Neil Johnson’s study group found that there were nearly three times as many active vaccination groups on Facebook as pro-vaccination groups during a global measles outbreak from February to October 2019, and they were growing faster.
Since the study was published, anti-vaccination views and COVID-19 vaccine plots have flourished on the platform, Johnson said, adding, “it’s kind of a steroid.”
Reporting by Elizabeth Culliford and Gabriella Borter, editing by Ross Colvin and Edward Tobin