WASHINGTON (AP) — In March, as claims about the dangers and ineffectiveness of coronavirus vaccines spun across social media and undermined attempts to stop the spread of the virus, some Facebook The employees believed that they were able to assist others.

By altering how posts about vaccines are ranked in people’s newsfeeds, researchers at the company realized they could curtail the misleading information individuals saw about COVID-19 vaccines and offer users posts from legitimate sources like the World Health Organization.

“Given these results, I’m assuming we’re hoping to launch ASAP,” one FacebookResponding to the internal memo on the study, employee wrote.

Facebook instead retracted some of the suggestions. Other changes weren’t made until April.

A Facebook researcher had suggested that comments be disabled on March vaccine posts to make it easier for the platform to address anti-vaccine messages. However, this suggestion was not taken seriously.

Critics say the reason Facebook was slow to take action on the ideas is simple: The tech giant worried it might impact the company’s profits.

“Why would you not remove comments? Because engagement is the only thing that matters,” said Imran Ahmed, the CEO of the Center for Countering Digital Hate, an internet watchdog group. “It drives attention and attention equals eyeballs and eyeballs equal ad revenue.”

In an emailed statement, Facebook said it has made “considerable progress” this year with downgrading vaccine misinformation in users’ feeds.

Facebook’s internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. These redacted versions were sent to Congress were obtained by a consortium of news organizationsIncluding The Associated Press.

This trove of documents shows Facebook was diligently investigating how it spread misinformation regarding life-saving vaccines during the COVID-19 pandemic. The documents also show that rank-and-file employees frequently suggested ways to counter the anti-vaccine content. The Wall Street Journal reported on some of Facebook’s efforts to deal with anti-vaccine comments last month.

Facebook’s response raises questions about whether the company prioritized controversy and division over the health of its users.

“These people are selling fear and outrage,” said Roger McNamee, a Silicon Valley venture capitalist and early investor in Facebook who is now a vocal critic. “It is not a fluke. It is a business model.”

Typically, Facebook ranks posts by engagement — the total number of likes, dislikes, comments, and reshares. This ranking system may be useful for simple topics like dog photos or recipes. But Facebook’s own documents show that when it comes to divisive public health issues like vaccines, engagement-based ranking only emphasizes polarization, disagreement, and doubt.

Facebook researchers modified the ranking system for over 6,000 people in Mexico, Brazil and the Philippines to find ways of reducing vaccine misinformation. Users no longer saw vaccine posts chosen by their popularity. They were shown posts that are trustworthy.

These results showed a dramatic decrease of content that claimed to be debunked or verified by fact-checkers, and an 8% rise in content from trusted public health organisations such as the WHO/U.S. Centers for Disease Control. These users saw a 7 percent decrease in the number of negative interactions.

Employees at the company reacted to the study with exuberance, according to internal exchanges included in the whistleblower’s documents.

“Is there any reason we wouldn’t do this?” one Facebook employee wrote in response to an internal memo outlining how the platform could rein in anti-vaccine content.

Facebook said it did implement many of the study’s findings — but not for another month, a delay that came at a pivotal stage of the global vaccine rollout.

In a statement, company spokeswoman Dani Lever said the internal documents “don’t represent the considerable progress we have made since that time in promoting reliable information about COVID-19 and expanding our policies to remove more harmful COVID and vaccine misinformation.”

According to the company, it also took some time for them to review and make changes.

Yet the need to act urgently couldn’t have been clearer: At that time, states across the U.S. were rolling out vaccines to their most vulnerable — the elderly and sick. The public health authorities were concerned. Just 10% had already received their COVID-19 vaccine. A third of Americans thought about skipping the vaccine entirely according to research. poll from The Associated Press-NORC Center for Public Affairs Research

Despite this, Facebook employees acknowledged they had “no idea” just how bad anti-vaccine sentiment was in the comments sections on Facebook posts. However, company research from February revealed that up to 60% of comments made on vaccine posts was anti-vaccine and vaccine reluctant.

“That’s a huge problem and we need to fix it,” the presentation on March 9 read.

Even worse, company employees admitted they didn’t have a handle on catching those comments. And if they did, Facebook didn’t have a policy in place to take the comments down. It was possible to create a free-for all by posting negative comments on vaccines from humanitarian groups or news outlets.

“Our ability to detect (vaccine hesitancy) in comments is bad in English — and basically non-existent elsewhere,” another internal memo posted on March 2 said.

Derek Beres of Los Angeles, a fitness instructor and author, is seeing anti-vaccine posts thrive on the comments whenever he encourages vaccinations via his Instagram account, which is owned Facebook. Beres started hosting a podcast in 2013 with his friends, after they saw conspiracy theories surrounding COVID-19 vaccines on social media accounts of well-known health and wellness bloggers.

Earlier this year, when Beres posted a picture of himself receiving the COVID-19 shot, some on social media told him he would likely drop dead in six months’ time.

“The comments section is a dumpster fire for so many people,” Beres said.

Facebook became so hostile to vaccination that prominent health organizations like UNICEF, World Health Organization, and World Health Organization, were encouraging people to get the vaccine. However, these agencies refused to advertise for free that Facebook gave them, according to documents.

Some Facebook employees thought of an alternative. While Facebook worked out an anti-vaccine plan, some employees thought it would be a good idea to disable comments on posts.

“Very interested in your proposal to remove ALL in-line comments for vaccine posts as a stopgap solution until we can sufficiently detect vaccine hesitancy in comments to refine our removal,” one Facebook employee wrote on March 2.

This suggestion was not accepted.

Mark Zuckerberg, Facebook’s CEO, announced that Facebook would begin labeling vaccine posts that are safe on March 15.

The move allowed Facebook to continue to get high engagement — and ultimately profit — off anti-vaccine comments, said Ahmed of the Center for Countering Digital Hate.

“They were trying to find ways to not reduce engagement but at the same time make it look like they were trying to make some moves toward cleaning up the problems that they caused,” he said.

It’s unrealistic to expect a multi-billion-dollar company like Facebook to voluntarily change a system that has proven to be so lucrative, said Dan Brahmy, CEO of Cyabra, an Israeli tech firm that analyzes social media networks and disinformation. Brahmy stated that Facebook may only be forced to change if there are government regulations.

“The reason they didn’t do it is because they didn’t have to,” Brahmy said. “If it hurts the bottom line, it’s undoable.”

In the area of bipartisan legislation U.S. Senate would require social media platforms to give users the option of turning off algorithms tech companies use to organize individuals’ newsfeeds.

John Thune of South Dakota was a sponsor for the bill and asked Haugen, Facebook’s whistleblower, to explain the dangers of engagement-based rank during her testimony before Congress this month.

She said there are other ways of ranking content — for instance, by the quality of the source, or chronologically — that would serve users better. The reason Facebook won’t consider them, she said, is that they would reduce engagement.

“Facebook knows that when they pick out the content … we spend more time on their platform, they make more money,” Haugen said.

Haugen’s leaked documents also reveal that a relatively small number of Facebook’s anti-vaccine users are rewarded with big pageviews under the tech platform’s current ranking system.

Internal Facebook research presented on March 24 warned that most of the “problematic vaccine content” was coming from a handful of areas on the platform. In Facebook communities where vaccine distrust was highest, the report pegged 50% of anti-vaccine pageviews on just 111 — or .016% — of Facebook accounts.

“Top producers are mostly users serially posting (vaccine hesitancy) content to feed,” the research found.

That same day, Center for Countering Digital Hate published a study of social media posts which found just 12 Facebook users responsible for 73% anti-vaccine postings between February and April. It was a study that Facebook’s leaders in August told the public was “faulty,” despite the internal research published months before that confirmed a small number of accounts drive anti-vaccine sentiment.

This month was earlier. an AP-NORC poll found Most Americans are blaming social media companies like Facebook and their users for spreading misinformation.

But Ahmed said Facebook shouldn’t just shoulder blame for that problem.

“Facebook has taken decisions which have led to people receiving misinformation which caused them to die,” Ahmed said. “At this point, there should be a murder investigation.”

___

Seitz was based in Columbus, Ohio.

___

See full coverage of the “The Facebook Papers” here: https://apnews.com/hub/the-facebook-papers

Source: HuffPost.com.

Share Your Comment Below

LEAVE A REPLY

Please enter your comment!
Please enter your name here