While Facebook claims to have made “considerable progress” with downgrading misinformation regarding COVID vaccines in users’ feeds, the internal company documents obtained by former Facebook product manager-turned-whistleblower raises pertinent questions about the platform’s business model and ethics.
By Jisha Krishnan
“Honestly, I wasn’t surprised. It has been an open secret,” says a self-confessed Facebook addict and techie based in Bengaluru, as we discuss ‘The Facebook Papers’. Thanks to the powerful collaboration among journalists from across the globe, the world is now privy to how the social network has driven the spread of misinformation. Facebook is accused of prioritising profits over public health and safety.
Based on the internal company documents obtained by Frances Haugen, the former Facebook product manager-turned-whistleblower – who testified before a U.S. Senate subcommittee earlier this month – the allegations are grave. The most damning evidence, perhaps, is that the tech giant was hiding the findings of its own research from investors and the public.
Blame the algorithm
In March 2021, the documents show, researchers at Facebook found a way to curtail the growing misinformation about COVID-19 vaccines. Instead of ranking posts by engagement - the number of likes, dislikes, comments, and reshares - employees believed that by prioritising information from credible sources like the World Health Organization (WHO) the company could curb the spread of misinformation.
When Facebook researchers changed how posts were ranked (based on trustworthiness) for more than 6,000 users in the U.S., Mexico, Brazil, and the Philippines, they found a 12 per cent decrease in content that made claims debunked by fact-checkers. There was also an eight per cent increase in content from sources such as the WHO and U.S. Centers for Disease Control. What’s more, those users also had a seven per cent decrease in negative interactions on the site.
Yet Facebook took a month to incorporate some of the suggestions from the study, while the remaining recommendations were put on hold. The proposal to disable comments on vaccine posts until they figured a way to tackle anti-vaccine messages didn’t pass muster at the time. This was despite company research finding that almost 60 per cent of the comments on vaccine posts were anti-vaccine or vaccine reluctant.
The fundamental question to be asked is whether Facebook was reluctant to act, as the critics claim, because it could impact user engagement and put a dent in the company’s profits. Was facilitating polarisation and vaccine hesitancy more profitable? Was the company merely paying lip service to the idea of fighting vaccine-related misinformation?
While Facebook maintains that it has made “considerable progress” with downgrading misinformation regarding COVID vaccines in users’ feeds, ‘The Facebook Papers’ raises pertinent questions about the platform’s business model and ethics. According to the internal company documents, some of Facebook’s anti-vaccine users were rewarded with big page views under the platform’s prevailing ranking system.
“A curated selection out of millions of documents at Facebook can in no way be used to draw fair conclusions about us,” the company tweeted earlier this month. “The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook,” it said in a recent statement.
One of Facebook CEO Mark Zuckerberg’s famous quotes states, “The question I ask myself like almost every day is, ‘Am I doing the most important thing I could be doing?’” Right now, the overwhelming response to that question for the founder of the world’s largest social network seems far from satisfactory.