“Do not accept WhatsApp̵
“In a few months, WhatsApp will launch a new version that shows you ads based on your chats,” said another. “Do not accept the new policy!”
Thousands of similar messages went viral on WhatsApp, the instant messaging app owned by Facebook, in the ensuing days. Started by celebrities like Tesla CEO Elon Musk and whistleblower Edward Snowden, millions of people rushed to download WhatsApp alternatives like Signal and Telegram.
There was only one problem: From the 4,000 word policy, it was clear that the new changes only applied if people used WhatsApp to chat with companies, not private conversations with friends and family.
No, the new terms would not let Facebook read your WhatsApp chats, the company explained to anyone who asked. Top executives sent long threads to Twitter, giving interviews to major publications in India, the company’s largest market. WhatsApp spent millions buying front-page ads in major newspapers and published graphics that debunked the rumors on its website with a large “Share to WhatsApp” button in hopes of spreading some truth in the stream of misinformation running through its platform. The company also encouraged Facebook employees to share these infographics, according to posts to its internal bulletin board Workplace.
“There has been a lot of misinformation and confusion, so we are working to provide accurate information on how WhatsApp protects people’s personal conversations,” a WhatsApp spokesman told BuzzFeed News. “We use our status feature to communicate directly with people in WhatsApp as well as send accurate information to social media and our site in dozens of languages. Of course, we have also made these resources available to people working in our company. So they can answer questions directly to friends and family if they wish. “
None of it worked.
For years, rumors and hoaxes spread via WhatsApp have given rise to a misinformation crisis in some of the world’s most populous countries such as Brazil and India, where the app is the primary way most people talk to each other. Now this crisis has reached the company itself.
“Trust in platforms is [at a] down to the bottom, ”Claire Wardle, co-founder and CEO of First Draft, a nonprofit that investigates misinformation, told BuzzFeed News. “We have had many years where people are becoming more and more concerned about the power of technology companies, especially an awareness of how much data they collect about us. So when privacy policies change, people are rightly concerned about what that means. ”
Wardle said people are worried that WhatsApp will link their in-app behavior to the data from their Facebook accounts.
“Facebook and WhatsApp have a huge trust deficit,” said Pratik Sinha, founder of Alt News, a fact-checking platform in India. “Once you have it, any misinformation attributed to you is easily consumed.”
What does not help, both Sinha and Wardle added, the lack of understanding among ordinary people about how technology and privacy work. “Confusion is where misinformation thrives,” Wardle said, “so people saw the policy changes, jumped to conclusions, and not surprisingly, many people believed the rumor.”
These patterns of misinformation that have been thriving on WhatsApp for years have often led to harm. In 2013, a video went viral in Muzaffarnagar, a city in northern India that allegedly showed two young men being lynched, prompting riots between the Hindu and Muslim communities in which dozens of people died. A police investigation revealed that the video was over two years old and was not even recorded in India. In Brazil, fake news flooded the platform and was used to favor the far-right candidate Jair Bolsonaro, who won the country’s 2018 presidential election.
But the company did not take its misinformation problem seriously until 2018, when rumors of kidnappers of children sweeping through the platform led to a series of violent lynchings across India. In a statement released at the time, India’s IT ministry warned WhatsApp of legal action, saying the company would be “treated as incentives” if it did not resolve the issue and sent WhatsApp into crisis mode. It flew top executives from the company’s Menlo Park, California, headquarters to New Delhi to meet with officials and journalists and ran high-profile information campaigns about misinformation.
It also built new features into the app to directly counter misinformation for the first time, such as tagging forwarded messages and limiting the number of people or groups a piece of content could be forwarded to curb viral content. Last August, it also started letting people in a handful of countries upload the text in a message to Google to check if a forward was fake. The feature is not yet available for WhatsApp users in India.
Since then, the company has been working on a tool that allows users to search for images they received in the app with a single tap in 2019, a step that helps people check more easily. But almost two years later, there is no sign of the feature, even though a text version is available in over a dozen countries that so far does not include India.
“We’re still working on the search engine feature,” a WhatsApp spokesman told BuzzFeed News.
This week, the company put a status message, WhatsApps equivalent of a Facebook story, at the top of people’s status section. Pressing status revealed a series of messages from the company debunking the rumors.
“WhatsApp does not share your contacts with Facebook,” said the first. Two additional status updates clarified that WhatsApp cannot see people’s location and cannot read or listen to encrypted personal conversations. “We are committed to your privacy,” the last message said.
On Thursday, employees had several questions for Facebook CEO Mark Zuckerberg ahead of a weekly Q&A, according to internal communications seen by BuzzFeed News. Some wanted to know if the growing move to Signal and Telegram affected WhatsApp’s usage and growth metrics. Others wanted the CEO to decide whether Facebook used metadata from WhatsApp to show ads or not.
“Public is furious @ WhatsApp PrivPolicy is changing,” commented another person. “Distrust of FB is so high that we should be more careful about this.”
Ryan Mac contributed reporting.