CHANGE IS COMING. Today, more than 1 billion people use the Facebook-owned messaging app WhatsApp to share messages, photos, and videos. With the tap of a button, they can forward a funny meme or send a party invite to groups of friends and family. They can also easily share “fake news,” rumors and propaganda disguised as legitimate information.
In India — the nation where people forward more WhatsApp content than anywhere else — WhatsApp-spread fake news is inciting mob violence and literally getting people killed. On Thursday, WhatsApp announced in a blog post that it plans to make several changes in an effort to prevent more violence.
Some of the changes will only apply to users in India. They will no longer see the “quick forward” button next to photos and videos that made that content particularly easy to send along quickly, without incorporating information about where it came from. They’ll also no longer be able to forward content to more than five chats at a time. In the rest of the world, the new limit for forwards will be 20 chats. The previous cap was 250.
THE ELEPHANT IN THE ROOM. Over the past two months, violent mobs have attacked two dozen people in India after WhatsApp users spread rumors that those people had abducted children. Some of those people even died from their injuries.
The Indian government has been pressuring WhatsApp to do something to address these recent bouts of violence; earlier on Thursday, India’s Ministry of Electronics and Information Technology threatened the company with legal action if it didn’t figure out some effective way to stop the mob violence.
The WhatsApp team, however, never mentions that violence is the reason for the changes in its blog post, simply asserting that the goal of the control changes is to maintain the app’s “feeling of intimacy” and “keep WhatsApp the way it was designed to be: a private messaging app.”
TRY, TRY AGAIN. This is WhatsApps’ third attempt in the last few weeks to address the spread of fake news in India. First, the company added a new label to the app to indicate that a message is a forward (and not original content from the sender). Then, they published full-page ads in Indian newspapers to educate the public on the best way to spot fake news.
Neither of those efforts has appeared to work, and it’s hard to believe the latest move will have the intended impact either. Each WhatsApp chat can include up to 256 people. That means a message forwarded to five chats (per the new limit) could still reach 1,280 people. And if those 1,280 people then forward the message to five chats, it’s not hard to see how fake news could still spread like wildfire across the nation.
READ MORE: WhatsApp Launches New Controls After Widespread App-Fueled Mob Violence in India [The Washington Post]
More on fake news: Massive Study of Fake News May Reveal Why It Spreads so Easily
The post WhatsApp Updates Controls in India in an Effort to Thwart Mob Violence appeared first on Futurism.