Encryption Not Compromised, Only Abuse Reports Inspected: WhatsApp
Encryption Not Compromised, Only Abuse Reports Inspected: WhatsApp
WhatsApp’s director of communications, Carl Woog acknowledged that teams of contractors in Austin and elsewhere review WhatsApp messages to identify and remove abusers.

Update: A WhatsApp spokesperson has clarified to News18 that WhatsApp only inspects user messages when a report of abusive content is raised by a co-user against a particular message. Such a report causes a message to be forwarded to WhatsApp, which allows it to check for abusers and spammers on the app. The spokesperson claims that its end to end encryption of messages is not broken as a result of this, as only voluntary reports of abuse are forwarded to the company under exceptional circumstances. The source report by ProPublica has also reflected WhatsApp’s stance on the below matter. Original report continues below.

Social media giant Facebook touts WhatsApp as a safe messaging platform where users chats are end-to-end encrypted. A recent report has now found out that WhatsApp may allow content moderators to access users’ messages in certain cases. According to a report in ProPublica, there are more than 1,000 contract workers in office buildings in Austin, Texas, Dublin, and Singapore. These hourly workers, according to the report, can only view messages that the users have reported. This means that these moderators can only see users messages, images and videos only when the receiver hits the report button to report the message to WhatsApp.

The report in ProPublica says that this messages review is one element in a broader monitoring operation in which the company also reviews material that is not encrypted, including data about the sender and their account. A 49-slide internal marketing presentation from December 2020 accessed by ProPublica emphasizes the “fierce” promotion of WhatsApp’s “privacy narrative.” It compares the brand character to “the Immigrant Mother.” This marketing material doesn’t mention the company’s content moderation efforts.

WhatsApp’s director of communications, Carl Woog acknowledged that teams of contractors in Austin and elsewhere review WhatsApp messages to identify and remove abusers. However, he told the publication that Facebook does not consider this work to be content moderation. “The decisions we make around how we build our app are focused around the privacy of our users, maintaining a high degree of reliability and preventing abuse,” Wong was quoted in the report as saying.

A ProPublica investigation that draws on data, documents, and dozens of interviews reveals how WhatsApp’s security has been compromised since Facebook’s purchase of the platform in 2014.

Read all the Latest News, Breaking News and Assembly Elections Live Updates here.

What's your reaction?

Comments

https://tupko.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!