The investigation raises new considerations concerning the privateness and encryption of WhatsApp

98 0

TL; DR

  • A new report sheds light on the inner workings of WhatsApp’s content verification system.
  • The report suggests that, despite claims that employees cannot read messages, WhatsApp is still employing contractors to review content.
  • WhatsApp claims that its employees can only read messages that have been reported to the company.

WhatsApp’s privacy-centric claims may not be as watertight as users might expect, according to an in-depth new report. ProPublica revealed the inner workings of the company’s moderation system, suggesting that WhatsApp contractors may, under certain circumstances, be able to read messages sent between users.

According to the report, WhatsApp employs at least 1,000 contractors who use “special Facebook software” to scan content flagged by the company’s machine learning system or reported by users. This content varies from material on child abuse to spam, terrorist activity and beyond.

WhatsApp regularly notes that due to end-to-end encryption, which first debuted on the platform in 2016, only senders and recipients can see their chats. Since then, it’s been an important marketing tool for Facebook’s own service. The existence of a content review system, however, arguably contradicts the company’s privacy policy.

WhatsApp content review system

However, WhatsApp has good reason to implement a message reporting and verification system. It told ProPublica that this process will allow the company to ban abusive and malicious users from the platform. It also suggests that users must initiate this reporting process. When a user is reported, only the objectionable content and four previous messages are sent “decrypted” to WhatsApp in one thread. While moderators can see these messages, they do not have access to a user’s entire chat library, nor can the machine learning system access them. Reviewers can either discard the reported message, lock the reported user’s account, or put him on a “watch list”.

However, some unencrypted information can also be scanned. According to the report, unencrypted data from accounts placed on a “proactive” list can be used to benchmark against suspicious practices. This information ranges from the details of a user’s groups to their phone number, from their status message and unique cell phone ID to their battery level or signal strength.

See also: Here is everything you need to know about encryption

It is understandable that a chat platform would want to implement a review and reporting system so that users can report abuse, but perhaps WhatsApp’s lack of clarity regarding the system is the biggest problem. In a statement to ProPublica, Facebook noted that its content verification system was not a problem for users. “Based on the feedback we’ve received from users, we’re confident that when people report WhatsApp that we’re receiving the content they’re sending us,” it says.

Still, the report is likely still a blow to WhatsApp’s privacy optics, especially given the divisive changes to privacy policy. The company announced the changes in January that would allow some data to be shared with Facebook. WhatsApp has since changed its rollout plans. WhatsApp has also been fined $ 267 million for violating data protection laws in the EU.

Leave a Reply