TL; PhD
- A new report highlights the inner workings of the WhatsApp content review system.
- The report indicated that even though WhatsApp claimed that employees could not read the messages, it still hired contractors to review the content.
- WhatsApp insists that its employees can only read messages reported to the company.
According to a lengthy new report, WhatsApp’s privacy-focused statement may not be as strict as users expected. Public Broadcasting Corporation It revealed the internal workings of the company’s audit system, which shows that WhatsApp contractors can read messages sent between users under certain circumstances.
According to the report, WhatsApp employs at least 1,000 contractors to use “special Facebook software” to scan content flagged by the company’s machine learning system or reported by users. The content ranges from child abuse materials to spam, terrorist activities, and more.
WhatsApp often points out that due to end-to-end encryption, only senders and recipients can see their chat history, which first appeared on the platform in 2016. Since then, it has been a key marketing tool for Facebook’s services. However, the existence of the content censorship system can be said to run counter to the company’s privacy promotion.
WhatsApp’s content review system
However, WhatsApp has good reasons to implement a message reporting and review system.It tells Public Broadcasting Corporation This process allows the company to prohibit abuse and harmful users on the platform. It also recommends that users must initiate this reporting process. When a user is reported, only the offending content and the four previous messages in the thread will be sent to WhatsApp “unscrambled”. Although moderators can see these messages, they don’t have access to the user’s entire chat library, nor can the machine learning system. Reviewers can close reported messages, ban the accounts of reported users, or put them on the “watch” list.
However, some unencrypted information can also be scanned. According to the report, unencrypted data from accounts placed on the “active” list can be used to compare them with suspicious behavior. This information ranges from user group details to their phone numbers, from their status messages and unique mobile IDs to their battery level or signal strength.
You can also take a look: This is all you need to know about encryption
It is understandable that the chat platform hopes to implement a review and reporting system to allow users to report abuse, but WhatsApp’s lack of clarity on the system may be the biggest problem at the moment.In a statement Public Broadcasting Corporation, Facebook pointed out that it believes that its content review system is not a problem for users. “Based on the feedback we received from users, we believe that people will understand that when they report to WhatsApp, we will receive what they send us,” it noted.
Nevertheless, the report may still be a blow to WhatsApp’s privacy optics, especially in the context of its split privacy policy changes. The company announced some changes in January, allowing some data to be shared with Facebook. WhatsApp has since changed its launch plan. WhatsApp was also fined US$267 million for violating EU privacy laws.