Sun Sep 12 2021
Facebook has consistently claimed that WhatsApp end-to-end encryption means no one but the sender/sendee can read the messages
WhatsApp’s much vaunted end-to-end encryption and the unparalleled privacy it supposedly offers its users… may not actually be as private as Facebook has been telling everyone, that is at least according to a report by ProPublica.
According to their investigation, Facebook’s moderator contract firm, Accenture, which employs over 1,000 moderators in offices around the globe (in locations as diverse as Texas, Dublin and Singapore), are paid to sift through user’s private messages on the platform after WhatsApp algorithm flags them or a user reports them.
Any message flagged (either manually or through the algorithm) as spam, blackmail, hate speech, disinformation, potential terrorist threats or 'sexually oriented businesses' is sent to a moderator for review.
Based on what they see, they can then choose to either block the account, put it on ‘watch’ with the heightened vigilance or leave it alone.
Once the message reaches an employee of Accenture for moderation they can see both the reported message, and the previous five messages in the conversation for context.
WhatsApp moderators actually told ProPublica that the platforms machine learning algorithm had trouble with misidentifying certain content as needing moderation… for instance images of children in baths sent between parents are often flagged as possibly abusive.
If ProPublica’s finding are to be believed them it directly contradicts both WhatsApp and parent company Facebook’s claims that they both don’t and in fact can’t see end-to-end encrypted messages between users.
In fact, back in 2018 when the US started probing Facebook’s activities in some depth, CEO Mark Zuckerberg went on record to the US Senate saying: "we don't see any of the content in WhatsApp, it's fully encrypted."
Facebook have replied to ProPublica’s investigation, telling another news outlet that all WhatsApp messages are end-to-end encrypted and the ProPublica report was based on a misunderstanding.
They said that WhatsApp allowed users to report abusive messages to moderators who could then view the reports… not the original messages. They however didn’t clarify if those reports included message logs or how a moderator was supposed to ‘moderate’ without the context of the messages being reported.
WhatsApp provides a way for people to report spam or abuse, which includes sharing the most recent messages in a chat. This feature is important for preventing the worst abuse on the internet. We strongly disagree with the notion that accepting reports a user chooses to send us is incompatible with end-to-end encryption. We build WhatsApp in a manner that limits the data we collect while providing us tools to prevent spam, investigate threats, and ban those engaged in abuse, including based on user reports we receive. This work takes extraordinary effort from security experts and a valued trust and safety team that works tirelessly to help provide the world with private communication.
Sun Sep 12 2021