This article points out that Facebook’s planned content moderation scheme will result in an encryption backdoor into WhatsApp:
In Facebook’s vision, the actual end-to-end encryption client itself such as WhatsApp will include embedded content moderation and blacklist filtering algorithms. These algorithms will be continually updated from a central cloud service, but will run locally on the user’s device, scanning each cleartext message before it is sent and each encrypted message after it is decrypted.
The company even notedthat when it detects violations it will need to quietly stream a copy of the formerly encrypted content back to its central servers to analyze further, even if the user objects, acting as true wiretapping service.
Facebook’s model entirely bypasses the encryption debate by globalizing the current practice of compromising devices by building those encryption bypasses directly into the communications clients themselves and deploying what amounts to machine-based wiretaps to billions of users at once.
Once this is in place, it’s easy for the government to demand that Facebook add another filter — one that searches for communications that they care about — and alert them when it gets triggered.
Of course alternatives like Signal will exist for those who don’t want to be subject to Facebook’s content moderation, but what happens when this filtering technology is built into operating systems?
The problem is that if Facebook’s model succeeds, it will only be a matter of time before device manufacturers and mobile operating system developers embed similar tools directly into devices themselves, making them impossible to escape. Embedding content scanning tools directly into phones would make it possible to scan all apps, including ones like Signal, effectively ending the era of encrypted communications.
I don’t think this will happen — why does AT&T care about content moderation — but it is something to watch?