An international group of law enforcement agencies are urging Meta not to standardize end-to-end encryption on Facebook Messenger and Instagram, which they say will harm their ability to fight child sexual abuse material (CSAM) online.
The Virtual Global Taskforce was formed in 2003 and is currently chaired by Britain’s National Crime Agency. The VGT consists of 15 law enforcement bodies, including Interpol, the FBI, the Australian Federal Police and other law enforcement agencies from around the world. In its letter [PDF], the VGT said reports from tech industry partners play a key role in fighting CSAM content, with Meta being its leading reporter of abuse material.
But the taskforce thinks that will end if Meta continues its encryption push. “The VGT has not yet seen any indication from META that any new safety systems implemented post-E2EE will effectively match or improve their current detection methods,” the taskforce said.
Meta’s WhatsApp platform has long used E2EE by default, and the company has long said it planned to implement E2EE on Facebook Messenger and Instagram, with most recent estimates indicating E2EE would become the default sometime this year.
As Reg readers know, End-to-end encryption theoretically makes it impossible for an intermediary to read the contents of messages – even if served with a subpoena the contents of an end-to-end encrypted message would be encoded.
“The announced implementation of E2EE on META platforms Instagram and Facebook is an example of a purposeful design choice that degrades safety systems and weakens the ability to keep child users safe,” the VGT said.
The organization cited the arrest and conviction of David Wilson in the UK as one example it claimed wouldn’t have been possible with E2EE in place. Wilson, a Facebook user who groomed hundreds of children using fake Facebook and Instagram profiles, was sentenced to 25 years in prison in 2021. Of his conviction, the VGT said “it is highly unlikely this case would have been detected” if E2EE had already been implemented.
“The VGT calls for all industry partners to fully appreciate the impact of implementing system design decisions that result in blindfolding themselves to CSA occurring on their platforms, or reduces their capacity to identify CSA and keep children safe,” the taskforce said.
In an email to The Register, Meta disputed the VGT’s claims that Wilson’s arrest wouldn’t have happened with E2EE in place, telling us it submits CSAM tips using both public and private information.
“We have developed detection systems using behavioral signals and other account activity that are not reliant on the content of private messages to identify malicious actors,” Meta said, adding that “It’s misleading and inaccurate to say that encryption would have prevented us from identifying and reporting accounts like David Wilson’s to the authorities.”
Without going into any details, Meta told us it’s committed to continuing to work with law enforcement as it rolls out E2EE. “We don’t think people want us reading their private messages, so have developed safety measures that prevent, detect and allow us to take action against this heinous abuse, while maintaining online privacy and security,” a Meta spokesperson told The Register.
Earlier this week, the UK’s professional computing body the BCS wrote its own statement urging the exact opposite of VGT’s: It wants parliament to shoot down the Online Safety Bill, a proposed piece of legislation that would require tech platforms to identify and remove CSAM or face fines.
Under the bill, companies would be required to remove content “whether communicated publicly or privately,” which, as The Register previously pointed out, would mean messages either wouldn’t be able to be encrypted, or scanning for CSAM would have to occur prior to encryption. Critics argue this would be tantamount to adding a government-sanctioned back door on encrypted communications, which BCS chief executive Rashik Parmar told us “is exactly what many bad actors want.”
“Building confidence in technology is a global priority in 2023. A bill aimed at keeping us safe online should protect encrypted messaging,” Parmar said.
The VGT said that it wants industry partners “only to implement platform design choices, including E2EE, at scale alongside robust safety systems that maintain or increase child safety.”
How that could be accomplished without also weakening encryption is something the law enforcement agencies are yet to answer. We’ve asked the VGT if it supports the Online Safety Bill, or whether it would support a different approach, but the taskforce has yet to respond to our email. ®