Signal CEO: We “1,000% won’t participate” in UK law to weaken encryption

By | February 24, 2023
Signal app on a phone.
Enlarge / Signal app on a phone.
Getty Images

The nonprofit responsible for the Signal messenger app is prepared to exit the UK if the country requires providers of encrypted communications to alter their products to ensure user messages are free of material that’s harmful to children.

“We would absolutely exit any country if the choice were between remaining in the country and undermining the strict privacy promises we make to the people who rely on us,” Signal CEO Meredith Whittaker told Ars. “The UK is no exception.”

Whittaker’s comments came as the UK Parliament is in the process of drafting legislation known as the Online Safety Bill. The bill, introduced by former Prime Minister Boris Johnson, is a sweeping piece of legislation that requires virtually any provider of user-generated content to block child sexual abuse material, often abbreviated as CSAM or CSA. Providers must also ensure that any legal content that can be accessed by minors—including self-harm topics—is age appropriate.

E2EE in the crosshairs

Provisions in the bill specifically take aim at end-to-end encryption, which is a form of encryption that allows only the senders and recipients of a message to access the human-readable form of the content. Typically abbreviated as E2EE, it uses a mechanism that prevents even the service provider from decrypting encrypted messages. Robust E2EE that’s enabled by default is Signal’s top selling point to its more than 100 million users. Other services offering E2EE include Apple iMessages, WhatsApp, Telegram, and Meta’s Messenger, although not all of them provide it by default.

Under one provision of the Online Safety Bill, service providers are barred from providing information that’s “encrypted such that it is not possible for [UK telecommunications regulator] Ofcom to understand it, or produces a document which is encrypted such that it is not possible for Ofcom to understand the information it contains,” and when the intention is to prevent the British watchdog agency from understanding such information.

An impact assessment drafted by the UK’s Department for Digital, Culture, Media & Sport explicitly says that E2EE is within the scope of the legislation. One section of the assessment states:

The Government is supportive of strong encryption to protect user privacy, however, there are concerns that a move to end-to-end encrypted systems, when public safety issues are not taken into account, is eroding a number of existing online safety methodologies. This could have significant consequences for tech companies’ ability to tackle grooming, sharing of CSA material, and other harmful or illegal behaviours on their platforms. Companies will need to regularly assess the risk of harm on their services, including the risks around end-to-end encryption. They would also need to assess the risks ahead of any significant design changes such as a move to end-to-end encryption. Service providers will then need to take reasonably practicable steps to mitigate the risks they identify.

The bill doesn’t provide a specific way for providers of E2EE services to comply. Instead, it funds five organizations to develop “innovative ways in which sexually explicit images or videos of children can be detected and addressed within end-to-end encrypted environments, while ensuring user privacy is respected.”

Source