The Online Safety Bill and the Power to Scan Encrypted Messages

The Online Safety Bill, which has been discussed more extensively in other Corporate Law Journal articles, primarily focuses on improving online safety for children. After a third reading in the House of Lords on 6th September 2023, it has reached its final stages. It is expected to enter into English law in Autumn 2023.

 

Controversy & the Online Safety Bill

One of the methods outlined to achieve its goals is the imposition of stricter rules on social media platforms, making it more difficult for children to sign up for these sites. Another; more controversial element of the Bill, is the ability to order an inspection of encrypted, private messages. Michelle Donelan, Secretary of State for Science, Innovation and Technology, has recently defended this section of the Bill, emphasising its potential to mitigate child abuse online. Despite the aims, large technology companies have demonstrated opposition towards this section of the Bill.

 

Meta, the owners of WhatsApp, have been particularly vocal on this issue. WhatsApp markets itself on privacy, flaunting its end-to-end encryption guarantee. In response to this invasive policy, WhatsApp has declared that the company would prefer to withdraw its service in the UK rather than encroach on the private messages of its users.

 

OFCOM’s POWERS

Under the Bill, OFCOM will be able to obligate technology companies to use “accredited technology” to scan encrypted messages, for the purpose of detecting abusive content.

 

The proposed limitations to OFCOM’s powers include the requirement to produce a written report by an expert (as proposed by Lord Parkinson in July). Such reports may discuss the potential effect on freedom of expression and privacy, as well as alternatives to the use of the outlined technology. However, the current iteration of the Bill merely requires OFCOM to include these reports in their decision-making. Therefore, the reports are non-binding and lack legal authority.

 

During its most recent hearing, the House further restricted OFCOM’s powers under the Bill. It acknowledged that, at present, technology that meets the required privacy and accuracy standards does not exist. This seems to temporarily prevent the use of powers to scan end-to-end encrypted communications; until it becomes “technically feasible… (to only detect) child sexual abuse and exploitation content”.

 

Justifying the Bill

The purpose behind the controversial clauses relates to uncovering and erasing child abuse material that is circulated through encrypted messaging services. According to the National Society for the Prevention of Cruelty to Children (NSPCC), the number of reported cases of online grooming has drastically increased since 2017, with 34,000 cases being documented by UK police. Of this aggregate, 6,350 were cases concerning sexual communication with a child. These alarming statistics indicate that legislative intervention is required to protect children online.

 

The legislators are faced with a complex balancing act: they are attempting to increase the protection of young and vulnerable people online without penalising large technology companies through sizeable obligations. Currently, millions of UK citizens rely on messaging services; in particular, WhatsApp. Therefore, their removal in the UK would be damning, regardless of the legislators’ intent.

 

Clarifications and changing sentiments

Although the Bill remains controversial, some technology companies welcomed the most recent clarifications which are likely to postpone the use of OFCOM’s powers due to the lack of adequate technology. Following the statements of the House of Lords, Meredith Whittaker, the president of Signal, acknowledged the guidelines for the implementation of the new law as a “victory” for technology companies. For now, WhatsApp remains more sceptical.

 

Conclusion

Technology companies have long disapproved of the Bill due to its potential to cause severe harm to the freedom of expression and privacy. Moreover, they argue that monitoring illegal content in this manner is unrealistic due to the absence of a universal list of illegal materials. The level of privacy enjoyed by the user would fluctuate globally despite the legislation and this would be further complicated by international communication between users.

 

Tech companies are outlining the dilemma in a binary manner; implying a choice between privacy or child safety. The UK government has emphasised that this is not the case and that both can be ensured. Notably, satisfying this balance will require any interferences with encrypted messages to be proportionate and justifiable in the given circumstances.

By Alexander McLean