European Commission Launches Investigation Into TikTok’s Addictive Architecture

Introduction

ByteDance’s global phenomenon TikTok is currently under scrutiny from the European Commission. Thierry Breton, the EU’s industry chief, announced the investigation on 19 February 2024 after reviewing TikTok’s recently published risk assessment report. Taking to X (formerly known as Twitter), Breton explained that action has been taken due to a ‘suspected breach of transparency and obligations to protect minors: addictive design and screen-time limits, rabbit hole effect, age verification, default privacy settings’.

 

The Commission’s Inquiry

The Commission is relying on the European Union’s recently introduced Digital Services Act (DSA). The first DSA investigation took place in December 2023, in which Twitter was the subject. As of 17 February 2024, the laws now apply to all platforms.

 

The DSA is designed to regulate online spaces, including social media platforms and marketplaces, to prevent harmful activities online. Moreover, the Act strives to protect the rights of consumers by imposing proportionate rules.

 

This DSA investigation is not the only instance of TikTok being scrutinised by the EU. In September 2023, ByteDance was fined $370 million for inadequate safeguards preventing children from accessing its platform.

 

Is TikTok’s Architecture Addictive?

Fundamentally, user well-being lies behind the Commission’s inquiry. The Commission is particularly concerned with TikTok’s advanced algorithm. The app monitors the user’s engagement with a particular topic, video style or creator and is able to feed back more content that is identical, or adequately similar, to sustain engagement with the platform.

 

The majority of social media sites operate on this endless loop. However, TikTok’s algorithm appears to be significantly more sophisticated than its competitors in the space, meaning it has been able to keep more users active for longer periods of time. The Commission is concerned that this “may stimulate behavioural addictions” or create “rabbit-hole effects” due to the effectiveness of the system.

 

TikTok’s terms and conditions specify that users must be at least 13 years old to register with the platform. However, the account creation stage does not appear to adequately prevent children below 13 from accessing and using the app. This can be discerned by the wide range of creators on the platform that cater towards younger audiences. “Behavioural addictions” can be particularly harmful to young children as their brain develops.

 

It must be acknowledged that parents also have an integral role to play in monitoring their children’s activity online. Nonetheless, the Commission’s inquiry may lead to firmer rules controlling account creation to protect young children.

 

Another concerning feature of TikTok’s architecture is its integrated marketplace. Introduced in November 2022, the storefront is accessible at all times whilst on the platform.

 

The DSA also regulates online marketplaces, bringing this within the ambit of the Commission’s inquiry. Although the marketplace’s existence may not be problematic; instead, the ability of young children to access it and purchase from it is concerning. It is likely therefore that the Commission may call for tighter age restrictions on the marketplace particularly, if not the site as a whole.

 

Conclusion

At the time of writing, it is uncertain what results the Commission’s inquiry will yield. This article has identified the Commission’s concerns and has explored the question of whether TikTok’s design can be accurately described as “addictive”.

 

By Alexander McLean