The University of Southampton
University of Southampton Institutional Repository

Toxic recommender algorithms: immunities, liabilities and the regulated self-regulation of the digital services act and the online safety act

Toxic recommender algorithms: immunities, liabilities and the regulated self-regulation of the digital services act and the online safety act
Toxic recommender algorithms: immunities, liabilities and the regulated self-regulation of the digital services act and the online safety act
This article critiques the apparently complementary nature of the intermediary immunities and the regulated self-regulation of the EU Digital Services Act 2022 [DSA] and the Online Safety Act 2023 [OSA]. Taking toxic recommender algorithms as the regulated target, the paper argues that while the DSA/OSA’s regulatory regimes purports to fill the regulatory vacuum created by the platform immunities granted in the early 2000s, they cannot compensate for the civil and criminal liability regimes foreclosed by them and are in fact undermined by their continuing presence. Whilst the DSA/OSA’s regulatory regimes recognise platforms as active participants in, and shapers of, the online content sphere, particularly through their recommender algorithms, the immunities remain stuck in the early view of the same platforms as passive providers of neutral infrastructure and thus as innocent messengers. Thus the immunities continue to allow platforms to adopt the toxic but profitable algorithms that the DSA/OSA seek to restrain. By the same token, statutory duties and administrative oversight can neither replace an individual’s right to redress nor the deliberative public engagement engendered by civil or criminal court cases. The article concludes that the tension between the two regimes may be resolved by restricting the immunities to truly neutral platforms, those not substantially invested in the content they are meant to regulate. For all other platforms the DSA/OSA self-regulatory regimes and standard liabilities could and should run in parallel.
Digital Services Act, Online Safety Act, Online Harms, recommender algorithms, host immunity, s.230 CDA, Platform liability, platform Immunities, polarisation, amplification
1757-7632
1-30
Kohl, Uta
813ff335-441f-4027-801b-4e6fc48409c3
Kohl, Uta
813ff335-441f-4027-801b-4e6fc48409c3

Kohl, Uta (2024) Toxic recommender algorithms: immunities, liabilities and the regulated self-regulation of the digital services act and the online safety act. Journal of Media Law, 1-30. (doi:10.1080/17577632.2024.2408912).

Record type: Article

Abstract

This article critiques the apparently complementary nature of the intermediary immunities and the regulated self-regulation of the EU Digital Services Act 2022 [DSA] and the Online Safety Act 2023 [OSA]. Taking toxic recommender algorithms as the regulated target, the paper argues that while the DSA/OSA’s regulatory regimes purports to fill the regulatory vacuum created by the platform immunities granted in the early 2000s, they cannot compensate for the civil and criminal liability regimes foreclosed by them and are in fact undermined by their continuing presence. Whilst the DSA/OSA’s regulatory regimes recognise platforms as active participants in, and shapers of, the online content sphere, particularly through their recommender algorithms, the immunities remain stuck in the early view of the same platforms as passive providers of neutral infrastructure and thus as innocent messengers. Thus the immunities continue to allow platforms to adopt the toxic but profitable algorithms that the DSA/OSA seek to restrain. By the same token, statutory duties and administrative oversight can neither replace an individual’s right to redress nor the deliberative public engagement engendered by civil or criminal court cases. The article concludes that the tension between the two regimes may be resolved by restricting the immunities to truly neutral platforms, those not substantially invested in the content they are meant to regulate. For all other platforms the DSA/OSA self-regulatory regimes and standard liabilities could and should run in parallel.

Text
ssrn-4947282 - Accepted Manuscript
Restricted to Repository staff only until 5 March 2026.
Request a copy
Text
Toxic recommender algorithms immunities liabilities and the regulated self-regulation of the Digital Services Act and the Online Safety Act - Version of Record
Available under License Creative Commons Attribution.
Download (935kB)

More information

Accepted/In Press date: 9 September 2024
e-pub ahead of print date: 1 October 2024
Keywords: Digital Services Act, Online Safety Act, Online Harms, recommender algorithms, host immunity, s.230 CDA, Platform liability, platform Immunities, polarisation, amplification

Identifiers

Local EPrints ID: 493785
URI: http://eprints.soton.ac.uk/id/eprint/493785
ISSN: 1757-7632
PURE UUID: db0cc8be-3f37-41e5-a8fe-2b92dde98ecc
ORCID for Uta Kohl: ORCID iD orcid.org/0000-0002-8616-9469

Catalogue record

Date deposited: 12 Sep 2024 16:54
Last modified: 03 Oct 2024 01:55

Export record

Altmetrics

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×