Moderating manipulation: demystifying extremist tactics for gaming the (regulatory) system
Moderating manipulation: demystifying extremist tactics for gaming the (regulatory) system
Due to its ease of scalability and broad applicability, the use of artificial intelligence (AI) and machine learning in platform management has gained prominence. This has led to widespread debates about the use of deplatforming as the default tool for repeated or severe violations of terms or service. But technologically deterministic approaches are not infallible and can be predictable based on their actions. This opens the door for manipulation of media content and technological affordances to become tactical options for actors seeking to subvert regulation. Existing discussions often neglect the topic of manipulation of content, algorithms, or platform affordances as a primary aspect of the strategies used by extremists in relation to the difficulties of moderation from a policy perspective. This study argues that it is essential to understand how extremists and conspiracy theorists use manipulation tactics to ‘game’ the current policy, regulatory and legislative systems of content moderation. Developing approaches that attend to manipulation as a strategy and focus on platform and context-specific tactics will generate more effective policies, platform rules, AI developments and moderation procedures. This study analyses and demystifies three primary tactics, which the authors categorize as numerology, borderlands and merchandising, regularly used by extremists online in their strategies to ‘game’ content moderation. We provide case examples from a variety of ideologies including far-right, QAnon and male supremacism to highlight the tactics rather than ideological nature of such manipulation. We conclude with a discussion of how demystification processes could be incorporated into content moderation settings. This study contributes new insights about evasion tactics to the content moderation discussion and expands current understanding of how platforms can develop sociotechnical remedial measures.
artificial intelligence (AI), content moderation, digital culture, extremism, machine learning, manipulation tactics, propaganda
478-497
Mattheis, Ashley A.
c328cf8f-e566-47ec-8d03-1bc5e98e6765
Kingdon, Ashton
c432a21d-9395-47d2-bc34-1ee77f63bc5c
December 2023
Mattheis, Ashley A.
c328cf8f-e566-47ec-8d03-1bc5e98e6765
Kingdon, Ashton
c432a21d-9395-47d2-bc34-1ee77f63bc5c
Mattheis, Ashley A. and Kingdon, Ashton
(2023)
Moderating manipulation: demystifying extremist tactics for gaming the (regulatory) system.
Policy and Internet, 15 (4), .
(doi:10.1002/poi3.381).
Abstract
Due to its ease of scalability and broad applicability, the use of artificial intelligence (AI) and machine learning in platform management has gained prominence. This has led to widespread debates about the use of deplatforming as the default tool for repeated or severe violations of terms or service. But technologically deterministic approaches are not infallible and can be predictable based on their actions. This opens the door for manipulation of media content and technological affordances to become tactical options for actors seeking to subvert regulation. Existing discussions often neglect the topic of manipulation of content, algorithms, or platform affordances as a primary aspect of the strategies used by extremists in relation to the difficulties of moderation from a policy perspective. This study argues that it is essential to understand how extremists and conspiracy theorists use manipulation tactics to ‘game’ the current policy, regulatory and legislative systems of content moderation. Developing approaches that attend to manipulation as a strategy and focus on platform and context-specific tactics will generate more effective policies, platform rules, AI developments and moderation procedures. This study analyses and demystifies three primary tactics, which the authors categorize as numerology, borderlands and merchandising, regularly used by extremists online in their strategies to ‘game’ content moderation. We provide case examples from a variety of ideologies including far-right, QAnon and male supremacism to highlight the tactics rather than ideological nature of such manipulation. We conclude with a discussion of how demystification processes could be incorporated into content moderation settings. This study contributes new insights about evasion tactics to the content moderation discussion and expands current understanding of how platforms can develop sociotechnical remedial measures.
Text
Policy Internet - 2023 - Mattheis - Moderating manipulation Demystifying extremist tactics for gaming the regulatory
- Version of Record
More information
Accepted/In Press date: 26 October 2023
Published date: December 2023
Additional Information:
Funding Information:
The authors wish to thank Dr. Pamela Ugwudike for providing feedback on earlier versions of this study. The authors gratefully acknowledge the support received from Swansea University's Legal innovation Lab Wales (which is in part funded by the European Regional Development Fund through the Welsh Government).
Publisher Copyright:
© 2023 The Authors. Policy & Internet published by Wiley Periodicals LLC on behalf of Policy Studies Organization.
Keywords:
artificial intelligence (AI), content moderation, digital culture, extremism, machine learning, manipulation tactics, propaganda
Identifiers
Local EPrints ID: 485143
URI: http://eprints.soton.ac.uk/id/eprint/485143
ISSN: 1944-2866
PURE UUID: f39d9c36-000a-46f6-a82b-5de81bdec080
Catalogue record
Date deposited: 30 Nov 2023 17:36
Last modified: 18 Mar 2024 04:02
Export record
Altmetrics
Contributors
Author:
Ashley A. Mattheis
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics