The University of Southampton
University of Southampton Institutional Repository

Explainable AI and the philosophy and practice of explanation

Explainable AI and the philosophy and practice of explanation
Explainable AI and the philosophy and practice of explanation
Considerations of the nature of explanation and the law are brought together to argue that computed accounts of AI systems’ outputs cannot function on their own as explanations of decisions informed by AI. The important context for this inquiry is set by Article 22(3) of GDPR. The paper looks at the question of what an explanation is from the point of view of the philosophy of science – i.e. it asks not what counts as explanatory in legal terms, or what an AI system might compute using provenance metadata, but rather what explanation as a social practice consists in, arguing that explanation is an illocutionary act, and that it should be considered as a process, not a text. It cannot therefore be computed, although computed accounts of AI systems are likely to be important inputs to the explanatory process.
AI, Artificial Intelligence, Explainable AI, Explanation, GDPR, General Data Protection Regulation (GDPR), XAI
2212-4748
O'Hara, Kieron
0a64a4b1-efb5-45d1-a4c2-77783f18f0c4
O'Hara, Kieron
0a64a4b1-efb5-45d1-a4c2-77783f18f0c4

O'Hara, Kieron (2020) Explainable AI and the philosophy and practice of explanation. Computer Law & Security Review, 39, [105474]. (doi:10.1016/j.clsr.2020.105474).

Record type: Article

Abstract

Considerations of the nature of explanation and the law are brought together to argue that computed accounts of AI systems’ outputs cannot function on their own as explanations of decisions informed by AI. The important context for this inquiry is set by Article 22(3) of GDPR. The paper looks at the question of what an explanation is from the point of view of the philosophy of science – i.e. it asks not what counts as explanatory in legal terms, or what an AI system might compute using provenance metadata, but rather what explanation as a social practice consists in, arguing that explanation is an illocutionary act, and that it should be considered as a process, not a text. It cannot therefore be computed, although computed accounts of AI systems are likely to be important inputs to the explanatory process.

Text
ohara explanation CLSR paper - Accepted Manuscript
Download (271kB)

More information

Accepted/In Press date: 8 September 2020
e-pub ahead of print date: 5 October 2020
Published date: November 2020
Additional Information: Funding Information: This work is supported by the UK EPSRC as part of the PETRAS National Centre of Excellence for IoT Systems Cybersecurity under Grant Number EP/S035362/1 ( https://www.petrashub.org/ ). Publisher Copyright: © 2020 Kieron O'Hara
Keywords: AI, Artificial Intelligence, Explainable AI, Explanation, GDPR, General Data Protection Regulation (GDPR), XAI

Identifiers

Local EPrints ID: 444111
URI: http://eprints.soton.ac.uk/id/eprint/444111
ISSN: 2212-4748
PURE UUID: c04872a8-36d0-42bb-9aed-0d6b2ed3d051
ORCID for Kieron O'Hara: ORCID iD orcid.org/0000-0002-9051-4456

Catalogue record

Date deposited: 25 Sep 2020 16:35
Last modified: 17 Mar 2024 05:56

Export record

Altmetrics

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×