The University of Southampton
University of Southampton Institutional Repository

The dual function of explanations: Why computing explanations is of value

The dual function of explanations: Why computing explanations is of value
The dual function of explanations: Why computing explanations is of value
The increasing dependence of decision-making on some level of automation has naturally led to discussions about the trustworthiness of such automation, calls for transparent automated decision-making and the emergence of ‘explainable Artificial Intelligence’ (XAI). Although XAI research has produced a number of taxonomies for the explanation of Artificial Intelligence (AI) and Machine Learning (ML) models, the legal debate has so far been mainly focused on whether a ‘right to explanation’ exists in the GDPR. Lately, a growing body of interdisciplinary literature is concentrating on the goals and substance of explanations produced for automated decision-making, with a view to clarify their role and improve their value against unfairness, discrimination and opacity for the purposes of ensuring compliance with Article 22 of the GDPR. At the same time, several researchers have warned that transparency of the algorithmic processes in itself is not enough and tools for better and easier assessment and review of the whole socio-technical system that includes automated decision-making are needed. In this paper, we suggest that generating computed explanations would be useful for most of the obligations set forth by the GDPR and can assist towards a holistic compliance strategy when used as detective controls. Computing explanations to support the detection of data protection breaches facilitates the monitoring and auditing of automated decision-making pipelines. Carefully constructed explanations can empower both the data controller and external recipients such as data subjects and regulators and should be seen as key controls in order to meet accountability and data protection-by-design obligations. To illustrate this claim, this paper presents the work undertaken by the PLEAD project towards ‘explainable-by-design’ socio-technical systems. PLEAD acknowledges the dual function of explanations as internal detective controls (to benefit data controllers) and external detective controls (to benefit data subjects) and leverages provenance-based technology to compute explanations and support the deployment of systematic compliance strategies
Explainability, Explainable AI, Automated decisions, Data Protection by Design
127–156
Bloomsbury Publishing
Tsakalakis, Niko
eae42e98-58b8-45b9-8c11-35a798cc9671
Stalla-Bourdillon, Sophie
c189651b-9ed3-49f6-bf37-25a47c487164
Carmichael, Laura
3f71fb73-581b-43c3-a261-a6627994c96e
Huynh, Dong
70469d1c-2272-47b5-8dd8-4d848582e84e
Moreau, Luc
724ac396-044d-4599-b420-7ecdb974b54c
Helal, Ayah
67f2f72b-cd02-4d6c-8586-398d6b198e09
Hallinan, Dara
Leenes, Ronald
De Hert, Paul
Tsakalakis, Niko
eae42e98-58b8-45b9-8c11-35a798cc9671
Stalla-Bourdillon, Sophie
c189651b-9ed3-49f6-bf37-25a47c487164
Carmichael, Laura
3f71fb73-581b-43c3-a261-a6627994c96e
Huynh, Dong
70469d1c-2272-47b5-8dd8-4d848582e84e
Moreau, Luc
724ac396-044d-4599-b420-7ecdb974b54c
Helal, Ayah
67f2f72b-cd02-4d6c-8586-398d6b198e09
Hallinan, Dara
Leenes, Ronald
De Hert, Paul

Tsakalakis, Niko, Stalla-Bourdillon, Sophie, Carmichael, Laura, Huynh, Dong, Moreau, Luc and Helal, Ayah (2021) The dual function of explanations: Why computing explanations is of value. In, Hallinan, Dara, Leenes, Ronald and De Hert, Paul (eds.) Data Protection and Privacy: Enforcing Rights in a Changing World. (Computers, Privacy and Data Protection, 14) Bloomsbury Publishing, 127–156. (doi:10.5040/9781509954544.ch-005).

Record type: Book Section

Abstract

The increasing dependence of decision-making on some level of automation has naturally led to discussions about the trustworthiness of such automation, calls for transparent automated decision-making and the emergence of ‘explainable Artificial Intelligence’ (XAI). Although XAI research has produced a number of taxonomies for the explanation of Artificial Intelligence (AI) and Machine Learning (ML) models, the legal debate has so far been mainly focused on whether a ‘right to explanation’ exists in the GDPR. Lately, a growing body of interdisciplinary literature is concentrating on the goals and substance of explanations produced for automated decision-making, with a view to clarify their role and improve their value against unfairness, discrimination and opacity for the purposes of ensuring compliance with Article 22 of the GDPR. At the same time, several researchers have warned that transparency of the algorithmic processes in itself is not enough and tools for better and easier assessment and review of the whole socio-technical system that includes automated decision-making are needed. In this paper, we suggest that generating computed explanations would be useful for most of the obligations set forth by the GDPR and can assist towards a holistic compliance strategy when used as detective controls. Computing explanations to support the detection of data protection breaches facilitates the monitoring and auditing of automated decision-making pipelines. Carefully constructed explanations can empower both the data controller and external recipients such as data subjects and regulators and should be seen as key controls in order to meet accountability and data protection-by-design obligations. To illustrate this claim, this paper presents the work undertaken by the PLEAD project towards ‘explainable-by-design’ socio-technical systems. PLEAD acknowledges the dual function of explanations as internal detective controls (to benefit data controllers) and external detective controls (to benefit data subjects) and leverages provenance-based technology to compute explanations and support the deployment of systematic compliance strategies

Text
Dual_function_of_Explanations-accepted - Accepted Manuscript
Download (1MB)

More information

Published date: 2021
Keywords: Explainability, Explainable AI, Automated decisions, Data Protection by Design

Identifiers

Local EPrints ID: 454424
URI: http://eprints.soton.ac.uk/id/eprint/454424
PURE UUID: de064136-1640-4adb-a432-c9079203e859
ORCID for Niko Tsakalakis: ORCID iD orcid.org/0000-0003-2654-0825
ORCID for Sophie Stalla-Bourdillon: ORCID iD orcid.org/0000-0003-3715-1219
ORCID for Laura Carmichael: ORCID iD orcid.org/0000-0001-9391-1310

Catalogue record

Date deposited: 09 Feb 2022 17:35
Last modified: 10 Feb 2022 02:42

Export record

Altmetrics

Contributors

Author: Niko Tsakalakis ORCID iD
Author: Dong Huynh
Author: Luc Moreau
Author: Ayah Helal
Editor: Dara Hallinan
Editor: Ronald Leenes
Editor: Paul De Hert

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×