The dual function of explanations: Why computing explanations is of value
The dual function of explanations: Why computing explanations is of value
The increasing dependence of decision-making on some level of automation has naturally led to discussions about the trustworthiness of such automation, calls for transparent automated decision-making and the emergence of ‘explainable Artificial Intelligence’ (XAI). Although XAI research has produced a number of taxonomies for the explanation of Artificial Intelligence (AI) and Machine Learning (ML) models, the legal debate has so far been mainly focused on whether a ‘right to explanation’ exists in the GDPR. Lately, a growing body of interdisciplinary literature is concentrating on the goals and substance of explanations produced for automated decision-making, with a view to clarify their role and improve their value against unfairness, discrimination and opacity for the purposes of ensuring compliance with Article 22 of the GDPR. At the same time, several researchers have warned that transparency of the algorithmic processes in itself is not enough and tools for better and easier assessment and review of the whole socio-technical system that includes automated decision-making are needed. In this paper, we suggest that generating computed explanations would be useful for most of the obligations set forth by the GDPR and can assist towards a holistic compliance strategy when used as detective controls. Computing explanations to support the detection of data protection breaches facilitates the monitoring and auditing of automated decision-making pipelines. Carefully constructed explanations can empower both the data controller and external recipients such as data subjects and regulators and should be seen as key controls in order to meet accountability and data protection-by-design obligations. To illustrate this claim, this paper presents the work undertaken by the PLEAD project towards ‘explainable-by-design’ socio-technical systems. PLEAD acknowledges the dual function of explanations as internal detective controls (to benefit data controllers) and external detective controls (to benefit data subjects) and leverages provenance-based technology to compute explanations and support the deployment of systematic compliance strategies
Explainability, Explainable AI, Automated decisions, Data Protection by Design
127–156
Tsakalakis, Niko
eae42e98-58b8-45b9-8c11-35a798cc9671
Stalla-Bourdillon, Sophie
c189651b-9ed3-49f6-bf37-25a47c487164
Carmichael, Laura
3f71fb73-581b-43c3-a261-a6627994c96e
Huynh, Dong
70469d1c-2272-47b5-8dd8-4d848582e84e
Moreau, Luc
724ac396-044d-4599-b420-7ecdb974b54c
Helal, Ayah
0e44c81d-ef58-4503-b8cb-3d1fb82ec651
2021
Tsakalakis, Niko
eae42e98-58b8-45b9-8c11-35a798cc9671
Stalla-Bourdillon, Sophie
c189651b-9ed3-49f6-bf37-25a47c487164
Carmichael, Laura
3f71fb73-581b-43c3-a261-a6627994c96e
Huynh, Dong
70469d1c-2272-47b5-8dd8-4d848582e84e
Moreau, Luc
724ac396-044d-4599-b420-7ecdb974b54c
Helal, Ayah
0e44c81d-ef58-4503-b8cb-3d1fb82ec651
Tsakalakis, Niko, Stalla-Bourdillon, Sophie, Carmichael, Laura, Huynh, Dong, Moreau, Luc and Helal, Ayah
(2021)
The dual function of explanations: Why computing explanations is of value.
In,
Hallinan, Dara, Leenes, Ronald and De Hert, Paul
(eds.)
Data Protection and Privacy: Enforcing Rights in a Changing World.
(Computers, Privacy and Data Protection, 14)
Bloomsbury Publishing, .
(doi:10.5040/9781509954544.ch-005).
Record type:
Book Section
Abstract
The increasing dependence of decision-making on some level of automation has naturally led to discussions about the trustworthiness of such automation, calls for transparent automated decision-making and the emergence of ‘explainable Artificial Intelligence’ (XAI). Although XAI research has produced a number of taxonomies for the explanation of Artificial Intelligence (AI) and Machine Learning (ML) models, the legal debate has so far been mainly focused on whether a ‘right to explanation’ exists in the GDPR. Lately, a growing body of interdisciplinary literature is concentrating on the goals and substance of explanations produced for automated decision-making, with a view to clarify their role and improve their value against unfairness, discrimination and opacity for the purposes of ensuring compliance with Article 22 of the GDPR. At the same time, several researchers have warned that transparency of the algorithmic processes in itself is not enough and tools for better and easier assessment and review of the whole socio-technical system that includes automated decision-making are needed. In this paper, we suggest that generating computed explanations would be useful for most of the obligations set forth by the GDPR and can assist towards a holistic compliance strategy when used as detective controls. Computing explanations to support the detection of data protection breaches facilitates the monitoring and auditing of automated decision-making pipelines. Carefully constructed explanations can empower both the data controller and external recipients such as data subjects and regulators and should be seen as key controls in order to meet accountability and data protection-by-design obligations. To illustrate this claim, this paper presents the work undertaken by the PLEAD project towards ‘explainable-by-design’ socio-technical systems. PLEAD acknowledges the dual function of explanations as internal detective controls (to benefit data controllers) and external detective controls (to benefit data subjects) and leverages provenance-based technology to compute explanations and support the deployment of systematic compliance strategies
Text
Dual_function_of_Explanations-accepted
- Accepted Manuscript
More information
Published date: 2021
Keywords:
Explainability, Explainable AI, Automated decisions, Data Protection by Design
Identifiers
Local EPrints ID: 454424
URI: http://eprints.soton.ac.uk/id/eprint/454424
PURE UUID: de064136-1640-4adb-a432-c9079203e859
Catalogue record
Date deposited: 09 Feb 2022 17:35
Last modified: 06 Sep 2024 01:47
Export record
Altmetrics
Contributors
Author:
Dong Huynh
Author:
Luc Moreau
Author:
Ayah Helal
Editor:
Dara Hallinan
Editor:
Ronald Leenes
Editor:
Paul De Hert
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics