The University of Southampton
University of Southampton Institutional Repository

Bias in data‐driven artificial intelligence systems: An introductory survey

Bias in data‐driven artificial intelligence systems: An introductory survey
Bias in data‐driven artificial intelligence systems: An introductory survey

Artificial Intelligence (AI)-based systems are widely employed nowadays to make decisions that have far-reaching impact on individuals and society. Their decisions might affect everyone, everywhere, and anytime, entailing concerns about potential human rights issues. Therefore, it is necessary to move beyond traditional AI algorithms optimized for predictive performance and embed ethical and legal principles in their design, training, and deployment to ensure social good while still benefiting from the huge potential of the AI technology. The goal of this survey is to provide a broad multidisciplinary overview of the area of bias in AI systems, focusing on technical challenges and solutions as well as to suggest new research directions towards approaches well-grounded in a legal frame. In this survey, we focus on data-driven AI, as a large part of AI is powered nowadays by (big) data and powerful machine learning algorithms. If otherwise not specified, we use the general term bias to describe problems related to the gathering or processing of data that might result in prejudiced decisions on the bases of demographic features such as race, sex, and so forth. This article is categorized under: Commercial, Legal, and Ethical Issues > Fairness in Data Mining Commercial, Legal, and Ethical Issues > Ethical Considerations Commercial, Legal, and Ethical Issues > Legal Issues.

fairness, fairness-aware AI, fairness-aware machine learning, interpretability, responsible AI
1942-4795
1-14
Ntoutsi, Eirini
53ea9bed-328d-45cb-a80a-2e6916f816b0
Fafalios, Pavlos
fd154b7a-9ba5-48d0-bd4b-b5f90b31c3e2
Gadiraju, Ujwal
91e34693-d3ab-4469-a8f5-4c42bda42805
Iosifidis, Vasileios
02b2e367-dd0a-4074-a68b-54de5faf8226
Nejdl, Wolfgang
e9f800fe-27db-470a-bf02-9e8ab128d500
Vidal, Maria-Esther
3f7938e0-39e2-49c9-8a60-2be9b016f574
Ruggieri, Salvatore
1b1c73e9-3462-48d2-9864-e395c4daf337
Turini, Franco
4d637b00-8f5e-4164-a237-b543f4078879
Papadopoulos, Symeon
818a6f28-8102-45b4-8e95-53be585ec20a
Krasanakis, Emmanouil
0d4d020e-50c7-4729-a5f4-18c4d139e826
Kompatsiaris, Ioannis
192fd35c-6c84-43c8-ae1a-49cbb6f1a926
Kinder-Kurlanda, Katharina
d41094c1-41ca-4c9b-9d6c-696ab932e2b7
Wagner, Claudia
bd90ec7e-14cc-415f-a0f3-692510baaf58
Karimi, Fariba
5e3871f3-6549-4ce0-9d09-df05b6951b10
Fernández, Miriam
4f16df70-9ffd-44d0-8e3b-a00c2e6d666d
Alani, Harith
70cdbdce-1494-44c2-9dae-65d82bf7e991
Berendt, Bettina
191543ac-805e-4209-b84c-7d8c42ced4c1
Krügel, Tina
5c313292-4224-4257-9418-e8679de400f0
Heinze, Christian
301f48ee-5337-4189-b9da-e5f078bd4064
Broelemann, Klaus
591f0927-c503-465e-8b27-e615a564733c
Kasneci, Gjergji
2991f0cd-5693-4843-9da5-af09e489ce7d
Tiropanis, Thanassis
d06654bd-5513-407b-9acd-6f9b9c5009d8
Staab, Steffen
bf48d51b-bd11-4d58-8e1c-4e6e03b30c49
Ntoutsi, Eirini
53ea9bed-328d-45cb-a80a-2e6916f816b0
Fafalios, Pavlos
fd154b7a-9ba5-48d0-bd4b-b5f90b31c3e2
Gadiraju, Ujwal
91e34693-d3ab-4469-a8f5-4c42bda42805
Iosifidis, Vasileios
02b2e367-dd0a-4074-a68b-54de5faf8226
Nejdl, Wolfgang
e9f800fe-27db-470a-bf02-9e8ab128d500
Vidal, Maria-Esther
3f7938e0-39e2-49c9-8a60-2be9b016f574
Ruggieri, Salvatore
1b1c73e9-3462-48d2-9864-e395c4daf337
Turini, Franco
4d637b00-8f5e-4164-a237-b543f4078879
Papadopoulos, Symeon
818a6f28-8102-45b4-8e95-53be585ec20a
Krasanakis, Emmanouil
0d4d020e-50c7-4729-a5f4-18c4d139e826
Kompatsiaris, Ioannis
192fd35c-6c84-43c8-ae1a-49cbb6f1a926
Kinder-Kurlanda, Katharina
d41094c1-41ca-4c9b-9d6c-696ab932e2b7
Wagner, Claudia
bd90ec7e-14cc-415f-a0f3-692510baaf58
Karimi, Fariba
5e3871f3-6549-4ce0-9d09-df05b6951b10
Fernández, Miriam
4f16df70-9ffd-44d0-8e3b-a00c2e6d666d
Alani, Harith
70cdbdce-1494-44c2-9dae-65d82bf7e991
Berendt, Bettina
191543ac-805e-4209-b84c-7d8c42ced4c1
Krügel, Tina
5c313292-4224-4257-9418-e8679de400f0
Heinze, Christian
301f48ee-5337-4189-b9da-e5f078bd4064
Broelemann, Klaus
591f0927-c503-465e-8b27-e615a564733c
Kasneci, Gjergji
2991f0cd-5693-4843-9da5-af09e489ce7d
Tiropanis, Thanassis
d06654bd-5513-407b-9acd-6f9b9c5009d8
Staab, Steffen
bf48d51b-bd11-4d58-8e1c-4e6e03b30c49

Ntoutsi, Eirini, Fafalios, Pavlos, Gadiraju, Ujwal, Iosifidis, Vasileios, Nejdl, Wolfgang, Vidal, Maria-Esther, Ruggieri, Salvatore, Turini, Franco, Papadopoulos, Symeon, Krasanakis, Emmanouil, Kompatsiaris, Ioannis, Kinder-Kurlanda, Katharina, Wagner, Claudia, Karimi, Fariba, Fernández, Miriam, Alani, Harith, Berendt, Bettina, Krügel, Tina, Heinze, Christian, Broelemann, Klaus, Kasneci, Gjergji, Tiropanis, Thanassis and Staab, Steffen (2020) Bias in data‐driven artificial intelligence systems: An introductory survey. WIREs Data Mining and Knowledge Discovery, 10 (3), 1-14, [e1356]. (doi:10.1002/widm.1356).

Record type: Article

Abstract

Artificial Intelligence (AI)-based systems are widely employed nowadays to make decisions that have far-reaching impact on individuals and society. Their decisions might affect everyone, everywhere, and anytime, entailing concerns about potential human rights issues. Therefore, it is necessary to move beyond traditional AI algorithms optimized for predictive performance and embed ethical and legal principles in their design, training, and deployment to ensure social good while still benefiting from the huge potential of the AI technology. The goal of this survey is to provide a broad multidisciplinary overview of the area of bias in AI systems, focusing on technical challenges and solutions as well as to suggest new research directions towards approaches well-grounded in a legal frame. In this survey, we focus on data-driven AI, as a large part of AI is powered nowadays by (big) data and powerful machine learning algorithms. If otherwise not specified, we use the general term bias to describe problems related to the gathering or processing of data that might result in prejudiced decisions on the bases of demographic features such as race, sex, and so forth. This article is categorized under: Commercial, Legal, and Ethical Issues > Fairness in Data Mining Commercial, Legal, and Ethical Issues > Ethical Considerations Commercial, Legal, and Ethical Issues > Legal Issues.

Text
Ntoutsi et al 2020 Wiley Interdisciplinary Reviews Data Mining and Knowledge Discovery - Version of Record
Available under License Creative Commons Attribution.
Download (1MB)

More information

Accepted/In Press date: 31 December 2019
e-pub ahead of print date: 3 February 2020
Published date: 1 May 2020
Additional Information: Funding Information: This work is supported by the project ?NoBias - Artificial Intelligence without Bias,? which has received funding from the European Union's Horizon 2020 research and innovation programme, under the Marie Sk?odowska-Curie (Innovative Training Network) grant agreement no. 860630. Publisher Copyright: © 2020 The Authors. WIREs Data Mining and Knowledge Discovery published by Wiley Periodicals, Inc.
Keywords: fairness, fairness-aware AI, fairness-aware machine learning, interpretability, responsible AI

Identifiers

Local EPrints ID: 437566
URI: http://eprints.soton.ac.uk/id/eprint/437566
ISSN: 1942-4795
PURE UUID: 7ad486fc-27cb-43df-b0de-9716abd7d7b2
ORCID for Thanassis Tiropanis: ORCID iD orcid.org/0000-0002-6195-2852
ORCID for Steffen Staab: ORCID iD orcid.org/0000-0002-0780-4154

Catalogue record

Date deposited: 05 Feb 2020 17:34
Last modified: 06 Jun 2024 01:54

Export record

Altmetrics

Contributors

Author: Eirini Ntoutsi
Author: Pavlos Fafalios
Author: Ujwal Gadiraju
Author: Vasileios Iosifidis
Author: Wolfgang Nejdl
Author: Maria-Esther Vidal
Author: Salvatore Ruggieri
Author: Franco Turini
Author: Symeon Papadopoulos
Author: Emmanouil Krasanakis
Author: Ioannis Kompatsiaris
Author: Katharina Kinder-Kurlanda
Author: Claudia Wagner
Author: Fariba Karimi
Author: Miriam Fernández
Author: Harith Alani
Author: Bettina Berendt
Author: Tina Krügel
Author: Christian Heinze
Author: Klaus Broelemann
Author: Gjergji Kasneci
Author: Thanassis Tiropanis ORCID iD
Author: Steffen Staab ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×