The University of Southampton
University of Southampton Institutional Repository

AI3SD Video: Interpreting opacity: understanding gaps in our explanations of artificial neural networks

AI3SD Video: Interpreting opacity: understanding gaps in our explanations of artificial neural networks
AI3SD Video: Interpreting opacity: understanding gaps in our explanations of artificial neural networks
We know everything that goes on within artificial neural networks. We tend to know of all the data such systems have been trained on. And designers will be aware of the various design decisions, training algorithms and techniques that went into their construction, too. At the same time, leading AI designers tell us that their systems are in some sense uninterpretable, inexplicable or opaque. That’s puzzling. Drawing on discussions in the philosophy of neuroscience and science more generally, I will make use of this puzzle to try to advance our understanding of what explanations we lack with respect to ANNS; hence the nature and scope of explanation. The puzzle helps us to distinguish different phenomena in need of explanation, and some limits to the mechanistic explanatory strategies so often helpfully employed in the cognitive neurosciences.
Mcneill, William
be33c4df-0f0e-42bf-8b9b-3c0afe8cb69e
Frey, Jeremy G.
ba60c559-c4af-44f1-87e6-ce69819bf23f
Kanza, Samantha
b73bcf34-3ff8-4691-bd09-aa657dcff420
Niranjan, Mahesan
5cbaeea8-7288-4b55-a89c-c43d212ddd4f
Mcneill, William
be33c4df-0f0e-42bf-8b9b-3c0afe8cb69e
Frey, Jeremy G.
ba60c559-c4af-44f1-87e6-ce69819bf23f
Kanza, Samantha
b73bcf34-3ff8-4691-bd09-aa657dcff420
Niranjan, Mahesan
5cbaeea8-7288-4b55-a89c-c43d212ddd4f

Mcneill, William (2022) AI3SD Video: Interpreting opacity: understanding gaps in our explanations of artificial neural networks. Frey, Jeremy G., Kanza, Samantha and Niranjan, Mahesan (eds.) AI4SD Network+ Conference, Chilworth Manor , Southampton, United Kingdom. 01 - 03 Mar 2022. (doi:10.5258/SOTON/AI3SD0198).

Record type: Conference or Workshop Item (Other)

Abstract

We know everything that goes on within artificial neural networks. We tend to know of all the data such systems have been trained on. And designers will be aware of the various design decisions, training algorithms and techniques that went into their construction, too. At the same time, leading AI designers tell us that their systems are in some sense uninterpretable, inexplicable or opaque. That’s puzzling. Drawing on discussions in the philosophy of neuroscience and science more generally, I will make use of this puzzle to try to advance our understanding of what explanations we lack with respect to ANNS; hence the nature and scope of explanation. The puzzle helps us to distinguish different phenomena in need of explanation, and some limits to the mechanistic explanatory strategies so often helpfully employed in the cognitive neurosciences.

Video
ai4sd_march_2022_day_2_WillMcNeill
Available under License Creative Commons Attribution.
Download (14MB)

More information

Published date: 2 March 2022
Additional Information: William has been a lecturer in Philosophy at the University of Southampton since 2016 and is part of the Philosophy of Language, Philosophy of Mind and Epistemology Research Group. Prior to this he lectured at Kings College London, the University of York and Cardiff University. His research interests are centered on the epistemology of perception, social cognition and inferential knowledge.
Venue - Dates: AI4SD Network+ Conference, Chilworth Manor , Southampton, United Kingdom, 2022-03-01 - 2022-03-03

Identifiers

Local EPrints ID: 470014
URI: http://eprints.soton.ac.uk/id/eprint/470014
PURE UUID: 3d575357-3cba-4f68-ae00-8a5f88ddecdc
ORCID for William Mcneill: ORCID iD orcid.org/0000-0002-3647-0720
ORCID for Jeremy G. Frey: ORCID iD orcid.org/0000-0003-0842-4302
ORCID for Samantha Kanza: ORCID iD orcid.org/0000-0002-4831-9489
ORCID for Mahesan Niranjan: ORCID iD orcid.org/0000-0001-7021-140X

Catalogue record

Date deposited: 30 Sep 2022 16:38
Last modified: 17 Mar 2024 03:52

Export record

Altmetrics

Contributors

Author: William Mcneill ORCID iD
Editor: Jeremy G. Frey ORCID iD
Editor: Samantha Kanza ORCID iD
Editor: Mahesan Niranjan ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×