The University of Southampton
University of Southampton Institutional Repository

How convolutional neural network architecture biases learned opponency and colour tuning

How convolutional neural network architecture biases learned opponency and colour tuning
How convolutional neural network architecture biases learned opponency and colour tuning
Recent work suggests that changing Convolutional Neural Network (CNN) architecture by introducing a bottleneck in the second layer can yield changes in learned function.
To understand this relationship fully requires a way of quantitatively comparing trained networks.
The fields of electrophysiology and psychophysics have developed a wealth of methods for characterising visual systems which permit such comparisons.
Inspired by these methods, we propose an approach to obtaining spatial and colour tuning curves for convolutional neurons, which can be used to classify cells in terms of their spatial and colour opponency.
We perform these classifications for a range of CNNs with different depths and bottleneck widths.
Our key finding is that networks with a bottleneck show a strong functional organisation: almost all cells in the bottleneck layer become both spatially and colour opponent, cells in the layer following the bottleneck become non-opponent.
The colour tuning data can further be used to form a rich understanding of how colour is encoded by a network.
As a concrete demonstration, we show that shallower networks without a bottleneck learn a complex non-linear colour system, whereas deeper networks with tight bottlenecks learn a simple channel opponent code in the bottleneck layer.
We further develop a method of obtaining a hue sensitivity curve for a trained CNN which enables high level insights that complement the low level findings from the colour tuning data.
We go on to train a series of networks under different conditions to ascertain the robustness of the discussed results.
Ultimately, our methods and findings coalesce with prior art, strengthening our ability to interpret trained CNNs and furthering our understanding of the connection between architecture and learned representation.
Trained models and code for all experiments are available at https://github.com/ecs-vlc/opponency.
1530-888X
Harris, Ethan William Albert
6d531059-ebaa-451c-b242-5394f0288266
Mihai, Andreea Daniela
f8910fe1-18e7-45b3-8923-b34b5cd136fa
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Harris, Ethan William Albert
6d531059-ebaa-451c-b242-5394f0288266
Mihai, Andreea Daniela
f8910fe1-18e7-45b3-8923-b34b5cd136fa
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9

Harris, Ethan William Albert, Mihai, Andreea Daniela and Hare, Jonathon (2020) How convolutional neural network architecture biases learned opponency and colour tuning. Neural Computation. (In Press)

Record type: Article

Abstract

Recent work suggests that changing Convolutional Neural Network (CNN) architecture by introducing a bottleneck in the second layer can yield changes in learned function.
To understand this relationship fully requires a way of quantitatively comparing trained networks.
The fields of electrophysiology and psychophysics have developed a wealth of methods for characterising visual systems which permit such comparisons.
Inspired by these methods, we propose an approach to obtaining spatial and colour tuning curves for convolutional neurons, which can be used to classify cells in terms of their spatial and colour opponency.
We perform these classifications for a range of CNNs with different depths and bottleneck widths.
Our key finding is that networks with a bottleneck show a strong functional organisation: almost all cells in the bottleneck layer become both spatially and colour opponent, cells in the layer following the bottleneck become non-opponent.
The colour tuning data can further be used to form a rich understanding of how colour is encoded by a network.
As a concrete demonstration, we show that shallower networks without a bottleneck learn a complex non-linear colour system, whereas deeper networks with tight bottlenecks learn a simple channel opponent code in the bottleneck layer.
We further develop a method of obtaining a hue sensitivity curve for a trained CNN which enables high level insights that complement the low level findings from the colour tuning data.
We go on to train a series of networks under different conditions to ascertain the robustness of the discussed results.
Ultimately, our methods and findings coalesce with prior art, strengthening our ability to interpret trained CNNs and furthering our understanding of the connection between architecture and learned representation.
Trained models and code for all experiments are available at https://github.com/ecs-vlc/opponency.

Text
color-accepted - Accepted Manuscript
Available under License Creative Commons Attribution.
Download (2MB)

More information

Accepted/In Press date: 5 October 2020

Identifiers

Local EPrints ID: 444364
URI: http://eprints.soton.ac.uk/id/eprint/444364
ISSN: 1530-888X
PURE UUID: c26712bf-9403-4d8a-9fec-9548b3e798e2
ORCID for Ethan William Albert Harris: ORCID iD orcid.org/0000-0003-3545-1349
ORCID for Andreea Daniela Mihai: ORCID iD orcid.org/0000-0003-3368-9062
ORCID for Jonathon Hare: ORCID iD orcid.org/0000-0003-2921-4283

Catalogue record

Date deposited: 14 Oct 2020 16:31
Last modified: 17 Mar 2024 03:05

Export record

Contributors

Author: Ethan William Albert Harris ORCID iD
Author: Andreea Daniela Mihai ORCID iD
Author: Jonathon Hare ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×