The University of Southampton
University of Southampton Institutional Repository

Degree of symmetry: using human perception to guide symmetry operators

Degree of symmetry: using human perception to guide symmetry operators
Degree of symmetry: using human perception to guide symmetry operators
Symmetry is often treated as a binary property. In contrast, this study demonstrates that symmetry (specifically reflection), as perceived by human vision, can be represented as a measure that is continuous in value which this work terms the ‘Degree of Symmetry’ (DoS). It is suggested that human perception can determine one axis to have a greater, lesser, or the same DoS compared to another axis. If human vision has evolved to pre-process image features to optimise visual cognition, then computational algorithms could benefit from aiming to replicate human performance. Using pairwise comparisons of symmetry axes and crowd-sourcing, datasets of symmetry axes can be ranked to give a relative DoS for each axis. Two new datasets are ranked using human perception and the analysis compared with mathematical models for symmetry and existing symmetry operators. Comparison is made by a new performance measure correlating rank-ordered lists and is termed as the ‘Symmetry Similarity’: estimating how similar to human vision an operator performs. These comparisons led to two main conclusions. Firstly that shape is more important than intensity to human perception. Secondly, that existing operators tested are not good predictors of human perception: the highest Symmetry Similarity of tested existing operators is only 17.6%. Two new DoS operators are demonstrated based on these findings. Using scale and rotation invariant features around an axis, a Deep Belief Network (DBN) is trained to make the same pairwise comparisons as human perception. It achieves a Symmetry Similarity of 52.3%, suggesting that it is a better predictor of human perception than previously tested algorithms. Analysis of this operator prompted the dataset to be extended. Two key conclusions from the analysis of the extended dataset are that the DoS for sections of axes can be linearly combined to form the DoS of the combined axis, and that the decrease in DoS when an axis is parallel offset from the optimal axis can be approximated to a Gaussian. The DBN based operator achieved a lower Symmetry Similarity of 44.1% against the extended dataset.
University of Southampton
Forrest, Peter
44b92a54-79d0-4070-abbc-a8da45cb9994
Forrest, Peter
44b92a54-79d0-4070-abbc-a8da45cb9994
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12

Forrest, Peter (2018) Degree of symmetry: using human perception to guide symmetry operators. University of Southampton, Doctoral Thesis, 183pp.

Record type: Thesis (Doctoral)

Abstract

Symmetry is often treated as a binary property. In contrast, this study demonstrates that symmetry (specifically reflection), as perceived by human vision, can be represented as a measure that is continuous in value which this work terms the ‘Degree of Symmetry’ (DoS). It is suggested that human perception can determine one axis to have a greater, lesser, or the same DoS compared to another axis. If human vision has evolved to pre-process image features to optimise visual cognition, then computational algorithms could benefit from aiming to replicate human performance. Using pairwise comparisons of symmetry axes and crowd-sourcing, datasets of symmetry axes can be ranked to give a relative DoS for each axis. Two new datasets are ranked using human perception and the analysis compared with mathematical models for symmetry and existing symmetry operators. Comparison is made by a new performance measure correlating rank-ordered lists and is termed as the ‘Symmetry Similarity’: estimating how similar to human vision an operator performs. These comparisons led to two main conclusions. Firstly that shape is more important than intensity to human perception. Secondly, that existing operators tested are not good predictors of human perception: the highest Symmetry Similarity of tested existing operators is only 17.6%. Two new DoS operators are demonstrated based on these findings. Using scale and rotation invariant features around an axis, a Deep Belief Network (DBN) is trained to make the same pairwise comparisons as human perception. It achieves a Symmetry Similarity of 52.3%, suggesting that it is a better predictor of human perception than previously tested algorithms. Analysis of this operator prompted the dataset to be extended. Two key conclusions from the analysis of the extended dataset are that the DoS for sections of axes can be linearly combined to form the DoS of the combined axis, and that the decrease in DoS when an axis is parallel offset from the optimal axis can be approximated to a Gaussian. The DBN based operator achieved a lower Symmetry Similarity of 44.1% against the extended dataset.

Text
Final Thesis - Version of Record
Available under License University of Southampton Thesis Licence.
Download (135MB)
Text
PTD
Restricted to Repository staff only

More information

Submitted date: June 2017
Published date: 28 March 2018

Identifiers

Local EPrints ID: 455849
URI: http://eprints.soton.ac.uk/id/eprint/455849
PURE UUID: 1cb0f9c6-fabd-427a-b999-c4a2f04a3d7e
ORCID for Mark Nixon: ORCID iD orcid.org/0000-0002-9174-5934

Catalogue record

Date deposited: 06 Apr 2022 16:57
Last modified: 17 Mar 2024 02:33

Export record

Contributors

Author: Peter Forrest
Thesis advisor: Mark Nixon ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×