The University of Southampton
University of Southampton Institutional Repository

3D Motion Estimation By Evidence Gathering

3D Motion Estimation By Evidence Gathering
3D Motion Estimation By Evidence Gathering
In this paper we introduce an algorithm for 3D motion estimation in point clouds that is based on Chasles’ kinematic theorem. The proposed algorithm estimates 3D motion parameters directly from the data by exploiting the geometry of rigid transformation using an evidence gathering technique in a Hough-voting-like approach. The algorithm provides an alternative to the feature description and matching pipelines commonly used by numerous 3D object recognition and registration algorithms, as it does not involve keypoint detection and feature descriptor computation and matching. To the best of our knowledge, this is the first research to use kinematics theorems in an evidence gathering framework for motion estimation and surface matching without the use of any given correspondences. Moreover, we propose a method for voting for 3D motion parameters using a one-dimensional accumulator space, which enables voting for motion parameters more efficiently than other methods that use up to 7-dimensional accumulator spaces.
IEEE
Abuzaina, Anas
8e37a6fc-d659-45a0-8b8b-e885f385ece5
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12
Carter, John
e05be2f9-991d-4476-bb50-ae91606389da
Abuzaina, Anas
8e37a6fc-d659-45a0-8b8b-e885f385ece5
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12
Carter, John
e05be2f9-991d-4476-bb50-ae91606389da

Abuzaina, Anas, Nixon, Mark and Carter, John (2017) 3D Motion Estimation By Evidence Gathering. In 2016 23rd International Conference on Pattern Recognition (ICPR 2016) : Proceedings of a meeting held 4-8 December 2016, Cancun, Mexico. IEEE.. (doi:10.1109/ICPR.2016.7899890).

Record type: Conference or Workshop Item (Paper)

Abstract

In this paper we introduce an algorithm for 3D motion estimation in point clouds that is based on Chasles’ kinematic theorem. The proposed algorithm estimates 3D motion parameters directly from the data by exploiting the geometry of rigid transformation using an evidence gathering technique in a Hough-voting-like approach. The algorithm provides an alternative to the feature description and matching pipelines commonly used by numerous 3D object recognition and registration algorithms, as it does not involve keypoint detection and feature descriptor computation and matching. To the best of our knowledge, this is the first research to use kinematics theorems in an evidence gathering framework for motion estimation and surface matching without the use of any given correspondences. Moreover, we propose a method for voting for 3D motion parameters using a one-dimensional accumulator space, which enables voting for motion parameters more efficiently than other methods that use up to 7-dimensional accumulator spaces.

Text
bare_conf.pdf - Accepted Manuscript
Download (976kB)

More information

Accepted/In Press date: 4 December 2016
e-pub ahead of print date: 24 April 2017
Published date: 24 April 2017
Venue - Dates: 23rd International Conference on Pattern Recognition 2016, , Cancun, Mexico, 2016-12-04 - 2016-12-08
Organisations: Vision, Learning and Control

Identifiers

Local EPrints ID: 405242
URI: http://eprints.soton.ac.uk/id/eprint/405242
PURE UUID: 2170f228-1dc5-4f06-b1d4-a003bc81db91
ORCID for Mark Nixon: ORCID iD orcid.org/0000-0002-9174-5934

Catalogue record

Date deposited: 30 Jan 2017 11:30
Last modified: 16 Mar 2024 05:48

Export record

Altmetrics

Contributors

Author: Anas Abuzaina
Author: Mark Nixon ORCID iD
Author: John Carter

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×