The University of Southampton
University of Southampton Institutional Repository

Evidence-based object tracking via global energy maximization

Evidence-based object tracking via global energy maximization
Evidence-based object tracking via global energy maximization
This paper describes a robust algorithm for arbitrary object tracking in long image sequences. This technique extends the dynamic Hough transform proposed in our earlier work to detect arbitrary shapes undergoing affine motion. The proposed tracking algorithm processes the whole image sequence globally. First, the object boundary is represented in lookup-table form, and we then perform an operation that estimates the energy of the motion trajectory in the parameter space. We assign an extra term in our cost function to incorporate smoothness of deformation. The object is actually rigid, so by 'deformation' we mean changes due to rotation or scaling of the object. There is no need for training or initialization, and an efficient implementation can be achieved with coarse-to-fine dynamic programming and pruning. The method, because of its evidence-based nature, is robust under noise and occlusion.
1520-6149
501-504
IEEE
Carter, J. N.
e05be2f9-991d-4476-bb50-ae91606389da
Lappas, P.
67163f13-e443-47e5-b384-6606c49909d2
Damper, R. I.
6e0e7fdc-57ec-44d4-bc0f-029d17ba441d
Carter, J. N.
e05be2f9-991d-4476-bb50-ae91606389da
Lappas, P.
67163f13-e443-47e5-b384-6606c49909d2
Damper, R. I.
6e0e7fdc-57ec-44d4-bc0f-029d17ba441d

Carter, J. N., Lappas, P. and Damper, R. I. (2003) Evidence-based object tracking via global energy maximization. In, 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03). (Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing) IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '03). (06/04/03 - 10/04/03) IEEE, pp. 501-504. (doi:10.1109/ICASSP.2003.1199521).

Record type: Book Section

Abstract

This paper describes a robust algorithm for arbitrary object tracking in long image sequences. This technique extends the dynamic Hough transform proposed in our earlier work to detect arbitrary shapes undergoing affine motion. The proposed tracking algorithm processes the whole image sequence globally. First, the object boundary is represented in lookup-table form, and we then perform an operation that estimates the energy of the motion trajectory in the parameter space. We assign an extra term in our cost function to incorporate smoothness of deformation. The object is actually rigid, so by 'deformation' we mean changes due to rotation or scaling of the object. There is no need for training or initialization, and an efficient implementation can be achieved with coarse-to-fine dynamic programming and pruning. The method, because of its evidence-based nature, is robust under noise and occlusion.

Text
Evidence-based_object_tracking_via_global_energy_maximization
Available under License Creative Commons Attribution.
Download (272kB)

More information

Published date: 21 May 2003
Additional Information: Organisation: IEEE Address: Toronto, Canada
Venue - Dates: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '03)., , Hong Kong, China, 2003-04-06 - 2003-04-10

Identifiers

Local EPrints ID: 257958
URI: http://eprints.soton.ac.uk/id/eprint/257958
ISSN: 1520-6149
PURE UUID: dd19bf9b-457f-4aaf-b97a-2003b3eabb68

Catalogue record

Date deposited: 27 Jun 2003
Last modified: 17 Mar 2024 03:16

Export record

Altmetrics

Contributors

Author: J. N. Carter
Author: P. Lappas
Author: R. I. Damper

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×