The University of Southampton
University of Southampton Institutional Repository

Attentive visual tracking and trajectory estimation for dynamic scene segmentation

Attentive visual tracking and trajectory estimation for dynamic scene segmentation
Attentive visual tracking and trajectory estimation for dynamic scene segmentation
Intelligent Co-Pilot Systems (ICPS) offer the next challenge to vehicle-highway automation. The key to ICPSs is the detection of moving objects (other vehicles) from the moving observer using a visual sensor. The aim of the work presented in this thesis was to design and implement a feature detection and tracking strategy that is capable of tracking image features independently, in parallel, and in real-time and to cluster/segment features utilising the inherent temporal information contained within feature trajectories. Most images contain areas that are of little or no interest to vision tasks. An attentive, data-driven, approach to feature detection and tracking is proposed which aims to increase the efficiency of feature detection and tracking by focusing attention onto relevant regions of the image likely to contain scene structure. This attentive algorithm lends itself naturally to parallelisation and results from a parallel implementation are presented. A scene may be segmented into independently moving objects based on the assumption that features belonging to the same object will move in an identical way in three dimensions (this assumes objects are rigid). A model for scene segmentation is proposed that uses information contained within feature trajectories to cluster, or group, features into independently moving objects. This information includes: image-plane position, time-to-collision of a feature with the image-plane, and the type of motion observed. The Multiple Model Adaptive Estimator (MMAE) algorithm is extended to cope with constituent filters with different states (MMAE2) in an attempt to accurately estimate the time-to-collision of a feature and provide a reliable idea of the type of motion observed (in the form of a model belief measure). Finally, poor state initialisation is identified as a likely prime cause for poor Extended Kalman Filter (EKF) performance (and hence poor MMAE2 performance) when using high order models. The idea of the neurofuzzy initialised EKF (NF-EKF) is introduced which attempts to reduce the time for an EKF to converge by improving the accuracy of the EKF's initial state estimates.
University of Southampton
Roberts, J.M.
58762646-1ccb-4f99-b8c3-ca47871b8f32
Roberts, J.M.
58762646-1ccb-4f99-b8c3-ca47871b8f32
Charnley, D.
201a3f46-6348-4188-a8af-9e4e08c5889a

Roberts, J.M. (1994) Attentive visual tracking and trajectory estimation for dynamic scene segmentation. University of Southampton, : University of Southampton, Doctoral Thesis.

Record type: Thesis (Doctoral)

Abstract

Intelligent Co-Pilot Systems (ICPS) offer the next challenge to vehicle-highway automation. The key to ICPSs is the detection of moving objects (other vehicles) from the moving observer using a visual sensor. The aim of the work presented in this thesis was to design and implement a feature detection and tracking strategy that is capable of tracking image features independently, in parallel, and in real-time and to cluster/segment features utilising the inherent temporal information contained within feature trajectories. Most images contain areas that are of little or no interest to vision tasks. An attentive, data-driven, approach to feature detection and tracking is proposed which aims to increase the efficiency of feature detection and tracking by focusing attention onto relevant regions of the image likely to contain scene structure. This attentive algorithm lends itself naturally to parallelisation and results from a parallel implementation are presented. A scene may be segmented into independently moving objects based on the assumption that features belonging to the same object will move in an identical way in three dimensions (this assumes objects are rigid). A model for scene segmentation is proposed that uses information contained within feature trajectories to cluster, or group, features into independently moving objects. This information includes: image-plane position, time-to-collision of a feature with the image-plane, and the type of motion observed. The Multiple Model Adaptive Estimator (MMAE) algorithm is extended to cope with constituent filters with different states (MMAE2) in an attempt to accurately estimate the time-to-collision of a feature and provide a reliable idea of the type of motion observed (in the form of a model belief measure). Finally, poor state initialisation is identified as a likely prime cause for poor Extended Kalman Filter (EKF) performance (and hence poor MMAE2 performance) when using high order models. The idea of the neurofuzzy initialised EKF (NF-EKF) is introduced which attempts to reduce the time for an EKF to converge by improving the accuracy of the EKF's initial state estimates.

This record has no associated files available for download.

More information

Published date: 1994
Additional Information: Address: Faculty of Engineering and Applied Science
Organisations: University of Southampton, Electronics & Computer Science

Identifiers

Local EPrints ID: 250163
URI: http://eprints.soton.ac.uk/id/eprint/250163
PURE UUID: 158e5b78-e97c-4fa9-9495-eb2d20dd7a4d

Catalogue record

Date deposited: 04 May 1999
Last modified: 10 Dec 2021 20:07

Export record

Contributors

Author: J.M. Roberts
Thesis advisor: D. Charnley

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×