Grant, Michael G.
Extraction of arbitrarily moving arbitrary shapes by evidence
University of Southampton, Electronics and Computer Science,
There are currently available many approaches aimed at tracking objects moving in sequences of images. These approaches can suffer in occlusion and noise, and often require initialisation. These factors can be handled by techniques that extract objects from image sequences, especially when phrased in terms of evidence gathering. As yet, the newer approaches to arbitrary shape extraction avoid discretisation affects but do not include motion. The moving-object evidence gathering approach has yet to include arbitrary shapes and can require high order description for complex motions.
Since the template approach is proven for arbitrary shapes, we re-deploy it for moving arbitrary shapes, but in a way aimed to avoid discretisation problems. As the template approach has already been seen to reduce computational demand in the extraction of arbitrary shapes, we further deploy it to describe the motion of moving arbitrary shapes. As with the shape templates, we use Fourier descriptors for the motion templates, yielding an integrated framework for the representation of shape and motion. This prior specification of motion avoids the need to use an expensive parametric model to capture data that is already known. Furthermore, as the complexity of motion increases, a parametric model would require increasingly more parameters, leading to a rapid and catastrophic increase in computational requirements, whilst the cost and complexity of the motion template model is unchanged. The new approach combining moving arbitrary shape description with motion templates permits us to achieve the objective of low dimensionality extraction of arbitrarily moving arbitrary shapes with performance advantage as reflected by the results this new technique can achieve.
||University of Southampton, Faculty of Engineering and the Environment
||01 Feb 2012 14:31
||18 Apr 2017 00:29
|Further Information:||Google Scholar|
Actions (login required)