The University of Southampton
University of Southampton Institutional Repository

Pothole detection based on disparity transformation and road surface modeling

Pothole detection based on disparity transformation and road surface modeling
Pothole detection based on disparity transformation and road surface modeling
Pothole detection is one of the most important tasks for road maintenance. Computer vision approaches are generally based on either 2D road image analysis or 3D road surface modeling. However, these two categories are always used independently. Furthermore, the pothole detection accuracy is still far from satisfactory. Therefore, in this paper, we present a robust pothole detection algorithm that is both accurate and computationally efficient. A dense disparity map is first transformed to better distinguish between damaged and undamaged road areas. To achieve greater disparity transformation efficiency, golden section search and dynamic programming are utilized to estimate the transformation parameters. Otsu's thresholding method is then used to extract potential undamaged road areas from the transformed disparity map. The disparities in the extracted areas are modeled by a quadratic surface using least squares fitting. To improve disparity map modeling robustness, the surface normal is also integrated into the surface modeling process. Furthermore, random sample consensus is utilized to reduce the effects caused by outliers. By comparing the difference between the actual and modeled disparity maps, the potholes can be detected accurately. Finally, the point clouds of the detected potholes are extracted from the reconstructed 3D road surface. The experimental results show that the successful detection accuracy of the proposed system is around 98.7% and the overall pixel-level accuracy is approximately 99.6%.
1057-7149
897-908
Fan, Rui
26f33b75-2685-4987-9fe7-7a0da05c8ff4
Ozgunalp, Umar
a0e8b882-60f1-44ad-9439-bc53e1dbe27b
Hosking, Brett
f0b38c0e-2ae2-4cab-8e10-e05696dd505d
Liu, Ming
517510c5-5e98-4f51-b478-942c277bdd51
Pitas, Ioannis
b3380ebd-9d39-4869-a5f9-0a173794f318
Fan, Rui
26f33b75-2685-4987-9fe7-7a0da05c8ff4
Ozgunalp, Umar
a0e8b882-60f1-44ad-9439-bc53e1dbe27b
Hosking, Brett
f0b38c0e-2ae2-4cab-8e10-e05696dd505d
Liu, Ming
517510c5-5e98-4f51-b478-942c277bdd51
Pitas, Ioannis
b3380ebd-9d39-4869-a5f9-0a173794f318

Fan, Rui, Ozgunalp, Umar, Hosking, Brett, Liu, Ming and Pitas, Ioannis (2019) Pothole detection based on disparity transformation and road surface modeling. IEEE Transactions on Image Processing, 29, 897-908. (doi:10.1109/TIP.2019.2933750).

Record type: Article

Abstract

Pothole detection is one of the most important tasks for road maintenance. Computer vision approaches are generally based on either 2D road image analysis or 3D road surface modeling. However, these two categories are always used independently. Furthermore, the pothole detection accuracy is still far from satisfactory. Therefore, in this paper, we present a robust pothole detection algorithm that is both accurate and computationally efficient. A dense disparity map is first transformed to better distinguish between damaged and undamaged road areas. To achieve greater disparity transformation efficiency, golden section search and dynamic programming are utilized to estimate the transformation parameters. Otsu's thresholding method is then used to extract potential undamaged road areas from the transformed disparity map. The disparities in the extracted areas are modeled by a quadratic surface using least squares fitting. To improve disparity map modeling robustness, the surface normal is also integrated into the surface modeling process. Furthermore, random sample consensus is utilized to reduce the effects caused by outliers. By comparing the difference between the actual and modeled disparity maps, the potholes can be detected accurately. Finally, the point clouds of the detected potholes are extracted from the reconstructed 3D road surface. The experimental results show that the successful detection accuracy of the proposed system is around 98.7% and the overall pixel-level accuracy is approximately 99.6%.

This record has no associated files available for download.

More information

Accepted/In Press date: 24 July 2019
e-pub ahead of print date: 22 August 2019

Identifiers

Local EPrints ID: 435964
URI: http://eprints.soton.ac.uk/id/eprint/435964
ISSN: 1057-7149
PURE UUID: b809b91b-b41b-4d49-a40d-1878a25e8af8

Catalogue record

Date deposited: 25 Nov 2019 17:30
Last modified: 16 Mar 2024 05:16

Export record

Altmetrics

Contributors

Author: Rui Fan
Author: Umar Ozgunalp
Author: Brett Hosking
Author: Ming Liu
Author: Ioannis Pitas

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×