The University of Southampton
University of Southampton Institutional Repository

An empirical review of uncertainty estimation for quality control in CAD model segmentation

An empirical review of uncertainty estimation for quality control in CAD model segmentation
An empirical review of uncertainty estimation for quality control in CAD model segmentation
Deep neural networks are able to achieve high accuracy in semantic segmentation of geometries used in computational engineering. Being able to recognise abstract and sometimes hard to describe geometric features has applications for automated simulation, model simplification, structural failure analysis, meshing, and additive manufacturing. However, for these systems to be integrated into engineering workflows, they must provide some measures of predictive uncertainty such that engineers can reason about and trust their outputs. This work presents an empirical study of practical uncertainty estimation techniques that can be used with pre-trained neural networks for the task of boundary representation model segmentation. A point-based graph neural network is used as a base. Monte-Carlo Dropout (MCD), Deep Ensembles, test time input augmentation, and post-processing calibration are evaluated for segmentation quality control. The Deep Ensemble technique is found to be top performing and the error of a human-in-the-loop system across a dataset can be reduced from 3.8% to 0.7% for MFCAD++ and from16% to 11% for Fusion360 Gallery when 10% of the most uncertain predictions are flagged for manual correction. Models trained on only 5%of the MFCAD++ dataset were also tested, with the uncertainty estimation technique reducing the error from 9.4% to 4.3% with 10% of predictions flagged. Additionally, a point-based input augmentation is presented; which, when combined with MCD, is competitive with the Deep Ensemble while having lower computational requirements.
1865-0929
45-58
Springer Cham
Vidanes, Gerico
f42c6e15-7049-46ff-935f-701621a0bdef
Toal, David
dc67543d-69d2-4f27-a469-42195fa31a68
Keane, Andy
26d7fa33-5415-4910-89d8-fb3620413def
Zhang, Xu
21e210aa-51db-40af-a91b-f64bf44ed143
Nunez, Marco
f48ba560-b591-4dca-b8f2-3c73f2370f2f
Gregory, Jonathan
b5f3c77e-aefb-495e-959d-ae060e415257
Iliadis, Lazaros
Maglogiannis, Ilias
Kyriacou, Efthyvoulos
Jayne, Chrisina
Vidanes, Gerico
f42c6e15-7049-46ff-935f-701621a0bdef
Toal, David
dc67543d-69d2-4f27-a469-42195fa31a68
Keane, Andy
26d7fa33-5415-4910-89d8-fb3620413def
Zhang, Xu
21e210aa-51db-40af-a91b-f64bf44ed143
Nunez, Marco
f48ba560-b591-4dca-b8f2-3c73f2370f2f
Gregory, Jonathan
b5f3c77e-aefb-495e-959d-ae060e415257
Iliadis, Lazaros
Maglogiannis, Ilias
Kyriacou, Efthyvoulos
Jayne, Chrisina

Vidanes, Gerico, Toal, David, Keane, Andy, Zhang, Xu, Nunez, Marco and Gregory, Jonathan (2025) An empirical review of uncertainty estimation for quality control in CAD model segmentation. Iliadis, Lazaros, Maglogiannis, Ilias, Kyriacou, Efthyvoulos and Jayne, Chrisina (eds.) In Engineering Applications of Neural Networks: 26th International Conference, EANN 2025, Limassol, Cyprus, June 26–29, 2025, Proceedings, Part I. vol. 2581, Springer Cham. pp. 45-58 . (doi:10.1007/978-3-031-96196-0_4).

Record type: Conference or Workshop Item (Paper)

Abstract

Deep neural networks are able to achieve high accuracy in semantic segmentation of geometries used in computational engineering. Being able to recognise abstract and sometimes hard to describe geometric features has applications for automated simulation, model simplification, structural failure analysis, meshing, and additive manufacturing. However, for these systems to be integrated into engineering workflows, they must provide some measures of predictive uncertainty such that engineers can reason about and trust their outputs. This work presents an empirical study of practical uncertainty estimation techniques that can be used with pre-trained neural networks for the task of boundary representation model segmentation. A point-based graph neural network is used as a base. Monte-Carlo Dropout (MCD), Deep Ensembles, test time input augmentation, and post-processing calibration are evaluated for segmentation quality control. The Deep Ensemble technique is found to be top performing and the error of a human-in-the-loop system across a dataset can be reduced from 3.8% to 0.7% for MFCAD++ and from16% to 11% for Fusion360 Gallery when 10% of the most uncertain predictions are flagged for manual correction. Models trained on only 5%of the MFCAD++ dataset were also tested, with the uncertainty estimation technique reducing the error from 9.4% to 4.3% with 10% of predictions flagged. Additionally, a point-based input augmentation is presented; which, when combined with MCD, is competitive with the Deep Ensemble while having lower computational requirements.

Text
GV_uncertaintypaper_250325 - Accepted Manuscript
Restricted to Repository staff only until 22 June 2026.
Request a copy

More information

Published date: 22 June 2025

Identifiers

Local EPrints ID: 503318
URI: http://eprints.soton.ac.uk/id/eprint/503318
ISSN: 1865-0929
PURE UUID: ea91938c-9d95-4376-b7b1-ade484ed57f3
ORCID for David Toal: ORCID iD orcid.org/0000-0002-2203-0302
ORCID for Andy Keane: ORCID iD orcid.org/0000-0001-7993-1569
ORCID for Xu Zhang: ORCID iD orcid.org/0000-0002-6918-1861

Catalogue record

Date deposited: 29 Jul 2025 16:37
Last modified: 30 Jul 2025 01:42

Export record

Altmetrics

Contributors

Author: Gerico Vidanes
Author: David Toal ORCID iD
Author: Andy Keane ORCID iD
Author: Xu Zhang ORCID iD
Author: Marco Nunez
Author: Jonathan Gregory
Editor: Lazaros Iliadis
Editor: Ilias Maglogiannis
Editor: Efthyvoulos Kyriacou
Editor: Chrisina Jayne

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×