Recurrent graph convolutional multi-mesh autoencoder for unsteady transonic aerodynamics
Recurrent graph convolutional multi-mesh autoencoder for unsteady transonic aerodynamics
Unsteady, high-fidelity aerodynamic load predictions around a three-dimensional configuration will remain computationally expensive for the foreseeable future. Data-driven algorithms based on deep-learning are an attractive option for reduced order modelling of complex, nonlinear systems. However, a dedicated approach is needed for applicability to large and unstructured domains that are typical in engineering. This work presents a geometric-deep-learning multi-mesh autoencoder framework to predict the spatial and temporal evolution of aerodynamic loads for a finite-span wing undergoing different types of motion. The novel framework leverages on: (a) graph neural networks for aerodynamic surface grids embedded with a multi-resolution algorithm for dimensionality reduction; and (b) a recurrent scheme for time-marching the aerodynamic loads. The test case is for the BSCW wing in transonic flow undergoing a combination of forced-motions in pitch and plunge. A comprehensive comparison between a quasi-steady and a recurrent approach is provided. The model training requires four unsteady, high-fidelity aerodynamic analyses which require each about two days of HPC computing time. For any common engineering task that involves more than four cases, a clear benefit in computing costs is achieved using the proposed framework as an alternative predictive tool: new cases are computed in seconds on a standard GPU.
Massegur, David
d5bc71e8-f1b8-4c9f-9537-7ff63ad19426
Da Ronch, Andrea
a2f36b97-b881-44e9-8a78-dd76fdf82f1a
28 October 2024
Massegur, David
d5bc71e8-f1b8-4c9f-9537-7ff63ad19426
Da Ronch, Andrea
a2f36b97-b881-44e9-8a78-dd76fdf82f1a
Massegur, David and Da Ronch, Andrea
(2024)
Recurrent graph convolutional multi-mesh autoencoder for unsteady transonic aerodynamics.
Journal of Fluids and Structures, 131, [104202].
(doi:10.1016/j.jfluidstructs.2024.104202).
Abstract
Unsteady, high-fidelity aerodynamic load predictions around a three-dimensional configuration will remain computationally expensive for the foreseeable future. Data-driven algorithms based on deep-learning are an attractive option for reduced order modelling of complex, nonlinear systems. However, a dedicated approach is needed for applicability to large and unstructured domains that are typical in engineering. This work presents a geometric-deep-learning multi-mesh autoencoder framework to predict the spatial and temporal evolution of aerodynamic loads for a finite-span wing undergoing different types of motion. The novel framework leverages on: (a) graph neural networks for aerodynamic surface grids embedded with a multi-resolution algorithm for dimensionality reduction; and (b) a recurrent scheme for time-marching the aerodynamic loads. The test case is for the BSCW wing in transonic flow undergoing a combination of forced-motions in pitch and plunge. A comprehensive comparison between a quasi-steady and a recurrent approach is provided. The model training requires four unsteady, high-fidelity aerodynamic analyses which require each about two days of HPC computing time. For any common engineering task that involves more than four cases, a clear benefit in computing costs is achieved using the proposed framework as an alternative predictive tool: new cases are computed in seconds on a standard GPU.
Text
1-s2.0-S0889974624001373-main
- Version of Record
More information
Accepted/In Press date: 8 October 2024
e-pub ahead of print date: 28 October 2024
Published date: 28 October 2024
Identifiers
Local EPrints ID: 502928
URI: http://eprints.soton.ac.uk/id/eprint/502928
ISSN: 0889-9746
PURE UUID: 090c1690-599e-4e47-b369-23b32083ecf8
Catalogue record
Date deposited: 14 Jul 2025 16:38
Last modified: 22 Aug 2025 02:30
Export record
Altmetrics
Contributors
Author:
David Massegur
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics