Extremely randomized neural networks for constructing prediction intervals
Extremely randomized neural networks for constructing prediction intervals
The aim of this paper is to propose a novel prediction model based on an ensemble of deep neural networks adapting the extremely randomized trees method originally developed for random forests. The extra-randomness introduced in the ensemble reduces the variance of the predictions and improves out-of-sample accuracy. As a byproduct, we are able to compute the uncertainty about our model predictions and construct interval forecasts. Some of the limitations associated with bootstrap-based algorithms can be overcome by not performing data resampling and thus, by ensuring the suitability of the methodology in low and mid-dimensional settings, or when the i.i.d. assumption does not hold. An extensive Monte Carlo simulation exercise shows the good performance of this novel prediction method in terms of mean square prediction error and the accuracy of the prediction intervals in terms of out-of-sample prediction interval coverage probabilities. The advanced approach delivers better out-of-sample accuracy in experimental settings, improving upon state-of-the-art methods like MC dropout and bootstrap procedures.
Dropout, Ensemble methods, Neural networks, Prediction interval, Uncertainty quantification
113-128
Mancini, Tullio
3e5a59a2-e184-4996-a7d6-7b4394bec08c
Calvo-Pardo, Hector
07a586f0-48ec-4049-932e-fb9fc575f59f
Olmo, Jose
706f68c8-f991-4959-8245-6657a591056e
December 2021
Mancini, Tullio
3e5a59a2-e184-4996-a7d6-7b4394bec08c
Calvo-Pardo, Hector
07a586f0-48ec-4049-932e-fb9fc575f59f
Olmo, Jose
706f68c8-f991-4959-8245-6657a591056e
Mancini, Tullio, Calvo-Pardo, Hector and Olmo, Jose
(2021)
Extremely randomized neural networks for constructing prediction intervals.
Neural Networks : the official journal of the International Neural Network Society, 144, .
(doi:10.1016/j.neunet.2021.08.020).
Abstract
The aim of this paper is to propose a novel prediction model based on an ensemble of deep neural networks adapting the extremely randomized trees method originally developed for random forests. The extra-randomness introduced in the ensemble reduces the variance of the predictions and improves out-of-sample accuracy. As a byproduct, we are able to compute the uncertainty about our model predictions and construct interval forecasts. Some of the limitations associated with bootstrap-based algorithms can be overcome by not performing data resampling and thus, by ensuring the suitability of the methodology in low and mid-dimensional settings, or when the i.i.d. assumption does not hold. An extensive Monte Carlo simulation exercise shows the good performance of this novel prediction method in terms of mean square prediction error and the accuracy of the prediction intervals in terms of out-of-sample prediction interval coverage probabilities. The advanced approach delivers better out-of-sample accuracy in experimental settings, improving upon state-of-the-art methods like MC dropout and bootstrap procedures.
Text
Ensemble_MCPO
- Accepted Manuscript
More information
Accepted/In Press date: 13 August 2021
e-pub ahead of print date: 19 August 2021
Published date: December 2021
Keywords:
Dropout, Ensemble methods, Neural networks, Prediction interval, Uncertainty quantification
Identifiers
Local EPrints ID: 453786
URI: http://eprints.soton.ac.uk/id/eprint/453786
ISSN: 0893-6080
PURE UUID: c9fd015d-7057-4128-890e-8ef9a35e9cc7
Catalogue record
Date deposited: 24 Jan 2022 17:48
Last modified: 17 Mar 2024 06:51
Export record
Altmetrics
Contributors
Author:
Tullio Mancini
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics