Stochastic fractional Hamiltonian Monte Carlo
Stochastic fractional Hamiltonian Monte Carlo
In this paper, we propose a novel stochastic fractional Hamiltonian Monte Carlo approach which generalizes the Hamiltonian Monte Carlo method within the framework of fractional calculus and L\'evy diffusion. Due to the large ``jumps'' introduced by L\'evy noise and momentum term, the proposed dynamics is capable of exploring the parameter space more efficiently and effectively. We have shown that the fractional Hamiltonian Monte Carlo could sample the multi-modal and high-dimensional target distribution more efficiently than the existing methods driven by Brownian diffusion. We further extend our method for optimizing deep neural networks. The experimental results show that the proposed stochastic fractional Hamiltonian Monte Carlo for training deep neural networks could converge faster than other popular optimization schemes and generalize better.
3019-3025
Ye, Nanyang
a87cef03-6348-4407-92ed-00af2a77f295
Zhu, Zhanxing
e55e7385-8ba2-4a85-8bae-e00defb7d7f0
2018
Ye, Nanyang
a87cef03-6348-4407-92ed-00af2a77f295
Zhu, Zhanxing
e55e7385-8ba2-4a85-8bae-e00defb7d7f0
Ye, Nanyang and Zhu, Zhanxing
(2018)
Stochastic fractional Hamiltonian Monte Carlo.
In Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence.
.
(doi:10.24963/ijcai.2018/419).
Record type:
Conference or Workshop Item
(Paper)
Abstract
In this paper, we propose a novel stochastic fractional Hamiltonian Monte Carlo approach which generalizes the Hamiltonian Monte Carlo method within the framework of fractional calculus and L\'evy diffusion. Due to the large ``jumps'' introduced by L\'evy noise and momentum term, the proposed dynamics is capable of exploring the parameter space more efficiently and effectively. We have shown that the fractional Hamiltonian Monte Carlo could sample the multi-modal and high-dimensional target distribution more efficiently than the existing methods driven by Brownian diffusion. We further extend our method for optimizing deep neural networks. The experimental results show that the proposed stochastic fractional Hamiltonian Monte Carlo for training deep neural networks could converge faster than other popular optimization schemes and generalize better.
This record has no associated files available for download.
More information
Published date: 2018
Identifiers
Local EPrints ID: 486249
URI: http://eprints.soton.ac.uk/id/eprint/486249
PURE UUID: cac6db4c-65dd-4370-ae73-5c39faaaa0ae
Catalogue record
Date deposited: 16 Jan 2024 17:31
Last modified: 17 Mar 2024 06:51
Export record
Altmetrics
Contributors
Author:
Nanyang Ye
Author:
Zhanxing Zhu
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics