MonoFlow: rethinking divergence GANs via the perspective of differential equations
MonoFlow: rethinking divergence GANs via the perspective of differential equations
The conventional understanding of adversarial training in generative adversarial networks (GANs) is that the discriminator is trained to estimate a divergence, and the generator learns to minimize this divergence. We argue that despite the fact that many variants of GANs were developed following this paradigm, the current theoretical understanding of GANs and their practical algorithms are inconsistent. In this paper, we leverage Wasserstein gradient flows which characterize the evolution of particles in the sample space, to gain theoretical insights and algorithmic inspiration for GANs. We introduce a unified generative modeling framework – MonoFlow: the particle evolution is rescaled via a monotonically increasing mapping of the log density ratio. Under our framework, adversarial training can be viewed as a procedure first obtaining MonoFlow's vector field via training the discriminator and the generator learns to draw the particle flow defined by the corresponding vector field. We also reveal the fundamental difference between variational divergence minimization and adversarial training. This analysis helps us to identify what types of generator loss functions can lead to the successful training of GANs and suggest that GANs may have more loss designs beyond the literature (e.g., nonsaturated loss), as long as they realize MonoFlow. Consistent empirical studies are included to validate the effectiveness of our framework.
39984–40000
Association for Computing Machinery
Yi, Mingxuan
980c0067-b6e8-467f-8139-7975c721f6aa
Zhu, Zhanxing
e55e7385-8ba2-4a85-8bae-e00defb7d7f0
Liu, Song
a661269c-5510-4211-a2e1-a0eb7a7f64dc
23 July 2023
Yi, Mingxuan
980c0067-b6e8-467f-8139-7975c721f6aa
Zhu, Zhanxing
e55e7385-8ba2-4a85-8bae-e00defb7d7f0
Liu, Song
a661269c-5510-4211-a2e1-a0eb7a7f64dc
Yi, Mingxuan, Zhu, Zhanxing and Liu, Song
(2023)
MonoFlow: rethinking divergence GANs via the perspective of differential equations.
Krause, Andreas, Brunskill, Emma, Cho, Kyunghyun, Engelhardt, Barbara, Sabato, Sivan and Scarlett, Jonathan
(eds.)
In Proceedings of the 40th International Conference on Machine Learning (ICML).
Association for Computing Machinery.
.
(doi:10.5555/3618408.3620079).
Record type:
Conference or Workshop Item
(Paper)
Abstract
The conventional understanding of adversarial training in generative adversarial networks (GANs) is that the discriminator is trained to estimate a divergence, and the generator learns to minimize this divergence. We argue that despite the fact that many variants of GANs were developed following this paradigm, the current theoretical understanding of GANs and their practical algorithms are inconsistent. In this paper, we leverage Wasserstein gradient flows which characterize the evolution of particles in the sample space, to gain theoretical insights and algorithmic inspiration for GANs. We introduce a unified generative modeling framework – MonoFlow: the particle evolution is rescaled via a monotonically increasing mapping of the log density ratio. Under our framework, adversarial training can be viewed as a procedure first obtaining MonoFlow's vector field via training the discriminator and the generator learns to draw the particle flow defined by the corresponding vector field. We also reveal the fundamental difference between variational divergence minimization and adversarial training. This analysis helps us to identify what types of generator loss functions can lead to the successful training of GANs and suggest that GANs may have more loss designs beyond the literature (e.g., nonsaturated loss), as long as they realize MonoFlow. Consistent empirical studies are included to validate the effectiveness of our framework.
This record has no associated files available for download.
More information
Published date: 23 July 2023
Identifiers
Local EPrints ID: 486549
URI: http://eprints.soton.ac.uk/id/eprint/486549
PURE UUID: 8037fe6c-82c3-4213-b46b-7fc92654c56a
Catalogue record
Date deposited: 25 Jan 2024 17:35
Last modified: 17 Mar 2024 06:51
Export record
Altmetrics
Contributors
Author:
Mingxuan Yi
Author:
Zhanxing Zhu
Author:
Song Liu
Editor:
Andreas Krause
Editor:
Emma Brunskill
Editor:
Kyunghyun Cho
Editor:
Barbara Engelhardt
Editor:
Sivan Sabato
Editor:
Jonathan Scarlett
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics