Wasserstein divergence for GANs
Wasserstein divergence for GANs
In many domains of computer vision, generative adversarial networks (GANs) have achieved great success, among which the family of Wasserstein GANs (WGANs) is considered to be state-of-the-art due to the theoretical contributions and competitive qualitative performance. However, it is very challenging to approximate the k-Lipschitz constraint required by the Wasserstein-1 metric (W-met). In this paper, we propose a novel Wasserstein divergence (W-div), which is a relaxed version of W-met and does not require the k-Lipschitz constraint. As a concrete application, we introduce a Wasserstein divergence objective for GANs (WGAN-div), which can faithfully approximate W-div through optimization. Under various settings, including progressive growing training, we demonstrate the stability of the proposed WGAN-div owing to its theoretical and practical advantages over WGANs. Also, we study the quantitative and visual performance of WGAN-div on standard image synthesis benchmarks, showing the superior performance of WGAN-div compared to the state-of-the-art methods.
673-688
Wu, Jiqing
d82f4921-9c1a-4e3d-b757-90f0497ad93c
Huang, Zhiwu
84f477cd-9097-44dd-a33e-ff71f253d36b
Thoma, Janine
32c8712b-54c3-429e-a035-b6b849bc0948
Acharya, Dinesh
26c9ebfc-9e24-4b52-96b1-36d3bafa3723
Van Gool, Luc
7aa6fbb4-68f5-4b18-8d99-ba71be78844d
7 October 2018
Wu, Jiqing
d82f4921-9c1a-4e3d-b757-90f0497ad93c
Huang, Zhiwu
84f477cd-9097-44dd-a33e-ff71f253d36b
Thoma, Janine
32c8712b-54c3-429e-a035-b6b849bc0948
Acharya, Dinesh
26c9ebfc-9e24-4b52-96b1-36d3bafa3723
Van Gool, Luc
7aa6fbb4-68f5-4b18-8d99-ba71be78844d
Wu, Jiqing, Huang, Zhiwu, Thoma, Janine, Acharya, Dinesh and Van Gool, Luc
(2018)
Wasserstein divergence for GANs.
Ferrari, Vittorio, Hebert, Martial, Sminchisescu, Cristian and Weiss, Yair
(eds.)
In Computer Vision – ECCV 2018.
vol. 11209,
Springer Cham.
.
(doi:10.1007/978-3-030-01228-1_40).
Record type:
Conference or Workshop Item
(Paper)
Abstract
In many domains of computer vision, generative adversarial networks (GANs) have achieved great success, among which the family of Wasserstein GANs (WGANs) is considered to be state-of-the-art due to the theoretical contributions and competitive qualitative performance. However, it is very challenging to approximate the k-Lipschitz constraint required by the Wasserstein-1 metric (W-met). In this paper, we propose a novel Wasserstein divergence (W-div), which is a relaxed version of W-met and does not require the k-Lipschitz constraint. As a concrete application, we introduce a Wasserstein divergence objective for GANs (WGAN-div), which can faithfully approximate W-div through optimization. Under various settings, including progressive growing training, we demonstrate the stability of the proposed WGAN-div owing to its theoretical and practical advantages over WGANs. Also, we study the quantitative and visual performance of WGAN-div on standard image synthesis benchmarks, showing the superior performance of WGAN-div compared to the state-of-the-art methods.
This record has no associated files available for download.
More information
Published date: 7 October 2018
Identifiers
Local EPrints ID: 501237
URI: http://eprints.soton.ac.uk/id/eprint/501237
ISSN: 0302-9743
PURE UUID: ed8af227-7b27-478a-a5f5-c77a474c7d02
Catalogue record
Date deposited: 27 May 2025 18:06
Last modified: 28 May 2025 02:12
Export record
Altmetrics
Contributors
Author:
Jiqing Wu
Author:
Zhiwu Huang
Author:
Janine Thoma
Author:
Dinesh Acharya
Author:
Luc Van Gool
Editor:
Vittorio Ferrari
Editor:
Martial Hebert
Editor:
Cristian Sminchisescu
Editor:
Yair Weiss
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics