Direct differentiable augmentation search
Direct differentiable augmentation search
Data augmentation has been an indispensable tool to improve the performance of deep neural networks, however the augmentation can hardly transfer among different tasks and datasets. Consequently, a recent trend is to adopt AutoML technique to learn proper augmentation policy without extensive hand-crafted tuning. In this paper, we propose an efficient differentiable search algorithm called Direct Differentiable Augmentation Search (DDAS). It exploits meta-learning with one-step gradient update and continuous relaxation to the expected training loss for efficient search. Our DDAS can achieve efficient augmentation search without relying on approximations such as Gumbel-Softmax or second order gradient approximation. To further reduce the adverse effect of improper augmentations, we organize the search space into a two level hierarchy, in which we first decide whether to apply augmentation, and then determine the specific augmentation policy. On standard image classification benchmarks, our DDAS achieves state-of-the-art performance and efficiency tradeoff while reducing the search cost dramatically, e.g. 0.15 GPU hours for CIFAR-10. In addition, we also use DDAS to search augmentation for object detection task and achieve comparable performance with AutoAugment [8], while being 1000×faster. Code will be released in https: //github.com/zxcvfd13502/DDAS_code
12219-12228
Liu, Aoming
e7a5ba43-2b6c-469b-8af4-ae23c62b9c88
Huang, Zehao
751947e5-b93c-47ba-8fdc-54bd717f9dc2
Huang, Zhiwu
84f477cd-9097-44dd-a33e-ff71f253d36b
Wang, Naiyan
a593a8f0-c930-40e0-bd47-58446f667186
11 October 2021
Liu, Aoming
e7a5ba43-2b6c-469b-8af4-ae23c62b9c88
Huang, Zehao
751947e5-b93c-47ba-8fdc-54bd717f9dc2
Huang, Zhiwu
84f477cd-9097-44dd-a33e-ff71f253d36b
Wang, Naiyan
a593a8f0-c930-40e0-bd47-58446f667186
Liu, Aoming, Huang, Zehao, Huang, Zhiwu and Wang, Naiyan
(2021)
Direct differentiable augmentation search.
In International Conference on Computer Vision.
.
Record type:
Conference or Workshop Item
(Paper)
Abstract
Data augmentation has been an indispensable tool to improve the performance of deep neural networks, however the augmentation can hardly transfer among different tasks and datasets. Consequently, a recent trend is to adopt AutoML technique to learn proper augmentation policy without extensive hand-crafted tuning. In this paper, we propose an efficient differentiable search algorithm called Direct Differentiable Augmentation Search (DDAS). It exploits meta-learning with one-step gradient update and continuous relaxation to the expected training loss for efficient search. Our DDAS can achieve efficient augmentation search without relying on approximations such as Gumbel-Softmax or second order gradient approximation. To further reduce the adverse effect of improper augmentations, we organize the search space into a two level hierarchy, in which we first decide whether to apply augmentation, and then determine the specific augmentation policy. On standard image classification benchmarks, our DDAS achieves state-of-the-art performance and efficiency tradeoff while reducing the search cost dramatically, e.g. 0.15 GPU hours for CIFAR-10. In addition, we also use DDAS to search augmentation for object detection task and achieve comparable performance with AutoAugment [8], while being 1000×faster. Code will be released in https: //github.com/zxcvfd13502/DDAS_code
This record has no associated files available for download.
More information
Published date: 11 October 2021
Venue - Dates:
nternational Conference on Computer Vision, 2021-10-11
Identifiers
Local EPrints ID: 501675
URI: http://eprints.soton.ac.uk/id/eprint/501675
PURE UUID: 5cd2b2fb-505d-4159-a287-529564047dcc
Catalogue record
Date deposited: 05 Jun 2025 16:55
Last modified: 06 Jun 2025 02:06
Export record
Contributors
Author:
Aoming Liu
Author:
Zehao Huang
Author:
Zhiwu Huang
Author:
Naiyan Wang
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics