Arbitrary-scale texture generation from coarse-grained control
Arbitrary-scale texture generation from coarse-grained control
Existing deep-network based texture synthesis approaches all focus on fine-grained control of texture generation by synthesizing images from exemplars. Since the networks employed by most of these methods are always tied to individual exemplar textures, a large number of individual networks have to be trained when modeling various textures. In this paper, we propose to generate textures directly from coarse-grained control or high-level guidance, such as texture categories, perceptual attributes and semantic descriptions. We fulfill the task by parsing the generation process of a texture into the threelevel Bayesian hierarchical model. A coarse-grained signal first determines a distribution over Markov random fields. Then a Markov random field is used to model the distribution of the final output textures. Finally, an output texture is generated from the sampled Markov random field distribution. At the bottom level of the Bayesian hierarchy, the isotropic and ergodic characteristics of the textures favor a construction that consists of a fully convolutional network. The proposed method integrates texture creation and texture synthesis into one pipeline for real-time texture generation, and enables users to readily obtain diverse textures with arbitrary scales from high-level guidance only. Extensive experiments demonstrate that the proposed method is capable of generating plausible textures that are faithful to userdefined control, and achieving impressive texture metamorphosis by interpolation in the learned texture manifold.
Bayes methods, Bayesian hierarchy, Markov random field, Markov random fields, Pipelines, Real-time systems, Solid modeling, Task analysis, Texture synthesis, Visualization, fully convolutional network
5841-5855
Gan, Yanhai
02604e60-098d-412b-ae54-e644853a294d
Gao, Feng
b70fc7ee-1c00-4b32-aa1a-272e603a3add
Dong, Junyu
ef350fb2-8682-4a0a-b60e-ebcb7f55085f
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
9 September 2022
Gan, Yanhai
02604e60-098d-412b-ae54-e644853a294d
Gao, Feng
b70fc7ee-1c00-4b32-aa1a-272e603a3add
Dong, Junyu
ef350fb2-8682-4a0a-b60e-ebcb7f55085f
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Gan, Yanhai, Gao, Feng, Dong, Junyu and Chen, Sheng
(2022)
Arbitrary-scale texture generation from coarse-grained control.
IEEE Transactions on Image Processing, 31, .
(doi:10.1109/TIP.2022.3201710).
Abstract
Existing deep-network based texture synthesis approaches all focus on fine-grained control of texture generation by synthesizing images from exemplars. Since the networks employed by most of these methods are always tied to individual exemplar textures, a large number of individual networks have to be trained when modeling various textures. In this paper, we propose to generate textures directly from coarse-grained control or high-level guidance, such as texture categories, perceptual attributes and semantic descriptions. We fulfill the task by parsing the generation process of a texture into the threelevel Bayesian hierarchical model. A coarse-grained signal first determines a distribution over Markov random fields. Then a Markov random field is used to model the distribution of the final output textures. Finally, an output texture is generated from the sampled Markov random field distribution. At the bottom level of the Bayesian hierarchy, the isotropic and ergodic characteristics of the textures favor a construction that consists of a fully convolutional network. The proposed method integrates texture creation and texture synthesis into one pipeline for real-time texture generation, and enables users to readily obtain diverse textures with arbitrary scales from high-level guidance only. Extensive experiments demonstrate that the proposed method is capable of generating plausible textures that are faithful to userdefined control, and achieving impressive texture metamorphosis by interpolation in the learned texture manifold.
Text
TextureGeneration-F
- Accepted Manuscript
Text
TIP2022-Sept
- Version of Record
Restricted to Repository staff only
Request a copy
More information
Accepted/In Press date: 13 August 2022
Published date: 9 September 2022
Additional Information:
Funding Information:
This work was supported by the National Key Research and Development Program of China under Grant 2018AAA0100602.
Publisher Copyright:
© 1992-2012 IEEE.
Keywords:
Bayes methods, Bayesian hierarchy, Markov random field, Markov random fields, Pipelines, Real-time systems, Solid modeling, Task analysis, Texture synthesis, Visualization, fully convolutional network
Identifiers
Local EPrints ID: 469929
URI: http://eprints.soton.ac.uk/id/eprint/469929
ISSN: 1057-7149
PURE UUID: 878e80a7-c53b-4798-8db5-aa449fd70838
Catalogue record
Date deposited: 28 Sep 2022 17:08
Last modified: 16 Mar 2024 21:45
Export record
Altmetrics
Contributors
Author:
Yanhai Gan
Author:
Feng Gao
Author:
Junyu Dong
Author:
Sheng Chen
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics