Texture Mixer: A Network for Controllable Synthesis and Interpolation of Texture

CVPR 2019


Ning Yu1,2,4      Connelly Barnes3,4      Eli Shechtman3      Sohrab Amirghodsi3      Michal Lukáč3
1. University of Maryland      2. Max Planck Institute for Informatics      3. Adobe Research      4. University of Virginia
     

Abstract


This paper addresses the problem of interpolating visual textures. We formulate this problem by requiring (1) by-example controllability and (2) realistic and smooth interpolation among an arbitrary number of texture samples. To solve it we propose a neural network trained simultaneously on a reconstruction task and a generation task, which can project texture examples onto a latent space where they can be linearly interpolated and projected back onto the image domain, thus ensuring both intuitive control and realistic results. We show our method outperforms a number of baselines according to a comprehensive suite of metrics as well as a user study. We further show several applications based on our technique, which include texture brush, texture dissolve, and animal hybridization.

Demos

Texture Interpolation 128x1024

Texture Dissolve 1024x1024

     

Texture Brush 512x2048



Animal hybridization



Materials




Paper



Poster

Code

Press coverage


QbitAI Academia News

Citation

@inproceedings{yu2019texture,
  author = {Yu, Ning and Barnes, Connelly and Shechtman, Eli and Amirghodsi, Sohrab and Luk\'{a}\v{c}, Michal},
  title = {Texture Mixer: A Network for Controllable Synthesis and Interpolation of Texture},
  booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  year = {2019}
}

Acknowledgement


This research was supported by Adobe Research Funding. We thank to the Flickr photographers for licensing photos under Creative Commons or public domain.

Related Work


T. Karras, T. Aila, S. Laine, J. Lehtinen. Progressive Growing of Gans for Improved Quality, Stability, and Variation. ICLR 2018.
Comment: The state-of-the-art GAN infrastructure for our backbone implementation.
U. Bergmann, N. Jetchev, R. Vollgraf. Learning Texture Manifolds with the Periodic Spatial GAN. ICML 2017.
Comment: A texture synthesis baseline method that is based on GAN manifold interpolation and does not support user control.
S. Darabi, E. Shechtman, C. Barnes, D. Goldman, P. Sen. Image Melding: Combining Inconsistent Images using Patch-based Synthesis. SIGGRAPH 2012.
Comment: A texture synthesis baseline method that is prior to deep learng age, uses hand-crafted features, and is not efficient due to the optimization per case.
Y. Li, C. Fang, J. Yang, Z. Wang, X. Lu, M.H. Yang. Universal Style Transfer via Feature Transforms. NeurIPS 2017.
Comment: A texture synthesis baseline method that is based on style transfer and does not support input reconstruction.