ByteDance Proposes ‘DyStyle’: A Novel Dynamic Neural Network For Style Editing

Source: https://github.com/phycvgan/DyStyle/

In the last few years, AI researchers have been using Generative adversarial networks (GANs) to create images with unprecedented levels of diversity and photorealism, and one such example is StyleGAN. When it comes to enhancing the semantic controllability of StyleGANs, some style manipulation methods worked well in manipulating the style codes on one attribute. At the same time, it was problematic when jointly manipulating multiple attributes.

ByteDance (parent company of TikTok) introduces Dynamic Style Manipulation Network (DyStyle), a dynamic neural network to solve the above problems. DyStyle’s structure and parameters vary by input samples to get flexible and precise attribute control. DyStyle network utilizes a novel easy to hard training process for efficient and stable training of the network.

The primary purpose of DyStyle network is to perform multi-attribute-conditioned editing of the StyleGAN latent codes. Its parameter and structure differ by input signals, making it more compatible with various attribute configurations. The research group trained the DyStyle network using a two-stage easy-to-hard training procedure. This procedure prevents it from getting trapped on a local minimum and increases the training stability. While doing the evaluations, the proposed neural network DyStyle performed better than existing static architectures in improved attribute control accuracy and better identity preservation.

Source: https://arxiv.org/pdf/2109.10737.pdf

The proposed dynamic DyStyle network enables nonlinear and adaptive style manipulations for multi-attribute conditioned image generation. By just manipulating the latent space of StyleGAN2, the proposed network (DyStyle) could produce images of very high quality. This itself is a big advantage of the proposed network over conditional GANs. Apart from this benefit, the DyStyle network exhibits higher average precision of attribute-control and improved competency of identity preservation when compared with other linear or nonlinear styles manipulation approaches such as InterFaceGAN and StyleFlow.

Paper: https://arxiv.org/pdf/2109.10737.pdf

Github: https://github.com/phycvgan/DyStyle/

Credit: Source link

Comments are closed.