IMAGE STYLE MIGRATION BASED ON CYCLEGAN WITH SAME MAPPING LOSS

Xiangquan Gui, Yuxin Zhang, and Li Li

Keywords

Deep learning, image style migration, cyclic consistent adversarial network

Abstract

Image style migration, which is the remapping of the content of a given image with a style image, can be performed automatically using GAN with reduced workload and rich results. The paired datasets used for GAN methods in specific cases are difficult to obtain. Many of the current work related to image style migration is studied and improved from the perspective of the loss function, according to the characteristics and difficulties of the style migration task, the image representation of the deep neural network is characterised by the characteristics related to the image style. In this paper, based on the traditional research related to image style migration, innovatively make changes from the structural characteristics of the image generator, by putting the image style migration, to improve the already existing methods. To avoid the limitation of image style migration by using traditional GAN with paired datasets and to improve the efficiency of style migration, this paper implements image style migration using a modified cyclic consistent adversarial network CycleGAN, replacing the original network generator’s deep residual network ResNet with a densely connected convolutional network Dense Net, and using a loss consisting of the same mapping loss and perceptual loss. The loss function consisting of the same mapping loss and perceptual loss is used to measure the style migration loss. The improvements made resulted in improved network performance, eliminated the network restriction on paired samples, and improved the quality of images generated by style migration. It also further improves the stability and speeds up the convergence of the network. The proposed method of the paper performs style migration on images, and the experimental results show that the PSNR values of the generated images are improved by 6.27% on average, and the SSIM values are all improved by about 10%. Therefore, the improved CycleGAN image style migration method proposed in this paper generates better stylised images. It also provides a reference for future researchers of image style migration, and the results can be applied to movie and television production, fashion design, game development, and other areas where the application of style migration technology is promising. ∗ Lanzhou University of Technology, Lanzhou 730050, China; e-mail: zhangyasin@163.com Corresponding author: Yuxin Zhang

Important Links:



Go Back