Transformer-Based Neural Texture Synthesis and Style Transfer
Jiahao, Lu1,2
2022-02
会议日期2022-2
会议地点Virtual Event Thailand
关键词low-level vision, style transfer
英文摘要

Texture modeling has been a research hotspot for long, containing topics of neural texture synthesis and neural style transfer, have gained significant attention from both industry and academia. Prior arts prevalently utilized Convolutional Neural Networks as basis for performing neural texture synthesis and neural style transfer tasks, however, they hardly explore other deep neural architectures. Is convolutional network a must for texture modeling tasks? In this work, we explore this problem by introducing a novel framework along with novel optimization objectives for Transformer-based texture synthesis and style transfer. We proposed a novel texture description metric which works well in the feature space of Transformers, and more lightweight than Gram-based texture descriptors. We also proposed pixel-level and patch-level smoothing regulariza tions to help the generative process. Our approach shows significant improvement upon the baseline and generates favorable results, showing that we can make use of Transformers’ long-range de pendencies to perform texture modeling and style transfer tasks without the help of convolutional layers.

语种英语
内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/48936]  
专题类脑芯片与系统研究
作者单位1.School of Artificial Intelligence, University of Chinese Academy of Sciences
2.Institute of Automation, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Jiahao, Lu. Transformer-Based Neural Texture Synthesis and Style Transfer[C]. 见:. Virtual Event Thailand. 2022-2.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace