Generating Non-Stationary Textures Using Self-Rectification

Yang Zhou, Rongjun Xiao, Dani Lischinski, Daniel Cohen-Or, Hui Huang*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper addresses the challenge of example-based non-stationary texture synthesis. We introduce a novel two-step approach wherein users first modify a reference texture using standard image editing tools, yielding an initial rough target for the synthesis. Subsequently, our proposed method, termed 'self-rectification', automatically refines this target into a coherent, seamless texture, while faithfully preserving the distinct visual characteristics of the reference exemplar. Our method leverages a pretrained diffusion network, and uses self-attention mechanisms, to grad-ually align the synthesized texture with the reference, en-suring the retention of the structures in the provided target. Through experimental validation, our approach ex-hibits exceptional proficiency in handling non-stationary textures, demonstrating significant advancements in texture synthesis when compared to existing state-of-the-art techniques. Code is available at https://github.com/xiaorongjun000/Self-Rectification

Original languageEnglish
Title of host publicationProceedings - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
PublisherIEEE Computer Society
Pages7767-7776
Number of pages10
ISBN (Electronic)9798350353006
DOIs
StatePublished - 2024
Event2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024 - Seattle, United States
Duration: 16 Jun 202422 Jun 2024

Publication series

NameProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
ISSN (Print)1063-6919

Conference

Conference2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
Country/TerritoryUnited States
CitySeattle
Period16/06/2422/06/24

Funding

FundersFunder number
Guangdong Laboratory of Artificial Intelligence and Digital Economy
Scientific Development Funds of Shenzhen University
National Natural Science Foundation of China62161146005, U2001206, U21B2023
Israel Science Foundation3441/21
Science, Technology and Innovation Commission of Shenzhen MunicipalityRCJC20200714114435012, KQTD20210811090044003
Department of Education of Guangdong Province2022 KCXTD025
National Science Foundation2022A1515010221

    Keywords

    • Non-stationary Textures
    • Self-attention mechanism
    • Texture Synthesis
    • diffusion network

    Fingerprint

    Dive into the research topics of 'Generating Non-Stationary Textures Using Self-Rectification'. Together they form a unique fingerprint.

    Cite this