Example-based style synthesis

Iddo Drori*, Daniel Cohen-Or, Hezy Yeshurun

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review


We introduce an example-based synthesis technique that extrapolates novel styles for a given input image. The technique is based on separating the style and content of image fragments. Given an image with a new style and content, it is first adaptively partitioned into fragments. Stitching together novel fragments produces a coherent image in a new style for a given content. The aggregate of synthesized fragments approximates a globally non-linear model with a set of locally linear models. We show the result of our method for various artistic, sketch, and texture filters and painterly styles applied to different image content classes.

Original languageEnglish
Pages (from-to)II/143-II/150
JournalProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
StatePublished - 2003
Event2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2003 - Madison, WI, United States
Duration: 18 Jun 200320 Jun 2003


Dive into the research topics of 'Example-based style synthesis'. Together they form a unique fingerprint.

Cite this