SketchPatch: Sketch Stylization via Seamless Patch-level Synthesis

Noa Fish, Lilach Perry, Amit Bermano, Daniel Cohen-Or

Research output: Contribution to journalArticlepeer-review


The paradigm of image-to-image translation is leveraged for the benefit of sketch stylization via transfer of geometric textural details. Lacking the necessary volumes of data for standard training of translation systems, we advocate for operation at the patch level, where a handful of stylized sketches provide ample mining potential for patches featuring basic geometric primitives. Operating at the patch level necessitates special consideration of full sketch translation, as individual translation of patches with no regard to neighbors is likely to produce visible seams and artifacts at patch borders. Aligned pairs of styled and plain primitives are combined to form input hybrids containing styled elements around the border and plain elements within, and given as input to a seamless translation (ST) generator, whose output patches are expected to reconstruct the fully styled patch. An adversarial addition promotes generalization and robustness to diverse geometries at inference time, forming a simple and effective system for arbitrary sketch stylization, as demonstrated upon a variety of styles and sketches.

Original languageEnglish
Article number227
JournalACM Transactions on Graphics
Issue number6
StatePublished - 26 Nov 2020


  • image-to-image translation
  • neural texture synthesis


Dive into the research topics of 'SketchPatch: Sketch Stylization via Seamless Patch-level Synthesis'. Together they form a unique fingerprint.

Cite this