Quantification of local symmetry: application to texture discrimination

Yoram Bonneh, Daniel Reisfeld, Yehezkel Yeshurun*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Scopus citations


Symmetry is one of the most prominent cues in visual perception as well as in computer vision. We have recently presented a Generalized Symmetry Transform that receives as input an edge map, and outputs a symmetry map, where every point marks the intensity and orientation of the local generalized symmetry. In the context of computer vision, this map emphasizes points of high symmetry, which, in turn, are used to detect regions of interest for active vision systems. Many psychophysical experiments in texture discrimination use images that consist of various micro-patterns. Since the Generalized Symmetry Transform captures local spatial relations between image edges, it has been used here to predict human performance in discrimination tasks. Applying the transform to micro-patterns in some well-studied quantitative experiments of human texture discrimination, it is shown that symmetry, as characterized by the present computational scheme, can account for most of them.

Original languageEnglish
Pages (from-to)515-530
Number of pages16
JournalSpatial Vision
Issue number4
StatePublished - 1994


Dive into the research topics of 'Quantification of local symmetry: application to texture discrimination'. Together they form a unique fingerprint.

Cite this