From parallel to serial processing: A computational study of visual search

Eyal Cohen, Eytan Ruppin

Research output: Contribution to journalArticlepeer-review

Abstract

A novel computational model of a preattentive system performing visual search is presented. The model processes displays of lines, reproduced from Wolfe, Friedman-Hill, Stewart, and O'Connell's (1992) and Treisman and Sato's (1990) visual-search experiments. The response times measured in these experiments suggest that some of the displays are searched serially, whereas others are scanned in parallel. Our neural network model operates in two phases. First, the visual displays are compressed via standard methods (principal component analysis), to overcome assumed biological capacity limitations. Second, the compressed representations are further processed to identify a target in the display. The model succeeds in fast detection of targets in experimentally labeled parallel displays, but fails with serial ones. Analysis of the compressed internal representations reveals that compressed parallel displays contain global information that enables instantaneous target detection. However, in representations of serial displays, this global information is obscure, and hence, a target detection system should resort to a serial, attentional scan of local features across the display. Our analysis provides a numerical criterion that is strongly correlated with the experimental response time slopes and enables us to reformulate Duncan and Humphreys's (1989) search surface, using precise quantitative measures. Our findings provide further insight into the important debate concerning the dichotomous versus continuous views of parallel/serial visual search.

Original languageEnglish
Pages (from-to)1449-1461
Number of pages13
JournalPerception and Psychophysics
Volume61
Issue number7
DOIs
StatePublished - Oct 1999

Fingerprint

Dive into the research topics of 'From parallel to serial processing: A computational study of visual search'. Together they form a unique fingerprint.

Cite this