Range imaging with adaptive color structured light

Dalit Caspi*, Nahum Kiryati, Joseph Shamir

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

339 Scopus citations

Abstract

In range sensing with time-multiplexed structured light, there is a trade-off between accuracy, robustness and the acquisition period. The acquisition period is lower bounded by the product of the number of projection patterns and the time needed for acquiring a single image. In this paper a novel structured light method is described. Adaptation of the number and form of the projection patterns to the characteristics of the scene takes place as part of the acquisition process. Noise margins are matched to the actual noise level, thus reducing the number of projection patterns to the necessary minimum. Color is used for light plane labeling. The dimension of the pattern space (and the noise margins) are thus increased without raising the number of projection patterns. It is shown that the color of an impinging light plane can be identified from the image of the illuminated scene, even with colorful scenes. Identification is local and does not rely on spatial color sequences. Therefore, in comparison to other color structured light techniques, assumptions about smoothness and color neutrality of the scene can be relaxed. The suggested approach has been implemented and the theoretical results are supported by experiments.

Original languageEnglish
Pages (from-to)470-480
Number of pages11
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume20
Issue number5
DOIs
StatePublished - 1998
Externally publishedYes

Funding

FundersFunder number
Technion-Israel Institute of Technology

    Keywords

    • Color
    • Computer vision
    • Multilevel gray code
    • Range sensor
    • Shape from X
    • Structured light
    • Video and data projector

    Fingerprint

    Dive into the research topics of 'Range imaging with adaptive color structured light'. Together they form a unique fingerprint.

    Cite this