Real-time depth map manipulation for 3D visualization

Ianir Ideses*, Barak Fishbain, Leonid Yaroslavsky

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations


One of the key aspects of 3D visualization is computation of depth maps. Depth maps enables synthesis of 3D video from 2D video and use of multi-view displays. Depth maps can be acquired in several ways. One method is to measure the real 3D properties of the scene objects. Other methods rely on using two cameras and computing the correspondence for each pixel. Once a depth map is acquired for every frame, it can be used to construct its artificial stereo pair. There are many known methods for computing the optical flow between adjacent video frames. The drawback of these methods is that they require extensive computation power and are not very well suited to high quality real-time 3D rendering. One efficient method for computing depth maps is extraction of motion vector information from standard video encoders. In this paper we present methods to improve the 3D visualization quality acquired from compression CODECS by spatial/temporal and logical operations and manipulations. We show how an efficient real time implementation of spatial-temporal local order statistics such as median and local adaptive filtering in 3D-DCT domain can substantially improve the quality of depth maps and consequently 3D video while retaining real-time rendering. Real-time performance is achived by utilizing multi-core technology using standard parallelization algorithms and libraries (OpenMP, IPP).

Original languageEnglish
Article number72440J
JournalProceedings of SPIE - The International Society for Optical Engineering
StatePublished - 2009
EventReal-Time Image and Video Processing 2009 - San Jose, CA, United States
Duration: 19 Nov 200920 Nov 2009


  • 3D
  • 3D-DCT
  • Depth-maps
  • Real-time


Dive into the research topics of 'Real-time depth map manipulation for 3D visualization'. Together they form a unique fingerprint.

Cite this