In this paper we describe multidisciplinary experimental research concentrated on stereoscopic presentation of geospatial imagery data obtained from various sensors. Source data were different in scale, texture, geometry and content. None of image processing techniques allows processing such a data simultaneously. However, augmented reality system allows subjects to fuse multi-sensor, multi-temporal data and terrain reality into single model. Augmented reality experimental set, based on head-mounted display was designed to efficiently superimpose LIDAR point-clouds for comfortable stereoscopic perception. Practical research experiment performed indicates feasibility of the stereoscopic perception data obtained on-the-fly. One of the most interesting findings is that source LIDAR point-clouds do not have to be preprocessed or enhanced for being in the experiments described.