Direct Disparity Space: Robust and Real-time Visual Odometry
We present a direct visual odometry formulation using a warping function in disparity space. In disparity space measurement noise is well-modeled by a Gaussian distribution, in contrast to the heteroscedastic noise in 3D space. In addition, the Jacobian of the warp separates the rotation and translation terms, enabling motion to be estimated from all image points even those located at infinity. Furthermore, we show that direct camera tracking can obtain accurate and robust performance using only a fraction of the image pixels through a simple and efficient pixel selection strategy. Our approach allows faster than real-time computation on a single CPU core with unoptimized code.
As our approach does not rely on feature extraction, the selected pixels over successive frames are often unique. Hence, triangulating the selected pixels to the world frame produces an accurate and dense 3D reconstruction with minimal computational cost making it appealing to robotics and embedded applications. We evaluate the performance of our approach against state-of-the-art methods on a range of urban and indoor datasets. We show that our algorithm produces competitive performance, requires no specialized tuning, and continues to produce competitive results even when run with low resolution images where other techniques fail to operate.