Date of this Version
Robotic navigation algorithms increasingly make use of the panoramic field of view provided by omnidirectional images to assist with localization tasks. Since the images taken by a particular class of omnidirectional sensors can be mapped to the sphere, the problem of attitude estimation arising from 3D motions of the camera can be treated as a problem of estimating the camera motion between spherical images. This problem has traditionally been solved by tracking points or features between images. However, there are many natural scenes where the features cannot be tracked with confidence. We present an algorithm that uses image features to estimate ego-motion without explicitly searching for correspondences. We formulate the problem as a correlation of functions defined on the product of spheres S2 × S2 which are acted upon by elements of the direct product group SO(3) × SO(3). We efficiently compute this correlation and obtain our solution using the spectral information of functions in S2 × S2.
robotic navigation, navigation algorithms, tracking, computer science
Date Posted: 19 August 2004