Date of this Version
Ameesh Makadia and Kostas Daniilidis, "Correspondenceless Ego-Motion Estimation Using an IMU", . April 2005.
Mobile robots can be easily equipped with numerous sensors which can aid in the tasks of localization and ego-motion estimation. Two such examples are Inertial Measurement Units (IMU), which provide a gravity vector via pitch and roll angular velocities, and wide-angle or panoramic imaging devices. As the number of powerful devices on a single robot increases, an important problem arises in how to fuse the information coming from multiple sources to obtain an accurate and efficient motion estimate. The IMU provides real-time readings which can be employed in orientation estimation, while in principle an Omnidirectional camera provides enough information to estimate the full rigid motion (up to translational scale). However, in addition to being computationally overwhelming, such an estimation is traditionally based on the sensitive search for feature correspondences between image frames. In this paper we present a novel algorithm that exploits information from an IMU to reduce the five parameter motion search to a three-parameter estimation. For this task we formulate a generalized Hough transform which processes image features directly to avoid searching for correspondences. The Hough space is computed rapidly by re-treating the transform as a convolution of spherical images.
omnidirectional vision, localization, inertial sensors
Date Posted: 08 February 2007
This document has been peer reviewed.