Visual rotation detection and estimation for mobile robot navigation
Abstract
There are a number of sensor possibilities for mobile robots. Unfortunately many of these are relatively expensive (e.g., laser scanners) or only provide sparse information (e.g., sonar rings). As an alternative, vision-based navigation is very attractive because cameras are cheap these days and computer power is plentiful. The trick is to figure out how to get valuable information out of at least some fraction of the copious pixel stream. In this paper we demonstrate how environmental landmarks can be visually extracted and tracked in order to estimate the rotation of a mobile robot This method is superior to odometry (wheel turn counting) because it will work with a wide range of environments and robot configurations. In particular, we have applied this method to a very simple motorized base in order to get it to drive in straight lines. As expected, this works far better than ballistic control. We present quantitative results of several experiments to bolster this conclusion.