Loading...
Thumbnail Image
Item

Estimating heading direction from monocular video sequences using biologically-based sensor

Abstract
The determination of one’s movement through the environment (visual odometry or self-motion estimation) from monocular sources such as video is an important research problem because of its relevance to robotics and autonomous vehicles. The traditional computer vision approach to this problem tracks visual features across frames in order to obtain 2-D image motion estimates from which the camera motion can be derived. We present an alternative scheme which uses the properties of motion sensitive cells in the primate brain to derive the image motion and the camera heading vector. We tested heading estimation using a camera mounted on a linear translation table with the line of sight of the camera set at a range of angles relative to straight ahead (0◦ to 50◦ in 10◦ steps). The camera velocity was also varied (0.2, 0.4, 0.8, 1.2, 1.6 and 2.0 m/s). Our biologically-based method produced accurate heading estimates over a wide range of test angles and camera speeds. Our approach has the advantage of being a one-shot estimator and not requiring iterative search techniques for finding the heading.
Type
Conference Contribution
Type of thesis
Series
Citation
Cree, M. J., Perrone, J. A., Anthonys, G., Garnett, A. C., & Gouk, H. (2016). Estimating heading direction from monocular video sequences using biologically-based sensor. Presented at the Image and Vision Computing New Zealand, Palmerston North, New Zealand. https://doi.org/10.1109/IVCNZ.2016.7804435
Date
2016
Publisher
Degree
Supervisors
Rights
This is an author’s accepted version of an article presented at the Image and Vision Computing New Zealand (IVCNZ 2016), 21-22 November 2016, Palmerston North, New Zealand. © 2016 Crown.