Research Commons
      • Browse 
        • Communities & Collections
        • Titles
        • Authors
        • By Issue Date
        • Subjects
        • Types
        • Series
      • Help 
        • About
        • Collection Policy
        • OA Mandate Guidelines
        • Guidelines FAQ
        • Contact Us
      • My Account 
        • Sign In
        • Register
      View Item 
      •   Research Commons
      • University of Waikato Research
      • Science and Engineering
      • Science and Engineering Papers
      • View Item
      •   Research Commons
      • University of Waikato Research
      • Science and Engineering
      • Science and Engineering Papers
      • View Item
      JavaScript is disabled for your browser. Some features of this site may not work without it.

      Estimating heading direction from monocular video sequences using biologically-based sensor

      Cree, Michael J.; Perrone, John A.; Anthonys, Gehan; Garnett, Aden C.; Gouk, Henry
      Thumbnail
      Files
      ivcnz16-cree-etal-2016.pdf
      Accepted version, 3.700Mb
      DOI
       10.1109/IVCNZ.2016.7804435
      Find in your library  
      Citation
      Export citation
      Cree, M. J., Perrone, J. A., Anthonys, G., Garnett, A. C., & Gouk, H. (2016). Estimating heading direction from monocular video sequences using biologically-based sensor. Presented at the Image and Vision Computing New Zealand, Palmerston North, New Zealand. https://doi.org/10.1109/IVCNZ.2016.7804435
      Permanent Research Commons link: https://hdl.handle.net/10289/10884
      Abstract
      The determination of one’s movement through the environment (visual odometry or self-motion estimation) from monocular sources such as video is an important research problem because of its relevance to robotics and autonomous vehicles. The traditional computer vision approach to this problem tracks visual features across frames in order to obtain 2-D image motion estimates from which the camera motion can be derived. We present an alternative scheme which uses the properties of motion sensitive cells in the primate brain to derive the image motion and the camera heading vector. We tested heading estimation using a camera mounted on a linear translation table with the line of sight of the camera set at a range of angles relative to straight ahead (0◦ to 50◦ in 10◦ steps). The camera velocity was also varied (0.2, 0.4, 0.8, 1.2, 1.6 and 2.0 m/s). Our biologically-based method produced accurate heading estimates over a wide range of test angles and camera speeds. Our approach has the advantage of being a one-shot estimator and not requiring iterative search techniques for finding the heading.
      Date
      2016
      Type
      Conference Contribution
      Rights
      This is an author’s accepted version of an article presented at the Image and Vision Computing New Zealand (IVCNZ 2016), 21-22 November 2016, Palmerston North, New Zealand. © 2016 Crown.
      Collections
      • Science and Engineering Papers [3124]
      Show full item record  

      Usage

      Downloads, last 12 months
      67
       
       
       

      Usage Statistics

      For this itemFor all of Research Commons

      The University of Waikato - Te Whare Wānanga o WaikatoFeedback and RequestsCopyright and Legal Statement