Far-field terrain evaluation using geometric and toposemantic vision

Alec Avedisyan, David Wettergreen, Terrence W. Fong, and Charles Baur
Workshop on Advanced Space Technologies for Robotics and Automation (ASTRA), November, 2004.


Download
  • Adobe portable document format (pdf) (3MB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
In this paper, we describe how passive vision can be used to improve far-field navigation of planetary rovers, especially for detecting negative obstacles such as cliff edges, ditches, and escarpments. Far-field navigation fills the gap between close-up sensing for obstacle avoidance and satellite imagery. In particular, we can identify dangerous and/or interesting areas in distant terrain by processing color camera images.

Keywords
computer vision, navigation, mobile robots

Notes

Text Reference
Alec Avedisyan, David Wettergreen, Terrence W. Fong, and Charles Baur, "Far-field terrain evaluation using geometric and toposemantic vision," Workshop on Advanced Space Technologies for Robotics and Automation (ASTRA), November, 2004.

BibTeX Reference
@inproceedings{Avedisyan_2004_4740,
   author = "Alec Avedisyan and David Wettergreen and Terrence W Fong and Charles Baur",
   title = "Far-field terrain evaluation using geometric and toposemantic vision",
   booktitle = "Workshop on Advanced Space Technologies for Robotics and Automation (ASTRA)",
   publisher = "ESA",
   month = "November",
   year = "2004",
}