DROAN - Disparity-space Representation for Obstacle AvoidaNce - Robotics Institute Carnegie Mellon University

DROAN – Disparity-space Representation for Obstacle AvoidaNce

Geetesh Dubey, Sankalp Arora, and Sebastian Scherer
Conference Paper, Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1324 - 1330, September, 2017

Abstract

Agile MAVs are required to operate in cluttered, unstructured environments at high speeds and low altitudes for efficient data gathering. Given the payload constraints and long range sensing requirements, cameras are the preferred sensing modality for MAVs. The computation burden of using cameras for obstacle sensing has forced the state of the art methods to construct world representations on a per frame basis, leading to myopic decision making. In this paper we propose a long range perception and planning approach using cameras. By utilizing FPGA hardware for disparity calculation and image space to represent obstacles, our approach and system design allows for construction of long term world representation whilst accounting for highly non-linear noise models in real time. We demonstrate these obstacle avoidance capabilities on a quadrotor flying through dense foliage at speeds of up to 4 m/s for a total of 1.6 hours of autonomous flights. The presented approach enables high speed navigation at low altitudes for MAVs for terrestrial scouting.

BibTeX

@conference{Dubey-2017-105932,
author = {Geetesh Dubey and Sankalp Arora and Sebastian Scherer},
title = {DROAN - Disparity-space Representation for Obstacle AvoidaNce},
booktitle = {Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {2017},
month = {September},
pages = {1324 - 1330},
publisher = {IEEE},
}