Home/DROAN – Disparity-space Representation for Obstacle AvoidaNce

DROAN – Disparity-space Representation for Obstacle AvoidaNce

Geetesh Dubey, Sankalp Arora, Sebastian Scherer
Conference Paper, September, 2017

Download Publication (PDF)

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author’s copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract

Agile Micro Aerial Vehicles (MAV) are required to operate in cluttered, unstructured environments at high speeds and low altitudes for efficient data gathering. Given the payload constraints and long range sensing requirements, cameras are the preferred sensing modality for MAVs. The computation burden of using cameras for obstacle sensing has forced the state of the art methods to construct world representations on a per frame basis, leading to myopic decision making. In this paper we propose a long range perception and planning approach using cameras. By utilizing FPGA hardware for disparity calculation and image space to represent obstacles, our approach and system design allows for construction of long term world representation whilst accounting for highly non-linear noise models in real time. We demonstrate these obstacle avoidance capabilities on a quadrotor flying through dense foliage at speeds of up to 4 m/s for a total of 1.6 hours of autonomous flights. The presented approach enables high speed navigation at low altitudes for MAVs for terrestrial scouting.

BibTeX Reference
@conference{Dubey-2017-27200,
title = {DROAN – Disparity-space Representation for Obstacle AvoidaNce},
author = {Geetesh Dubey, Sankalp Arora, Sebastian Scherer},
month = {September},
year = {2017},
}
2017-09-13T10:37:59+00:00