Autonomous Flight in GPS-Denied Environments Using Monocular Vision and Inertial Sensors

A. D. Wu, E. N. Johnson, Michael Kaess, F. Dellaert and G. Chowdhary
Journal Article, AIAA J. of Aerospace Information Systems (JAIS), Vol. 10, No. 4, pp. 172-186, April, 2013

View Publication

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.


A vision-aided inertial navigation system that enables autonomous flight of an aerial vehicle in GPS-denied environments is presented. Particularly, feature point information from a monocular vision sensor are used to bound the drift resulting from integrating accelerations and angular rate measurements from an Inertial Measurement Unit (IMU) forward in time. An Extended Kalman filter framework is proposed for performing the tasks of vision-based mapping and navigation separately. When GPS is available, multiple observations of a single landmark point from the vision sensor are used to estimate the point’s location in inertial space. When GPS is not available, points that have been sufficiently mapped out can be used for estimating vehicle position and attitude. Simulation and flight test results of a vehicle operating autonomously in a simplified loss-of-GPS scenario verify the presented method.

author = {A. D. Wu and E. N. Johnson and Michael Kaess and F. Dellaert and G. Chowdhary},
title = {Autonomous Flight in GPS-Denied Environments Using Monocular Vision and Inertial Sensors},
journal = {AIAA J. of Aerospace Information Systems (JAIS)},
year = {2013},
month = {April},
volume = {10},
number = {4},
pages = {172-186},
} 2018-11-12T12:46:15-04:00