An Autonomous Vision-Guided Helicopter

Omead Amidi
doctoral dissertation, tech. report , Robotics Institute, Carnegie Mellon University, 1996

  • Adobe portable document format (pdf) (3MB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Helicopters are indispensable air vehicles for many applications ranging from rescue and crime fighting to inspection and surveillance. They are most effective when flown at close proximity to objects of interest while performing tasks such as delivering critical supplies, rescuing stranded individuals, or inspecting damaged buildings. These tasks require dangerous flight patterns which risk human pilot safety. An unmanned helicopter which operates autonomously can carry out such tasks more effectively without risking human lives. The work presented in this dissertation develops an autonomous helicopter system for such applications. The system employs on-board vision for stability and guidance relative to objects of interest in the environment.

Developing a vision-based helicopter positioning and control system is challenging for several reasons. First, helicopters are inherently unstable and capable of exhibiting high acceleration rates. They are highly sensitive to control inputs and require high frequency feedback with minimum delay for stability. For stable hovering, for example, vision-based feedback rates must be at least 30-60 Hz with no more than 1/30 second latency. Second, since helicopters rotate at high angular rates to direct main rotor thrust for translational motion, it is difficult to disambiguate rotation from translation with vision alone to estimate helicopter 3D motion. Third, helicopters have limited on-board power and payload capacity. Vision and control systems must be compact, efficient, and light weight for effective on-board integration. Finally, helicopters are extremely dangerous and present major obstacles to safe and calibrated experimentation to design and evaluate on-board systems,

This dissertation addresses these issues by developing: a "visual odometer" for helicopter position estimation, a real-time and low latency vision machine architecture to implement an on-board visual odometer machine, and an array of innovative indoor testbeds for calibrated experimentation to design, build and demonstrate an airworthy vision-guided autonomous helicopter. The odometer visually locks on to ground objects viewed by a pair of on-board cameras. Using high-speed image template matching, it estimates helicopter motion by sensing object displacements in consecutive images. The visual odometer is implemented with a custom-designed real-time and low latency vision machine which modularly integrates field rate (60 Hz) template matching processors, synchronized attitude sensing and image tagging circuitry, and image acquisition, convolution, and display hardware. The visual odometer machine along with a carrier-phase differential Global Positioning System receiver, a classical PD control system, and human augmentation and safety systems are integrated on-board a mid-sized helicopter, the Yamaha R50, for vision-guided autonomous flight.

Associated Center(s) / Consortia: Vision and Autonomous Systems Center
Associated Lab(s) / Group(s): Helicopter Lab
Associated Project(s): Autonomous Helicopter

Text Reference
Omead Amidi, "An Autonomous Vision-Guided Helicopter," doctoral dissertation, tech. report , Robotics Institute, Carnegie Mellon University, 1996

BibTeX Reference
   author = "Omead Amidi",
   title = "An Autonomous Vision-Guided Helicopter",
   booktitle = "",
   school = "Robotics Institute, Carnegie Mellon University",
   year = "1996",