Helicopters are indispensable air vehicles for many applications ranging from rescue
and crime fighting to inspection and surveillance. They are most effective when
flown at close proximity to objects of interest while performing tasks such as delivering
critical supplies, rescuing stranded individuals, or inspecting damaged buildings.
These tasks require dangerous flight patterns which risk human pilot safety. An
unmanned helicopter which operates autonomously can carry out such tasks more
effectively without risking human lives. The work presented in this dissertation
develops an autonomous helicopter system for such applications. The system
employs on-board vision for stability and guidance relative to objects of interest in
the environment.
Developing a vision-based helicopter positioning and control system is challenging
for several reasons. First, helicopters are inherently unstable and capable of
exhibiting high acceleration rates. They are highly sensitive to control inputs and
require high frequency feedback with minimum delay for stability. For stable hovering,
for example, vision-based feedback rates must be at least 30-60 Hz with no
more than 1/30 second latency. Second, since helicopters rotate at high angular rates
to direct main rotor thrust for translational motion, it is difficult to disambiguate
rotation from translation with vision alone to estimate helicopter 3D motion. Third,
helicopters have limited on-board power and payload capacity. Vision and control
systems must be compact, efficient, and light weight for effective on-board integration.
Finally, helicopters are extremely dangerous and present major obstacles to
safe and calibrated experimentation to design and evaluate on-board systems.
This dissertation addresses these issues by developing: a “visual odometer” for
helicopter position estimation, a real-time and low latency vision machine architecture
to implement an on-board visual odometer machine, and an array of innovative
indoor testbeds for calibrated experimentation to design, build and demonstrate an
airworthy vision-guided autonomous helicopter. The odometer visually locks on to
ground objects viewed by a pair of on-board cameras. Using high-speed image template
matching, it estimates helicopter motion by sensing object displacements in
consecutive images. The visual odometer is implemented with a custom-designed
real-time and low latency vision machine which modularly integrates field rate (60
Hz) template matching processors, synchronized attitude sensing and image tagging
circuitry, and image acquisition, convolution, and display hardware. The visual
odometer machine along with a carrier-phase differential Global Positioning System
receiver, a classical PD control system, and human augmentation and safety systems
are integrated on-board a mid-sized helicopter, the Yamaha R50, for vision-guided
autonomous flight.