Aurora employs a downward looking vision system consisting of a color video camera with a wide angle lens, a digitizer, and a Sun Sparc portable workstation.
By applying a novel template correlation method, it is able to reliably track lane markers on the road at 60 Hz and estimate the vehicle lateral displacement within an average absolute error of 0.8cm.
Based on this estimation, the time to lane crossing is calculated for each image field, triggering a warning alarm when it falls below a threshold.
Currently there are three warning modalities: visual, audible, and haptic (vibrating the steering wheel).
|The Robotics Institute is part of the School of Computer Science, Carnegie Mellon University.|
Contact Us | Update Instructions