Cooperative Perception and Localization for Cooperative Driving - Robotics Institute Carnegie Mellon University

Cooperative Perception and Localization for Cooperative Driving

Aaron Miller, Kyungzun Rim, Parth Chopra, Paritosh Kelkar, and Maxim Likhachev
Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, pp. 1256 - 1262, May, 2020

Abstract

Fully autonomous vehicles are expected to share the road with less advanced vehicles for a significant period of time. Furthermore, an increasing number of vehicles on the road are equipped with a variety of low-fidelity sensors which provide some perception and localization data, but not at a high enough quality for full autonomy. In this paper, we develop a perception and localization system that allows a vehicle with low-fidelity sensors to incorporate high-fidelity observations from a vehicle in front of it, allowing both vehicles to operate with full autonomy. The resulting system generates perception and localization information that is both low-noise in regions covered by high-fidelity sensors and avoids false negatives in areas only observed by low-fidelity sensors, while dealing with latency and dropout of the communication link between the two vehicles. At its core, the system uses a set of Extended Kalman filters which incorporate observations from both vehicles' sensors and extrapolate them using information about the road geometry. The perception and localization algorithms are evaluated both in simulation and on real vehicles as part of a full cooperative driving system.

BibTeX

@conference{Miller-2020-125506,
author = {Aaron Miller and Kyungzun Rim and Parth Chopra and Paritosh Kelkar and Maxim Likhachev},
title = {Cooperative Perception and Localization for Cooperative Driving},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2020},
month = {May},
pages = {1256 - 1262},
}