Some Core Techniques for Safe Autonomous Driving
Reliable curb detection is critical for safe autonomous driving in urban contexts. Curb detection and tracking are also useful in vehicle localization and path planning. Past work utilized a 3D LiDAR sensor to determine accurate distance information and the geometric attributes of curbs. However, such an approach requires dense point cloud data and is also vulnerable to false positives from obstacles present on both road and off-road areas. In this effort, the research team proposes an approach to detect and track curbs by fusing together data from multiple sensors: sparse LiDAR data, a mono camera and low-cost ultrasonic sensors. The detection algorithm will be based on a single 3D LiDAR and a mono camera sensor used to detect candidate curb features and it effectively removes false positives arising from surrounding static and moving obstacles. The detection accuracy of the tracking algorithm is boosted by using Kalman filter-based prediction and fusion with lateral distance information from low-cost ultrasonic sensors. The team will also conduct a complementary effort with the following goals. Autonomous vehicles promise significant advances in transportation safety, efficiency and comfort. However, achieving the goal of full autonomy is impeded by the need to address several operational challenges encountered in practice. Gesture recognition of flagmen on roads is one such set of challenges. An autonomous vehicle needs to make safe decisions and facilitate forward progress in the presence of road construction workers and flagmen. However, human gestures under diverse environmental conditions are very varied and represent significant complexity. In this effort, the team proposes (1) a taxonomy of challenges for organizing traffic gestures, (2) a sizeable flagman gesture dataset, and (3) extensive experiments on practical algorithms for gesture recognition. The team will categorize traffic gestures according to their semantics, flagman appearances, and the environmental context. The team will then collect a dataset covering a range of common flagman gestures with and without props such as signs and flags. Finally, the team will develop a recognition algorithm using different feature representations of the human pose and perform extensive ablation experiments on each component.
- Record URL:
Language
- English
Project
- Status: Active
- Funding: $224572
-
Contract Numbers:
69A3551747111
-
Sponsor Organizations:
National University Transportation Center for Improving Mobility (Mobility21)
Carnegie Mellon University
Pittsburgh, PA United States 15213Office of the Assistant Secretary for Research and Technology
University Transportation Center Program
, -
Managing Organizations:
National University Transportation Center for Improving Mobility (Mobility21)
Carnegie Mellon University
Pittsburgh, PA United States 15213 -
Project Managers:
Kline, Robin
-
Performing Organizations:
Carnegie Mellon University
, -
Principal Investigators:
Rajkumar, Raj
- Start Date: 20191101
- Expected Completion Date: 20230630
- Actual Completion Date: 0
- USDOT Program: University Transportation Centers
Subject/Index Terms
- TRT Terms: Algorithms; Autonomous vehicles; Curbs; Data fusion; Detection and identification system applications; Flaggers; Kalman filtering; Laser radar; Sensors; Work zones
- Subject Areas: Data and Information Technology; Highways; Safety and Human Factors; Vehicles and Equipment;
Filing Info
- Accession Number: 01767054
- Record Type: Research project
- Source Agency: National University Transportation Center for Improving Mobility (Mobility21)
- Contract Numbers: 69A3551747111
- Files: UTC, RIP
- Created Date: Mar 17 2021 4:23PM