CVPR2019 Tutorial
Apollo, Open Autonomous Driving Platform
Apollo is the largest open autonomous driving platform with a full stack of H/W and S/W developed by the autonomous driving community. We will present the ongoing research in Apollo and discuss about future direction of autonomous driving. We will mainly discuss 5 topics: perception, simulation, sensor fusion, localization, and control: 1) Perception: we will review pros and cons of each sensor and discuss what functionality and level of autonomy can be achieved with such sensors. We will also discuss the main issues to reach L4 autonomy with cameras. 2) Simulation: we will demonstrate game-engine based simulation for training and evaluation of perception algorithms using camera and lidar sensors. 3) Sensor fusion: we will present how to learn a prior and belief function of each sensor and fuse all sensor output usingDempster-Shafer theory. 4) Localization: we will present automated HD map generation in a large scale and show a highly precise localization algorithm integrating GNSS, IMU, camera, and lidar. 5) Control: We will also explain how to perform multiple iteration optimization in planning and introduce a learning-based dynamic control modeling and its application in simulation.
The tutorial will be held on June 17. The location is Long Beach Convention Center, Room 203B.
Course Outline
09:00 - 09:15
Introduction of Apollo – Tae Eun Choe
09:15 - 09:45
Perception for low cost self-driving (L3)– Tae Eun Choe
09:45 - 10:15
Perception for fully autonomous driving (L4) – Liang Wang
10:15 - 10:25
Sensor fusion – Liang Wang
10:25 - 10:45
Q/A and break
10:45 - 11:15
Mapping/Localization Shiyu Song
11:15 - 11:45
Prediction/Planning/Control [30 mins] - Jiangtao Hu
11:45 - 12:15
ApolloScape – Open data and tools for autonomous driving research [30 mins] – Ruigang Yang
12:15 - 12:45
Synthetic data generation for camera-based perception training and validation [30 mins] – Jaewon Jung
12:45 - 12:55
Discussion [10 mins]
Confirmed Speakers