Larry Jackel (NVIDIA) - CS Special Guest Talk
Thursday, 19.10.2017, 14:30
This talk describes a system, known as PilotNet, that provides perception and control for self-driving cars. PilotNet learns from data, emulating the behavior of human drivers. The use of hand-crafted rules is kept to a minimum. As we gather more data we find that PilotNet performance keeps improving. Our test car, controlled by PilotNet, can drive, on average, over 20 miles on a highway before a human intervention is required. The core technology in PilotNet is convolutional neural networks (CNNs). These nets can perform lane keeping, lane changes, and turns at intersections. Similar nets have shown promise in identifying the presence and location of stop signs using only weakly supervised learning. To better understand how PilotNet works, we built a visualization tool that runs real-time on our test car and that highlights objects in the input image that are most salient in determining PilotNet’s output.