Prof. Daniel Cremers, Computer Science, Technical University Munich
Realtime Direct Visual SLAM for Autonomous Systems


The reconstruction of the 3D world from moving cameras has seen enormous progress over the last couple of years. Already in the 2000s, researchers have pioneered algorithms which can reconstruct camera motion and sparse feature-points in real-time. In my talk, I will introduce direct methods for camera tracking and 3D reconstruction which do not require feature point estimation, which exploit all available input data and which recover dense or semi-dense geometry rather than sparse point clouds. Experimental results confirm that the direct approaches lead to a drastic boost in precision and robustness. I will present recent developments on Simultaneous Localization and Mapping (SLAM) using monocular and stereo cameras, inertial sensors and deep neural networks with applications to autonomous systems.