An accurate and robust state estimation as well as an efficient representation of the robot's environment are at the core of many robotic tasks such as obstacle avoidance or planning. These tasks are essential for robotic applications in real-world scenarios, such as inspection or agricultural robotics. Recent advances have brought new types of sensors in the field, each unlocking new ways for improving the performance of the perception system, but also coming with a set of challenges. Finding ways to fuse them with conventional sensors wisely is a key challenge in the current robotics research.
In the course of this seminar, we will cover topics from the previously introduced research areas. The presented works will range from classical and widely used publications to very recent developments in the field. In the first part, it will be discussed how new sensor types (e.g., event cameras) or fusing new sensor modalities (e.g., LiDAR, GPS) can enhance the performance of classical visual-only approaches. In the second part of the seminar, it will be investigated how novel map representations can be used and applied in real-world systems.
All course related material can be found here (password required).