Robotics III: Sensors and Perception in Robotics
- type: Lecture (V)
- chair: IAR Asfour
- semester: SS 2026
-
time:
Thu 2026-04-23
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-04-30
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-05-07
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-05-21
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-06-11
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-06-18
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-06-25
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-07-02
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-07-09
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-07-16
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-07-23
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
Thu 2026-07-30
14:00 - 15:30, weekly
50.34 Raum -102 (UG)
50.34 INFORMATIK, Kollegiengebäude am Fasanengarten (1. Untergeschoss)
-
lecturer:
Prof. Dr.-Ing. Tamim Asfour
Prof. Dr. Rudolph Triebel - sws: 2
- lv-no.: 2400067
- information: On-Site
| Content | This lecture complements the lecture “Robotics I” and provides a comprehensive overview of sensors and perception methods in robotics. It is divided into two main parts. The first part introduces fundamental concepts of perception in robotics, including the distinction between sensation and perception within the perception-cognition-action loop. It covers sensor fundamentals, such as sensor characteristics (resolution, range, accuracy, bandwidth), analog-to-digital conversion, and common sensor classification schemes. Proprioceptive sensors for measuring the robot’s internal state are discussed, including encoders (optical, magnetic, absolute, and incremental), force/torque sensors, and inertial measurement units (IMUs). The lecture then covers exteroceptive sensors, including proximity sensors, range sensors (LiDAR, time-of-flight cameras, ultrasonic sensors), visual sensors (monocular, stereo, RGB-D cameras), and tactile sensing technologies (capacitive, resistive, and optical). The second part focuses on processing and interpreting sensor data. Topics in robot vision include image representation, feature detection and matching, object detection and recognition, semantic segmentation, and corresponding deep learning approaches. Point cloud processing is covered with emphasis on data structures, registration, surface reconstruction, and semantic segmentation. The lecture also addresses perception for manipulation, covering object pose estimation, grasp detection, visual servoing, active vision, and haptic exploration. It concludes with the simultaneous localization and mapping (SLAM) problem, including EKF SLAM, Graph SLAM, and FastSLAM.
Students can explain the main sensor principles used in robotics and distinguish between proprioceptive sensors (encoders, force/torque sensors, IMUs) and exteroceptive sensors (proximity, range, visual, and tactile sensors). They understand and can characterize sensor properties such as resolution, range, accuracy, and bandwidth, and can explain the principles of analog-to-digital conversion. Students are able to propose, analyze, and justify suitable sensor concepts for specific robotic tasks, taking into account trade-offs between different sensor modalities. Students can apply fundamental perception methods for robotic applications. This includes robot vision techniques such as feature detection and matching, object detection and recognition, and semantic segmentation, including deep learning-based approaches. They can describe and analyze perception methods for manipulation tasks, including object pose estimation, grasp detection, active vision strategies, and haptic exploration. In addition, students can explain point cloud processing methods for registration and surface reconstruction, as well as principles of simultaneous localization and mapping (SLAM). |
| Language of instruction | English |
| Bibliography | Lecture slides will be provided during the course. Accompanying literature references regarding the individual topics of the lecture will be provided. |
| Organisational issues | The assessment is carried out as a written examination (§ 4 Abs. 2 No. 1 SPO) of, in general, 60 minutes. Recommendations Workload:
|