WALK-MAN: Whole-body Adaptive Locomotion and Manipulation
- Contact:
- Funding:
EU-FP7
- Startdate:
2013
- Enddate:
2017
Description
WALK-MAN targets at enhancing the capabilities of existing humanoid robots, permitting them to assist or replace humans in emergency situations including rescue operations in damaged or dangerous sites like destroyed buildings or power plants. The WALK-MAN robot will demonstrate human-level capabilities in locomotion, balance and manipulation. The scenario challenges the robot in several ways: Walking on unstructured terrain, in cluttered environments, among a crowd of people as well as crawling over a debris pile. The project's results will be evaluated using realistic scenarios, also consulting civil defence bodies.
KIT leads the tasks concerning multimodal perception for loco-manipulation and the representation of whole-body affordances. The partly unknown environments, in which the robot has to operate, motivate an exploration-based approach to perception. This approach will integrate whole-body actions and multimodal perceptual modalities such as visual, haptic, inertial and proprioceptive sensory information. For the representation of whole-body affordances, i.e. co-joint perception-action representations of whole-body actions associated with objects and/or environmental elements, we will rely on our previous work on Object-Action Complexes (OAC), a grounded representation of sensorimotor experience, which binds objects, actions, and attributes in a causal model and links sensorimotor information to symbolic information. We will investigate the transferability of grasping OACs to balancing OACs, inspired by the analogy between a stable whole-body configuration and a stable grasp of an object.