Videos

(more recent videos can be found here and here)

Intelligent whole-body reaching
The robot learns a motor representation of its workspace (reachable/non-reachable space) and use it to perform whole-body reaching. Article online.


















From eye-hand to eye-tool coordination
The robot learns internal sensorimotor models that support eye-limbs coordination, and that allow the inclusion of tools in the body schema. Article online.



 














Learning how to reach for a visually identified object
The robot combines visual and proprioceptive information to incrementally learn sensorimotor models for eye-neck-arm-hand coordination. Article online.



After some motor experience, the learned models can be successfully used to track objects with the eyes and head, and to reach for them with the hand.

 


Machine-learning based control of tendon-driven flexible neck
After some training movements, during which online machine-learning is performed on the gathered sensory data, the head orientation can be precisely controlled. Article online.

 


Hand obstacle avoidance based on tactile sensing
Compliant control of the hand based on the tactile feedback measured by the robot pressure-sensitive fingertips. Article online.