Perception

The perception group research focuses on developing the sensing on the quadruped robot HyQ in order to make the robot more autonomous during navigation. 

HyQ (Hydraulic Quadruped) has been design to perform highly dynamic motions such as trotting, running and jumping, as well as locomotion over rough terrain. 

The perception sensing is required to provide localization and mapping capabilities and recognition skills which will be needed when the robot will be equipped with arms.

 

The perception sensors are:

  • A Lidar, for large scale navigation (compute global path, avoid obstacles)
  • A Kinect, for foothold planning, localization and mapping in indoor conditions.
  • A pan and tilt stereo camera, for path planning localization and mapping in indoor/outdoor operations, and semantic mapping.

A vision computer equipped with a GPU is use today for all the vision on-board processing.

The main tasks of the perception system:

  • Simulatenous Localization and Mapping which include state estimation,
  • Vision based terrain Modeling for locomotion with foothold planning (accurate 3D model, traversability map including the terrain type),
  • Large scale mapping and localization (Lidar), and fusion with the camera map,
  • Terrain/obstacle classification for semantic mapping,
  • Object recognition and manipulation.

Key barriers:

  • SLAM posture control and foothold planning
  • Highly moving platform (slippage impact)
  • Limited computational power,
  • Unstructured environment,
  • Environment changes,
  • Limited a priori knowledge.

Up to now few robots behavior already include vision:

  • auto-calibration of camera position and orientation by leg tracking.
  • colored object tracking for heading control
  • trotting with vision based step height adjustment, gait adjustment. 
  • pan and tilt vision based motion compensation.
  • mapping while trotting,