Robotics and Biology Laboratory

Robotic Hands

For the Meka arms and the Barrett WAM arm we have three hands for different usage. Furthermore we have an Allegro hand and the most recent used SoMa soft hands. Among them the PISA/IIT SoftHand and the RBO Hand3, which we manufacture in our lab (for more information, see also the manufacturing lab below).  

Soft Hand Building Lab

To build the RBO Hand 2 and other soft robotic parts, we have a workshop with tools for silicone molding, and the assembly of the hands.

We also have the infrastructure to design and build our custom PneumaticBox for the pneumatic control of the hands.

A tutorial on building the RBO Hand can be found here: PneuFlex tutorial

Barrett WAM Arm

Our mobile manipulator is composed of a mobile base: a modified XR4000. We can mount different manipulators on top of the base. In the image on the right you can see a Barrett WAM.


Our two Panda arms are used in various research projects.

Meka T2

Our Meka (T2) Humanoid torso is composed of tractable arms (A2) and can be installed on the mobile platform.

Puma 560

We have two Puma 560 robot arms for teaching. The students learn how to implement joint and operational space controllers. Visual servoing and motion planning is also part of the Puma lectures

12 iRobots for Teaching

Each equiped with a Hokuyo Laser Scanner, a USB webcam and a Netbook. The iRobot platform has a differential drive and a front bumper. The base is connected via USB Interface to the Netbook. The Netbook is running Linux and we use ROS to control the robot.

CyberGlove Systems - CyberGlove II + III

In order to provide intuitive control over our manipulators and actuators, we use the CyberGlove II and III made by CyberGlove Systems to track movement of the hand and individual fingers.

RGB-D Sensors

The kinect sensor projects a known infrared pattern on the scene. This point pattern is not visible for the human eyes. A infrared camera records the pattern and estimate the depth using the shift of the infrared points. A colored image is used to give every point the right color value. The output of the kinect sensor is a colored 3D point cloud of the scene.

Motion Capturing

In order to be able to analyze movements, we maintain our own motion capturing laboratory. This allows us not only to map possible paths of movement, but also perform detailed analysis of, for example, grasping strategies.



Headnode: Intel Server System R2308WFTZS (2x Xeon Gold 5118 (12 Cores, 2,3 GHz))

84 nodes: IBM System x iDataPlex dx360 M2 (à 2x Xeon Quadcore E5540 2,5 GHz) and dx360 M3 (à 2x Xeon E5620 (4 Cores, 2,4 GHz)) and dx360 M4 (à 2x Xeon 6 Core E5-2630) server

20 nodes: Intel H2312XXLR3 Serversystem (à 2x Xeon Gold 5118 (12 Cores, 2,3 GHz))

4 nodes: Supermicro SuperServer 1029GP-TR (à 2x Xeon Gold 5118 (12 Cores, 2,3 GHz), 2x2 NVIDIA TESLA V100 32GB, 2x1 NVIDIA TESLA A100 80GB)

1 node: Gigabyte R282-G30 (2x Xeon Silver 5317 (12 Cores, 3 GHz), 1x NVIDIA TESLA A100 80GB)

Deep Learning Computers

Two computers each equipped with a NVIDIA GeForce GTX TITAN X GPU.

One high-performance computer, "Deep Thought", equipped with 4x GeForce GTX 1080 Ti GPUs.