Acoustic Sensing is our latest exploration of adding tactile senses to the PneuFlex actuators. By embedding microphone into the air chamber, we can learn to recognize different contact states from sound alone. This tutorial explains how you, too, can get started with acoustic sensing.
Welcome to the RBO Hand 3(RH3) Compilation. The goal of this page and the accompanying folder here is to get you started building the RH31, as developed at the Robotics and Biology Lab at Technische Univeristät Berlin. To simplify the rather extensive process, we'll break it down into pieces wherever possible.
To control the inflation of the soft hand's pneumatic actuators we developed a custom controller board, which we call the "PneumaticBox". Here we provide an overview of the system, describe the hardware components, and link to our software stack.
Our Online Interactive Perception system extracts patterns of motion at different levels (point feature motion, rigid body motion, kinematic structure motion) and infers the kinematic structure and state of the interacted articulated objects. Optionally, it can reconstruct the shape of the moving parts and use it to improve tracking.
Instead of relying on human defined perception (mapping from observations to the current state) for a specific task, robots must be able to autonomously learn which patterns in their sensory input are important. We think that the can learn this by interacting with the world: performing actions, observing how the sensory input changes and which situations are rewarding. Here we provide the code related to our work on learning state representations with robotic priors.
In May 2015, our Team RBO won a prestigious international robotics challenge: The Amazon Picking Challenge. This challenge aims to solve one of the last problems in warehouse automation: identifying and grasping objects from a warehouse shelf. Here we provide the code and data for the object perception method of our winning entry.