Robust Motion Generation for Mobile Manipulation — Integrating Control and Planning under Uncertainty
This thesis contributes to algorithmic approaches for the motion generation problem for
mobile manipulators. This problem is unsolved in unstructured environments, where the robot does not have access to precise models but must infer the state of the world with its sensors. The challenges for motion generation in these problems arise from the uncertainty prevalent in real world sensors, the different modalities that need to be considered, and real time constraints. Our approach in this thesis is to combine local feedback control with global planning under uncertainty to solve three different applications in manipulation.
In the first part of this thesis we show the feasibility of a feedback-driven approach on a real world manipulation problem. We present an autonomous mobile manipulation system for bin picking. This system was an entry in the “Amazon Picking Challenge”, where it outperformed 25 contenders. We evaluate strength and weaknesses of feedback and planning-based methods by comparing our system to others.
In the second part, we review planning-based approaches, using sampling to efficiently search high-dimensional spaces. We present a novel planner for motion that exploits contact to reduce uncertainty. We propose a particle-based uncertainty model and search the combined space of configurations in free space and in contact. Our experiments show that our planners strategies are more robust strategies than solutions of traditional sampling-based planners because of the use of contact to reduce uncertainty. We extend this planner with a model for tactile feedback which allows it to localize objects just using the signal of contact sensors.
In the third part, we discuss the continuous integration of sensor data into plans. To move efficiently in unstructured environments, robots must continuously adapt their plans in response to sensor data. We review trajectory optimization as a tool for path adaptation. We propose a novel approach to sensor-based motion generation based on a factorization in three tasks: 1) continuous path adaption, 2) continuous local planning of new motion alternatives 3) global planning with a model of an uncertain environment. This factorization allows us to generate robust motion in initially completely unknown environments with dynamic obstacles. In addition, we introduce an online learning method for manipulation control based on multi-modal sensors feedback. We conclude this thesis by combining all introduced techniques into a novel unifying framework for motion generation.