Sensing and Control for Robust Grasping With Simple Hardware

author:Leif Patrick Jentoft
adviser:Robert D. Howe
degree: Ph.D.
institution: Harvard University

Robots can move, see, and navigate in the real world outside carefully structured factories, but they cannot yet grasp and manipulate objects without human intervention. Two key barriers are the complexity of current approaches, which require complicated hardware or precise perception to function effectively, and the challenge of understanding system performance in a tractable manner given the wide range of factors that impact successful grasping. This thesis presents sensors and simple control algorithms that relax the requirements on robot hardware, and a framework to understand the capabilities and limitations of grasping systems. <\p>

The sensors and algorithms build on the recent success of underactuated hands, which use passive mechanics to adapt to object shape and position rather than trying to perceive a precise model of the object and control the grasp to match it. They include piezoelectric contact sensors that expand the range of positioning offsets the hand can tolerate, joint-angle sensors for compliant flexure joints that enable full-finger contact detection and determine object shape, and tactile sensors based on MEMS barometers that enable the hand to more gently adapt to object shape.

The framework poses the grasping problem as "overcoming" variation. It is not tractable to list all sources of variation that might potentially affect a grasp; a small subset are dominant in each context (such as object geometry or object mass), but listing them explicitly allows the clear comparison of different systems, and allows the contributions of different subsystems to be compared and understood in the same terms. This motivates a design methodology centered around the idea of a template grasp that serves as a reference around which local variation can be understood and analyzed to determine a "basin of attraction" within which a grasp is successful; this variation budget encompasses object variation, perception variation, and robot positioning errors. Increasing the size of this variation budget then serves as a target for system design.

Harvard BioRobotics Laboratory Home