News Release

This soft robotic gripper can screw in your light bulbs for you

Peer-Reviewed Publication

University of California - San Diego

Robotics Gripper and Fetch Robotics Robot

image: Researchers installed the soft robotic gripper on a Fetch Robotics robot in their lab. view more 

Credit: University of California San Diego

How many robots does it take to screw in a light bulb? The answer: just one, assuming you're talking about a new robotic gripper developed by engineers at the University of California San Diego.

The engineering team has designed and built a gripper that can pick up and manipulate objects without needing to see them and needing to be trained. The gripper is unique because it brings together three different capabilities. It can twist objects; it can sense objects; and it can build models of the objects it's manipulating. This allows the gripper to operate in low light and low visibility conditions, for example.

The engineering team, led by Michael T. Tolley, a roboticist at the Jacobs School of Engineering at UC San Diego, presented the gripper at the International Conference on Intelligent Robots and Systems (or IROS) Sept. 24 to 28 in Vancouver, Canada.

Researchers tested the gripper on an industrial Fetch Robotics robot and demonstrated that it could pick up, manipulate and model a wide range of objects, from lightbulbs to screwdrivers.

"We designed the device to mimic what happens when you reach into your pocket and feel for your keys," said Tolley.

The gripper has three fingers. Each finger is made of three soft flexible pneumatic chambers, which move when air pressure is applied. This gives the gripper more than one degree of freedom, so it can actually manipulate the objects it's holding. For example, the gripper can turn screwdrivers, screw in lightbulbs and even hold pieces of paper, thanks to this design.

In addition, each finger is covered with a smart, sensing skin. The skin is made of silicone rubber, where sensors made of conducting carbon nanotubes are embedded. The sheets of rubber are then rolled up, sealed and slipped onto the flexible fingers to cover them like skin.

The conductivity of the nanotubes changes as the fingers flex, which allows the sensing skin to record and detect when the fingers are moving and coming into contact with an object. The data the sensors generate is transmitted to a control board, which puts the information together to create a 3D model of the object the gripper is manipulating. It's a process similar to a CT scan, where 2D image slices add up to a 3D picture.

The breakthroughs were possible because of the team's diverse expertise and their experience in the fields of soft robotics and manufacturing, Tolley said.

Next steps include adding machine learning and artificial intelligence to data processing so that the gripper will actually be able to identify the objects it's manipulating, rather than just model them. Researchers also are investigating using 3D printing for the gripper's fingers to make them more durable.

###

This work was supported by the Office of Naval Research grant number N000141712062, the UC San Diego Frontiers of Innovation Scholars Program (FISP) and the National Science Foundation Graduate Research Fellowship Grant No. DGE-1144086.

Video of the gripper in action: https://youtu.be/Hs14LALfmnQ

Flickr photo gallery: https://www.flickr.com/photos/jsoe/albums/72157689277342576

A Soft Robotics Gripper Capable of In-Hand Manipulation Augmented with Soft Sensor Skin for Tactile Sensing
Benjamin Shih, Dylan Drotman, Caleb Christianson, Ruffin White, Zhaoyuan Huo, Henrik I. Christensen and Michael T. Tolley, University of California San Diego
https://drive.google.com/file/d/0ByukIhRDgCTjZXlLTVJBMjlwVWc/view


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.