SUBSCRIBE TO OUR FREE NEWSLETTER
PUBLISHED |

MIT’s new robot can identify objects using sight and touch

MIT’s new robot can identify objects using sight and touch article image

Researchers at MIT have moved one step closer toward human-level intelligence with a new development that allows robots to touch and feel in a similar way to humans.

To achieve this, the team at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) has taken a KUKA robot arm and added a tactile sensor called GelSight.

The information collected by GelSight is then fed to an AI so it can learn the relationship between visual and tactile information.

To teach the AI how to identify objects by touch, the team recorded 12,000 videos of 200 objects like fabrics, tools and household objects being touched. The videos were broken down into still images and the AI used this dataset to connect tactile and visual data.

"By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge", says Yunzhu Li, CSAIL PhD student and lead author on a new paper about the system.

"By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects."

Project lead YunzhuThe system is still in the early research stage, but by connecting these two senses digitally the team may have given AI a new way of experiencing the world.

This breakthrough could lead to far more sensitive and practical robotic arms that could improve any number of delicate or mission-critical operations.

For now, the robot can only identify objects in a controlled environment. The next step is to build a larger data set so the robot can work in more diverse settings.

“This is the first method that can convincingly translate between visual and touch signals”, says Andrew Owens, a postdoc at the University of California, Berkeley.

“Methods like this have the potential to be very useful for robotics, where you need to answer questions like ‘is this object hard or soft?’, or ‘if I lift this mug by its handle, how good will my grip be?’

“This is a very challenging problem, since the signals are so different, and this model has demonstrated great capability.”

related

comments

Leave A Comment
SUBSCRIBE TO OUR FREE NEWSLETTER

Featured Products