CLOSE
About Elements
为了美好的未来,
传播支撑社会的科技
TANAKA是“贵金属”的专家,为世界提供创造“社会价值”的“制造”。
“Elements”是主要提供符合我们的业务及价值观的有关“科技”和“可持续发展”
等方面信息的网络媒体。
在急速发生范式转换的现代,我们将不断传播促进实现更加美好的“社会”和富饶“地球”的未来的启示。
MIT CSAIL teams propose grippers with a humanlike sense of touch
In a pair of recently published technical papers, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) propose new applications of soft robotics — the subfield dealing with machines made from tissue-like materials — that aim to tackle the challenge of grasping objects of different shapes, weights, and sizes. One builds on an existing work that employs a cone-shaped origami-inspired structure designed to collapse in on objects, while the other gives a robotic gripper more nuanced, humanlike senses in the form of LEDs and two cameras.
Despite the promise of soft robotics technologies, they’re limited by their lack of tactile sense. Ideally, a gripper should be able to feel what it’s touching and sense the positions of its fingers, but most soft robots can’t. The MIT CSAIL teams’ approaches ostensibly fix that.
“We wish to enable seeing the world by feeling the world. Soft robot hands have sensorized skins that allow them to pick up a range of objects, from delicate, such as potato chips, to heavy, such as milk bottles,” said MIT professor and CSAIL director Daniela Rus in a statement.
Venus flytrap
Last year, scientists at MIT CSAIL and Harvard demonstrated a gripper design capable of lifting a wide range of household objects. The team’s hollow, cone-shaped device comprises three parts that together surround items as opposed to clutching them. In one experiment where the gripper was mounted on a robot to test its strength, it managed to lift and grasp objects that were 70% of its diameter and up to 120 times its weight without damaging them.
VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.
A new MIT CSAIL team thought there was room for improvement in the existing gripper design. To give it versatility and adaptability closer to that of a human hand, they added tactile sensors made from latex bladders (balloons) connected to pressure transducers. The sensors let the gripper pick up objects as delicate as potato chips while classifying them, enabling it to better understand what it’s grasping.
The silicon-adhered sensors — one of which is on the outer circumference of the gripper to capture its changing diameter, while the other four are attached to the inside to measure contact forces — experience internal pressure changes upon force or strain. The team measured each of these changes, using them to train an object-detecting algorithm running on an Arduino Due.
In 10 experiments during which the sensors captured and averaged together 256 samples (at a rate of 20Hz), the algorithm classified some objects — including a bottle, an apple, a box, and a Pringles can — with 100% accuracy. Other objects it classified with between 80% to 90% accuracy, including another bottle, a scrubber, a can, and a bag of cookies. (One bottle was misidentified as a can, which had a similar profile, and the toothbrush was misclassified as a box.)
In separate experiments, the researchers tested the sensor-equipped grippers’ ability to grasp delicate objects and detect when those objects might be slipping. They observed that the success rate over the course of 100 trials varied depending on the rate of the slip, with upwards of 100% success when the slip rates were higher. And they report that, when tasked with picking up 20 randomly selected kettle chips, the gripper grasped 80% without damage.
GelFlex
In the second paper, a CSAIL team describes GelFlex, a gripper consisting of a soft, transparent silicone finger with one camera near the fingertip, a second camera near the middle, reflective ink on the front and side, and LED lights affixed to the back.
The cameras, which are equipped with fisheye lenses, capture the finger’s deformations in great detail, enabling AI models trained by the team to extract information like bending angles and the shape and size of objects being grabbed. These models and GelFlex’s design allow it to pick up various items such as a Rubik’s cube, a DVD case, or a block of aluminum. During experiments, the average positional error while gripping was less than 0.77 millimeters — better than that of a human finger — and the gripper successfully recognized various cylinders and boxes 77 out of 80 times.
Above: The proposed gripper holding a glass box.
Image Credit: MIT CSAIL
In the future, the team hopes to improve the proprioception (i.e., sense of self-movement) and tactile sensing algorithms, while utilizing vision-based sensors to estimate more complex finger configurations, such as twisting or lateral bending. They’re scheduled to present their research virtually at the 2020 International Conference on Robotics and Automation, alongside the other gripper team.
“Our soft finger can provide high accuracy on proprioception and accurately predict grasped objects, and also withstand considerable impact without harming the interacted environment and itself,” lead author on the GelFlex paper Yu She said in a statement. “By constraining soft fingers with a flexible exoskeleton, and performing high resolution sensing with embedded cameras, we open up a large range of capabilities for soft manipulators.”
This article was written by Kyle Wiggers from VentureBeat and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.