Researchers Develop a Robot Hand that Can Feel Objects

 In a recent breakthrough, a team of engineers has come up with an artificial fingertip that reacts to texture and shapes much like a human fingertip. According to the researchers at the University of Bristol, New York, the latest development could make robots considerably more agile and even help improve prosthetics in the longer run.

Nathan Lepora, a professor of robotics and artificial intelligence (AI) at the University of Bristol, along with his colleagues designed the artificial fingertip using a 3D-printed mesh. This mesh imitates the movement observed between the inner and outer layers of human skin.

The researchers first began designing an artificial fingertip back in 2009, using human skin as a model. Their first fingertip, assembled by hand, was about the size of a soda can. Then by 2018, they had shifted to 3D printing, which made it possible to make the tip and all its components about the size of an adult’s big toe and create a series of layers resembling the multilayered construction of human skin.

Now more recently, scientists have incorporated neural networks into the fingertip, which they have named, TacTip. The extensive network of neurons in a neural network sharpens a robot’s senses and prompts it to react accordingly, similar to an actual finger.

In a human fingertip, a layer of nerve endings contorts whenever the skin comes into contact with an object, and sends a signal to the brain about what’s happening. These nerves send “fast” signals to help us avoid dropping something and “slow” signals to convey information regarding an object’s shape.

How it Works

TacTip’s equivalent signals come from an array of pin-like projections underneath a rubbery surface layer that moves whenever the surface is touched. The array’s pins are like hairbrush bristles, stiff but bendable. Underneath that array is a small camera that detects when and how the pins move.

While the amount of bending of the pins gives out a slow signal, the speed of bending indicates a fast signal. After this, the neural network translates those signals into the fingertip’s movements, i.e., either making it grip the object more tightly or adjusting the angle of the fingertip.

This research could lead to a better replication of human hands with an improved dexterity in robotics, to better grip objects and understand the shapes that they are touching. Lepora and his team are now looking to the future, aiming to make this new artificial skin as good as human skin.

Post a Comment

Previous Post Next Post