AI-Powered Robotic Skin Mimics Human Touch



So many technological advancements will need to be made before we realize the dream of building a human-like robot that this goal is likely to remain a dream for many years to come. There are the control systems that must approximate the function of the brain, the actuators that will stand in for muscles, and of course, all of the perception systems required to mimic the human senses. While all of these areas have progressed significantly in recent years, they still lag far behind their biological counterparts.

Artificial vision and hearing are perhaps the most advanced of all senses, with high-quality sensing systems combining with artificial intelligence algorithms to offer up impressive performance. But the sense of touch, despite being incredibly important in understanding the world around us, has proven to be especially difficult to reproduce in machines. There are so many different cell types and signaling mechanisms densely packed into the skin that recreating it artificially is still beyond the reach of our best technology.

That is likely to be the case for some time, but the efforts of a group of researchers at the University of Cambridge and University College London have moved the field forward with the development of a durable and highly-sensitive robotic skin. Rather than embedding a slew of traditional sensors into it like most robotic skins, this new solution is made of just a single material which allows the entire surface of the skin to act as if it were a dense array of sensors.

Constructed from a conductive gelatin-based hydrogel, the skin is soft, stretchable, and can be easily molded into complex shapes like the contours of a human hand. Using electrical impedance tomography, the researchers were able to access over 860,000 unique conductive pathways across its surface. This effectively turns the hydrogel into a giant multimodal sensor, capable of detecting different types of physical input such as touch, pressure, heat, and even damage.

A machine learning model was developed to teach a robotic hand the meaning of different types of touch sensations. This model was trained by subjecting the robotic skin to a variety of interactions. The team pressed it with fingers and robotic tools, applied localized heat, and even cut into it with a scalpel. The model then analyzed the massive amount of data collected (over 1.7 million data points from just 32 electrodes at the wrist of a molded hydrogel hand) to determine which signals were most useful in identifying the nature of each stimulus.

This data-driven approach allows the robotic skin to distinguish between a range of contacts, such as gentle touches, sharp impacts, or localized heating, with a degree of sensitivity unmatched by previous systems. Importantly, the material achieves this without needing separate embedded sensors for each modality, making it simpler and more cost-effective to manufacture.

Looking ahead, the team aims to improve the durability of the material and further validate its performance in real-world robotic applications. As tactile sensing moves closer to the richness of biological skin, we inch closer to being able to build machines that can truly interact with the world as we do.

By admin

Deixe um comentário

O seu endereço de email não será publicado. Campos obrigatórios marcados com *