Human skin is soft and stretchy and has millions of nerve endings that sense heat and touch. This makes it a superb instrument for detecting and responding to the outside world. Engineers have been working to reproduce these abilities in a synthetic version for the past 40 years, but such attempts have always fallen fall short of the versatility and adaptability of living skin. Now, however, new research is adding more abilities and complexities to bring this field closer to its ultimate goal: an electronic skin, or e-skin, with uses ranging from covering robots to sticking wearable devices onto humans. One day, these devices may even let humans remotely control robots and “feel” the signals they detect.

“It’s in the 1980s that we started seeing some touch sensors that you can call a crude version of skin,” says Ravinder Dahiya, a professor of electronics and nanoengineering and leader of the Bendable Electronics and Sensing Technologies group at the University of Glasgow. The first so-called flexible sensor arrays were built in the mid-1980s. One such array used Kapton, a flexible but not stretchable film invented in the 1960s, to support an arrangement of infrared sensors and detectors. This “skin” was wrapped around a simple robotic arm, which enabled the limb to “dance” with a human ballerina: if she was within 20 centimeters of the arm, it could sense her movements and respond by spontaneously modifying its own actions.

But these abilities were still extremely rudimentary, compared with those of biological skin. Available materials and electronics advanced through the 2000s to become softer, increasingly flexible and, most importantly, stretchable. These improvements allowed researchers to incorporate new sensors and electronics into a fully developed skin system, Dahiya says. Such a system involves a skinlike base that can flex and stretch, equipped with a power supply, various sensors and ways to send the sensor information to a central processor.

Touch and temperature sensors were the first to be developed for this kind of system. Wei Gao, a biomedical engineer at the California Institute of Technology, decided to try combining these sensors with ones that could detect chemicals. “We wanted to create a robotic skin that has the physical sensing capability—basically what people already do,” Gao says. “And in addition, we wanted to give it powerful chemical-sensing capability.” His team’s work was published in Science Robotics earlier this month.

Gao’s lab used an inkjet printer to layer a specialized ink made of nanomaterials—mixtures of microscopic bits of metals, carbon or other compounds—within a soft hydrogel base. By printing with different nanomaterial inks, each formulated to detect a specific chemical, Gao’s team developed skins that could sense explosives, nerve agents such as those used in chemical warfare and even viruses such as the COVID-causing SARS-CoV-2. The researchers also incorporated previously developed pressure and temperature sensors. The resulting e-skin looks like a transparent Band-Aid with metallic designs embedded in its surface.

Sensing its environment is not all this skin can do. “We also want to make sure human-machine interaction can be involved,” Gao says. To achieve this, the team developed an artificial intelligence program to enable a connection between two electronic skin patches—one on a robot and another on a human. The skin printing process is scalable, so the researchers were able to print a fingertip-size patch for a robotic hand and a larger one for a human’s forearm. This skin enabled the robot to “feel” how tightly it gripped something and to sense if the object was coated in specific chemicals. Meanwhile the humans gained the ability to control the connected robot from afar and to feel electrical signals from the robot if it detected those chemicals. The researchers say this interaction may one day let a robot stand in for a human controller, like a physical avatar, in places that are inhospitable to people.

Gao’s project required an external device to process the e-skin’s sensor data. Multiple layers of metallic ink were used—layers used for sensing and stability and to wirelessly transmit sensor data to a nearby computer or phone for collection and processing. But this is not the only way for robotic skin to analyze the information it picks up. Other labs are working on skins that sort through information themselves, similar to the way a human nervous system would.

Dahiya used human skin as the inspiration for his electronic skin’s data processing, described in two separate Science Robotics articles that were also published this month. Using electronic building blocks, such as transistors and capacitors, he says, “we can develop something that is analogous to a peripheral nervous system.” In his system, a signal from the sensors has to reach a certain threshold before being sent to a central processor. This reduces the amount of data being sent at any one time. “You cannot send unlimited data,” Dahiya explains. “If you want to send large data, then you have to have some arrangement where data can queue and can wait for those in the front.”

Dahiya points to a touch sensor his group developed that uses tiny transistors—devices that control the flow of electricity to and from other electronic components—to help robotic skin feel and learn. Pressing on the transistors in the skin causes a change in electrical current, which makes the robot “feel” pressure. Over time, it can adapt its responses to the amount of pressure detected. “These are all neurallike transistors, which can learn, which can adapt,” he says. The skin learns the robotic equivalent of pain, he adds, so it will not transmit the signal until it feels something “painful.”

In addition to remotely controlling robots or teaching them to adapt to their environments, electronic skins could have many other applications. “A lot of the opportunities, I think, aren’t for robots,” says Carmel Majidi, a mechanical engineer at Carnegie Mellon University, whose lab specializes in developing soft materials for human-compatible electronics. Majidi envisions e-skins making good sensors for robots but also for more mundane objects. They could become the basis of soft, flexible touch pads for interactive electronic devices, for example, or for sensitive clothing or upholstery capable of detecting extreme temperatures and other environmental conditions. Such skins could also be helpful in medicine. “The idea there is that [you] want these robotic skins as stickers that you can put on the body, and you can immediately track your vitals,” Majidi says.

When it comes to commercial uses, current e-skin prototypes still have issues to address. Durability is an important one, Gao notes. “There are a lot of developments. People are getting very close,” he says. “But one of the key challenges for [electronic skins] is reliability and robustness over long-term operation.” Even with such challenges, Gao says there may be robotic skins in industrial settings within the next five years.

“The limiting factor is actually not so much the robotic skin—those technologies exist. I think it’s more on the demand,” Majidi says in regard to commercial availability. “We still don’t have robots in people’s homes.” But with all the possible applications of electronic skin, he says it is crucial to have collaborations with parties outside the engineering field. “Folks who are not roboticists, folks who are not engineers, should not feel that there are hard barriers to them getting engaged in the field,” he says. Majidi suggests that potential collaborators might be people who use a prosthetic limb that could be equipped with electronic skin sensors or those who have a chronic illness and might benefit from continuous monitoring via a wearable patch.

“Soft robotics is so interdisciplinary,” he says. “You don’t need a degree from [an engineering] department or robotics institute to make important contributions and to make sure that these are successfully adopted in real life.”