Robotic Finger With SkinRobotic Finger With Skin

A robotic finger coated with living human skin heals itself after researchers covered it with a collagen bandage.
Shoji Takeuchi

From the Six Million Dollar Man to RoboCop to the Terminator, Hollywood has produced a pantheon of memorable cyborgs. These hybrids tried to destroy society, or save it, according to their own goals. But they fascinate for the same reason; they blurred the lines between humans and robots in ways that have never happened in our history—but just might be part of our future.

Fully functional cyborgs are still quite aways off, but scientists are pioneering a new way to commingle human and machine. A Japanese team has designed a robotic finger that’s covered with living skin grown from actual human skin cells. The process gives the robotic appendage an extremely lifelike look, not least because the skin can move and flex naturally as the three-joint digit does. To the touch, the skin also feels far more like human skin than silicone robot skins, and can even heal when cut or split. Covering a single finger is a far cry from cloaking an entire humanoid robot in artificially produced human skin. But the groundbreaking proof of concept, detailed in a study published today in Matter, raises some incredible possibilities.

Shoji Takeuchi, an engineer specializing in biohybrid systems at the University of Tokyo, Japan, says that while some silicone-skinned robots look very human at some distance, close inspection reveals them to be artificial. That’s why his team turned to biohybrid robotics. “Our goal is to develop robots that are truly human-like,” he says. “We think that the only way to achieve an appearance that can be mistaken for a human being is to cover it with the same material as a human being—living skin cells.”

To create the lifelike appendage, Takeuchi and colleagues crafted a kind of skin-tissue cocktail, and then molded the material around the artificial finger to produce seamless and natural looking coverage.

Application of the skin was a two-part process. The team first mixed collagen and human dermal fibroblasts, the two main ingredients in our skin’s connective tissues. The finger was submerged in this solution, and while culturing in an incubator for three days, this artificial ‘dermis’ adhered to the digit as the tissues naturally shrank to produce a solid, close-fitting coating over the finger. This coating served as a foundation for the molding and application of a second coat, an ‘epidermis,’ made up of the same human skin cells that comprise some 90 percent of our own skin’s outer layer. The second solution was poured on the finger multiple times, from different angles, and left to culture for two weeks to produce the finished product.

The resulting skin has a human-like texture, and when split or cut it can be healed by the application of a collagen bandage which gradually became part of the skin itself—a technique inspired by the use of hydrogel grafts to treat severe burns.

The robot skin was created with commercially available experimental human skin cells. “Research on mass production is being actively conducted in other fields such as regenerative medicine and cultured meat research,” Takeuchi says, adding that ongoing skin production research in those areas will help his own work on clothing robots in human skin.

Other advances in the production of skin that might be applied to robots have involved creating sheets of living human skin, which then have to be cut and tailored to the various shapes of a body. Researchers at Caltech recently unveiled a printable artificial skin, made of soft hydrogel, embedded with sensors that can detect pressure, temperature or even dangerous chemicals. But it may be difficult to conform printed skin to the unique shapes of human anatomy, like a finger or a hand. Takeuchi’s method creates a form fit without the need for such efforts.

The finger, moved by an electric motor, is only one small part of the human anatomy but its movements do represent a way to explore how the skin can cover moving parts in a lifelike way. Scaling up the experiment presents some challenges, beginning with finding more efficient ways to produce the skin in larger quantities

The product is also still a lot weaker than our own skin, Takeuchi notes, and so far it must be constantly tended to in order to survive. “To maintain it for a long period of time, it needs a system that has a vascular-like structure inside that provides a constant supply of nutrients,” he explains. To solve the problem the team is mulling over how to mimic blood vessels and the equivalent of sweat glands to help deliver water to the skin.

Of course, appearance isn’t everything. Humans don’t just see one another’s skin, they touch it, and the living skin provides a much more natural feel than silicone.

Maria Paola Paladino, who has studied human attitudes towards robots at the University of Trento, Italy, points out that a lot of scientific literature exists on touch and its impact in building relationships and well-being. “There is research suggesting for example that if someone touches you, in a way you’re receptive to, you become kinder towards this person,” she says. “If you touch this robot skin, will you be able to feel a human touch? In terms of human experience that could be really interesting.”

The robot’s own sense of touch is another key feature that must be developed if robots are to interact more naturally, and safely, as they become a more common part of our everyday human environment. Scientists have tried various electronic sensors and other methods to create the sense of touch in robots. For his own finger experiment, Takeuchi plans to explore reproducing a natural nerve system to instill a sense of touch in the skin.

Robots have sparked a lot of debate about the future of artificial intelligence. Just how smart do we want robots to become, some ask, and what are the implications? Similar questions are raised when it comes to the appearances of intelligent machines—just how human do we want robots to look?

Human reactions to robots vary. A study from the Georgia Institute of Technology found that most college-aged adults preferred their robots to look like robots, while older adults preferred those with more human faces. A given robot’s role is also a factor. Most individuals in the study preferred housecleaning robots to look more like machines, for example, while those communicating with us and performing ‘smart’ tasks like giving information, were preferred to look more like us.

Increasingly, we’ll be interacting meaningfully with social robots in our daily lives. (Robots can already check you into a hotel, lead you through a workout, or conduct your funeral.) And some very humanlike robots are already among us, including Hanson Robotics’ Sofia, which boasts its own social media accounts. Founder David Hanson expounds on the benefits of making machines much like ourselves. “In designing human-inspired robotics, we hold our machines to the highest standards we know—humanlike robots being the apex of bio-inspired engineering,” writes in IEEE Spectrum, a technology publication.

Neuroscience studies have delved into human feelings for robots, and found our empathy for them when they are treated harshly isn’t yet on the same level as what we feel for other humans. We view robots as less than human, so making them more humanlike may strengthen our relationships. That might be useful as robots are increasingly socially tasked with things like caregiving or dispensing important information and advice.

“On the other hand, there are some very good examples of humanoids, like NAO, where it’s clearly a machine but it’s cute and people really like it,” says Paladino. Hollywood robots like R2-D2 and WALL-E have also engendered legions of fans without looking all that much like humans. (The Smithsonian museums are home to their own group of humanoid robots, four-foot-tall guides known as the Pepper robots, which engage visitors by dishing out information and answering questions.)

Part of the debate about robot appearance revolves around the concept of the “uncanny valley,” an idea floated by roboticist Masahiro Mori back in 1970 that also applies to creepy dolls. Mori suggests that as robots become more lifelike humans respond favorably—up until a point when the exact opposite becomes true. When they become too lifelike, the theory goes, the subtle but noticeable inhuman attributes become especially eerie and disturbing to humans who notice that something isn’t quite right. Disagreement on how to quantify the “uncanny valley,” or to the extent it even exists, continues in earnest.

Paladino has studied human reactions and attitudes to social robots that look increasingly like ourselves. She describes our evolving relationship to such robots as a paradox. On one hand, humans want social robots to be human enough in appearance and behavior to fulfill our relationship needs. On the other hand, robots that are “too human” can threaten our sense of human identity and uniqueness—a fear that might be fueled by cognitive systems that aren’t accustomed to confusing blurred boundaries between human and machine.

“If you have machines that are too similar to us, you start to have this blurring of human identity and people can be threatened by that,” she says. “If they are as human as I am, then what does it mean to be human?”

Another question may lie near the core of such doubts, ‘can we ever really trust robots?’ Right now, perhaps in part because of Hollywood creations like the Terminator and Number Six, some individuals remain very wary. Paladino believes that our relationship and attitudes towards robots will continue to evolve, for better or worse, as humans have more and more experiences with intelligent machines. In that way, the robots we produce will really shape our attitudes towards them. “What social psychology teaches us,” she says, “is that humans can change their minds.”


Leave a Reply

Your email address will not be published.