Tag Archives: Tactile

Bionic Skin.

3D-printed bionic skin can give bots a sense of touch, protect soldiers from explosions

University of Minnesota engineers have developed an innovative 3D printing process which can lay down stretchable electronics. The devices could be used to coat robots in a sensitive layer or to provide feeling-from-a-distance for surgeons.

Bionic Skin.

Image credits Shuang-Zhuang Guo, Michael McAlpine / University of Minnesota

The fabric-like material is created using a unique multifunctional 3D printer the team designed and built in their lab. It has four different nozzles and uses specialized inks to print the device up in layers. The process starts with the printer laying down an initial layer of silicone to give the device mechanical resilience, followed by two (one top and one bottom) conductive layers which act as electrodes, a coil-shaped pressure sensor, and a final ‘sacrificial’ layer to hold everything in place while they set. This last layer won’t be part of the final device but will be washed off in the final steps of the process.


The device can ‘feel’ and relay pressure, functioning very much like out skin’s tactile sensors. And this outsourcing of tactile sense could lend itself well to many applications — for example, installing this bionic skin on surgical robots would enable surgeons to actually feel around during these procedures, instead of relying solely on cameras like they do now. It could also be used to give robots a sense of touch, helping them walk or better interact with the environment.

But one of the most exciting aspects of this printing process is that the layers are fully flexible and the whole printing process can be performed at room temperature. Each of the layers can also stretch up to three times their original size. Taken together, this means that unlike conventional 3D printing systems — which use hot, molten materials — the team’s device can print directly on live human skin. The layers are also flexible and resilient enough not to hinder motion or break during use. Here’s the printer in action on a dummy.

Michael McAlpine, a University of Minnesota mechanical engineering associate professor and lead researcher on the study says the technology could eventually lead to wearable electronics to perform a wide range of tasks, from monitoring a user’s health all the way to protecting soldiers from explosives of chemicals in combat.

“While we haven’t printed on human skin yet, we were able to print on the curved surface of a model hand using our technique,” McAlpine said. “We also interfaced a printed device with the skin and were surprised that the device was so sensitive that it could detect your pulse in real time.”

“This is a completely new way to approach 3D printing of electronics,” he adds. “We have a multifunctional printer that can print several layers to make these flexible sensory devices. This could take us into so many directions from health monitoring to energy harvesting to chemical sensing.”

Another advantage of the bionic skin is that every step of manufacturing is built into the current process, so there’s no need to scale anything up for industrial production. The skin is ” ready to go now,” according to McAlpine.

Next, the team plans to expand the process to include semiconductor inks and work on printing on a real body.

The full paper “3D Printed Stretchable Tactile Sensors” has been published in the journal Advanced Materials.

Artificial skin can feel pressure, then tell your brain about it

Prosthetics has come a long way from its humble beginnings – the crude wooden legs of yore are a far cry from the technological marvels we can create to replace our limbs today. However, there is one thing that, with all our know-how, we haven’t yet been able to incorporate in them: a sense of touch. A research team from Stanford University aims to fix this shortcoming, and has developed technology that can “feel” when force is exerted upon it, then transmit the sensory data to neurons – in essence, they’ve created an artificial skin.

Image via factor-tech

Image via factor-tech

Tactile sense is a very important source of information for our brains, and having an otherwise functioning limb that doesn’t feel what it’s touching is something most of us can’t even imagine. Sit on your hand till it goes numb, then try to tie your laces – it’s frustratingly hard, and personally, I find the sensation disturbing.

Now imagine that numbness persists for your whole life. That’s what prosthetic users have to live with, a serious limitation imposed on even the most effective prosthetic. Without tactile sensitivity, it’s hard to maintain optimal motor control, and it’s impossible to know how much force you’re exerting on an object, or it’s temperature and texture, for example. To make matters worse, having a sense of touch (even the illusion of it) is one of the best ways to alleviate phantom limb pain, which affects nearly 80% of amputees.

The human skin is a superbly complex and well tuned sensory organ – so much so in fact, that we may never be capable of creating something that reacts to stimulus in quite the same way it does. But the Stanford team, led by electrical engineer Benjamin Tee, recently performed a proof-of-concept experiment that brought artificial tactile sense from the realm of sci-fi one step closer to reality. They used flexible organic circuits and innovative pressure sensors to create a skin-like interface that can sense the force of static objects. Data recorded by the device was transmitted via optogenetic to cultured mice brain cells. Their work was published in the journal Science.

The DiTact

Artificial mechanoreceptors mounted on the fingers of a model robotic hand.
Image via phys

The system, dubbed “DiTact” (Digital Tactile System) relies on low-power, flexible organic transistor circuitry, that can translate pressure into the same signals our natural mechanoreceptors generate. To make the sensors precise and to give them a wide enough range of pressure recording, the team created them out of carbon nanotubes shaped into tiny pyramidal structures.

“Our sensor was made of tiny pyramids of rubber with carbon nanotubes distributed in it,” noted study co-author Alex Chortos. “This structure was very useful because it allowed us to easily change a few things, like the distance between the pyramids, the size of the pyramids, and the concentration of carbon nanotubes in order to get the ideal pressure sensing characteristics in the right range.”

The nanostructure of the pyramids allowed the researchers to increase the sensors’ precision close to the levels of our own cutaneous receptors.


But just having sensors isn’t enough, all the magic happens in the brain. To create sensation, the researchers took the signals from the piezometers and transferred them via optic cables to mouse cortical neurons – as the technology is still in an early stage of development, the cells were cultured in vitro rather than use the brains of live animals.

But using the same technique, signals from a prosthesis coated with DiTact could be fed directly to the brain of a living human – optogenetics has been successfully used on live subjects before. All that is needed is for a number of neurons to be genetically altered to respond to light signals. Using a transgene obtained from certain algae strands, neurons can be made to fire electrical signals when exposed to blue light, or to yellow light using a bacterial transgene.

However, because of the rate at which sensory information is processed by neurons, the team had to implement a few of their own changes to the classical method.

“Biological mechanoreceptors are able to produce signals as fast as several hundred electrical pulses per second,” says Chortos. “Previous optogenetic technologies were only capable of stimulating brain cells much slower than we need to mimic real mechanoreceptors.”

Luckily, Chortos knew of the work of Andre Berndt and Karl Deisseroth who developed a new type of optogenetic treatment that allows brain cells to be stimulated very rapidly so that they’re compatible with the speed of real mechanoreceptors. Using the new optogenetic proteins, the neurons were able to sustain longer intervals of stimulation, suggesting that the system could also work with other fast-firing neurons, including peripheral nerves. This, the team says, means that DiTact will likely work with live mice or humans, and the good results they’ve seen up to know means that they will test the system on a live mouse as soon as possible.

Getting a feel for the future

“We could validate that our sensor is conveying the correct information to [a live] animal by using behavioral cues, i.e. how the animal behaves in response to pressure,” said Chortos. “The ultimate test will be to attach the sensor to a human and ask them what they feel. In order to get truly natural touch sensing, we may need to modify and tweak our design.”

“We envision our artificial mechanoreceptors making the greatest impact via integration for sensory feedback with prosthetic systems in development by other groups,” noted co-author Amanda Nguyen. “As our sensor would be mounted alongside artificial limb systems, the primary safety concerns are centered around nerve stimulation patterns and interface.”

Nguyen notes that while the early work on sensory feedback with neurally interfaced prosthetics shows great promise, we need to truly understand how to effectively and safely stimulate nerves in order to provide realistic sensory feedback.

“As a greater understanding of stimulation parameters is gained, the output of our artificial mechanoreceptor will be tuned to follow these stimulation paradigms,” she said. “With demonstrated efficacy and safety, the potential for improving the quality of life for individuals with tactile impairments can be balanced with the ethical concerns raised by neuroprosthetics. Accessibility of this type of technology in humans will grow as both our understanding of neuroscience grows and prosthetic technology advances to provide nuanced sensory perceptions.”