Prototype tactile recording and display system for real-time reproduction of touch (credit: UCSD)

Next level user-interface tech: recording and rendering human touch

Prototype tactile recording and display system for real-time reproduction of touch (credit: UCSD)

Prototype tactile recording and display system for real-time reproduction of touch (credit: UCSD)

Since touch screen interfaces have been introduced on mass scale the way people interact with technology has been arguably revolutionized. Still, there is much more to be explored in how the sense of touch can be manipulated to enrich user interaction with tech. Recording and relaying back information pertaining to the sense of sound (audio files) or sight (photo, video files) has been successfully incorporated in technology for more than a century now. The other senses, however, seem to have been bypassed by the digital revolution, in part because they’re so much more difficult to reproduce. This might change in the future, ultimately leading to user interfaces that offer a complete, real-life experience that stimulates all human senses.

A recent research by scientists at University of California, San Diego explores the sense of touch in user interfaces. The researchers devised a set-up comprised of sensors and sensor arrays capable of electronic recording and playback of human touch. The researchers envision  far-reaching implications for health and medicine, education, social networking, e-commerce, robotics, gaming, and military applications, among others.

In their recently demonstrated prototype, an 8 × 8 active-matrix pressure sensor array made of transparent zinc-oxide (ZnO) thin-film transistors (TFTs) records the contact touch and pressure produced by the finger tips of an user. The eletrical signal is then sent to a PC, which processes the data and from there to a tactile feedback display, which used an 8 × 8 array of diaphragm actuators made of an acrylic-based dielectric elastomer with the structure of an interpenetrating polymer network (IPN). The polymer actuators playback the initial touch recorded by the ZnO TFTs, as they can dynamically shape the force and level of displacement by adjusting both the voltage and charging time.

“One of the critical challenges in developing touch systems is that the sensation is not one thing. It can involve the feeling of physical contact, force or pressure, hot and cold, texture and deformation, moisture or dryness, and pain or itching. It makes it very difficult to fully record and reproduce the sense of touch,” the researchers write in the paper documenting their work, published in the journal Nature Scientific Reports

In addition to simply playing back touch, the touch data can be stored in memory and replayed at a later time or sent to other users. “It is also possible to change the feeling of touch, or even produce synthesized touch by varying temporal or spatial resolutions,” said Deli Wang, a professor of Electrical and Computer Engineering (ECE) in UC San Diego’s Jacobs School of Engineering and senior author . “It could create experiences that do not exist in nature.”

The technology is still in its incipient form, so it’s better to view the scientists’ work as a proof of concept instead of a complete solution. The technology needs to be drastically improved to reproduce all the intricate parameters that influence the sense of touch, yet this work signals that we’re on our way there. What about smell and taste? Well, that should be really interesting to see. [READ: Music for the nose: the olfactory organ]

  • Siarhei Vishniakou et al., Tactile Feedback Display with Spatial and Temporal Resolutions, Scientific Reports, 2013, DOI: 10.1038/srep02521 (open access)

One thought on “Next level user-interface tech: recording and rendering human touch

Leave a Reply

Your email address will not be published. Required fields are marked *