Tag Archives: user interface

Prototype tactile recording and display system for real-time reproduction of touch (credit: UCSD)

Next level user-interface tech: recording and rendering human touch

Prototype tactile recording and display system for real-time reproduction of touch (credit: UCSD)

Prototype tactile recording and display system for real-time reproduction of touch (credit: UCSD)

Since touch screen interfaces have been introduced on mass scale the way people interact with technology has been arguably revolutionized. Still, there is much more to be explored in how the sense of touch can be manipulated to enrich user interaction with tech. Recording and relaying back information pertaining to the sense of sound (audio files) or sight (photo, video files) has been successfully incorporated in technology for more than a century now. The other senses, however, seem to have been bypassed by the digital revolution, in part because they’re so much more difficult to reproduce. This might change in the future, ultimately leading to user interfaces that offer a complete, real-life experience that stimulates all human senses.

A recent research by scientists at University of California, San Diego explores the sense of touch in user interfaces. The researchers devised a set-up comprised of sensors and sensor arrays capable of electronic recording and playback of human touch. The researchers envision  far-reaching implications for health and medicine, education, social networking, e-commerce, robotics, gaming, and military applications, among others.

In their recently demonstrated prototype, an 8 × 8 active-matrix pressure sensor array made of transparent zinc-oxide (ZnO) thin-film transistors (TFTs) records the contact touch and pressure produced by the finger tips of an user. The eletrical signal is then sent to a PC, which processes the data and from there to a tactile feedback display, which used an 8 × 8 array of diaphragm actuators made of an acrylic-based dielectric elastomer with the structure of an interpenetrating polymer network (IPN). The polymer actuators playback the initial touch recorded by the ZnO TFTs, as they can dynamically shape the force and level of displacement by adjusting both the voltage and charging time.

“One of the critical challenges in developing touch systems is that the sensation is not one thing. It can involve the feeling of physical contact, force or pressure, hot and cold, texture and deformation, moisture or dryness, and pain or itching. It makes it very difficult to fully record and reproduce the sense of touch,” the researchers write in the paper documenting their work, published in the journal Nature Scientific Reports

In addition to simply playing back touch, the touch data can be stored in memory and replayed at a later time or sent to other users. “It is also possible to change the feeling of touch, or even produce synthesized touch by varying temporal or spatial resolutions,” said Deli Wang, a professor of Electrical and Computer Engineering (ECE) in UC San Diego’s Jacobs School of Engineering and senior author . “It could create experiences that do not exist in nature.”

The technology is still in its incipient form, so it’s better to view the scientists’ work as a proof of concept instead of a complete solution. The technology needs to be drastically improved to reproduce all the intricate parameters that influence the sense of touch, yet this work signals that we’re on our way there. What about smell and taste? Well, that should be really interesting to see. [READ: Music for the nose: the olfactory organ]

  • Siarhei Vishniakou et al., Tactile Feedback Display with Spatial and Temporal Resolutions, Scientific Reports, 2013, DOI: 10.1038/srep02521 (open access)
Holodeck o

Real life ‘holodeck’ in 10 years? Very possible, Tim Huckaby says

Holodeck caption from the USS enterprise

Holodeck caption from the USS enterprise

At his recent keynote 2013 Consumer Electronics Show (CES), Las Vegas, Tim Huckaby dazzled the audience with his predictions for the future in user interface and technology. His presentation was initially structured as a showcase of possible developments in the next five years, but Huckaby didn’t stop there and also talked a bit about how technology will evolve even further.

Huckaby – founder and chairman of California-based InterKnowlogy, as well as the current chief executive officer of Actus Interactive Software – believes we’re on the verge of entering an era where what we call today science fiction merges with reality. Some of his most appealing examples of how we will interact with technology in the future, based on decades of work in emerging technology, are: interfaces where doctors manipulate molecules in three-dimensional (3-D) space, augmented music players tune into your thoughts, and retailers deliver coupons in real time based on the focus of your gaze across store shelves.

Imagine a world in retail where my wife has opted in at Nordrom’s, or Macy’s, or something like that to be tracked through the store… We can see what you’re looking at, and we can push a coupon to you. ‘Hey, Kelly, you were in the Seattle Nordstrom’s, and you looked at these cute shoes, but your didn’t buy them. Now you’re in the Las Vegas Nordstrom’s. You’re looking at the exact same shoes. How about 40 percent off if you buy them right now?’ That’s the beauty of retail.

Imagining the future

I don’t know what to think about the above example; might spell spam maddening trouble – still, it does sound interesting. In his second part of his presentation, however, Huckaby really entered the realm of SciFi, the realm of highly possible and scientifically proven SciFi that is. For instance he talks about a “holodeck” (as in Star Trek) into which holographic images are displayed; a legitimate neural-based interface offering a direct pathway between the brain and external devices; and virtual objects that extend into practically every facet of life and that behave much as they would in the natural world.

In the past few years incredible development has been made in brain/external device interface development. For instance, we reported a while ago how a paralyzed woman was able to control a robotic hand with the power of thought, deploying it to serve basic tasks like griping, pulling, pushing, releasing, feeding and so on. Princeton scientists were able to “read the minds” of participants after they developed a technique which analyzes fMRI scans and then correlates it with a known database of brain wave patterns. Even more interesting though, University of California researchers were able to read the minds of participants and project their train of thoughts in video!

In the video below you can watch a 2010 TED talk demonstrating Emotiv’s brainwave-reading machine.




Do these ideas seem far fetched to you? Think a bit about it. You’re probably viewing this article on a handheld device that’s more powerful than the combined power of  all supercomputers in the world just 25 years ago. Remember how weird bluetooth devices used to look just ten years ago? What about touch screen interfaces?

via Smart Planet

Leap Motion

Thought the touchscreen was innovative? Get ready for user interface complete revamp

Leap Motion

Touchscreen technology has been in use for many years, but when the first iPhone came out some six years ago, it totally changed front end design and user interface, because it brought the technology to the people, the common folk. You didn’t have to be a scientist to own or use a touchscreen device – it’s dead simple and fun. Just a few years later, everything is touch interface, and i’m not exclusively talking about smartphones or tablets here; even washing machines or electric shavers have been included. Touch interface incorporated in technology has definitely risen the bar for interactivity. Has it really revolutionized UI design, turning it upside down, however? Well, absolutely not, if you’re to compare it with what’s just around the corner – the Leap Motion system.

Image a device which only costs $70, can be plugged in via USB, and turns your notebook, tablet or TV into a virtual playground, in six degrees of freedom. Minority Report? No, this is better! If you haven’t watched the company’s presentation demos, do it now and prepare to be amazed. This might just be the biggest thing for user interface since the mouse!

Sure, you might cry that this is extremely similar to the Kinect, but believe me, after scouring the web through hundreds of blogs for more information on this piece of gold, the Leap is a lot more powerful. It’s 200 times more accurate than the Kinect, to the point that it can distinguish all ten of your fingers individually and track your movements to a 1/100th of a millimeter.

The device’s working principle is fairly simple in nature, if you choose to ignore the complex, yet beautiful development work that went into the back-end.  A number of camera sensors map out a workspace – a 3D space in which you operate as you normally would, with almost none of the Kinect’s angle and distance restrictions. For the current prototype, the workspace is about three cubic feet, but the only restriction here are the sensors – what a bigger workspace? Just choose bigger sensors.

UI Revolution

Yes, you can become a FruitNinja master with this incredible motion sensor system, but trust me this is meant for something a lot deeper. The possibilities at stake are virtually limitless. Imagine browsing websites or organizing your files in the most natural of manners. You don’t need to learn to place two fingers apart, then drag to zoom and such. The system ensures that everything you’d want to do naturally in the real world can be transferred virtually. If you want to draw, you just use your finger or put out a pencil and start drawing. Want to pick up a ball in a video game, just mimic the grabbing action. Everything can be manipulated. Imagine a special app based on the Leap for remote surgery. Amazing!

Along the past few years, thousands of developers tried their best and some have released some incredibly useful and creative apps and releasing them to the public via Apple’s AppStore or the Android Market. Leap Motion officials stated that their system’s SDK will be able to do much more, however, and the possibilities appear to be limited only by your imagination.

The Leap is slated for release sometime between December and February. So, what do you guys think?