Tag Archives: touch

New research finds the neurons that makes mice itchy

New research is looking into how our bodies sense and transmit itchiness to the brain.

Light touches play an important role in our daily lives. Between cuddling, picking up fragile objects, and performing tasks that require precision, we use the sensation to guide many of our activities. It’s also an essential part of the body’s defense system, telling us, among others, if we’re covered in biting insects such as ticks or mosquitoes — via that oh-so-pleasant sensation of itchiness.

Creepy crawlies

“The takeaway is that this mechanical itch sensation is distinct from other forms of touch and it has this specialized pathway within the spinal cord,” says Salk Institute Professor Martyn Goulding, senior author of the new study.

The team looked at how neurons in the spinal cord carry these itchy signals to the brain. They hope that the findings will help lead to new drugs to treat chronic itch, which occurs in such conditions as eczema, diabetes, and even some types of cancer.

Goulding and his colleagues had previously found a set of inhibitory neurons in the spinal cord that keep the itchiness pathway locked down most of the time. Inhibitory neurons act as brakes on neural circuits, dampening their activity. Without these neurons — which produce the neurotransmitter neuropeptide Y (NPY) — the pathway is constantly active, causing chronic itching.

What the team wanted to find out know was how the signal encoding this sensation is transmitted to the brain, making us feel the itch. One of the team’s hypotheses was that when NPY inhibitory neurons are missing, the nerve bundles in the spinal cord that transmit light touch get stuck on the “on” setting — which creates a self-amplifying loop. The team identified a population of such (excitatory) neurons in the spinal cord that express the receptor for NPY, the so-called Y1 spinal neurons

To test if these were indeed behind the self-amplifying loop of itchiness, the team selectively removed the NPY “brake” and Y1 “accelerator” neurons in mice to see the effects.

Without Y1 neurons, they report, the mice didn’t scratch, not even in response to light-touch stimuli that normally make them scratch. When the team gave them drugs to activate the Y1 neurons, the mice scratched spontaneously even in the absence of any touch stimuli. The team was then able to link NPY neurotransmitter levels to Y1 neuron excitability — showing that NPY controls our sensitivity to light touch. The findings are also supported by other research which found that people with psoriasis have lower than average levels of NPY.

While the study shows how itchy signals go through the spinal cord, more research is needed to understand the full pathway. There are other neurons that likely mediate its transmission and final response in the brain, the team explains.

“By working out mechanisms by which mechanical itch is signaled under normal circumstances, we might then be able to address what happens in chronic itch,” says David Acton, a postdoctoral fellow in the Goulding lab and the study’s first author.

The paper “Spinal Neuropeptide Y1 Receptor-Expressing Neurons Form an Essential Excitatory Pathway for Mechanical Itch” has been published in the journal Cell Reports.

A high-tech prosthetic hand allows users to experience touch. Credit: University of Utah.

Scientists develop prosthetic hand that enables users to feel touch

A high-tech prosthetic hand allows users to experience touch. Credit: University of Utah.

A high-tech prosthetic hand allows users to experience touch. Credit: University of Utah.

An electrical accident 17 years ago claimed Keven Walgamott’s hand. Now, researchers at the University of Utah have fitted the man with an innovative prosthetic arm whose fingers not only move with his thoughts but are also capable of relaying sensations. Essentially, this is a prosthetic hand that feels. It’s so sensitive that Walgamott was able to hold an egg between his fingers without breaking it by squeezing it too hard.

The prosthetic hand that feels

The technology was developed by a team led by biomedical engineering associate professor Gregory Clark. The backbone of the prototype is the Utah Slanted Electrode Array (USEA), which is an interface between the prosthetic hand and the patient’s remaining sensory and motor nerves in the arms.

USEA consists of hundreds of electrodes that are surgically implanted next to the nerve fibers. The electrodes pick up the ‘chatter’ of nearby nerve fibers, forming a connection between the prosthesis and the nervous system.

The prosthetic — called “LUKE” after the prosthetic Luke Skywalker wore in Star Wars — was fitted to Walgamott in 2017. Since then, he has been training closely with the researchers at the University of Utah to perform extremely delicate tasks that would have been otherwise impossible using metal hooks or claws prosthetics.

“It almost put me to tears,” Walgamott said. “It was really amazing. I never thought I would be able to feel in that hand again.”

Scientists have been working on the LUKE arm for more than 15 years. It’s mostly made of metal motors and parts which control finely articulated fingers, along with an external battery that’s wired to a computer. Sensors cover the hand that sends signals to the nerves via the microelectrode array, mimicking the feeling you feel in your hand when picking up something.

Star Wars-inpired

One huge breakthrough in developing LUKE’s touch involved understanding and recreating how the brain interprets first touching something.

“Just providing sensation is a big deal, but the way you send that information is also critically important, and if you make it more biologically realistic, the brain will understand it better and the performance of this sensation will also be better,” said Clark.

Although the quality of touch that Walgamott can feel with his new prosthesis isn’t nearly as sensitive as a real hand, this is still a huge leap from nothing at all. With it, Walgamott can distinguish between touching something soft or hard, the kind of sensitivity that allows him to live a fuller life. For instance, the researchers claim that the man is now able to perform complex movements such as picking grapes or stuffing a pillow into its case.

For Walgamott, these training sessions have been incredibly emotional.

“One of the first things he wanted to do was put on his wedding ring. That’s hard to do with one hand,” says Clark. “It was very moving.”

Next, Clark and colleagues plan to improve the design of the prosthetic to make a mobile version. Right now, it can only be used in the lab where it has to be hooked up with all sorts of bulky machinery.

Clark hopes that in 2020 or 2021, three participants will be able to take their arms home, as long as they receive FDA approval.

It’ll take years, though, before such devices are commercially available. Nevertheless, it’s incredibly inspiring to see technology truly in service of the people.

Cow pats.

Why do pets like pats?

Why do we, too, like pats for that matter?

Pig.

Think it’ll turn into a handsome prince?
Image credits Juda / Pixabay.

If you happen to have your own little, warm ball of fur back home, you know how much they love to be pet. But why do they like it, exactly? And do we humans like it, too? Let’s find out.

Skin-deep

Dogs love belly rubs. Cats break into a purr when you scratch that one special place between their ears. Hugs give us comfort and pleasure. If you hit your hand on something, a quick rub will reduce the pain.

All of these are fundamented on the sense of touch and show what a massive role it plays on our emotional state. While not all touches are pleasurable, all mammals seem to agree that a longer, slighter stroking motion feels good.

This particular type of motion stimulates a set of neurons known as MRGPRB4+, reported this study published in Nature Neuroscience in 2007. The authors worked with genetically-engineered mice, whose MRGPRB4+ neurons were modified to light up when activated (optogenetics). Pet-like stroking patterns of touch — and only this pattern of touch — at temperatures around that of human skin activated the neurons, the team found, inducing a pleasant sensation in the animal.

These neurons are connected to hair follicles in the skin and are relatively widely-spaced. Their layout is what makes them only respond to long stroking motions, and not more localized ones like pinching or poking.

Orange cat pet.

Image credits Linnaea Mallette.

We also have these neurons built into the follicles of hair-covered portions of our skin. This suggests that MRGPRB4+ neurons respond to touch on the skin itself, not to motions transmitted through strands of hair. This is also supported by the fact that an individual can experience a pleasant sensation from petting, hugging, or stroking even after experiencing hair loss or shaving; if the MRGPRB4+ neurons were tied to hair strands, instead of follicles, this wouldn’t have been the case.

“The researchers suspect similar sensory neurons with comparable properties exist in humans and most furry mammals,” explained David Anderson, one of the study’s co-authors.

“Since the sensation is connected to hair follicles, animals with many of them, such as cats and dogs, likely feel waves of pleasure when being petted. The neurons that detect stroking are probably wired into higher brain circuits that produce a reward or pleasure.”

To validate these findings, the team further modified some mice so that the same neurons could be activated biochemically, via a drug injection. When given the choice between two chambers, a control one where nothing happened and one where the drug-induced touch sensation occurred, the mice opted for the latter. This implied that the animals actually found the sensations caused by MRGPRB4+ neuron activation to be pleasurable. The mice also showed fewer signs of stress after receiving their chemical pat.

So, to recap, furry, hairy animals (i.e. mammals) enjoy the sensation of being pet. It is mediated by neurons connected to hair follicles in the skin and only caused by deliberate, slow, gentle, and relatively long strokes on the skin or fur. But we’re still missing a why — why did mammals evolve to experience pleasure from these patterns of touch?

Making buddies

Grooming.

“So how are you how’s the kids?”
Image credits Anthony / Pixabay.

For mammals, especially social ones, touch is a great way to make friends and strengthen bonds. Our wild cousins groom each other to remove harmful parasites from their fur since they can’t do it by themselves. But past research has shown that they engage in this behavior far more than than necessary from a purely hygienic standpoint. So, while grooming might have a very practical, even critical purpose, primates also seem to simply get a kick out of it and do it for fun or to socialize. It’s how they hang out.

Us humans aren’t typically big on public displays of grooming, but we also employ touch socially. Hugs, handshakes, a tap on the shoulder, they’re small gestures that can go a long way in strengthening familial or social bonds.

The mammal enjoyment of pats probably started out as a practical ritual — for example, as grooming — and our physiology later evolved to encourage the activity with positive sensations. Such behavior likely represented an evolutionary advantage as it promotes health, hygiene, bonding, and trust among the group, thereby increasing the survival chances of all its members. Alternatively, it is possible that this enjoyment of pats helps baby mammals keep warm by balling up together with their parents and siblings, thus conferring a selective advantage at a young age.

Regardless of why it happens, the end result is extremely effective at promoting bonding, social interaction, and good moods. Activation of the MRGPRB4+ neurons releases endorphins and oxytocin into the brain (these help with pain relief, relaxation, and bonding) and may lead to temporarily-reduced cortisol (a stress hormone) levels. This chemical cocktail puts us, or our pets, at ease, nips aggression in the bud, and induces a state of pleasure.

In your brain

One paper published in NeuroImage in 2016 looked into the patterns of “brain activation during 40min of pleasant touch” — which sounds quite enjoyable. The authors worked with 25 participants who “were stroked for 40min with a soft brush while they were scanned with functional Magnetic Resonance Imaging [fMRI], and rated the perceived pleasantness of the brush stroking.”

What they found was that stroking heavily activates neurons in the somatosensory cortex initially, although this dwindles in intensity over time — likely due to stimulus habituation. Stimulus habituation is the thing that makes you less sensitive to a particular smell after being exposed to it for a while, why you eventually stop feeling the chair under you or the smartphone in your right pocket.

At the same time, activity levels in the orbitofrontal gyrus (OFC, also known as the orbitofrontal cortex) and the putamen increase, stabilizing at about 20-minute mark. Certain structures of the insular cortex (the posterior insula) also see greater activity during this time. The team believes this increase in cerebral activity comes down to the subjective pleasure each participant was feeling — pleasure is how your brain rewards you for doing something.

The workings of the orbitofrontal cortex have been linked to depression in humans. In particular, reports one study published in Brain in 2016, subjects with depression showed weaker neural connections between the medial (middle) OFC and the hippocampus, which is associated with memory. They also showed stronger neural connections between the lateral OFC and other areas of the brain. The study worked with 421 patients with major depressive disorder and 488 control subjects.

The study’s authors explain that the medial OFC activates when processing or ‘administering’ rewards in the form of pleasure. It’s not yet understood exactly what those weaker connections mean, but it does suggest that people with depression may find it more difficult to access and recall happy or positive memories. At the same time, the lateral OFC — which enjoys stronger connections with other brain areas — is involved in processing or administering the non-rewards: science-speak for ‘punishments’.

To tie it all into a neat little bow, one paper published in Current Biology last year reported that the “lateral OFC is a promising new stimulation target for treatment of mood disorders” such as depression. The team worked with 25 subjects, using electrodes to stimulate various areas of their brains while monitoring and recording their (self-reported) mood via a daily questionnaire.

Patting, stroking, massages — they activate the neurons in the OFC, which is exactly what the team achieved using their direct stimulation techniques. A literal gentle touch, then, may be just what you need when you’re struggling with depression.

And hey, if nobody’s around to pet you, grab a brush, clear out 20 minutes of your schedule, and go hack your OFC.

Hand cloth.

Human touch can feel molecule-thin differences, according to new study

The human sense of touch can pick up on differences as minute as a single layer of molecules, according to new research.

Hand cloth.

Image credits Pedro Figueras.

Our sense of touch comes in very handy. Through it, we can easily discriminate between a wide range of surfaces, from wood, paper and metal to glass and plastics, chiefly due to differences in texture and because every material sucks up the heat from our fingers at different rates. So we understand the ‘how’, but until now we didn’t know how finely tuned our tactile sense is, i.e. what the smallest difference they can pick up on is. Such knowledge is crucial if we are to create life-like prosthetics that can accurately recreate our sense of touch, in the development of virtual and augmented realities, and many other advanced technologies.

New research from the University of California San Diego says that our sense of touch is, in fact, so refined it can pick up on differences of a single layer of molecules.

A touching subject

Modern technology such as PCs, game consoles, smartphones, or TVs, let us experience the world more freely and fully than ever before. We can hear and watch events unfolding on the other side of the planet — even on other side of other planets — but these devices don’t allow us to feel what’s happening. Mixing in that ingredient “is a driving force behind this work,” says San Diego nanoengineering Ph.D. student and paper co-author Cody Carpenter.

“Reproducing realistic tactile sensations is difficult because we don’t yet fully understand the basic ways in which materials interact with the sense of touch,” adds Darren Lipomi, a professor of nanoengineering at UC San Diego who led the research efforts.

For their research, the team asked participants to try and distinguish between various unassuming silicon wafers only by dragging or tapping a finger across their surface. The wafers were almost identical, differing only in their single, topmost layer of molecules.

So how could they pick up on such minute differences? The team was placing their hopes on stick-slip friction, the jerking motion that occurs as two objects at rest start sliding against one another. The phenomenon draws on the fact that, generally speaking, the static friction coefficient between two bodies is greater than their dynamic friction coefficient — it takes more force to start sliding two objects against each other than to keep them sliding. This difference means that immediately after movement starts and the dynamic coefficient takes over, there’s a short but powerful burst in the bodies’ relative velocity, hence the jerking motion.

The sound you make when running a wet finger along the rim of a wine glass is generated by stick-slip-induced vibrations in the glass. That’s also why an ungreased door hinge will squeak, and why stopping trains make that infernal racket when stopping. It’s all stick-slip-induced vibrations dissipating as sound.

Hands-on approach

Hands touch.

Image credits Simon Wijers.

The team ran several trials to see if their theory would hold. During the first, participants were presented with two wafers. One was covered in a single oxidized layer rich in oxygen atoms, the other with a Teflon-like layer of fluorine and carbon atoms. Both wafers looked identical so participants couldn’t tell them apart by appearance.

In another test, 15 subjects were presented with three surfaces and had to identify the one surface that differed from the other two. Subjects correctly identified the differences 71% of the time during this trial.

Finally, subjects were given three strips of silicon wafer, each patterned with a different sequence of 8 patches of oxidized and Teflon-like surfaces. Each strip thus encoded a binary language (1s and 0s), with the patterns corresponding to a letter in the ASCII alphabet. Participants were asked to read these sequences, using their fingers to tell which patches were oxidized and which were covered in the Teflon-like material. During this trial, 10 out of 11 subjects correctly decoded the word (which was “Lab”, spelled with upper and lowercase letters) more than 50% of the time. Subjects spent an average of 4.5 minutes to decode each letter.

“This is the greatest tactile sensitivity that has ever been shown in humans,” Lipomi explains.

From the data recorded during the trials, the team determined that materials should be distinguishable by how fast a finger drags and how much force it applies to the surface. They constructed a mock finger out of organic polymer, attached it to a force sensor, and ran it across the surfaces used during their study at different combinations of force and speed. Further processing of the data revealed that certain combinations of these two factors lend themselves well to distinguishing materials, while others create too much noise (chaotic data) to be used in such applications.

“Our results reveal a remarkable human ability to quickly home in on the right combinations of forces and swiping velocities required to feel the difference between these surfaces. They don’t need to reconstruct an entire matrix of data points one by one as we did in our experiments,” Lipomi said.

“It’s also interesting that the mock finger device, which doesn’t have anything resembling the hundreds of nerves in our skin, has just one force sensor and is still able to get the information needed to feel the difference in these surfaces,” he adds. “This tells us it’s not just the mechanoreceptors in the skin, but receptors in the ligaments, knuckles, wrist, elbow and shoulder that could be enabling humans to sense minute differences using touch.”

Starting from these results, researchers and engineers could develop technologies that would allow artificial-skin systems to feel the world around us very much like our own, biological skins. Alternatively, the paper could form the foundation for systems that can recreate the feel of any material, which would level-up virtual reality systems dramatically.

The paper “Human ability to discriminate surface chemistry by touch” has been published in the journal Materials Horizons.

(c) Bristol University, UK

Human-computer interface relays touch out of thin air

(c) Bristol University, UK

(c) Bristol University, UK

Using ultrasound radiation, researchers at University of Bristol (UK) have devised a computer interface that basically allows users to interact with a digital screen without touching it. Sure the Kinect or Leap Motion does this already, the catch is that this system also provides haptic (touch) feedback. So, whenever a user traces a motion in front of the system, not only does the system react, it also relays feedback to the users which senses it as touch. The device was unveiled this week at the ACM Symposium on User Interface Software and Technology in Scotland.

Dubbed, UltraHaptics the researchers claim the system’s main advantage is that it allows the user to “feel” what is on the screen.

“UltraHaptics uses the principle of acoustic radiation force where a phased array of ultrasonic transducers is used to exert forces on a target in mid-air,” Co-developer Tom Carter explained. “Haptic sensations are projected through a screen and directly onto the user’s hands.”

The system works by means of an ultrasound transducer array positioned beneath an acoustically transparent display, which doesn’t interfere with the haptic interaction. The multiple transducers join together and collectively emit very high frequency sound waves. When all of the sound waves meet at the same location at the same time, they create sensations on the skin. By creating multiple simultaneous feedback points, and giving them individual tactile properties, users can receive localized feedback associated to their actions. A LeapMotion device is used to relay hand movements.

Finally, the research team explored three new areas of interaction possibilities that UltraHaptics can provide: mid-air gestures, tactile information layers and visually restricted displays, and created an application for each.

A video demonstration of the UltraHaptic system can be viewed below.

Tom Carter, PhD student in the Department of Computer Science’s BIG research group, said: “Current systems with integrated interactive surfaces allow users to walk-up and use them with bare hands. Our goal was to integrate haptic feedback into these systems without sacrificing their simplicity and accessibility.

“To achieve this, we have designed a system with an ultrasound transducer array positioned beneath an acoustically transparent display. This arrangement allows the projection of focused ultrasound through the interactive surface and directly onto the users’ bare hands. By creating multiple simultaneous feedback points, and giving them individual tactile properties, users can receive localised feedback associated to their actions.”

UltraHaptics: Multi-Point Mid-Air Haptic Feedback for Touch Surfaces, Thomas Carter, Sue Ann Seah, Benjamin Long, Bruce Drinkwater, Sriram Subramanian, UIST 2013, 8-11 October, St Andrews, UK.

Prototype tactile recording and display system for real-time reproduction of touch (credit: UCSD)

Next level user-interface tech: recording and rendering human touch

Prototype tactile recording and display system for real-time reproduction of touch (credit: UCSD)

Prototype tactile recording and display system for real-time reproduction of touch (credit: UCSD)

Since touch screen interfaces have been introduced on mass scale the way people interact with technology has been arguably revolutionized. Still, there is much more to be explored in how the sense of touch can be manipulated to enrich user interaction with tech. Recording and relaying back information pertaining to the sense of sound (audio files) or sight (photo, video files) has been successfully incorporated in technology for more than a century now. The other senses, however, seem to have been bypassed by the digital revolution, in part because they’re so much more difficult to reproduce. This might change in the future, ultimately leading to user interfaces that offer a complete, real-life experience that stimulates all human senses.

A recent research by scientists at University of California, San Diego explores the sense of touch in user interfaces. The researchers devised a set-up comprised of sensors and sensor arrays capable of electronic recording and playback of human touch. The researchers envision  far-reaching implications for health and medicine, education, social networking, e-commerce, robotics, gaming, and military applications, among others.

In their recently demonstrated prototype, an 8 × 8 active-matrix pressure sensor array made of transparent zinc-oxide (ZnO) thin-film transistors (TFTs) records the contact touch and pressure produced by the finger tips of an user. The eletrical signal is then sent to a PC, which processes the data and from there to a tactile feedback display, which used an 8 × 8 array of diaphragm actuators made of an acrylic-based dielectric elastomer with the structure of an interpenetrating polymer network (IPN). The polymer actuators playback the initial touch recorded by the ZnO TFTs, as they can dynamically shape the force and level of displacement by adjusting both the voltage and charging time.

“One of the critical challenges in developing touch systems is that the sensation is not one thing. It can involve the feeling of physical contact, force or pressure, hot and cold, texture and deformation, moisture or dryness, and pain or itching. It makes it very difficult to fully record and reproduce the sense of touch,” the researchers write in the paper documenting their work, published in the journal Nature Scientific Reports

In addition to simply playing back touch, the touch data can be stored in memory and replayed at a later time or sent to other users. “It is also possible to change the feeling of touch, or even produce synthesized touch by varying temporal or spatial resolutions,” said Deli Wang, a professor of Electrical and Computer Engineering (ECE) in UC San Diego’s Jacobs School of Engineering and senior author . “It could create experiences that do not exist in nature.”

The technology is still in its incipient form, so it’s better to view the scientists’ work as a proof of concept instead of a complete solution. The technology needs to be drastically improved to reproduce all the intricate parameters that influence the sense of touch, yet this work signals that we’re on our way there. What about smell and taste? Well, that should be really interesting to see. [READ: Music for the nose: the olfactory organ]

  • Siarhei Vishniakou et al., Tactile Feedback Display with Spatial and Temporal Resolutions, Scientific Reports, 2013, DOI: 10.1038/srep02521 (open access)
Leap Motion

Thought the touchscreen was innovative? Get ready for user interface complete revamp

Leap Motion

Touchscreen technology has been in use for many years, but when the first iPhone came out some six years ago, it totally changed front end design and user interface, because it brought the technology to the people, the common folk. You didn’t have to be a scientist to own or use a touchscreen device – it’s dead simple and fun. Just a few years later, everything is touch interface, and i’m not exclusively talking about smartphones or tablets here; even washing machines or electric shavers have been included. Touch interface incorporated in technology has definitely risen the bar for interactivity. Has it really revolutionized UI design, turning it upside down, however? Well, absolutely not, if you’re to compare it with what’s just around the corner – the Leap Motion system.

Image a device which only costs $70, can be plugged in via USB, and turns your notebook, tablet or TV into a virtual playground, in six degrees of freedom. Minority Report? No, this is better! If you haven’t watched the company’s presentation demos, do it now and prepare to be amazed. This might just be the biggest thing for user interface since the mouse!

Sure, you might cry that this is extremely similar to the Kinect, but believe me, after scouring the web through hundreds of blogs for more information on this piece of gold, the Leap is a lot more powerful. It’s 200 times more accurate than the Kinect, to the point that it can distinguish all ten of your fingers individually and track your movements to a 1/100th of a millimeter.

The device’s working principle is fairly simple in nature, if you choose to ignore the complex, yet beautiful development work that went into the back-end.  A number of camera sensors map out a workspace – a 3D space in which you operate as you normally would, with almost none of the Kinect’s angle and distance restrictions. For the current prototype, the workspace is about three cubic feet, but the only restriction here are the sensors – what a bigger workspace? Just choose bigger sensors.

UI Revolution

Yes, you can become a FruitNinja master with this incredible motion sensor system, but trust me this is meant for something a lot deeper. The possibilities at stake are virtually limitless. Imagine browsing websites or organizing your files in the most natural of manners. You don’t need to learn to place two fingers apart, then drag to zoom and such. The system ensures that everything you’d want to do naturally in the real world can be transferred virtually. If you want to draw, you just use your finger or put out a pencil and start drawing. Want to pick up a ball in a video game, just mimic the grabbing action. Everything can be manipulated. Imagine a special app based on the Leap for remote surgery. Amazing!

Along the past few years, thousands of developers tried their best and some have released some incredibly useful and creative apps and releasing them to the public via Apple’s AppStore or the Android Market. Leap Motion officials stated that their system’s SDK will be able to do much more, however, and the possibilities appear to be limited only by your imagination.

The Leap is slated for release sometime between December and February. So, what do you guys think?

Woman can literally feel sound after stroke

After she suffered a stroke, a 36-year-old professor started to feel sounds. In the beginning she didn’t know what was happening when a radio announcer’s voice made her tingle, or when during a flight she became physically uncomfortable.

Neuroscientist at the City College of New York and the Graduate Center of the City University of New York believe they understand what happened in this particular case of synesthesia, after enhanced brain scans showed that new links had grown between its auditory part, which processes sound, and the somatosensory region, which handles touch.

“The auditory area of her brain started taking over the somatosensory area,” says Tony Ro, one of the researchers.

Based on this peculiar case, the scientists who have researched the woman’s case have presented a paper at the recent at a meeting of the Acoustical Society of America, where they stated that a deep connection between hearing and touch is nested inside each of us. Their theory was formulated around vibrations and how they trigger certain nerves in both touch and hearing sensory parts of the human body. A phone vibrating will be felt by the skin, while a phone on a ring tone will create sound waves which vibrate the eardrum.

Bearing this in mind, researchers have shown that hearing a sound can boost touch sensitivity, which Ro calls the mosquito effect. The name comes from an obvious example in which the bug’s buzz makes our skin prickle, meaning that you’ll actually be able to sense the mosquito touching your skin better, according to a 2009 paper he published in Experimental Brain Research.

Further MRI scans of people’s brains have shown that the auditory region can activate during a touch, and some speculate that chunks of brain specialized to understand frequency may play a role in crossing the wires. How the two senses come together in the end is still confussing for scientists.

Still, image being able to feel all sorts of stuff based on the vibrations the various kind of music you’d like to listen. How would listening to Lady Gaga would differ from Led Zeppelin or Nat King Cole or classical music?

Touch and sight – more connected than previously thought

What you see may be very much related to what you’ve just felt. Even though we were taught at school that each sense is processed in another area of the brain it seems that this theory may be wrong and that there is a lot more to understand about the way human brain works.

As an example, a light ripple of pins moving up the fingertip tricked the subjects of a study into perceiving some lines moving on a screen as moving down, and the other way around. So, there is something going on.

Some recent studies have proved that the way our brain works –  in this case processes the most important senses – is far more complex than thought for decades. Firstly, it was discovered that hearing and seeing are related to each other, but now it seems that this is far from being the end of the story.

In order to take the theory even further, a trick of perception named aftereffect was used by scientists, a phenomenon that occurs for example when watching a waterfall. Staring at it for some time will eventually make one perceive the stationary rocks as moving up. This happens because the neurons which are in charge of the “down” get tired while the “up” ones are still fresh and create this impression.

But what interested the researchers was the aftereffect caused by the sense of touch. A small gadget of the size of a stamp made of 60 pins in rows was used throughout the study as the participants were asked to rest a finger on top of it. Some of the rows were raised at different times, thus creating a gentle prodding movement which was directed away or toward the person, for ten seconds. Then, the subjects were asked to look at a computer screen on which they could see common patterns of white and black horizontal lines. The lines were constantly moving and switching places, but their entire movement could be characterized as upward or downward.

Simple task, isn’t it? Well, not when the brain has a lot of stimuli to cope with. This is why the subjects who had felt the lines on the little device going up perceived the lines on the screen as going down and the other way around. This is the the visual aftereffect. A touch aftereffect can also be induced as watching lines going up on a screen made the participants feel the pins as going down.

Now it seems quite clear that sight and touch are connected to a larger extent than it was expected. Now, the next step is to find where exactly in the brain the connection is created.

source: Body & Brain