Tag Archives: face recognition

caltech_brain_imaging

Part of the brain subjectively encodes information related to human emotions

Primates are among a couple of mammals that have a dedicated system for processing faces, something that involves a lot of neuropower and energy expenditure. While there are people that resemble each other, no two humans have the same exact faces (not even identical twins). Some people see, and thus analyze, thousands of faces each day, depending on how much they go outside. Recognizing emotions is an even more complicated process, one that gives even the brain some problems.

caltech_brain_imaging

Photo: Ralph Adolphs / Caltech. Arrows indicate responses from single neurons.

Researchers at Caltech, Cedars-Sinai Medical Center, and Huntington Memorial Hospital in Pasadena targeted brain activity in the amygdala, the region of the brain responsible for encoding information related to emotional reactions. Their findings suggest that some brain cells recognize emotional face patterns based on a subjective approach (i.e. the viewer’s preconception), and not through an entirely objective process that should have revealed the true emotional pattern. This is the first time neurons in the amygdala were shown to encode the subjective judgment of emotions shown in face stimuli, rather than simply their stimulus features.

[RELATED] Remembering faces is influenced by genetics

Participants in the study were shown images of partially obscured faces and asked to decide which emotion they were showing. They did not see the red circles shown here to outline the areas where the faces appeared. - See more at: http://www.caltech.edu/content/sorting-out-emotions#sthash.hOjJsi1h.dpuf

Participants in the study were shown images of partially obscured faces and asked to decide which emotion they were showing. They did not see the red circles shown here to outline the areas where the faces appeared. Photo: Ralph Adolphs / Caltech

For their purpose, the researchers investigated over 200 single neurons in the amygdalae of 7 patients treated for epilepsy who had surgically implanted depth electrodes. MRI image of the patients’ brain activity were taken while the participants were shown images of partially obscured faces showing either happiness or fear. Each participants was asked to judge which of the two emotions was shown. Here’s what the authors report:

“During trials where subjects responded correctly, we found neurons that distinguished fear vs. happy emotions as expressed by the displayed faces. During incorrect trials, these neurons indicated the patients’ subjective judgment. Additional analysis revealed that, on average, all neuronal responses were modulated most by increases or decreases in response to happy faces, and driven predominantly by judgments about the eye region of the face stimuli,” from the abstract of the paper published in the Proceedings of the National Academy of Science.

What this means is that the amygdala doesn’t necessarily respond to what’s actually there in the world, but to what SEEMS to be there, after it passes an internal filter. Things become more interesting when you take into account the fact that the amygdala is linked with a number of psychiatric diseases like depression or autism. Many of these afflictions might be due to a skewed perception of the patient’s surroundings.  That doesn’t mean the amygdala alone is responsible for all of this.

“Of course, the amygdala doesn’t accomplish anything by itself.  What we need to know next is what happens elsewhere in the brain,  so we need to record not only from the amygdala, but also from other brain regions with which the amygdala is connected,” says Shuo Wang, a postdoctoral fellow at Caltech and first author of the paper. 

Constantin Rezlescu, a postdoctoral fellow in psychology is one of the people involved in the face brain mechanics study at Harvard. (c) Jon Chase/Harvard Staff Photographer

Recognizing faces may result from a specific brain mechanism

During a lifetime, a person sees hundreds of thousands of faces. Of course, you wont remember all of them, in fact you might find you’re embarrassed when you fail to recognize a distant relative at a family gathering. Things like this happen, but you shouldn’t be upset on your brain. In fact, it does a pretty fantastic job considering it needs to process distinct and subtle cues every time you see a face. How and where does this process take place in the brain, however? A latest study from Harvard psychologists suggests that face processing is a standalone mechanism, separate from object processing.

Some scientists argue that face processing  relies on the same brain mechanisms used in other areas of visual expertise, like the same we use to recognize a cup or various animals. To test this hypothesis,  Harvard and Dartmouth researchers made tests with patients suffering from prosopagnosia also known as “face blidness”.

Seeing, but not seeing

Constantin Rezlescu, a postdoctoral fellow in psychology is one of the people involved in the face brain mechanics study at Harvard.   (c) Jon Chase/Harvard Staff Photographer

Constantin Rezlescu, a postdoctoral fellow in psychology is one of the people involved in the face brain mechanics study at Harvard. (c) Jon Chase/Harvard Staff Photographer

People with this condition can see the eyes, the nose, and the mouth, what is known as the context – but they cannot see them as a whole. They do not recognise gestures or emotions. Until recently, it was thought that very few people suffer from prosopagnosia. The condition has traditionally been studied in individuals who acquire the disorder following neurological damage (typically from stroke or head injury), and a handful of case studies were reported in the literature in the 20th century. However, it has recently become clear that many more people suffer from prosopagnosia without experiencing neurological damage.

[ALSO READ] Face recognition gene found

 

This form of the disorder is commonly referred to as “developmental” or “congenital” prosopagnosia, and these individuals simply fail to develop normal face processing abilities despite normal intellectual and perceptual functions. Developmental prosopagnosics have suffered from the face recognition impairment for most of their lives, perhaps since birth. Recent evidence suggests there may be a genetic contribution to developmental prosopagnosia, and several case studies report at least one first-degree relative who also suffers from the face recognition impairment.

The greebles

Clearly, prosopagnosia patients are unable to recognize faces, but the present research sought to find out something different. The researchers asked the volunteers to distinguish between  20 computer-generated objects designed to engage the brain in the ways faces do.

Commonly used in psychological research, these sort of objects are called greebles. These can be grouped into families based on their body types, and share a limited amount of common features – in a sense they’re the same, but different if you catch the subtle markings. So in a way, to sort between the greebles, the participants had to recognize these subtle differences much in the same way a normal person distinguishes specific face features.

“What we wanted to do was to test a key prediction of the ‘expertise’ hypothesis,” said Constantin Rezlescu, a postdoctoral fellow in psychology and the study’s first author. “The expertise hypothesis predicts that when there is impairment in facial processing, you should also see impairment in processing other objects of expertise, because if the mechanisms are the same, any damage should affect both faces and other objects. Our findings, however, show a clear dissociation between participants’ ability to recognize faces and their ability to recognize other objects.”

A place for faces

Even normal people that don’t suffer from prosopagnosia have a tough time distinguishing greebles. It’s quite easy to sort the various objects to their respective families, but when distinguishing between individual greebles things get tough. The prosopagnosia participants scored the same at distinguishing greebles as the control group, which had no problem in distinguishing faces. The participants however scored much lower than the control when they were asked to recognize faces.

“What we found is that prediction — which is a fundamental prediction of the expertise hypothesis — does not hold,” Rezlescu said. “That provides indirect evidence that there may be some specific mechanism for processing faces, although it doesn’t prove it directly. Our conclusion is that the expertise hypothesis, at least that relying on greeble studies, is false.”

In other words, the question whether or not face recognition is an independent brain mechanism or is coupled with other functions is still open, yet the present findings suggest that prosopagnosia is the result of damage to a brain mechanism devoted specifically to face processing.

“In the real world, you may have experience for 10 years or more with objects that you become an expert on,” he said, adding: “But it is important to note that a great deal of the evidence that was claimed to support the expertise hypothesis comes from studies involving greebles, and what we found is that cannot be true.”

The results were reported in a paper published in the journal  Proceedings of the National Academy of Sciences.

Emotion detectors could make driving safer

Technology and feelings

Technology has gotten pretty good at understanding how we feel, being basically able to read at least the seven universal emotions a person is feeling: fear, anger, joy, sadness, disgust, surprise, or suspicion. This has become useful in medicine and psychology, marketing, police investigations, and more recently… driving safety.

EPFL researchers, in collaboration with PSA Peugeot Citroën, have developed an on-board emotion detector based on the analysis of facial expressions. They’ve tested the prototype and reported extremely promising results.

It’s not easy to measure drivers’ emotions in a car, especially in a non-invasive way. In order to work around this problem, scientists let the driver’s face do the work for them – and researchers in EPFL’s Signal Processing 5 Laboratory (LTS5) have teamed up with Peugeot and adapted a facial detection device for use in a car, using a totally non-invasive infrared camera placed behind the steering wheel.

Detecting irritation

The main emotion you want to detect in drivers is irritation; it’s this emotion that makes drivers more reckless and make hasty, unsafe decisions. The problem is that everyone expresses it differently – small gestures, a nervous tic, a slight clenching of the jaw, even an apparently impassive face. To simplify the task at this stage of the project, Hua Gao and Anil Yüce, who spearheaded the research chose to track only two expressions, indicative of the driver’s state: anger and disgust.

In order to do this, they developed a two-stage test. First, the system was taught to identify the two emotions using a series of photos of subjects expressing them. Then the same exercise was carried out using videos. This technology has already been implemented in several other areas, nothing really revolutionary here. The images were then taken both in an office setting as well as in real life situations, in a car that was made available for the project.

Overall, the system worked out fine, being able to detect irritation in the majority of the cases – when it failed, it did so because of the great variety of human expressions – which can be worked on in time. Additional research aims to explore updating the system in real-time.

This test also worked in conjunction with another project, which aims to see how tired drivers are, by measuring the percentage of eyelid closure. The LTS5 is also working on detecting other states on drivers’ faces such as distraction, and on lip reading for use in vocal recognition. They haven’t yet announced what they want to actually do when they get better at detecting these expressions, but it’s pretty clear that estimating how annoyed and tired the driver is can work wonders to improve driving safety.

Harvard researchers have found a brain region in which patterns of neural activity change when people look at black and white faces, and at male and female faces. (c) Harvard University

When interacting with other people, we first notice race and gender

Harvard researchers have found a brain region in which patterns of neural activity change when people look at black and white faces, and at male and female faces. (c) Harvard University

Harvard researchers have found a brain region in which patterns of neural activity change when people look at black and white faces, and at male and female faces. (c) Harvard University

What’s the first thing you notice when you first look at a person? Is it the shoes? The eyes? The nose? The mouth? There’s one thing to consciously notice and another to passively acquire data, something the brain constantly does. Harvard researchers have found that the first things the brain recognizes when interacting with other people is race and gender.

“We found that a brain region called the fusiform face area, or the FFA for short, seems to play a key role in differentiating faces along these two dimensions,” said Contreras, the study’s first author, who earned his doctorate in psychology. “When we studied the patterns of activation in this region, we found they were different for black and white faces, and for female and male faces.”

To measure how the brain acquires and process these features, the researchers turned to functional magnetic resonance imaging, or fMRI, a technique that allows researchers to monitor changes in blood flow in the brain in real time. Participants were instructed to enter the scanner and were shown various portraits of people on a computer screen. They had to quickly identify whether the faces were male or female, and for others whether the faces were black or white.

“We take images every few seconds,” Contreras explained. “Using statistical analysis, we can identify patterns of neural activity that correspond to different social categories. We could then look for differences in those patterns between the faces of blacks and whites, and between the faces of men and women.

“We also found evidence that, when we asked participants to pay attention only to the sex of a person, this region was still recognizing race. When we told them to pay attention to race, the FFA was still recognizing sex, so it appears as though this region is constantly categorizing faces by sex and race.”

So the brain focuses on both of these key aspects at the same time to acquire time. This signifies that these traits are extremely important, and as always when looking for answers to such questions you need to address evolutionary biology. One reason might be that it can be important to know the sex and race of other people, especially in contexts in which those differences should change the way in which you interact with them.

“Sex and race can be important things to know about another person, so it would make sense that as soon as you see another person, you need to know figure out the social categories to which they belong,” Contreras said.

“What’s interesting is that the FFA is also believed to be involved in some aspects of processing identity,” he added. “Obviously, characteristics that are inextricably linked to you, like your race and your sex, are part of identity. Other scientists have shown that we perceive identity by perceiving the sex and race of faces, and what we’re showing here is a sort of neural correlate of that. If this region is responsible for identity processing, it might make sense that it’s also responsible for recognizing race and sex differences.”

Findings were reported in the journal PLOS ONE.

[NOW READ] Eye colour and shape influence trustworthiness

Human face perception not limited only to humans

In the theoretical and absurd case of a pigeon going on a police identification line, do you think it would use the same processes you’d use ? If your answer is “yes”, you may seem a bit strange… but you’re right. A study published by researchers from the University of Iowa found that pigeons recognize a human face and its expression pretty much in the same way a human does.

This does raise the rather interesting question of how they were actually able to do this; well, the pigeons were shown photographs of human faces that varied in the identity, as well as the emotional expression. In the first experiment, the pigeons, who were chosen because they have very good eyesight and are extremely distant relatives of humans, were studied to see how they perceive similarities among the faces with the same identity and emotion.

In the second, key experiment, the task of the birds was a little harder; they had to categorize the pictures by identity or expression, and the researchers were quite surprised to find out that it was easier for them to ignore identity when they recognized face emotion, according to Ed Wasserman, Stuit Professor of Experimental Psychology, and graduate student Fabian Soto, both of the UI College of Liberal Arts and Sciences Department of Psychology. Basically, it’s easer for them to identify a facial expression rather than a face.

“This asymmetry has been found many times in experiments with people and it has always been interpreted as the result of the unique organization of the human face processing system.” Soto said. “We have provided the first evidence suggesting that this effect can arise from perceptual processes present in other vertebrates. “The point of the project is not that pigeons perceive faces just as we do or that people do not have specialized processes for face perception. Rather, the point is that both specialized and general processes are likely to be involved in peoples’ recognition of faces and that the contributions of each should be carefully determined empirically,” he added.

This may force scientists rethink the current understanding about how unique human cognitive processes are, and how these processes might interfere with other more general ones when it comes to rather difficult tasks, such as face recognition.

Why some people never forget a face while others can’t even remember who you are

Have you ever felt insignificant because someone didn’t remember your face (not to mention your name)? Have you ever freaked out because someone else remembered the exact time and date of your first meeting…three years earlier? Well, you’re not so unimpressive so as not to be recognized later and neither are the ones around you some weirdos. It’s just that some people can remember faces extremely well and others…can’t.
It seems that this skill varies from one individual to another so that some are very good at this, while others can hardly remember the people they saw at a party a day before. About 2% of the population even seems to have “face-blindness”, which is called prosopagnosia. On the other hand, some have turned this into a real superpower, literally never forgetting a face they focus on.
In order to assess better this ability the scientists used standardized face recognition tests, which were given to the test subjects. The results suggest that there is a lot more than having or not having this “talent”. What varies is the perception itself and also the spectrum of abilities one possesses.

Super-recognizers are able to recognize people far more often than they are recognized, so they have to pretend not to know who someone is so as no to create awkward moments. These people’s stories can really be amazing. They can recognize people who had been shopping at the same supermarket at the same day three months earlier even if they did not interact at all. One woman who was part of the study was even able to recognize on the street a woman who had served her as a waitress in a restaurant in another town five years before the study. The woman confirmed that she had in fact worked there.
Super-recognizers have even proved able to use their ability even if the person who they had seen changed his or her hair color or aged. The research proves to be highly important for eyewitness testimony, or for interviewing for some jobs, such as security or those who check identification.
The evolution of our society has made this particular skill an extremely necessary one. In the past a much smaller number of people interacted with each other. Now, in our huge cities and communities it is almost impossible to avoid unpleasant situations if you are unable to remember people’s faces.