Tag Archives: psychology

Each one of us falls into one of three information-seeking ‘personalities’

Knowing what people want to know, and why, can go a long way towards designing public information campaigns. However, it’s easier said than done. New research comes to shed some light on the topic, reporting on the criteria people rely on when deciding to get informed on a topic, or not.

Image via Pixabay.

According to the findings, at least in matters regarding to their health, finances, and personal traits, people, in general, rely on one of three criteria: the emotional reaction they assume they will have when presented with that information, how useful they consider said information will be to them, and whether or not it pertains to something that they think about often. The team says each person falls into one of these three “information-seeking types”, and that they don’t tend to change them over time.

Knowing, why?

“Vast amounts of information are now available to individuals. This includes everything from information about your genetic make-up to information about social issues and the economy. We wanted to find out: how do people decide what they want to know?” says Professor Tali Sharot from the University College London (UCL) Psychology & Language Sciences, co-lead author of the study. “And why do some people actively seek out information, for example about COVID vaccines, financial inequality and climate change, and others don’t?”

“The information people decide to expose themselves to has important consequences for their health, finance and relationships. By better understanding why people choose to get informed, we could develop ways to convince people to educate themselves.”

The study pools together data the researchers obtained over the course of five experiments with 543 research participants.

In one of the experiments, participants were asked to rate how much they would like to know about a certain topic related to their health — for example, whether they had a gene that put them at risk of developing Alzheimer’s, or one that strengthened their immune system. Another experiment followed the same pattern but substituted financial information (for example, what income percentile they fall into) in lieu of personal health. A third asked them to rate how much they would like to know where their family and friends rated them on personal traits such as intelligence or laziness.

Later on, they were asked how useful they thought the information would be, how they expected to feel upon receiving the info, and how often they thought about the subject matter of each experiment.

Based on their responses during these five experiments, the team explains that people tend to seek out information based predominantly on one of the three factors — expected utility, emotional impact, and relevance to their interests. They add that the three-factor model they establish could be used to more accurately predict a participant’s choices to seek or refuse information compared to a range of other models they tested.

Some of the participants also repeated this series of experiments several times, at intervals of a few months. Based on their responses over time, the team explains that people tend to routinely prioritize one of the three motives over the others, and they tend to stick to that one motive over time and across topics. This, they argue, suggests that our motivators in this regard are ‘trait-like’.

These traits do have a direct impact on our lives; the first, obviously, is that they drive us towards and away from certain topics and pieces of data. But they also have a bearing on our wellbeing. In two of the five experiments, participants were also asked to fill in a questionnaire that estimated their general mental health. The team explains that participants who wanted to know more about traits they often thought about showed more signs of positive mental health when seeking out information about their own traits.

“By understanding people’s motivations to seek information, policy makers may be able to increase the likelihood that people will engage with and benefit from vital information. For example, if policy makers highlight the potential usefulness of their message and the positive feelings that it may elicit, they may improve the effectiveness of their message,” says PhD student Christopher Kelly from UCL Psychology & Language Sciences a, co-lead author of the study.

“The research can also help policy makers decide whether information, for instance on food labels, needs to be disclosed, by describing how to fully assess the impact of information on welfare. At the moment policy-makers overlook the impact of information on people’s emotions or ability to understand the world around them, and focus only on whether information can guide decisions.”

The paper “Individual differences in information-seeking” has been published in the journal Nature Communications.

Our daily commute has a direct impact on our productivity and job satisfaction

Our daily commute can tell a lot about our productivity at work, according to new research.

Image via Pixabay.

New research at Dartmouth College showcases the importance our commute can have on our workday. The findings show how certain behavior and psychological patterns we exhibit during commuting can be used to accurately predict job performance and employee satisfaction levels throughout the day.

The results are based on a year-long monitoring period of commuting workers prior to the outbreak of the COVID-19 pandemic.

Start of the day

“Your commute predicts your day,” said Andrew Campbell, the Albert Bradley 1915 Third Century Professor of computer science at Dartmouth, lead researcher and co-author of the study. “This research demonstrates that mobile sensing is capable of identifying how travel to and from the office affects individual workers.”

Data for the study was recorded through the smartphones and fitness trackers of 275 workers over a one-year monitoring period. The participants’ states were also recorded for 30 minutes before and after commuting. Most of these individuals (around 95%) drove to and from work, the team reports. Participants were provided with Garmin vivoSmart 3 activity tracker and a smartphone-based sensing app.

These devices were used to record a range of factors including the levels of physical activity, phone usage, heart rates, and stress levels. This body of data could be used to accurately predict workers’ productivity and satisfaction, the authors explain. The research could also help us to raise workers’ quality of life and help them be more productive.

“We were able to build machine learning models to accurately predict job performance,” said Subigya Nepal a PhD student at Dartmouth and lead author of the paper. “The key was being able to objectively assess commuting stress along with the physiological reaction to the commuting experience.”

Each worker’s day was assessed using ‘counterproductive work behavior’ and ‘organizational citizenship behavior’, two recognized criteria of job performance. The first is behavior that harms an organization’s overall efficiency, while the latter is beneficial. The baselines for each of these behaviors were set through regular, self-reported questionnaires sent in by participants.

“Compared to low performers, high performers display greater consistency in the time they arrive and leave work,” said Pino Audia, a professor of Management and Organizations at the Tuck School of Business, a senior scientist on the study team, and a co-author of the study. “This dramatically reduces the negative impacts of commuting variability and suggests that the secret to high performance may lie in sticking to better routines.”

Apart from this, high-performers tended to show more psychological markers of physical fitness and stress resilience. Low-performers showed higher levels of stress before, during, and after the commutes, and tended to use their phone more during commutes.

This aligns well with previous research on the topic, the team explains. Such research found that stress, anxiety, and frustration felt by individuals during their commute can reduce their efficiency at work, increase levels of counterproductive work behavior, and lower their engagement with organizational citizenship behavior. However, the current study is the first to link commuting data directly with workplace performance.

“The insights from this proof-of-concept study demonstrate that this is an important area of research for future of work,” said Campbell, co-director of Dartmouth’s DartNets Lab.

The small percentage of participants who engaged in active commuting — such as walking to work — showcased that such forms of commuting are typically associated with increased productivity during the day. Additionally, the study also found that people tended to spend more time commuting back home than they do going to work in the morning.

In the future, the team hopes that their findings can be used as a basis for new technology aimed at detecting and lowering commuter stress. Such interventions could include an app that offers suggestions for short stops, music, or podcasts aimed at improving a commuter’s emotional state.

The paper “Predicting Job Performance Using Mobile Sensing” has been published in the journal IEEE Pervasive Computing.

Ancient Aboriginal memory technique may be better than the ‘memory palace’ technique

In a new study, Australian researchers compared a memorizing technique from the Aboriginal culture to the established technique of the ‘memory palace’. Both fared better than conventional types of learning, but the former came out on top.

The Aboriginal memorization method involves visualization combined with storytelling elements.

There used to be a time when humans didn’t have access to written documents — let alone things like the internet. Everything was passed down orally, from person to person. It paid to have a good memory, and various cultures developed various methods of memorizing things.

Perhaps the most well-known technique — and one that’s still used successfully today — is the method of loci, or as most people call it, the memory palace. The method hinges on the brain’s inclination to memorize images better than other types of facts. You take a place you know well (a house, a city, or a palace if you’re so inclined), and you “place” different facts you want to memorize in that layout. The layout then becomes the structure for the facts, and you recall the facts by revisiting the layout. Ancient Greeks and Romans used and described this method, and it’s still used to this day by people participating in different memory championships.

The Aboriginal culture in Australia developed a different approach. The culture relies heavily on oral stories, and facts like navigation, tool use, and even political relationships were stored orally. The Aboriginal method of memorization also uses the idea of attaching facts to a landscape, but they also add stories that describe the facts and placement to further facilitate recall.

In the study, researchers compared the two methods by training medical students in a rural university to use one or the other. The study was Dr. David Reser, from the Monash University School of Rural Health, and Dr. Tyson Yunkaporta, from Deakin University’s NIKERI Institute.

“Because one of the main stressors for medical students is the amount of information they have to rote learn, we decided to see if we can teach them alternate, and better, ways to memorize data,” Dr. Reser said.

The 76 medical students who participated in the study were split into three groups. All groups received 30 minutes of training: one was trained in the memory palace technique, another in the Aboriginal techniques, and a control group watched a regular video. The students were then asked to memorize 20 common butterfly names as a test. The students were then quizzed 10 and 30 minutes later.

The students who used the Aboriginal techniques (locations + narrative) were 2.8 times more likely to remember the entire list. The memory palace group also showed a marked improvement (2.1 times), while the control group improved by about 50% (1.5 times). The Aboriginal technique group also reported more satisfaction, finding the process more enjoyable.

“Student responses to learning the Australian Aboriginal memory technique in the context of biomedical science education were overwhelmingly favorable, and students found both the training and the technique enjoyable, interesting, and more useful than rote memorization,” the authors explain.

Although it’s a small scale study, it seems to confirm that alternative memorization techniques really can work. Furthermore, it suggests that approaches that are often overlooked (like the Aboriginal techniques) could be even more effective than established methods such as the method of loci.

“In particular the Australian Aboriginal method seems better suited to teaching in a single, relatively short instruction period,” the study also explains.

Even as we have access to an unprecedented wealth of information at our fingertips, there are still plenty of instances where being able to memorize facts on the go is very useful — med school is just one of them.

Researchers hope that once things return to a post-pandemic normal, they can incorporate such techniques into the students’ curriculum.

“This year we hope to offer this to students as a way to not only facilitate their learning but to reduce the stress associated with a course that requires a lot of rote learning,” Reser says.

The study was published in PLoS.

Boys who play video games seem to have lower depression risk — but not girls

Boys who regularly play video games at age 11 are less likely to display depression symptoms when they’re 14. But this doesn’t seem to be the case for girls. Taken together, the findings suggest that video games can have both a positive and a negative effect on mental health, and it’s not always a straightforward relationship.

Image in public domain.

Screens

If there’s one thing that has changed drastically in the past two decades, it’s computers. Computers used to be incredibly big, bulky, and not that capable. That couldn’t be further from the truth nowadays. The smartphone in your pocket is millions of times more powerful than the equipment that sent people to the moon, and year on year, they just get more and more powerful.

As a result, screens have become almost ubiquitous in our society. You have your small screen that you carry in your pocket, the big screen you work on, the even bigger screen you watch movies on, sometimes even screens on utilities. Screens are everywhere, and we’re not really sure if that’s a good thing — especially when it comes to kids.

Ever since computers became mainstream, researchers have voiced concerns about screens, concerns ranging from vision to mental health problems. But screens allow us to do different things and can have varying effects, and we should consider this instead of drawing any blanket conclusions, researchers say.

“Screens allow us to engage in a wide range of activities. Guidelines and recommendations about screen time should be based on our understanding of how these different activities might influence mental health and whether that influence is meaningful,” says Aaron Kandola the author of the new study.

At first, the general idea seemed to be that video games can have a negative effect on mental health, making children more aggressive and worsening their mental health. But a flurry of recent studies paints a very different picture, showing not only that much of this damage was overstated, but that in many instances, casual video gaming can actually improve the mental health of children.

In the new study, the results are a mixed bag. A research team involving from UCL, Karolinska Institutet (Sweden) and the Baker Heart and Diabetes Institute (Australia) reviewed data from 11,341 adolescents who are part of the Millennium Cohort Study, a nationally representative sample of young people who have been involved in research since they were born in the UK in 2000-2002. They asked the teens at age 11 about how much they spend on social media, video games, and other internet activities. Then, at age 14, they asked them again about any depression symptoms.

After accounting for other factors that may affect the results (such as socioeconomic status, physical activity, or reports of bullying), the researchers look at how depression symptoms were linked with screen habits. At age 14, boys who played video games most days had 24% fewer depressive symptoms than boys who played video games less than once a month. This effect, however, was not observed on girls. Although it’s not clear why this happens, researchers link it with different screen use patterns between boys and girls.

“While we cannot confirm whether playing video games actually improves mental health, it didn’t appear harmful in our study and may have some benefits. Particularly during the pandemic, video games have been an important social platform for young people,” adds Kandola, who is a PhD student at UCL Psychiatry.

Sitting down and social media cause problems

The researchers note that the positive effect on boys was only significant among those with low physical activity. We all know (or should know) that sitting down for prolonged periods is really bad for your health, but it’s important to know that sitting down can affect your mind as well as your body. Kandola’s previous research has shown that sedentary behavior seems to increase the risk of depression and anxiety in adolescents. So it could very well be that sitting down (and not screen time itself) is causing harmful effects.

“We need to reduce how much time children – and adults – spend sitting down, for their physical and mental health, but that doesn’t mean that screen use is inherently harmful.”

Social media also plays a role. For girls, this role seems to be particularly important. Researchers found that girls (but not boys) who used social media at age 11 had 13% more depressive symptoms when they were 14. The same association was not found for more moderate use of social media. This fits with previous studies indicating that intense social media usage can increase feelings of loneliness and alienation.

The study only shows an association, not a cause-effect relationship. But it seems to suggest that not all screen time is equal, and video games can have a positive component. Researchers say that video games could support mental health, especially those that feature problem-solving, social, cooperative, and engaging elements. At any rate, reducing the amount of sedentary time seems to be a much healthier intervention than reducing screen time.

Senior author Dr. Mats Hallgren from the Karolinska Institutet has conducted other studies in adults, finding that active screen time (when you’re doing something like playing a game) seems to have a different effect on depression than passive screen time (watching something).

“The relationship between screen time and mental health is complex, and we still need more research to help understand it. Any initiatives to reduce young people’s screen time should be targeted and nuanced. Our research points to possible benefits of screen time; however, we should still encourage young people to be physically active and to break up extended periods of sitting with light physical activity,” says Hallgren.

The study was published in Psychological Medicine.

Training our playfulness can improve life satisfaction, help with depression

Playfulness is an acquired skill, a new paper reports.

Image via Piqsels.

People can become more playful by engaging in simple exercises, a new paper reports. The team found that greater levels of playfulness are also associated with higher life satisfaction. One week’s worth of playfulness exercises was enough for the researchers and participants to notice the effects.

The player of games

“Particularly playful people have a hard time dealing with boredom. They manage to turn almost any everyday situation into an entertaining or personally engaging experience,” explains Professor René Proyer, a psychologist at Martin Luther University Halle-Wittenberg (MLU).

If you know someone who gets bored easily but also squeals in delight at word or mental games, or someone who’s particularly curious, chances are that person is high on playfulness. Far from making someone silly, irresponsible, or undependable, the team cites past research from the MLU finding that playful adults have an eye for detail, can easily understand and adopt a new way of looking at an issue, and have a knack for making even monotonous tasks enjoyable and interesting for themselves.

In other words, it’s a pretty good skill to have. And it’s one you can train.

The team worked with 533 participants in collaboration with researchers from the University of Zurich in Switzerland and Pennsylvania State University (USA). They were randomly assigned to two experimental groups and one control (placebo) group. Each day for a week, they had to perform an exercise for 15 minutes before going to bed; each participant was assigned one of three exercises and was not aware that other groups were receiving different ones.

These were “think about three playful things that happened during the day”, to consider how they used playfulness “in a different way than they are used to (e.g. doing something playful at the workplace) [during the day]”, or to “reflect on playful experiences they have had over the day” either as observers or as actors. Those in the control group were asked to “write about their early memories from their childhood” for 15 minutes before going to bed.

All participants filled out a questionnaire before the experiment, at its conclusion, and on the second, fourth, and twelfth week after the intervention. The questions were designed to measure various personality traits.

“Our assumption was that the exercises would lead people to consciously focus their attention on playfulness and use it more often. This could result in positive emotions, which in turn would affect the person’s well-being,” Brauer explains.

The team reports that the exercises “increased expressions in all facets of playfulness, had short‐term effects on well‐being, and ameliorated depression” for all participants in the experimental groups. They also led to an increase in playfulness (as estimated from their answers to the questionnaire) and a temporary increase in reported well-being for the participants.

The findings warrant further research into how training to be more playful can help us in our personal, romantic, and professional lives.

“Our study is the first intervention study on adults to show that playfulness can be induced and that this has positive effects for them,” says Proyer. “I believe that we can use this knowledge in everyday life to improve various aspects.”

“This does not mean that every company needs tennis tables or a playground slide. However, one idea would be to allow employees to consciously integrate playfulness into their everyday work and, as a supervisor, to set an example for this kind of behavior.”

The paper “Can Playfulness be Stimulated? A Randomised Placebo‐Controlled Online Playfulness Intervention Study on Effects on Trait Playfulness, Well‐Being, and Depression” has been published in the journal Applied Psychology: Health and Well-Being.

The Dunning-Kruger effect, or why the ignorant think they’re experts

“The fool doth think he is wise, but the wise man knows himself to be a fool,” wrote Shakespeare in As You Like It. Little did he know, but this line perfectly encapsulates the spirit of the Dunning-Kruger effect.

A tongue-in-cheek graph showcasing the Dunning-Kruger effect.
Image via Wikimedia.

The Dunning-Kruger effect is a cognitive bias first highlighted in literature by David Dunning and Justin Kruger in the (now-famous) 1999 study Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments.

The study was borne from the shenanigans of one McArthur Wheeler who, in the broad daylight of a sunny April 19, 1995, decided to rob two saving banks in Pittsburgh. Wheeler carried a gun, but not a mask. Surveillance cameras recorded him in the act, and the police put his picture up in local news, receiving multiple tips almost immediately.

When they went to perform the arrest, Mr. Wheeler was visibly confused.

“But I wore the juice,” he managed, before officers carried him away.

There’s no such thing as ‘foolproof’

At one point in his life, Mr. Wheeler learned that lemon juice can be used as an ‘invisible ink’. Write something down on a piece of paper using lemon juice and you won’t see a thing — until you heat it up, making the scribblings visible. So, naturally, he covered his face in it and went to rob a bank, confident that his identity would remain secret to cameras as long as he didn’t come close to any sources of heat.

Still, credit where credit is due: Mr. Wheeler wouldn’t go out on blind faith. He actually did test out his theory by taking a selfie with a polaroid camera (there’s a budding scientist in all of us). For some reason or another, maybe the film was defective, we don’t know exactly why, the camera did return a blank image.

The news made the rounds, everybody had a good chuckle, and Mr. Wheeler was wheeled off to jail. The police concluded that he wasn’t crazy or on drugs, he actually believed his plan would work. “During his interaction with the police, he was incredulous on how his ignorance had failed him,” wrote Anupum Pant for Awesci.

David Dunning was working as a psychologist at Cornell University at the time, and the bizarre story caught his eye. Enlisting the help of Justin Kruger, one of his graduate students, he set out to understand how Mr. Wheeler could be so confident in a plan that was plainly stupid. The theory they developed is that almost all of us view our abilities in certain areas as above average and that most are likely to assess our skills as being much better than they objectively are — an “illusion of confidence” that underpins the Dunning-Kruger effect.

We’re all clueless

Between how you see yourself and how you really are.
Image via Pxfuel.

“If you’re incompetent, you can’t know you’re incompetent,” Dunning wrote in Self-Insight: Roadblocks and Detours on the Path to Knowing Thyself.

“The skills you need to produce a right answer are exactly the skills you need to recognize what a right answer is.”

In the 1999 study (the first they carried out on the topic), the duo asked undergrads at Cornell a series of questions about grammar, logic, and humor (these were used to gauge the students’ actual skills) and then asked each to estimate the overall score they would achieve, and how that related to the scores of the other participants. The lowest-ranking students, they found, consistently and substantially overestimated their own ability. Students in the bottom quartile (lowest 25% by score) thought that they out-performed two-thirds of the other students on average (i.e. that they ranked in the top 33% by score).

A follow-up study that the authors carried out at a gun range showed similar results. Dunning and Kruger used a similar methodology, asking hobbyists questions about gun safety and to estimate how well they performed on the quiz. Those who answered the fewest questions correctly also wildly overestimated their mastery of firearm knowledge.

It’s not specific only to technical skills but plagues all walks of human existence equally. One study found that 80% of drivers rate themselves as above average, which is literally impossible because that’s not how averages work. We tend to gauge our own relative popularity the same way.

It isn’t limited to people with low or nonexistent skills in a certain matter, either — it works on pretty much all of us. In their first study, Dunning and Kruger also found that students who scored in the top quartile (25%) routinely underestimated their own competence.

A fuller definition of the Dunning-Kruger effect would be that it represents a bias in estimating our own ability that stems from our limited perspective. When we have a poor or nonexistent grasp on a topic, we literally know too little of it to understand how little we know. Those who do possess the knowledge or skills, however, have a much better idea of where they sit. But they also think that if a task is clear and simple to them, it must be so for everyone else as well.

A person in the first group and one in the second group are equally liable to use their own experience and background as the baseline and kinda just take it for granted that everyone is near that baseline. They both partake in the “illusion of confidence” — for one, that confidence is in themselves, for the other, in everyone else.

But perhaps not equally clueless

Image via Pxhere.

To err is human. But, to confidently persist in erring is hilarious.

Dunning and Kruger did seem to find a way out of the effect they helped document. While we all seem to be equally likely to delude ourselves, there is one key difference between those who are confident yet unable and those able yet lacking in confidence — how we deal with and integrate feedback into our behavior.

Mr. Wheeler did try to check his theory. Yet, he looked at a blank polaroid he just shot — a pretty big giveaway that something didn’t work out properly — and saw no cause for concern; the only explanation he accepted was that his plan worked. Later, he receives feedback from the police, but this in no way shape or form manage to diminish his certainty; he was “incredulous on how his ignorance had failed him” even when he had absolute confirmation (being in jail) that it did fail him.

During their research, Dunning and Kruger found that good students would better predict their performance on future exams when given accurate feedback about the score they achieved currently and their relative ranking among the class. The poorest-performing students would not change their predictions even after clear and repeated feedback that they were performing badly. They simply insisted that their assumptions were correct.

Jokes aside, the Dunning-Kruger effect isn’t a failing on our part; it’s simply a product of our subjective understanding of the world. If anything, it serves as a caution against assuming we’re always right and highlights the importance of keeping an open mind and a critical view of our own ability.

But if you’re afraid that you might be incompetent, you could check by seeing how feedback affects your view on your own work, knowledge, skills, and how that relates to others around you. If you truly are, you won’t change your mind and this process is basically a waste of time but fret not — someone will tell you you’re incompetent.

And you won’t believe them.

Music can be used to estimate political ideology to an “accuracy of 70%”, researchers say

Do you like Pharrell’s “Happy”? Then you’re probably a conservative.

If you’ve ever tried to argue with a stranger on the Internet about politics (or with your family at Thanksgiving dinner), you’re well aware that it’s a recipe for disaster: political ideology is often so deeply rooted that it feels hard-wired into our DNA. Political ideology strongly influences our views on things like economics and social policies, but could it also have far-reaching influences on things we aren’t even aware of? The Fox Lab at New York University believes the answer is yes.

Their theory?

“Ideology fundamentally alters how we perceive a neutral stimulus, such as music,” said Caroline Myers, who presented her research at the 2018 Society for Neuroscience Meeting.

To examine the influence of political ideology on musical preference, participants self-reported their political ideology as liberal, conservative, or center, and then listened to clips from 192 songs. For each song clip, they would rate how familiar they were with the song and then how much they liked or disliked it. These songs included the top 2 songs each year from the Billboard Top 40, iconic songs across certain genres, and a selection of more obscure music. Participants additionally ranked how often they believed they listened to certain genres of music — which led to some surprising findings.

For example, 60% of individuals who identified as liberals said that they listen to R&B music, and yet they weren’t any more familiar with these songs than any other group — and they actually liked R&B songs less than their conservative counterparts. Liberals also stated they listen to jazz but were not any more familiar with jazz music than the other groups.

They also looked at individual song preference across the various ideologies. Some did not showcase any major differences, with classical music being the least divisive of all the musical genres. The most polarizing song, however, was “Happy” by Pharrell Williams. Conservatives love it, while liberals hate it. And there’s actually evidence of this in the real world — just two weeks ago, Pharrell issued President Donald Trump a cease and desist order for using the song at one of his rallies.

While we can use this information to create a kick-ass playlist for our like-minded friends, is there any evidence that we can guess an individual’s political ideology purely based on musical taste? Surprisingly, the answer is yes.

“We were able to estimate individual’s ideological leanings to an accuracy of 70%,” said Myers.

Myers is currently working on addressing the limitations of her study such as the limited number of conservative participants due to heavy on-campus recruiting for the study. However, the results are still striking, and quite concerning, from a personal data standpoint. It goes to show that, even if we’re not actively posting personal details on social media, companies may still have other means to gain insight into our personal preferences – and we might not even be aware of it.

Knife.

The genders kill differently — and one paper proposes it’s because of our ancient roles

The ways serial killers pick their victims may be more deeply entwined with our sexes’ evolutionary history than you’d assume, new research reports.

Knife.

Image credits Pixabay.

Male serial killers tend to treat their victims, who are often strangers, like prey — they “hunt” them, often stalking them before striking. Female serial killers, in contrast, tend to “gather” victims, targeting people they already know, often for financial gain. These are the findings of a new paper which looked into how our evolutionary history plays a part in shaping each gender’s more sinister pursuits and which, the team hopes, can help us catch those that dabble in them.

The wolf changes his coat, not his habits

“If a murder has been committed without a known suspect, you can sometimes use details of the crime to form a profile of what the perpetrator might look like,” said Marissa Harrison, associate professor of psychology at Penn State Harrisburg. “So if you know that men are more likely to commit a crime in a certain way and women are more likely to do it another, hopefully it can help investigators go down the correct path.”

While the public’s, and our editors’ imaginations are rapt with serial killers, Harrison says that very little actual research has been dedicated to understanding them — likely because they’re (thankfully) quite rare. However, while working on a previous study, Harrison noticed a difference between the modus operandi of male and female serial killing — a difference that she was interested in exploring.

Together with her team, Harrison combed through reputable, reliable news sources like the Associated Press, Reuters, TV networks, and national and local newspapers for data about serial murders in the U.S. In the end, they compiled data on 55 female and 55 male serial killers and set about defining how their approach differed — and understanding why.

Male serial killers, the team found, were almost six times as likely to kill a stranger than their female counterparts. Alternatively, these sinister damsels were nearly twice as likely to kill a person they already knew than male serial killers. Roughly 65.4% of male serial killers stalked their victims, while only 3.6% of female serial killers engaged in this behavior. Harrison believes these differences arise from the roles men and women tended to fulfill in ancient societies, which left their mark on each gender’s psyche.

“Historically, men hunted animals as prey and women gathered nearby resources, like grains and plants, for food,” she says. “As an evolutionary psychologist, I wondered if something left over from these old roles could be affecting how male and female serial killers choose their victims.”

“In our sample, there were two female serial killers who engaged in stalking-like behavior during their crimes. Interestingly, reports indicate that men were also involved in those crimes.”

Surprisingly, there also seems to be a difference in how we perceive each gender of serial killer. The team found that the media and public at large were pretty consistent in the patterns of nicknames they gave to male and female killers.

“Women were more likely to be given nicknames that denote their gender — like Jolly Jane or Tiger Woman,” Harrison adds. “Men were more likely to be given nicknames that suggest the brutality of their crimes, like the Kansas City Slasher.”

Harrison hopes that the findings can help investigators profile killers faster and with more accuracy in the future. She would also like to see the results applied to creating better prevention and treatment programs for violent offenders. However, it is very important for everyone to understand that, while evolutionary psychology may help explain the differences between male and female serial killers, these rules are not set in stone — and must not, in any way, be interpreted as saying that any one person is born to commit crimes.

“Evolution doesn’t mean you’re predetermined to do certain things or act a certain way,” Harrison said. “It means that it may be possible to make predictions about behavior based on our evolutionary past. In this case, I do believe that these behaviors are reminiscent of sex-specific behaviors or assignments in the ancestral environment.”

“And perhaps we can understand this better through an evolutionary lens.”

The paper “Sex differences in serial killers” has been published in the journal Evolutionary Behavioral Sciences.

If you think cats are antisocial, it’s probably you, new study concludes

Despite being intertwined with humans for millennia and enjoying dramatic internet popularity, the personality of cats remains surprisingly understudied. A new research paper concludes that if you feel cats are antisocial — it’s probably you.

It’s not Fluffy’s fault — it’s probably yours. Isn’t that right, Fluffy?

We all know how cats can be… demanding, independent, but also caring and attached. In fact, if you talk to different people about the personality of cats, you might get completely opposite answers. In a new paper, Kristyn Vitale, a postdoctoral scholar in animal behavior, writes that cats are “facultatively social animals” — they have flexible personalities and can be more or less friendly, depending on the situation. But how do they decide whether or not to be friendly and attentive? In the new study, Vitale suggests that cats are surprisingly attuned to our own personality — and to how much attention we pay them.

The study featured two experiments, both designed to highlight the cats’ interactivity. In the first one, 46 cats (half at a shelter, half in their own homes) were placed in a room with a complete stranger who sat on the floor. For 2 minutes, the person ignored the cat. After another two minutes, they would call the cat by its name and pet it when it approached.

The second experiment involved only pet cats, who went through the exact same experiment with other owners. Both experiments revealed the same thing: cats spent much more time near the human when it showered them with attention.

“Relatively little scientific research has been conducted on cat-human social behavior,” researchers write in the study. “Human attentional state and cat population influenced cat sociability behaviors,” they continue, adding that regardless of whether the cats knew the person or not, they were still more social when paid more attention.

This study indicates two things: first, cats are kind of like us. Sure, some are friendlier while others tend to keep to themselves, but if you pay attention to them, they enjoy it and become friendlier. However, since cats are territorial animals, there were big differences between how cats behaved in their home and in a different place.

Cats don’t seem to be overly independent, researchers conclude:

“Although we have found that indeed a wide range of individual variation exists in domestic cats, we have not necessarily seen this bias toward independence.”

Some cats are definitely friendlier than others — and exactly why this happens is not clear, though it’s most likely a complex mixture of genes and previous experiences — but for cat owners, the takeaway is simple: if you want your cat to pay more attention to you, try paying more attention to it.

The study “The quality of being sociable: The influence of human attentional state, population, and human familiarity on domestic cat sociability” has been published in Behavioral Processes.

Materialism is on the rise — here’s how to avoid raising a materialistic child

Materialism is on the rise, and it’s linked to significant mental problems, particularly in children. In a new study, researchers describe a way to curb kids’ materialistic tendencies: working on their generosity.

There’s a philosophical discussion to be had about materialism and whether it is an inherent problem or not, but practically speaking, materialism has been linked to a variety of mental health problems, including anxiety and depression, while mental side-effects such as selfish attitudes and behaviors essentially come by default. Needless to say that that’s not really how you want your kid ending up — but there’s some good news. Researchers at the University of Illinois at Chicago have published a new study documenting what parental tactics can curb kids’ materialistic tendencies.

“Our findings show that it is possible to reduce materialism among young consumers, as well as one of its most common negative consequences (nongenerosity) using a simple strategy — fostering gratitude for the things and people in their lives,” writes researcher Lan Nguyen Chaplin, associate professor of marketing at the University of Illinois at Chicago and coauthor of the study.

After studying a nationwide sample of more than 900 adolescents ages 11 to 17, Chaplin’s team found that unexpectedly, there’s a silver bullet when it comes to defeating materialism: harboring generosity.

The teens were asked to fill out two short questionnaires. The first was a measure of materialism, assessing the value they placed on money and other material goods, while the second one was a measure of gratitude, assessing how thankful the teens are for the people and possessions in their life.

All adolescents were randomly assigned to keep a daily journal for two weeks, but one group was asked to record a gratitude journal (writing down who and what they were thankful for each day), whereas the second group (the control group) was asked to record their daily activities. After two weeks, the journals were collected. Participants were asked to fill out the same questionnaires and were also given ten $1 bills for participating, which they could keep for themselves or donate to charity.

Researchers note that participants who were asked to record the gratitude journal exhibited lower materialism score, and they were also more likely to split some of their $10 with charity.

“Collectively, our findings show that it is possible to reduce materialism among young consumers, as well as one of its most common negative consequences (nongenerosity), using a simple strategy – fostering gratitude for the things and people in their lives,” the study reads.

“The results of this survey study indicate that higher levels of gratitude are associated with lower levels of materialism in adolescents across a wide range of demographic groups,” Chaplin added.

The team concludes by calling for further research to extend and enrich our understanding of how gratitude can benefit the development of positive values among children and adolescents.

The study was published in The Journal of Positive Psychology

People can handle the truth — more than you think

A new study could have significant implications for interpersonal relationships and suggests that we should be blunt more often — even when we think we shouldn’t.

We’ve all been there at some point: a softening of the truth, avoiding an unpleasant discussion, or simply not telling your friend that he’s bad at karaoke — we do it to protect people’s feelings and to avoid awkward social interactions. But a new study explored the benefits and downsides of honesty in everyday relationships, finding that often times, people can afford to be more honest than they think they can.

Chicago Booth Assistant Professor Emma Levine and Carnegie Mellon University’s Taya Cohen say that people overestimate the damage that direct honesty does, at least in the long run.

“We’re often reluctant to have completely honest conversations with others,” says Levine. “We think offering critical feedback or opening up about our secrets will be uncomfortable for both us and the people with whom we are talking.”

Obviously, honest conversations are not always pleasant, but is that cost worth it? Levine and Cohen carried three experiments: in the first one, participants were instructed to be completely honest with everyone in their lives for three days. In the second one, participants had to be perfectly honest with a close relational partner, answering personal and potentially difficult discussion questions. In the third experiment, the roles were switched and participants had to honestly share negative feedback to a close relational partner. For the purpose of this study, honesty was defined as “speaking in accordance with one’s own beliefs, thoughts and feelings.”

In all experiments, participants reported that things panned out much better than they expected. Both when giving and receiving feedback, the expectation was worse than the reality turned out to be. Furthermore, in the long run, it seems that people appreciate those who are honest, even if the initial result is on the negative side.

The results suggest that individuals tend to misunderstand the consequences of increased honesty: in other words, we fear that if we’re honest, people can’t take it, and that will lead to all sorts of problems. But according to this study, we overestimate this negativity.

Simply put, researchers say, being honest (even when critical) might not be as bad as we tend to think.

“We’re often reluctant to have completely honest conversations with others,” says Levine. “We think offering critical feedback or opening up about our secrets will be uncomfortable for both us and the people with whom we are talking.”

Of course, this is still just a preliminary study. Many aspects of the process weren’t controlled — for instance, how you say something can be just as important, and who you say it to (and in what context) can also matter.

Our society can definitely use a bit more honesty and who knows — we may even get rewarded for speaking our mind.

The study has been published in the Journal of Experimental Psychology.

Angry people are more likely to overestimate their intelligence — but that’s not the whole story

People who are quick to lose their temper are also more likely to overestimate their intelligence, a new study reports.

Anger and optimism

Not all negative emotions are created equally. Feelings of anxiety and depression are typically associated with a more negative outlook on life — but anger, one of the study authors explains, is more closely linked to optimism. People who are angrier are just as optimistic as people who are generally happy.

“In a recent project, I examined the relationship between anger and various cognitive functions. I noticed from the literature review that anger differs significantly from other negative emotions, such as sadness, anxiety or depression. Anger is more approach-oriented and associated with optimistic risk perception and generally optimistic bias,” said study author Marcin Zajenkowski of the University of Warsaw.

Zajenkowski was wondering whether anger could influence other characteristics of people, namely how they perceive their own intelligence. So he carried out two studies with a sample size of 528 undergraduate students, assessing their anger, their intelligence, and their self-perceived intelligence. Participants undertook an array of 2-4 fluid intelligence tests (focusing on the ability to solve new problems, use logic in new situations, and identify patterns instead of relying on previously learned knowledge).

Researchers also evaluated the neuroticism and narcissism of the participants, looking for any associations and patterns.

The research revealed that anger was associated with an overestimation of one’s intelligence, though it was unrelated to one’s actual level of intelligence. In other words, if you lose your temper quickly, that doesn’t say anything about your intelligence — but it might say something about your self-perceived intelligence.

Interestingly, neuroticism, which was positively correlated with anger, tends to negatively correlate with self-assessed intelligence — so neuroticism acts as a suppressor for overestimating one’s intelligence.

However, this doesn’t really tell the whole story, due to a familiar problem that’s all too familiar in psychology.

The WEIRD problem

WEIRD stands for Western, Educated, Industrialized, Rich, and Democratic (as in living in a democracy).

Psychology studies overwhelmingly rely on WEIRD participants, which are typically undergrads — 67% of American psychology studies use college students, for example — and this is a problem because undergrads aren’t really representative for the whole population.

It’s easy to understand why researchers do this: gathering a large enough sample is complicated, and studies don’t typically receive that much funding. Undergrads are on campus (so they’re easily available), they often enroll for little or no money, and they can be quite homogeneous as a group — which allows scientists to detect small differences.

So while the study has been peer-reviewed and highlights an intriguing association, it also comes with the significant caveat: it addresses a very particular subset of the population, which may not be representative of the broader situation.

Journal Reference: Marcin Zajenkowski and Gilles Gignac. “Why do angry people overestimate their intelligence? Neuroticism as a suppressor of the association between Trait-Anger and subjectively assessed intelligence.” https://doi.org/10.1016/j.intell.2018.07.003

London metro.

Egotists’ brains just don’t care about the future, affecting their choices in life

The brains of altruists simply don’t function the same as those of egotistical individuals, a new paper reports — and it can influence our choices in profound ways.

London metro.

Image credits Mattbuck / Wikimedia.

Some people do actually worry about future consequences — for example, those of climate change — while others can’t be bothered with something if it doesn’t impact their well-being directly. Wanting to know if these differences arise from the brain, a team of researchers at the University of Geneva (UNIGE), Switzerland, took an MRI (magnetic resonance imaging) machine and peered into the brains of volunteers.

Their results suggest that “egotistical” individuals don’t use the area of the brain that allows us to imagine and insert ourselves into the distant future. Those deemed “altruistic,” on the other hand, registered heavy activity in the same area. These findings, the team reports, could be used to improve people’s ability to project themselves into the future and engage more of the public with issues such as climate change.

Too Long; Didn’t Relate

One of the most fundamental criteria from which individual concerns arise is whether or not somebody prioritizes their personal well-being or puts everybody on equal footing, the team notes. In order to encourage as many people as possible to engage in sustainable behavior, it is, therefore, necessary for them to feel that the consequences of climate change impact them as well. People who are more self-centered don’t give the issue much thought, believing the consequences and potential hardships are far removed from them, both in time and space.

But why is it that some people register climate change as a pressing concern, while others can’t even be bothered with it?

“We wondered what magnetic resonance imaging (MRI) could teach us about how the brain processes information about the future impact of climate change, and how this mechanism differs depending on the self-centeredness of the individual,” says Tobias Brosch, professor in the Psychology Section at UNIGE’s Faculty of Psychology and Educational Sciences (FPSE), and lead author of the paper.

The team started their research from the IPCC‘s (Intergovernmental Panel on Climate Change) 2014 synthesis report. They identified predictions about the outcomes of changing climate based on this document — for example a reduction in drinking water supplies, economic collapse, social upheaval, or increased political instability and violence. Then, they assigned a year sometime in the future for each of these effects — these dates weren’t meant to reflect when the effects will actually make themselves felt, but, rather, to see how subjects react to perceived threats in the near and (most importantly) far future.

Participants were then asked to complete a standardized questionnaire meant to measure their value hierarchies, giving the researchers an estimate of each individual’s selfish or altruistic tendencies. After that, the participants’ brains were scanned with an MRI machine. Finally, they were shown the dated consequences of the events, and were asked to answer two questions on a 1-8 scale for each consequence: “Is it serious?” and “Are you afraid?”.

Figure 1.

Individual concern judgments for events occurring in the near future (2025-2035) and the far future (2075-2085), respectively. “Egotistical” participants (dashed line) showed significantly less concern for events in the far future than those in the near future. “Altruistic” participants (solid line) appear equally concerned with regards to events occurring in the near and far future.
Image credits Tobias Brosch et al., 2018, CA&B Neuro.

“The first result we obtained was that for people with egotistical tendencies, the near future is much more worrying than the distant future, which will only come about after they are dead. In altruistic people, this difference disappears, since they see the seriousness as being the same,” explains Brosch.

Later, the team focused their investigations on the activity of the ventromedial prefrontal cortex (vmPFC). This area of the brain, located directly above the eyes, comes into use when we think about the future or try to visualize it and our role in it. In altruistic people, the researchers report, “this cerebral zone is activated more forcefully when the subject is confronted with the consequences of a distant future as compared to the near future”. Egotistical people saw no increase in activity when confronted with consequences in the near or distant future.

Figure 2.

A) Activation in anterior vmPFC when considering events in the far future (2075-2085) relative to events in the near future (2025-2035) based on the exploratory whole-brain analysis. B) Parameter estimates for the region of interest in anterior vmPFC for the near (dashed line) and far (solid line) future scenarios. “Altruistic” participants (+1 SD) showed increased responses when considering consequences in the far future relative to consequences in the near future, while “egocentric” participants (−1 SD) did not show this increase.
Image credits Tobias Brosch et al., 2018, CA&B Neuro.

Since the vmPFC is used to project ourselves into the distant future, the absence of any heightened activity in self-centered people suggests that their brains don’t put them in the shoes of someone living in the future and, as such, they simply can’t relate to or feel concerned about what will happen after their death. In this case, they would have virtually no incentive to adopt sustainable behaviors, as they wouldn’t register the collective benefits of such a choice, only their personal cost.

The findings — which the authors note can be applied to all areas of human choice, not just those regarding climate change — show how important it is in a society for individuals to be able to think about the distant future and adapt their behavior to future needs and constraints.

“In our everyday life, we are frequently confronted with situations in which we need to choose between following our egoistic impulses and taking into account the needs of others. Do I spend my money on yet another treat for myself, or do I give it to the beggar sitting on the street corner? Do I buy a powerful SUV car, which is a lot of fun but also quite the polluter, or rather do I invest in an electric vehicle, which is maybe not as much fun, but helps to preserve the environment for future generations? Whether the consequences of our choices for ourselves and others are visible immediately or will only materialize in the future, we need to integrate them into our considerations when deciding,” the paper’s abstract reads.

“We could imagine a psychological training that would work on this brain area using projection exercises,” suggests Brosch. “In particular, we could use virtual reality, which would make the tomorrow’s world visible to everyone, bringing human beings closer to the consequences of their actions.”

The paper “Not my future? Core values and the neural representation of future events” has been published in the journal Cognitive, Affective, & Behavioral Neuroscience.

Video games might make you more sexist, but not as much as religion

The way women are portrayed in many video games — attractive, scantily clad, performing limited roles — sends a powerful message to gamers, making them more subjective to sexism.

Playing video games might make teens a bit more sexist. Image credits: R Pollard.

Douglas Gentile, a professor of psychology at Iowa State University, followed some 13,000 adolescents aged 11 to 19, who spent approximately three hours a day watching TV and nearly two hours playing video games, on average. He found a very small, but significant connection between video games and sexism.

However, it’s not like video games are ruining the pristine minds of teenagers — “traditional values” do much more harm in this case. Gentile didn’t only look at the video games, he also studied the impact of television and religion, finding that religion was three times more likely to make teens sexist.

“Many different aspects of life can influence sexist attitudes. It was surprising to find a small but significant link between game play and sexism. Video games are not intended to teach sexist views, but most people don’t realize how attitudes can shift with practice,” Gentile said. “Nonetheless, much of our learning is not conscious and we pick up on subtle cues without realizing it.”

To measure the impact, researchers asked participants how much they agree or disagree with the following statement:

  • “A woman is made mainly for making and raising children.”

GTA is a game where women are particularly sexualized.

Participants who spent more time playing video games were more likely to agree, and participants who were religious were even more likely to agree. The fact that religion is much more impactful in terms of sexism is really worrying, though this was not the central focus of this study. Another interesting finding was that sexism was also connected with lower social economic status in teenagers.

Repeated exposure to media also changes our perception, and there’s a lot to be improved with how women are represented on television as well. Basically, it’s not necessarily that video games are sexist in nature, it’s more that they are another type of media where women are misrepresented. This is the so-called cultivation theory, which states that the more people watch TV, the more likely they are to believe that the reality presented on TV is the real reality. In this sense, a similar thing could be applied to games, especially role-playing games where players pick a character and walk through his or her decisions.

“If you repeatedly ‘practice’ various decisions and choices in games, this practice can influence your attitudes and behaviors outside of the gaming world,” Gentile said.

These findings go against those of a previous study, conducted in Germany. In 2015, researchers found no connection between video games and sexism. The fact that this new study was conducted in the US and the 2015 one was carried out in Germany, and they came up with different conclusions, might indicate that culture (and of course, religion) also play an important role. However, Gentile says the results are applicable across cultures because this study is focused on learning behavior, not on inherited traits. How we learn and adapt to cues is independent of culture, he argues.

Journal Reference: Laurent Bègue, Elisa Sarda, Douglas A. Gentile, Clementine Bry and Sebastian Roche — Video Games Exposure and Sexism in a Representative Sample of Adolescents. doi: 10.3389/fpsyg.2017.00466

Personality traits are “contagious among children” — especially good ones

When preschoolers hang out with each other, they tend to borrow each other’s personalities, but they prefer good traits over bad traits

Image credits: isakarakus / Pixabay

Our understanding of personality is still a work in progress. There’s an ongoing debate about how much of our personality is owed to genes and how much is owed to environmental factors. Now, a new study might lend some new weight to the latter. Researchers from the psychology department at Michigan State University studied two preschool classes for a whole school year, reporting that social networks and traits between the children become similar over time.

“Our finding, that personality traits are ‘contagious’ among children, flies in the face of common assumptions that personality is ingrained and can’t be changed,” said Jennifer Watling Neal, associate professor of psychology and co-investigator on the study. “This is important because some personality traits can help children succeed in life, while others can hold them back.”

There’s a bit of good news around it too. Children whose play partners were hard working and extroverted took on these traits, while children whose play partners were overanxious and easily frustrated tended to be more resistant. This is the first study to show how likely children are to influence each other’s personality, and it’s also the first one to show that positive traits catch on more easily than negative ones. This might indicate an innate tendency to develop positive traits, but it’s just speculation at this point.

Emily Durbin, study co-investigator and associate professor of psychology, said kids are having a bigger effect on each other than people may realize. She believes that the role of the children’s peers in education has been significantly underestimated.

“Parents spend a lot of their time trying to teach their child to be patient, to be a good listener, not to be impulsive,” Emily Durbin, associate professor of psychology and co-author of the study, said in a press release. “But this wasn’t their parents or their teachers affecting them — it was their friends. It turns out that 3- and 4-year-olds are being change agents.”

Journal Reference: Neal, Jennifer Watling; Durbin, C. Emily; Gornik, Allison E.; Lo, Sharon L. — Codevelopment of Preschoolers’ Temperament Traits and Social Play Networks Over an Entire School Year. http://dx.doi.org/10.1037/pspp0000135

Optimistic women live more, new study finds

An encouraging study conducted by Harvard researchers found that having an optimistic outlook on life may help people live longer.

Credit: Estitxu Carton.

The study only included women, but the results should be significant for everyone. What researchers found, over a period of 8 years, is that women who were generally optimistic had a significantly reduced risk of dying from several major causes of death — including cancer, heart disease, stroke, respiratory disease, and infection. Some of this can be attributed to optimistic people generally heaving healthier lifestyles, but a possibility is that higher optimism directly impacts our biological systems.

“While most medical and public health efforts today focus on reducing risk factors for diseases, evidence has been mounting that enhancing psychological resilience may also make a difference,” said Eric Kim, research fellow in the Department of Social and Behavioral Sciences and co-lead author of the study. “Our new findings suggest that we should make efforts to boost optimism, which has been shown to be associated with healthier behaviors and healthier ways of coping with life challenges.”

Out of the 70,000 women enrolled in the study, the most optimistic 25% had a nearly 30 percent lower risk of dying from any of the diseases analyzed, compared to the least optimistic 25% of women. The most noticeable difference was in cases of infection – there was a 52 percent lower risk of dying from infection, a striking difference.

Researchers only analyzed the correlation and didn’t attempt to determine the cause for which this happens. There may be a layer of biological resistance brought on by a positive outlook, but this is all speculation at this point. Still, as far as I could find, this is the first study to correlate optimism with the risk of the most common major diseases. It could help doctors find better approaches to treat said diseases, and might indicate that psychological help is also key in some therapies.

“Previous studies have shown that optimism can be altered with relatively uncomplicated and low-cost interventions — even something as simple as having people write down and think about the best possible outcomes for various areas of their lives, such as careers or friendships,” said postdoctoral research fellow Kaitlin Hagan, co-lead author of the study. “Encouraging use of these interventions could be an innovative way to enhance health in the future.”

Journal Reference: Optimism and Cause-Specific Mortality: A Prospective Cohort Study.

Most psychology studies can’t be replicated – and this is a huge problem

Numerous academic journals often post intriguing and challenging psychological studies – but according to a new, massive review, we should take those studies with a big grain of salt. A four-year project by 270 researchers attempted to replicate 100 experiments published in three of the most prestigious journals; only 36 produced similar results.

Social sciences have taken quite a “beating” in recent years: one of the most prolific authors was caught fabricating data, leading to more than 50 retracted papers – most of them from top level journals. Then, one of the rising stars of social science also manipulated data on how homosexuals are regarded, and faking scientific data seems to be done industrially in China. These events, along with many others, led to the development of the The Reproducibility Project: Psychology, led by Brian Nosek of the University of Virginia.

Most studies didn’t held up.

“Less than half — even lower than I thought,” said Dr. John Ioannidis, a director of Stanford University’s Meta-Research Innovation Center, who once estimated that about half of published results across medicine were inflated or wrong. Dr. Ioannidis said the problem was hardly confined to psychology and could be worse in other fields, including cell biology, economics, neuroscience, clinical medicine, and animal research.

Reproducing a study’s results lies at the very core of science. After all, if a study doesn’t produce reproducible results, then what value does it have? What confidence can you have in its data? Well, things often aren’t as straightforward as this, and free will often makes it impossible to replicate environments, but generally speaking, when a study and a verification study don’t add up, there are three options: either study A was wrong, study B was wrong, or there were subtle differences between how study A and study B were done that affected the outcome. Just think about it – even today, almost 100 years later, people are still testing for Albert Einstein’s theory of relativity, even though it’s the backbone of modern physics. This doesn’t necessarily mean that the science is wrong.

Cody Christopherson of Southern Oregon University comments:

“This project is not evidence that anything is broken. Rather, it’s an example of science doing what science does,” says Christopherson. “It’s impossible to be wrong in a final sense in science. You have to be temporarily wrong, perhaps many times, before you are ever right.”

Psychology is an especially difficult area to reproduce studies, but this raises a major problem. In order to succeed in academia, you need to publish new stuff, stuff that no one’s ever tried, even though often, everyone would be better of if you’d double check something that someone has already done.

“To get hired and promoted in academia, you must publish original research, so direct replications are rarer. I hope going forward that the universities and funding agencies responsible for incentivizing this research—and the media outlets covering them—will realize that they’ve been part of the problem, and that devaluing replication in this way has created a less stable literature than we’d like.”

In other words, there are two things we should be doing. First of all, we should be taking these studies with a grain of salt; let’s wait for results to be confirmed and double checked by others, and second: let’s encourage others to do so! We’re not only giving out perverse incentives for some research, but we’re eliminating valid incentive for solid, valuable verification work. Journals also carry a big part of the blame: they prioritize positive results and ignore almost completely negative results.

“We see this is a call to action, both to the research community to do more replication, and to funders and journals to address the dysfunctional incentives,” said Brian Nosek, a psychology professor at the University of Virginia and executive director of the Center for Open Science, the nonprofit data-sharing service that coordinated the project published Thursday, in part with $250,000 from the Laura and John Arnold Foundation.

 

 

Be sarcastic! It’s good for you, scientists find

Using and understanding the intricacies of sarcasm is a fine art; one does not simply “become” sarcastic – you must dive into it, let it embrace you. You must become sarcasm. But jokes aside, sarcasm is a strange thing – we don’t know exactly how it appeared and why. The best theory seems to be that it developed as a cognitive and emotional tool that adolescents use in order to test the borders of politeness and truth in conversation, but that’s not really a satisfying explanation. Sarcasm can be understood as a sign of intelligence, or a sign of rudeness. It can be used to make fun of somebody, or to signal a contrary opinion in a different way. But perhaps most importantly, as scientists have recently found – sarcasm is good for you.

Bill Murray’s sarcastic humor has made him a favorite of many – yours truly included. Image via Memecrunch.

Many practitioners will agree that sarcasm often feels like mental gymnastics. Not everyone can do it, and even understanding it often requires a thorough understanding capacity and ability to think in a non-intuitive way. But until now, the science to back that up lacked. Now, new research by Francesca Gino of Harvard Business School, Adam Galinsky, the Vikram S. Pandit Professor of Business at Columbia Business School, and Li Huang of INSEAD, the European business school, found that sarcasm is a process that activates and is facilitated by abstraction, which in turn promotes creative thinking. In other words, yes, sarcasm is like mental gymnastics.

“Not only did we demonstrate the causal effect of expressing sarcasm on creativity and explore the relational cost sarcasm expressers and recipients have to endure, we also demonstrated, for the first time, the cognitive benefit sarcasm recipients could reap. Additionally, for the first time, our research proposed and has shown that to minimize the relational cost while still benefiting creatively, sarcasm is better used between people who have a trusting relationship,” said Gino.

Differentiating between the literal and the sarcastic sense is what prompts the mental exercise.

“To create or decode sarcasm, both the expressers and recipients of sarcasm need to overcome the contradiction (i.e., psychological distance) between the literal and actual meanings of the sarcastic expressions. This is a process that activates and is facilitated by abstraction, which in turn promotes creative thinking,” said Gino.

In a series of studies, participants were asked to label different statements as sincere, sarcastic, or neutral. They then had a conversation task, which again, included sincere, sarcastic or neutral replies.

“Those in the sarcasm conditions subsequently performed better on creativity tasks than those in the sincere conditions or the control condition. This suggests that sarcasm has the potential to catalyze creativity in everyone,” said Galinsky via email. “That being said, although not the focus of our research, it is possible that naturally creative people are also more likely to use sarcasm, making it an outcome instead of [a] cause in this relationship.”

This raises an interesting question: is sarcasm really good for you? Directly, it is. But indirectly… well, not everybody appreciates sarcasm. It can cause a lot of confusion and misunderstandings, or at the very worse, bruise an ego or two. It’s a powerful weapon, so use it carefully.

“While most previous research seems to suggest that sarcasm is detrimental to effective communication because it is perceived to be more contemptuous than sincerity, we found that, unlike sarcasm between parties who distrust each other, sarcasm between individuals who share a trusting relationship does not generate more contempt than sincerity,” said Galinsky.

Journal Reference: Li Huang, Francesca Gino, Adam D. Galinsky. The highest form of intelligence: Sarcasm increases creativity for both expressers and recipients. doi:10.1016/j.obhdp.2015.07.001

High heels really do have power over men, study shows

Marilyn Monroe once said that if you give a woman the right shoes, she can conquer the world; that may be a bit of a stretch, but a new study published in a Springer journal has shown that if a woman wants attention or help from a man, high heels definitely go a long way.

Women wearing high heels are more likely to be noticed and helped by men, a new study shows. Image via Moda Eyes.

Previous research has already shown that our physical features, as well as the color and shape of our clothes affect what others think of us. However, even though it seems pretty clear that high heels are generally associated with sexiness and fashion, only one previous study looked at how high heels impact men. Now, Nicolas Guéguen of the Université de Bretagne-Sud in France has shown that high heels really do have power over men, making them more likely to notice and help women.

He set out to test his theory and made relatively simple field experiments. In the first experiment, women in flat shoes and subsequently high heels asked people to complete a survey and noted that generally, men complied more readily when she was wearing high heels. The second experiment was even more obvious – if a woman drops a glove on the street while wearing heels, she’s almost 50 percent more likely to have a man lift it up for her than if she’s wearing flats. Also, he found that women in high heels in bars were much more likely to get hit on than women wearing flats.

“Though it’s a relatively small cross-section, this study is very significant since the results are clear and consistent,” said Paris-based sociologist Jean-Claude Kaufmann, who was not involved in the study. “In a relation of seduction, men are very attracted by a woman in heels as she looks taller, more sexually confident, sure of herself, with a lengthened silhouette.”

However, the finding showed that the size of the heel had a very strange impact on men’s willingness to help. German female volunteers wearing 0.5cm (0.2 inches) or 5cm (2 inches) or 9cm (3 1/2 inches) high heels received the same attention from men when it came to lifting the glove, but when it came to answering the survey, the results were very different. The situation involved a woman asking passers-by: “Excuse me, sir. We are currently conducting a survey on gender equality. Would you agree to answer our questionnaire?” Flat heels got a 46.7% answer rate, medium heels a 63% rate and the highest heels a whopping 83% success rate from the men. It’s still not clear what role the heel’s size plays.

Guéguen suspects that because sexy female models often wear such shoes in the media, men have started to associate the wearers of high-heeled shoes with those having sexual intent.

High heel footwear was used extensively in the Ottoman Empire and Persia for horseback, but they only entered Europe in the 1500s, and they were popularized heavily in the 19th and 20th century in an erotic context. Nowadays, high heels are often associated with night clubs and fashion shows. From a medical point of view, high heels pose a lot of problems, causing back pain and increasing the risk of ankle injury. Prolonged usage can lead to shorter calf tendons.

Journal Reference: Guéguen, N. (2014). High Heels Increase Women’s Attractiveness. Archives of Sexual Behavior. DOI 10.1007/s10508-014-0422-z

Gifted children rarely achieve their potential, 30-year study shows

Gifted children are supposed to be tomorrow’s leaders, scientists, and innovators – but the exceptionally smart are often invisible in the classroom, don’t do so well on the curriculum, and aren’t motivated by society to achieve their full potential.

This conclusion comes after the longest study that monitors exceptional children, a 30-year study conducted by researchers at Vanderbilt University’s Peabody College of education and human development. David Lubinski, professor of psychology and human development tracked 300 gifted children from age 13 until age 38, logging their accomplishments in academia, business, culture, health care, science and technology.

The results were recently published in a paper in Psychological Science.

“Gifted children are a precious human-capital resource,” said Lubinski, who has spent four decades studying talented individuals to correlate exceptional early SAT scores with achievement later in life. “This population represents future creators of modern culture and leaders in business, health care, law, the professoriate and STEM (science, technology, engineering, and mathematics). Our study provides new insight into the potential of these children.”

They selected the children that were in the .01% in above-level testing methods, namely SAT verbal or math scores achieved at age 13 or younger. In total, 320 children were selected; of them, 203 went on to earn master’s degrees and above, 142 also earned doctoral degrees (that’s 44%, compared to the general population, which has about 2%). Most of them fared very well, becoming senior leaders at Fortune 500 companies, prolific software engineers, physicians, attorneys, and leaders in public policy.

However, despite these undeniable achievements, researchers concluded that they haven’t truly reached their full potential. Typical school settings are unable to accommodate their fast development, with teachers moving away from them to students who were having problems dealing with the curriculum. This trend continues throughout the formal education, and further social roadblocks are also encountered, which leads to an accumulation of frustrations and a sense of underachieving for most, if not all exceptional children.

“There’s this idea that gifted students don’t really need any help,” Kell said. “This study shows that’s not the case. These people with very high IQs—what some have called the ‘scary smart’—will do well in regular classrooms, but they still won’t meet their full potential unless they’re given access to accelerated coursework, AP classes and educational programs that place talented students with their intellectual peers like Peabody’s Programs for Talented Youth.”

The main idea is that most highly talented children do very well in life – but not as well as they could do, if the system were prepared to handle and encourage them, instead of placing roadblocks.

“Ability, motivation and opportunity all play roles in developing exceptional human capacity and providing the support needed to cultivate it throughout life,” Lubinski concludes.

Via Vanderbilt University.