Tag Archives: Research

The food industry is skewing research, but we’re onto them now

The food industry could be actively working against public health by influencing the results of studies in their favor.

Image credits Stefan Divily.

New research reports that around 13.4% of the nutrition studies it analyzed disclosed ties to the food industry. Studies in which the industry was involved were more likely to produce results that were favorable to its interest, the team adds, raising questions in regards to the merits of these findings.

Harmburger

“This study found that the food industry is commonly involved in published research from leading nutrition journals. Where the food industry is involved, research findings are nearly six times more likely to be favourable to their interests than when there is no food industry involvement,” the authors note.

It’s not uncommon for industry to become involved with research — after all, they have a direct stake in furthering knowledge in their field of activity. This can range from offering funding to assigning employees to research teams for support or active research.

The current paper comes to show that, at least in the food industry, such activities are actively skewing and biasing research into nutrition. It is possible, the team reports, that this can put public health at risk as corporate interests can start dictating what findings see the light of day, where, and in what form. Such findings are worrying since corporations are notorious for putting profits above anything else, including truth or the common good.

In order to get a better idea of just how extensive the influence of industry is in food-related research, the team — led by Gary Sacks of Deakin University in Melbourne, Australia — analyzed all papers published in the top 10 peer-reviewed academic journals related to diet or nutrition. They looked at which had ties to the industry such as funding from food companies or affiliated organizations, and then whether or not the authors went out of their way to support industry interests.

Roughly 13.4% of the articles had some level of industry involvement, with some journals bearing more of the blame than others. The authors explain that studies with industry involvement were over five times more likely to favor industry interests compared to a random sample of studies without involvement (55.6% vs 9.7% for the latter).

Such figures offer a pretty big warning sign that industry involvement could promote research bias or help push an agenda at the expense of quality science (such as the neglect of topics that are important for public health but go against industrial interests). The authors suggest several mechanisms that could be employed to preserve the quality of nutrition research.

The paper “The characteristics and extent of food industry involvement in peer-reviewed research articles from 10 leading nutrition-related journals in 2018” has been published in the journal PLOS One.

Scientists Find That Social Distancing Reduces COVID-19’s Infection Rate by Approximately 1% per Day

All states in the U.S. initiated social distancing measures between March 10 and March 25, 2020. Researchers predicted that this type of intervention will prevent a rapid, overwhelming epidemic according to modeling studies. Governments also enacted physical distancing measures in prior pandemics including the 1918 influenza pandemic with moderate success.

Prior to the coronavirus pandemic, there wasn’t much evidence about the net benefits of imposing statewide social distancing measures to reduce the transmission of viral infections. Because of this, a team of researchers from the United States, South Africa, and the United Kingdom conducted a study on it. They wanted to know what the COVID-19 case growth rate was, before and after social distancing measures where enacted. And, what the public health impacts of government-mandated non-pharmacological interventions were after they started and before they ended.

Data Collection

A search of government websites and third-party sources provided nationwide social distancing measures implemented between January 21 and May 1, 2020. These sources included the New York Times COVID-19 database from which they obtained daily state-specific reported COVID-19 cases and deaths. The measures included cancellations of public events, restrictions on internal movement, and closures of schools, workplaces, and state borders. Researchers also sought for state orders to shelter-in-place, also referred to as lockdowns, and categorized them as restrictions on internal movement.

What researchers found was:

  • Beginning 4 days after social distancing, the case growth rate declined by ~ 1% per day.
  • Beginning 7 days after social distancing, the mortality growth rate decreased by 2% per day.

However, they did not observe a major difference in the average daily case growth rate, before versus after, statewide restrictions on internal movement were enacted. Though separating their close connection with one another did prove difficult.

The authors’ findings imply that social distancing reduced the total number of COVID-19 cases substantially every week:

  • 1st week by approximately 1,600.
  • 2nd week by approximately 55,000
  • 3rd week by approximately 600,000

Music can be used to estimate political ideology to an “accuracy of 70%”, researchers say

Do you like Pharrell’s “Happy”? Then you’re probably a conservative.

If you’ve ever tried to argue with a stranger on the Internet about politics (or with your family at Thanksgiving dinner), you’re well aware that it’s a recipe for disaster: political ideology is often so deeply rooted that it feels hard-wired into our DNA. Political ideology strongly influences our views on things like economics and social policies, but could it also have far-reaching influences on things we aren’t even aware of? The Fox Lab at New York University believes the answer is yes.

Their theory?

“Ideology fundamentally alters how we perceive a neutral stimulus, such as music,” said Caroline Myers, who presented her research at the 2018 Society for Neuroscience Meeting.

To examine the influence of political ideology on musical preference, participants self-reported their political ideology as liberal, conservative, or center, and then listened to clips from 192 songs. For each song clip, they would rate how familiar they were with the song and then how much they liked or disliked it. These songs included the top 2 songs each year from the Billboard Top 40, iconic songs across certain genres, and a selection of more obscure music. Participants additionally ranked how often they believed they listened to certain genres of music — which led to some surprising findings.

For example, 60% of individuals who identified as liberals said that they listen to R&B music, and yet they weren’t any more familiar with these songs than any other group — and they actually liked R&B songs less than their conservative counterparts. Liberals also stated they listen to jazz but were not any more familiar with jazz music than the other groups.

They also looked at individual song preference across the various ideologies. Some did not showcase any major differences, with classical music being the least divisive of all the musical genres. The most polarizing song, however, was “Happy” by Pharrell Williams. Conservatives love it, while liberals hate it. And there’s actually evidence of this in the real world — just two weeks ago, Pharrell issued President Donald Trump a cease and desist order for using the song at one of his rallies.

While we can use this information to create a kick-ass playlist for our like-minded friends, is there any evidence that we can guess an individual’s political ideology purely based on musical taste? Surprisingly, the answer is yes.

“We were able to estimate individual’s ideological leanings to an accuracy of 70%,” said Myers.

Myers is currently working on addressing the limitations of her study such as the limited number of conservative participants due to heavy on-campus recruiting for the study. However, the results are still striking, and quite concerning, from a personal data standpoint. It goes to show that, even if we’re not actively posting personal details on social media, companies may still have other means to gain insight into our personal preferences – and we might not even be aware of it.

Why do people self-harm? New study offers surprising answers

If you’ve seen HBO’s newest miniseries, Sharp Objects, you’re well familiar with what doctors call NSSI: non-suicidal self-injury. NSSI is a serious mental health condition, but despite years of research, we’re still not quite sure why individuals engage in this type of behavior. A new study performed at St. Edward’s University in Austin, Texas, sought the answer to this question by drawing on existing theories in the literature.


Previous studies have shown that individuals who exhibit NSSI have low levels of β-endorphin, which is produced to mediate stress-induced analgesia (the inability to feel pain) — and high ratings of clinical dissociation, which is a feeling of disconnection with oneself and one’s surroundings. The researchers hypothesized that NSSI individuals are attempting to restore these imbalances using self-harm. To test their hypothesis, researchers recruited participants from the university. Using saliva samples and surveys, they assessed β-endorphin levels and psychological state before and after a procedure called the cold-pressor test.

[panel style=”panel-info” title=”Cold-Pressor Test” footer=””]During the cold-pressor test, an individual immerses his or her hand in a bucket of ice water. Researchers then note how long it takes the individual to feel pain (their pain threshold) and how long until the pain is unbearable (pain tolerance), at which point the test ends.
[/panel]

They discovered that non-suicidal self-injurers have lower levels of arousal than people without these tendencies (the control group). After the pain challenge, their arousal levels matched the baseline of the control group — in other words, experiencing pain was able to correct their low levels of arousal. The pain challenge also decreased symptoms of dissociation. However, these changes weren’t exclusive to the NSSI group: the control group also experienced an increase in arousal and a decrease in dissociative symptoms after the cold-pressor test.

Next, the researchers sorted the NSSI group by symptom severity. They found that the more severe the individual’s NSSI symptoms, the stronger their dissociative symptoms were. However, only the most severe cases experienced a reduction in these symptoms after the pain challenge. Another interesting finding is that the NSSI individuals with moderate symptom severity actually had higher levels of β-endorphins (both before and after the pain challenge). This wasn’t seen in those with low or high symptom severity.

However, perhaps the most surprising part of the study was the high percentage of NSSI participants. 

“The literature states that there’s a 5% prevalence of NSSI in the general population, and we found this in 17 out of 65 participants, which is way above what we would expect, even when taking into consideration that university students tend to have a higher NSSI rate than the general population,” said Haley Rhodes, who presented the research at the 2018 Society for Neuroscience Meeting.

Rhodes admits that a bigger sample size is necessary before we can draw full conclusions from the data, but it’s intriguing that there seems to be a minor psychological benefit to the pain — though it most definitely doesn’t warrant any self-harming practices.

Understanding the imbalances in individuals that partake in NSSI might help us find a way to provide for their psychological needs, and allow them to get the same benefits without needing to resort to self-injury.

 

 

Harbour seal.

The role of art in research with science illustrator Sarah Gluschitz

What is the common trait all scientists share? And what role does art play in research? Today we talk with scientific illustrator Sarah Gluschitz to find out.

Sarah Gluschitz.

Sarah alongside some very impressive anatomy illustrations.

One of the goals we set here at ZME Science is, understandably, to promote science in society. You gals and guys make that job quite easy for us; you’re curious, thirsty for knowledge and hungry for the latest, juiciest morsel of research. Driving people to make the next step, that of getting personally involved in science, however, proves to be a more elusive goal.

Most people with whom I’ve discussed this conundrum confess that they want to bring their own contribution — but that they feel outclassed. They say they don’t have something that will make a difference when brought to the research table. They feel they don’t have Einstein’s prowess in physics or Darwin’s eye for evolution, and that that bars them from pursuing science.

But guys — very few people do. Einsteins, Curies, Darwins, and Newtons stand out in history because (you won’t believe this) they were outstanding. Most scientists you’ll talk to look up to them as leviathans, and most suffer from impostor syndrome as a result. However, the small, incremental advances these scientists put in every day are what advances scientific knowledge. The leaps that people such as Curie or Newton made were built upon these efforts — and the overall contribution to science these figures made are relatively small compared to that of the first group.

So don’t sweat it; science would be very happy to have your brain on the team.

Harbour seal.

The other reason people offer up is competence in ‘hard’ topics: math, physics, computer languages, and so on. I can completely empathize with this. I’m trained as an engineer, and I loved every day at Uni — except those that involved math. Which was basically every day. I’m not good at calculus, I couldn’t do it to save my life, and this haunted me throughout my four years of university.

The current, wiser me wants you to know that it’s totally fine. You don’t have to be great at everything to be a researcher. It helps, sure, but it’s not a prerequisite. I figured out I was quite good at geometry while my colleagues weren’t; so they would handle calculus on group projects and I’d crunch the shapes. It worked out well.

Still, I was at a loss as to what to say to the friends, strangers, or readers who broached the subject with me. I knew the tools but not the engine of scientific pursuit, its trappings but not its source — so I didn’t have any wisdom to share.

I found my answer at this year’s European Science Open Forum in Toulouse in the shape of one Sarah Gluschitz. In a room of researchers holding talks and journalists holding recorders, Sarah was drawing. Not aimlessly — contours merged with keywords on topics being discussed. This fresh (if unorthodox) approach caught my eye and sparked a conversation that served me with a heaping of realization: The only thing you really need to be an asset to science is curiosity.

Sarah is about as far from the traditional image of a scientist as you can get, and yet she was there. With a background in arts, she was discussing science and science journalism. Her story helped me make better sense of my work and what I can bring to science.

Today, I’m sharing her story with you.

With Sarah’s permission, we’ll also get to enjoy some samples of her work.

Human Dissection.

Tell me a little about yourself. A short bio of sorts. Something to help our audience get to know you better.

My name is Sarah Gluschitz. I am a Scientific Illustrator and Artist based in the Netherlands. As long as I can remember I have been fascinated with a world hidden in plain sight. A world underneath our skin, one only visible to a small group of people. As a scientific Illustrator, I am fortunate to now be part of that world and to help to translate it for others.

After attending the Royal Academy of Art in The Hague, graduating as a Bachelor in Interactive/Media/Design, I continued my studies at the ZUYD University of Applied Science and Maastricht University in Maastricht. There I have graduated cum laude in Scientific Illustration with my masters’ thesis “Corpse in the copse”, which focusses on the taphonomy [the study of how organisms fossilize] of the human skeleton in 2D and 3D for archaeological applications. This combines my passions for Archaeology, Forensics, Human Anatomy, and Illustration.

What was your first passion? Did you start with science, or with art? What made you mix the two together?

Work in progress.

Illustration in progress of the circulatory and respiratory system of the spiney dogfish (Squalus acanthias).

I have always been artistically inclined as well as having a great interest in getting to the bottom of things. During high school in Germany, it was mandatory to choose two subjects as the main focus point of your studies. I chose Arts and Biology which early on seemed to me like a natural combination of things.

I didn’t know about Scientific Illustration as a field just yet. Searching for a way to use my Arts degree on another level and satisfy my thirst for knowledge I came across Scientific Illustration. Once I got into contact with it for the first time it was as clear as day to me, that this is where I needed to be.

Do you regret your choice of career? Why or why not?

I absolutely adore my line of work. It gives me the possibility to express myself artistically, while also being in contact with science and the source of the knowledge I love so much.

It is multi-layered and diverse as I am not confined to any specific field, but get the possibility to dive into a large variety of topics. This summer, for example, I have illustrated Drones for the ENAC, the French National Civil Aviation School in Toulouse, while only a few months before that I was a research assistant at a forensic decomposition facility in the US.

My favorite thing about my job is being able to translate the researcher’s knowledge to a new audience and making both ends enthusiastic about the topic. I strongly believe that a visual language is one of the universal mediums we can use to reach an audience free of spoken language barriers.

Tell me something you love and something you hate about your line of work.

There is nothing I would like to change about my line of work. I would like to be part of bringing it into the spotlight and showing people the amazing world of Scientific Illustration.

Many people imagine ‘science’ as being strictly something you do with beakers in a lab. But art has a very important part to play in science and the process of gathering knowledge. Illustrations such as yours have graced the pages of encyclopedias for decades, even centuries.

What advice would you give to someone who has an artistic inclination and a passion for science, but feels like they lack ‘hard’ skills like maths, physics, so on?

I believe there is a niche for every one of us, that fits our interest and skillset. Being artistically inclined doesn’t necessarily have to lead to a career in arts, nor should a lack of hard skills hinder us from being in touch with science. Scientific Illustration is one way of combining both, so is journalism.


I believe that a symbiotic relationship between science and art will elevate research and help it outgrow the confinements of the scientific community. The community is often separated, through their own language, from the general public. Utilizing a visual language breaks down the walls and helps create spaces for open communication. Everyone who feels like neither arts nor science is a perfect fit should start exploring at what point they could come in and be the link between the two worlds they love.

Once you find that intersection it will come naturally as how to proceed.

What is your view on the relationship between art and science? Should they be more separated, or should they play together more often?

Like many things, art and science would be even greater combined. Both fields have their own mentality, research methods and ways of thinking. This sometimes leads to places where each is stuck. A fresh view, that might seem unorthodox at first, can open up new pathways for each field to continue growing.

Both science and art are very complimentary to each other.

Can art help us make better science? What about the other way around?

Within the Arts, the field of ArtScience is continuously growing and welcomed with open arms, while within the Science community there still is some resistance towards having Arts distort the nature of Science; the accuracy and the objectiveness.

In Scientific Illustration, we make sure that that accuracy and objectiveness is guarded, while still offering a different perspective and approach to a problem, such as the reconstruction of Archaeological findings. Reconstruction of archaeological finds has several layers and goals, one being to understand a past society. But can you truly understand the creation of an art object from the past, when you don’t engage artist with their unique way of thinking into the process of reconstruction?

All image credits to Sarah Gluschitz. You can see more of her work on her Instagram page or her website.

Laboratory.

It’s getting harder and harder to come up with new ideas in science, paper reports

The well of new ideas might be drying up — or at least getting deeper.

Laboratory.

Image credits Michal Jarmoluk.

A paper penned by researchers from MIT Sloan and the Stanford University raises a worrying possibility. New ideas, they write, are harder and harder to come by.

Now, they’re not talking about ideas pertaining to cool new places to hang out or something like that. The paper limits itself strictly to new ‘ideas’ in the context of scientific research. And, according to their findings, research productivity is falling rapidly across the board.

Uphill road

The team argues that this drop in research productivity comes down to the fact that scientists need to put in more and more effort just to maintain the same pace — even a slightly slower pace in some fields — of idea generation as a few decades ago. In other words, each new addition to the body of scientific knowledge takes more and more work.

The authors cite Moore’s Law — that the numbers of transistors that can be packed into a computer processor doubles every two years — as a prime example of this effect. This doubling requires a growth rate of embedded transistors of 35% per year, but it takes more in-depth research each year to reach this goal.

Productivity graph.

The team defined research productivity as the ratio of idea output, measured as total factor productivity (TFP) growth, to research effort.
Image credits Nicholas Bloom et al., 2018, NBER.

“Many commentators note that Moore’s Law is not a law of nature, but instead results from intense research effort: Doubling the transistor density is often viewed as a goal or target for research programs,” they write.

“The constant exponential growth implied by Moore’s Law has been achieved only by a massive increase in the amount of resources devoted to pushing the frontier forward.”

Research efforts into semiconductor technology have intensified 18-fold since the 1970s, they report. Research productivity, however, has fallen by the same factor over this time period — evening each other out. This means that it’s about 18 times as hard today to push Moore’s Law to its next ‘level’ than it was half a century ago.

It’s not only computer sciences that are affected, even agricultural output follows the same trend. Per-acre yields of corn, soybeans, wheat, and cotton grew about 1.5 percent on average every five years between 1960 and 2015, according to the paper, but the number of researchers trying to boost these yields has risen by a factor of between 3 to 25, depending on the crop. “Yield growth is relatively stable or even declining,” the team concludes, “while the effective research that has driven this yield growth has risen tremendously.”

Yield graphs.

The blue line denotes the annual growth rate of yield for each crop per year. The solid green line is based on R&D targeting seed efficiency only; the dashed line additionally includes research on crop protection.
Image credits Nicholas Bloom et al., 2018, NBER.

In the pharmaceutical industry, research efforts rose by 6% per year since the early 1970s, while productivity (measured in how many new drugs were approved by the Food and Drug Administration) fell by 3.5% per year. When comparing the years of life saved by cancer research per 100 people since the 1970s to the number of medical studies published over the same period, the team found that productivity declined by a factor of 1.2 for all work, and a factor of 4.8 when looking only at clinical trials.

Overall, in the broader economy, the authors report that it takes about 15 times as many researchers today, compared to 30 years ago, for a company to maintain the same rate of revenue growth.

“Just to sustain the constant growth in GDP per person, the U.S. must double the amount of research effort put into searching for a new idea every 13 years to offset the increased difficulty in finding new ideas,” the paper reads.

So what gives?

John Van Reenen, an MIT Sloan professor of applied economics and co-author of the paper, thinks one factor that could explain this trend is that researchers simply need more time to reach the level of education they need in order to start producing new ideas.

“As the total amount of knowledge becomes larger and larger and larger, it becomes increasingly difficult to get to its frontier of that knowledge,” Van Reenen said. “It was much easier a couple thousand years ago.”

We handle this increase in knowledge by focusing our education on a narrow domain — think of how your education became progressively more specialized as you moved from school to high-school, university, then to a master’s degree or even a Ph.D. However, this breeds its own set of issues. Innovation often requires people of various different specializations working together, and “it’s very complicated to get all of these people and ideas together,” Van Reenen said. “That, itself, could be a reason why things start slowing down.”

Not all is lost, however. The team also reports that the productivity of research efforts targeting cancer actually rose from 1975 to the mid-1980s, which would “suggest that it may get easier to find new ideas at first before getting harder, at least in some areas.”

Van Reenen also says that we’re a long way off from any sort of hard limit on technological growth. Population growth, the increasing ease of communication, and globalization also offer a lot of opportunities for new ideas to emerge. Just as long as “we keep increasing the amount of resources we put into research,” he explains “we’ll keep [generating new ideas].”

The paper “Are Ideas Getting Harder to Find” has been published (link to pre-print version) in the journal National Bureau of Economic Research.

Recents studies show how coffee is good for your health

Steaming hot, iced, blended, black, creamy. Coffee! It comes in many forms, and it’s part of my daily routine. It’s part of many others’ too. Last week several established publications’ websites were running coffee-related articles, touting this beverage’s health benefits. Scientists have remarked on this drink’s healthful qualities in the past. The idea that coffee is good for you is not a new one.

The Relationship with Diabetes

The delightful drink seems to help in warding off type 2 diabetes. The sex hormone-binding globulin, or SHBG for short, is a protein which controls the sex hormones in the human body: testosterone and estrogen. It has also been considered to have a key role in the evolution of this specific type of diabetes.

It has been observed that drinking coffee will increase the amount of plasma of SHBG. A few years ago, a study showed that women who ingested a minimum of four cups each day were slightly less likely to develop diabetes as opposed to those who didn’t drink it at all.

Help in Other Areas

The Best Way to Start the Day Right. Source: Pixabay.

Coffee, primarily the caffeinated kind, has been known to prevent as well as alleviate Parkinson’s disease. The consumption of caffeine has been found to significantly decrease the number of Parkinson’s cases. In fact, it may even aid in simple movement in individuals afflicted with the disease.

It provides some benefits for those who are concerned about their heart. Small daily doses can assist in preventing heart failure. In one study, it was shown that the risk of heart failure in people drinking four European cups of coffee per day was reduced by 11%.

Newer studies show that the regular intake of a relatively small amount of coffee can bring down the chances of premature death by 10%. Additional benefits could possibly include preventing cirrhosis, decrease the likelihood of multiple sclerosis (MS), and prevent the onslaught of colon cancer. However, to be certain whether these benefits are actually present in coffee more tests are needed. It is also one of the very best sources of antioxidants which protect the human body against destructive molecules called free radicals. This is good since free radicals are believed by many scientists to bring about cancer, blood vessel disease, and other serious ailments.

The Biggie: Coffee and Liver Health

From Pot to Cup. Source: Pixabay.

Perhaps the biggest health factor it basks in being associated with is liver health. Marc Gunter, head of a recent large-scale European study noted by National Geographic, has stated coffee drinking is linked to good health in the liver and circulatory systems. He also says it can account for lower inflammation levels in those who drink it as opposed to those who don’t.

The discoveries this study has led to supply the strongest defense to date for the healthful qualities of coffee. Gunter informed the scientific community and the public that he plans to examine the beverage’s chemical compounds in an attempt to know what makes it healthful.

We have actually seen how it can aid in liver conditions for several years. For instance, it was found that consuming three cups of coffee on a daily basis reduced the chances of getting liver cancer by 50%! Decaf also decreases the number of enzymes located in the liver. Thus, it is seen that caffeine is not always the prime healthy aspect provided in coffee. Drinking the beverage frequently has been associated with decreasing the risk of primary sclerosing cholangitis (PSC) which is a rare disease infecting the liver’s bile ducts.

As we’ve seen, coffee has quite a few benefits when drunk regularly and moderately. The important thing to recognize now is that many specific studies need to done on coffee itself and how it relates to treating various illnesses.

Women in STEM.

More gender equal countries have fewer women in STEM, paradoxically

Puzzlingly, women in countries with greater gender equality are less likely to take degrees in science, technology, engineering, and mathematics (STEM). New research delves into this ‘gender equality paradox’.

Women in STEM.

Image credits Eryk Salvaggio.

You’d expect countries which make it harder for women to carve their own path in life to have fewer women involved in STEM fields — however, that is not the case. It’s actually quite the opposite: countries like Algeria or Albania enjoy a greater percentage of women (and of their total female population) amongst their STEM graduates than Finland, Norway, or Sweden.

The STEM of the issue

Researchers from the Leeds Beckett University in the UK and the University of Missouri in the USA wondered what’s up and set out to investigate. Their working hypothesis was that this divide stems from the poorer quality of life in countries with lower equality, which often have little welfare support, making STEM careers (which are generally better-paid jobs) more attractive to women who live there. The teams also looked at what factors motivate boys and girls to choose STEM subjects, including overall ability, whether or not science subjects were a personal academic strength, as well as personal interest or sheer enjoyment of the topic.

The data used in the study was drawn from 475,000 teenagers across 67 countries and regions. Boys and girls had overall similar achievement levels in STEM fields, however, science was more likely to be the best subject for boys. Girls, even in cases where their ability and achievements in science were comparable to or greater than that of boys, were more likely to be better overall in reading comprehension, which is more closely tied to non-STEM subjects. Girls, overall, also tended not to be as interested in science subjects as boys. The authors note that these differences were near-universal across all the countries and regions in their analysis.

So on the one hand, girls generally tend not to care about science as much as boys, and they’re also, generally speaking, likely to be better than boys at non-STEM-related skills. According to first author Gijsbert Stoet from LBU, this already explains some of the gender disparity we see in STEM fields participation.

“The further you get in secondary and then higher education, the more subjects you need to drop until you end with just one. We are inclined to choose what we are best at and also enjoy. This makes sense and matches common school advice.”

“So, even though girls can match boys in terms of how well they do at science and mathematics in school, if those aren’t their best subjects and they are less interested in them, then they’re likely to choose to study something else.”

And it makes sense; with limited resources to invest (both financial and time-wise) in education, we all want to go for something we both like and are good at. According to these findings, girls by-and-large seem to be naturally better at non-STEM-related tasks. I’m not saying they’re not good at STEM-related skills, and the authors aren’t either — it’s just that they’re even better at doing something else.

Where gender equality comes in

Bathroom sign.

Image via provera250.

That explanation, however, only tells part of the story. STEM fields, after all, tend to be the better-paying ones, and that’s certainly a powerful motivator when deciding on a career path. So, based on the criteria I’ve listed above, the team looked at how many girls could be expected to study in STEM fields. They took the number of girls in each country that had the necessary ability in STEM and for whom it was also their best subject and compared to the number of women actually graduating in STEM.

All things considered, they report that every country had a disparity between those two figures, however, more gender equal countries had the widest gaps. In the UK for example, 29% of STEM graduates are female, whereas 48% of girls might be expected to take those subjects based on science ability alone, and 39% could be expected to do so once both ability and interest were factored in.

“Although countries with greater gender equality tend to be those where women are actively encouraged to participate in STEM, they lose more girls from an academic STEM track who might otherwise choose it, based on their personal academic strengths,” says co-author Professor David Geary, UoM.

“Broader economic factors appear to contribute to the higher participation of women in STEM in countries with low gender equality and the lower participation in gender-equal countries.”

Using the UNESCO overall life satisfaction (OLS) figures as a stand-in for economic opportunity and hardship, the researchers found that in more gender equal countries, overall life satisfaction was higher. The team reports that STEM careers are generally more secure and well-paid than their competition. However, in countries where any choice of career feels relatively safe (i.e. richer countries, which tend to be more gender equal) women may put more emphasis on non-economic factors, such as personal preference, over economic factors, such as pay. Sex differences in academic strength and interests would thus factor in much more in women’s college and career choices in a more gender-equal country, Geary adds.

The findings could help guide efforts of getting more women into STEM, where their presence has remained broadly stable for decades despite efforts to increase participation.

“It’s important to take into account that girls are choosing not to study STEM for what they feel are valid reasons, so campaigns that target all girls may be a waste of energy and resources,” adds Professor Stoet.

“If governments want to increase women’s participation in STEM, a more effective strategy might be to target the girls who are clearly being ‘lost’ from the STEM pathway: those for whom science and maths are their best subjects and who enjoy it but still don’t choose it. If we can understand their motivations, then interventions can be designed to help them change their minds.”

The paper “The Gender-Equality Paradox in Science, Technology, Engineering, and Mathematics Education” has been published in the journal Psychological Science.

Origami dollar.

US Senate says White House’s proposed DOE budget cuts are “short-sighted,” increases funding instead

Senate budget makers shut down the DOE research budget cuts proposed by the White House in May and left no room for interpretation as to why.

“The Committee definitively rejects this short-sighted proposal, and instead increases investment in this transformational program and directs the Department to continue to spend funds provided on research and development and program direction,” the Senate appropriations committee wrote in a report penned alongside a bill funding the Department of Energy.

Origami dollar.

Money well spent.
Image credits lukaswafl / Pixabay

We all know by now that the current U.S. administration has a bone to pick with certain fields of scientific pursuit. Back in May, that grudge materialized in some very drastic cuts to several of the Department of Energy’s (DOE’s) basic and applied research programs. Among the White House’s list of undesirables for the new fiscal year (beginning 1st of October,) one could find the Advanced Research Projects Agency-Energy (ARPA-E), an 8-year-old agency which works to turn the most promising ideas from basic research into workable energy technologies. The ARPA-E was earmarked for complete shut-down under the new budget proposal.

That proposal went through the House of Representatives, but Senate appropriators shot it down as soon as they saw it. “The Committee definitively rejects this short-sighted proposal,” they write in their report, and would actually increase ARPA-E’s budget by 8% (to US$330 million.) The report also explicitly forbids the DOE from using monetary constraints to shut down the program.

The DOE’s Office of Science would also see its biological and environmental research (BER) budget gutted by 43% (down to US$349 million.) Similarly, the appropriations committee “rejects the short-sighted reductions proposed in the budget request” and would see BER funding increased by 3% (to US$630 million.)

Some areas, however, will see cuts. The DOE Office of Energy Efficiency & Renewable Energy’s applied research budget will see a 7% cut compared to last year, to US$1.937 billion. Still, that’s way more than what the White House proposed — only US$636 million. The Senate appropriations subcommittee on energy and water development would also cut 39% off fusion energy R&D (to US$232 million,) and discontinue the US’ involvement in the international fusion project, ITER, currently under construction in France.

Still, it’s good to see that the US will not readily surrender its long-term pursuit of science (especially pertaining to environmental sciences, which are now more desperately needed than ever) to appease a passing administration.

You can read the full report on the Senate’s appropriation committee’s website.

Lotus seeds.

German researchers release open-source tomato and wheat seeds to boost research

Breeders from the Göttingen University and Dottenfelderhof agricultural school in Bad Vilbel, Germany, have released new varieties of tomato and wheat seeds. The catch? They’re free for anyone to use, ever, as long as the products of their work remain free to use. In essence, these are open-source seeds.

Lotus seeds.

Image credits Nam Nguyen.

I think we’ve all, at one point or another, had to bump heads with the sprawling world of intelectual property and copyright licensing. That being said, I don’t think many of us imagined that licensing is a problem farmers and plant breeders also have to face — but they do. Feeling that this practice has gone beyond doing good and is actually stifling progress (both scientifically and morally in areas where food insecurity is still high,) German scientists have created new varieties of tomato and wheat plants, whose seeds are now freely available for use under an open-source license.

The move follows similar initiatives to share plant material in India and the United States, but it’s the first to actually extend the legal framework to all future descendants of the varieties.

So why would you make seeds open source? Well, the idea is that scientists and breeders can experiment with these seeds to improve the varieties or create new ones altogether without having to worry about the legal department suing them back into the stone age. And in that respect, it’s a gift that keeps on giving. According to Johannes Kotschi, an agricultural scientist who helped write the license last year, the license “says that you can use the seed in multiple ways but you are not allowed to put a plant variety protection or patent on this seed and all the successive developments of this seed.” Kotschi manages OpenSourceSeeds for the nonprofit Agrecol in Marburg, Germany, which announced the tomato and wheat licensing in Berlin back in April.

The open source seeds have had a very positive reception. Since their announcement, other universities, nonprofits, as well as organic breeders have expressed an interest in releasing open-source licenses for their hop, potato, and tomato varieties, and Kotschi’s tomato seeds have been in great demand.

Why open-source

For the majority of human history, seeds have, obviously, been open-source. Without any system in place to enforce copyright claim or to penalize copyright infringements in place, farmers could use and improve on any variety of plant to suit his needs. This freedom allowed for the crops we know today — those with ample yields, drought- and pest-resistance, better taste and growing times, so on. Or they just got lucky.

But in the 1930s the United States began applying patent law to plants and soon everyone was doing it — so farmers and breeders couldn’t claim a variety as their own, and even risked legal action for working with a claimed crop. The problem further deepened as a sleuth of additional measures including patents and a special intellectual property system for crops called “plant variety protection” made it into legislature. As companies merge, these patents and plant intellectual property become increasingly concentrated to an ever-smaller number of legal entities.

Some progress was done on plant variety protection, with international agreements allowing an exception from the intellectual property for research and breeding. But there’s no such system in place for patents, and scientists aren’t allowed to use patented plants for breeding or research purposes.

The Geman open-source seeds solve these problems by allowing anyone to use the varieties as long as any derivatives (offspring) remain in the common, public domain. However, there is some concern that a complete shift to an open-source system would harm innovation, as commercial breeders (who are the main source of new varieties) and universities wouldn’t be able to cash in royalties off their work. As with most areas of life, balance is key to solving the issue.

For now, governments will likely keep an eye on how the seeds impact existing systems.

Public is skeptical of all research tied to a company, new study shows

A new study has revealed that at least when it comes to health risks or medicine, most people don’t believe studies associated with an industrial partner, even one with a good reputation.

No one really loves corporations, however, they do play a vital role in society — and in science. But at what cost? Image credits: takomabibelot.

In the past couple of years, we’ve seen a disturbing trend of anti-intellectualism. People don’t believe the experts, they don’t want science, and would often take their news and information from click bait Facebook posts or articles. Science isn’t really quick to react and scientists rarely aim to grab your attention with catchy headlines, so this problem is likely going to stick with us for a long time. However, if there is something scientists are good at, it’s figuring stuff out — and they recently showed that one of the mechanisms which erode trust in science is partnerships with industry.

It doesn’t take a genius to realize that most people dislike big companies, but the effect through which this dislike carries onto science is still not properly explored. Many health studies have a corporate partner or involve some kind of drug or treatment method developed by a corporation; how impactful are these associations?

“People have a hard time seeing research related to health risks as legitimate if done with a corporate partner,” said John Besley, lead author and an associate professor who studies the public’s perception of science. “This initial study was meant to understand the scope of the problem. Our long-term goal though is to develop a set of principles so that quality research that’s tied to a company will be better perceived by the public.”

In Besley’s study, participants were randomly assigned to evaluate one of 15 scenarios which included various partnerships between scientists from a university, a government agency, a non-governmental organization, and a large food company. Basically, participants were presented the same study on genetically modified foods and trans fats, but featuring various partnerships of the author.

The results clearly showed that people tended to dislike and distrust the science when the food company was involved. In fact, 77 percent of participants had something negative to say about this association and questioned the quality of the produced results. At the other side, only 28 percent of participants said something negative when a corporate partner wasn’t present. Additional partners, even reliable ones such as the Centers for Disease Control and Prevention, didn’t change these values significantly.

What this tells us is pretty simple: even if you do some quality science, there’s a good chance people won’t believe you because you got money from a company. This is understandable to some extent and you’d be tempted to say — “OK, scientists simply shouldn’t partner up with corporations” and that’s that. But then… where are you supposed to get funding money from? In the US, the funding leash is getting shorter and shorter, and there’s virtually no branch of science which isn’t getting significant funding from industry. Much of the science happening today is also trans-disciplinary and benefits from multiple actors involved. The study explains:

“University scientists conducting research on topics of potential health concern often want to partner with a range of actors, including government entities, non-governmental organizations, and private enterprises. Such partnerships can provide access to needed resources, including funding. However, those who observe the results of such partnerships may judge those results based on who is involved.”

So you’re stuck between a rock and a hard place — either risk the public not believing in your research or just never get the money you need in the first place. It’s a challenging time to be a researcher.

“Ultimately, the hope is to find some way to ensure quality research isn’t rejected just because of who is involved,” Besley said. “But for now, it looks like it may take a lot of work by scientists who want to use corporate resources for their studies to convince others that such ties aren’t affecting the quality of their research.”

Journal Reference: John C. Besley , Aaron M. McCright, Nagwan R. Zahry, Kevin C. Elliott, Norbert E. Kaminski, Joseph D. Martin — Perceived conflict of interest in health science partnerships. https://doi.org/10.1371/journal.pone.0175643

ISS.

This year’s 22 NASA Innovative Advanced Concepts are out of this world

Yesterday NASA released the list of its 2017 Innovative Advanced Concepts (NIAC) projects, showing what we should expect for the future of space exploration.

ISS.

Image credits Skeeze / Pixabay.

There is one program ran by NASA whose mission can be best summed up as “think of something that would probably be possible in a sci-fi book set about 50 years in the future — now let’s make it happen.” It’s named the NASA Innovative Advanced Concepts, and it is the squared (maybe even cubed) creme de la creme of scientific pursuit in space.

The current program has been on-going since 2011, although there was a previous, slightly-different-but-basically-the-same NIAC in operation from 1998 to 2007. During this time, NIAC has awarded millions of dollars for ideas which could bring a long-term and far-reaching impact into space exploration by “creating breakthroughs, radically better or entirely new aerospace concepts.” The program works in phases of funding, with Phase I projects receiving and estimated US$125,000 in funding for nine months to “support initial definition and analysis of their concepts.” If Phase I shows the idea is feasible and will benefit our forays into space, it becomes eligible for Phase II funding — which can be worth as much as $500,000, for two-year studies.

Yesterday, the agency released the latest list of approved Phase I and Phase II grants and boy oh boy we’re talking about some really heavy ideas here. Among them, there’s a proposal for an artificial gravity device intended for deep-space missions, swarms of softbots to pick apart asteroids, a self-powered Venus probe and, I kid you not — “Solar Surfing.” Any one of these paper names would feel at home as the title of the next big sci-fi book and could make a space buff’s glasses foggy with excitement.

So sit back and let your imagination fly to the time where these ideas will be zooming around in the skies above us.

[panel style=”panel-success” title=”Phase I projects:” footer=””]

  • A Synthetic Biology Architecture to Detoxify and Enrich Mars Soil for Agriculture  — Adam Arkin, University of California, Berkeley.
  • A Breakthrough Propulsion Architecture for Interstellar Precursor Missions — John Brophy, NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California.
  • Evacuated Airship for Mars Missions — John-Paul Clarke, Georgia Institute of Technology in Atlanta.
  • Mach Effects for In Space Propulsion: Interstellar Mission — Heidi Fearn, Space Studies Institute in Mojave, California.
  • Pluto Hop, Skip, and Jump — Benjamin Goldman, Global Aerospace Corporation in Irwindale, California.
  • Turbolift — Jason Gruber, Innovative Medical Solutions Group in Tampa, Florida.
  • Phobos L1 Operational Tether Experiment — Kevin Kempton, NASA’s Langley Research Center in Hampton, Virginia.
  • Gradient Field Imploding Liner Fusion Propulsion System — Michael LaPointe, NASA’s Marshall Space Flight Center in Huntsville, Alabama.
  • Massively Expanded NEA Accessibility via Microwave-Sintered Aerobrakes — John Lewis, Deep Space Industries, Inc., in Moffett Field, California.
  • Dismantling Rubble Pile Asteroids with Area-of-Effect Soft-bots — Jay McMahon, University of Colorado, Boulder.
  • Continuous Electrode Inertial Electrostatic Confinement Fusion — Raymond Sedwick, University of Maryland, College Park.
  • Sutter: Breakthrough Telescope Innovation for Asteroid Survey Missions to Start a Gold Rush in Space — Joel Sercel, TransAstra in Lake View Terrace, California.
  • Direct Multipixel Imaging and Spectroscopy of an Exoplanet with a Solar Gravity Lens Mission — Slava Turyshev, JPL.
  • Solar Surfing — Robert Youngquist, NASA’s Kennedy Space Center in Florida.
  • A Direct Probe of Dark Energy Interactions with a Solar System Laboratory — Nan Yu, JPL.

[/panel]

“The 2017 NIAC Phase I competition has resulted in an excellent set of studies. All of the final candidates were outstanding,” said Jason Derleth, NIAC program executive.

“We look forward to seeing how each new study will expand how we explore the universe.”

[panel style=”panel-info” title=”Phase II projects:” footer=””]

  • Venus Interior Probe Using In-situ Power and Propulsion, Ratnakumar Bugga, JPL.
  • Remote Laser Evaporative Molecular Absorption Spectroscopy Sensor System, Gary Hughes, California Polytechnic State University in San Luis Obispo.
  • Brane Craft Phase II, Siegfried Janson, The Aerospace Corporation in El Segundo, California.
  • Stellar Echo Imaging of Exoplanets, Chris Mann, Nanohmics, Inc., Austin, Texas.
  • Automaton Rover for Extreme Environments, Jonathan Sauder, JPL.
  • Optical Mining of Asteroids, Moons, and Planets to Enable Sustainable Human Exploration and Space Industrialization, Joel Sercel, TransAstra Corp.
  • Fusion-Enabled Pluto Orbiter and Lander, Stephanie Thomas, Princeton Satellite Systems, Inc., in Plainsboro, New Jersey.

[/panel]

“Phase II studies can accomplish a great deal in their two years with NIAC. It is always wonderful to see how our Fellows plan to excel,” said Derleth.

“The 2017 NIAC Phase II studies are exciting, and it is wonderful to be able to welcome these innovators back in to the program. Hopefully, they will all go on to do what NIAC does best – change the possible.”

All the projects listed here have been evaluated for technical viability and innovativeness through a peer-review process. They are still in early development and most will require 10 or more years of work before being ready for field testing.

Dank science: Jerusalem University launches marijuana study center

The world’s first multidisciplinary center for marijuana has been opened at the Hebrew University in Jerusalem, with the purpose of exploring the plant’s therapeutic potential.

Image via Pixabay.

The medicinal properties of cannabis have been touted for a long time, and yet we know surprisingly little about what cannabis can actually do inside the body. Up until a few decades ago, it was merely classified as a recreational drug, but recent research has shown that the hotly-debated plant might have a lot of potential — but it’s hard to explore it to the maximum, due to laws and regulations. Getting all the required paperwork for cannabis studies is insanely difficult in most parts of the world, and obtaining funding for such a study is simply an ordeal. But marijuana research just got a big boost with the new (and first) center dedicated to it.

“There is so much interest in cannabis at the moment, but a lot remains unknown about its mechanism of action,” Dr. Joseph Tam, the director of the center, told JTA. “My belief is that our multidisciplinary center will lead global research and answer these questions.”

Tam’s team encompasses researchers working in agriculture, chemistry, drug delivery, pharmacology, and chemical development — all the areas connected to medicinal marijuana. They will analyze all aspects related to marijuana, from growing, to harvest, to drug development and biophysical interactions with the body. They will focus especially on cancer, pain, inflammation, immunity, metabolism, and stress management. For starters, there will be a total of 27 full-time researchers working on the project, but the center hopes to draw more specialists in time, including in the fields of nanotechnology, pain science, and brain science. They also have plans to collaborate with scientists and biotech companies around the world.

“We feel incredibly fortunate to team up with a vast number of scientists working together on this expanding field of medicine with the significant potential to discover new therapies based on cannabinoids,” Tam said.

It makes a lot of sense for this to happen in Israel, as the country has long been a pioneer in cannabis research. In 1964, Raphael Mechoulam basically kickstarted the field when he identified THC as the main psychoactive constituent of cannabis. He then went on to identify the endocannabinoid system upon which cannabinoids act on the body and at 86 years old, he is still an active researcher and of course, a part of this new center.

The favorable political climate in Israel also helps. Unlike most countries, Israel significantly invests in this type of research and backed legislation to decriminalize recreational marijuana use. However, this won’t affect Tam, who claims to have never smoked a joint in his life.

PhD students are 2.5 times more likely to develop psychiatric disorders than highly educated general population

Science has confirmed what PhD students worldwide already knew.

Doctor of Philosophy (PhD) is the highest academic recognition you can get, but it doesn’t always come with the perks you might expect. Recognition — not so much. Money — way less than the industry. Mental satisfaction — well about that… A new study conducted by Belgian researchers found that PhD students are much more prone to developing mental disorders, even when compared to similarly educated groups.

In this day and age, being a PhD student is extremely stressful. I can’t imagine it being too mellow at any point in history, but nowadays, it’s probably worse than ever. You’re in a “publish or perish” environment, where you need to present deliverables constantly, to a committee which might have different ideas to your own. You’re working on a research project with (almost always) limited duration funding, but you’re never sure when you’ll cross the finish line. This, in turn, means you often feel like you’re not making any progress, or you’re lacking a breakthrough moment. There’s also an overwhelming feeling that even if you do manage to submit and publish your work, it will largely be ignored or be completely insignificant.

The fact that you’re living at or even below poverty line (you’re usually banned from outside working) and sometimes working insane hours doesn’t help. You’re also subjected to a professor (or a few professors) who, in theory, guide and support you — but the practice might be very different. In a book called  ‘A Survival Kit for Doctoral Students and Their Supervisors,’ authors describe several issues in which the views of the PhD student and that of the committee or guiding professor can clash, with dramatic consequences for the student’s mental state. This new study seems to not only back that idea up, but also put some figures on an extremely worrying trend. Here are their main findings:

  • One in two PhD students experiences psychological distress; one in three is at risk of a common psychiatric disorder.
  • The prevalence of mental health problems is higher in PhD students than in the highly educated general population, highly educated employees, and higher education students.
  • Work and organizational context are significant predictors of PhD students’ mental health.

They identify several factors which can negatively affect the student’s mental state: work-family interface, job demands and job control, the supervisor’s leadership style, team decision-making culture, and perception of a career outside academia. Speaking from personal experience, these feelings seem echoed by most (if not all) the PhD student community. Here’s what a Reddit commentator, Gibbie99 said about his PhD:

“Neither cared about my mental health, or made much of any attempt to mentor. Instead we were left to grinding away in the lab every day with little interaction, beyond the mandated 6 month check-in with the committee/advisor.”

Sure, you could argue that this comment comes from a random person on a Reddit thread and doesn’t carry much weight, but if you’ll have a look at PhD forums or engage with the community in any way, the odds are what they say will shock you. Most people feel that their work doesn’t matters, or at the very least that their professor/committee doesn’t care about it. Basically, we’re taking some of the world’s brightest minds, overworking and underpaying them, and filling them up with a sense of meaningless. It’s hard to imagine a world where they don’t suffer more from psychiatrical conditions. However, the fact that the work and organizational context can be used to predict mental health is significant — and it’s perhaps something universities and research institutes could pay more attention to.

Journal Reference: Katia Levecquea, Frederik Anseela, Alain De Beuckelaerd, Johan Van der Heydenf, Lydia Gislef — Work organization and mental health problems in PhD students. http://dx.doi.org/10.1016/j.respol.2017.02.008

 

People are more willing to accept embryonic stem cell research than politicians

As stem cell research continues to be a very divisive topic, a new study has revealed that the general public is much more willing to accept it than politicians.

Image in Public Domain.

The Swiss are very liberal with their referendums — they have quite a few every year. This offers a unique and direct perspective to see how voters think about a variety of topics including (in this case) embryonic stem cell research. They found that people were much more willing to accept the research than politicians.

“By analysing the outcomes of a referendum on a liberal new bill regulating such research, we reveal an about 10 percentage point lower conditional probability of the bill being accepted by politicians than by voters,” the study reads.

The motivations for the two categories of people are also quite different. For politicians, it’s all about the politics, whereas general people are swayed by different aspects.

“Whereas the behaviour of politicians is driven almost entirely by party affiliation, citizen votes are driven not only by party attachment but also by church attendance.”

Old and new

If you’ve kept up with stem cell research, you’re probably wondering why we’re talking about embryonic stem cell research (taking stem cells from embryos), when the science world is moving away to different types of stem cells, most notably induced pluripotent cells. Induced pluripotent stem cells are a type of pluripotent stem cell that can be generated directly from adult cells. But this study analyzed the 2004 referendum when embryonic stem cells stirred heated debates, and its conclusions are more about social science than biology.

The study showed that citizens care whether scientists are trustworthy, act transparently, and serve the public interest. Even scientists themselves have asked that journal editors and funding agencies enforce high standards of ethics. However, politicians play a different ball game. For them, it’s all about affiliation and, although this study doesn’t address it, personal interests.

“According to our findings, in this environment, citizens are more likely than politicians to favour embryonic stem cell research, suggesting that social discussion may help bring about agreement on shared principles, professional norms, and procedural conditions related to stem cell research. Citizen involvement through direct democracy might thus provide a way to bridge polarization in the stem cell debate.”

But as we all know, direct democracy is tricky and referendums can also backfire dramatically. The key here is an informed population. The people care about science and they want it kept to a good standard but they need to be accurately informed about the vote they are about to cast.

“Because of the high level of direct democracy in Switzerland, its citizens are generally well informed about upcoming referenda through intense public discourse and official booklets. These latter, which include the exact text of the legislative paragraphs to be modified or introduced into the law or constitution, provide objective information on the referendum issue.”

Journal Reference: David Stadelmann, Benno Torgler — Voting on Embryonic Stem Cell Research: Citizens More Supportive than Politicians. http://dx.doi.org/10.1371/journal.pone.0170656

 

British research vessel gets named “Boaty McBoatFace” following an online poll

The world has spoken and the vote has been cast: people want to name the new British Antarctic research ship “Boaty McBoatface”. Vox populi, vox Dei!

The boat that could become Boaty McBoatface.

It all started when the Natural Environment Research Council (NERC) started a poll to name their $2.8 million research ship last month with its Twitter #nameourship campaign. The internet did what the internet does, and came up with names such as “Usain Boat,” “Boatamus Prime,” “It’s bloody cold here,” “Ice Ice Baby,” and “Notthetitanic,” but one really stood out; one name shone bright above them all: Boaty. BoatMcBoatface. The name gathered 78 percent of all votes, as clear a winner as any.

However, the final decision still lies in the hands of NERC and its chief executive, Duncan Wingham. They have been quite evasive about their decision, writing in a statement:

“NERC would like to thank everyone who has supported our campaign to name the UK’s next world-class polar research ship. NERC will now review all of the suggested names and the final decision for the name will be announced in due course.”

They have a long history of naming ships after explorers and naval officers, and this is probably what they were going for this time. We’ll just have to wait and see if they’ll follow what the people want, or if they’ll go for a more classical approach. But at this point, you just have to follow the pole – it’s too awesome to ignore.

The top ten suggestions are:

  • Boaty McBoatface – 124,109 votes

  • Poppy-Mai – 34,371 votes

  • Henry Worsley – 15,231 votes

  • It’s bloody cold here – 10,679 votes

  • David Attenborough – 10,284 votes

  • Usain Boat – 8,710 votes

  • Boatimus Prime – 8,365 votes

  • Katherine Giles – 7,567 votes

  • Catalina de Aragon – 6,826 votes

  • I like big boats & I cannot lie – 6,452 votes

Your microbial cloud is your “signature”

New research focused on the personal microbial cloud. (Credit: Viputheshwar Sitaraman, of Draw Science)

New research focused on the personal microbial cloud. (Credit: Viputheshwar Sitaraman, of Draw Science)

Humans are walking ecosystems. Each of us carries around about 100 trillion microbes in and on our bodies, which make up our microbiome. The quality of this bacterial community has a lot to say about our health and well-being. The blend of microbes is also surprisingly unique, which says a lot about who we are as individuals.

New research published in the September 22 issue of PeerJ has found that people can be identified by the nature of the microbial cloud that they release in the air around them. We each have our own microbial “signature.”

A not-so-empty room

Scientists have already amassed plenty of information about the human microbiome and they already know that people disperse some of those bacteria to their environments. These microbes come primarily from dust, our clothing, and our bodies.

Two new experiments, conducted at the University of Oregon, investigated the individual nature of these bacterial clouds.

The first experiment was designed to test whether researchers could confirm the presence of a person based only on bacterial traces. The researchers asked study participants to sit alone in a sanitized chamber that was filled with filtered air. A second, unoccupied chamber was used as a sterile control.

Each participant was given a clean outfit to wear to reduce the number of particles coming from clothing. Participants also sat in a plastic chair that had been disinfected and were given a disinfected laptop to use for communication and for personal entertainment during the study.

The experiment involved three participants, each tested for a total of six hours. Air in the test chamber was compared to the air in the unoccupied chamber. Any particles that came from a participant were filtered out of the chamber air and genetically sequenced to identify the mix of bacteria. The analyses involved thousands of different bacteria types in over 300 air and dust samples.

The researchers were able to determine that a person was present in the chamber after two hours, based only at the presence of bacteria in the air samples. They also found, however, that they could distinguish one person from another based on the unique combinations of bacteria from each participant.

This result motivated a second, more precise experiment.

Eight new people were asked to sit alone in the chamber for two 90-minute sessions. Analyzing the bacteria in the air revealed several individual features of each participant, including whether the person was male or female.

“We expected that we would be able to detect the human microbiome in the air around a person,” said lead author James Meadow, who was a postdoctoral researcher at the University of Oregon from the Biology and the Built Environment Center at the University of Oregon.at the time of the study. “But we were surprised to find that we could identify most of the occupants just by sampling their microbial cloud.”

A few bacteria groups, such as Streptococcus (commonly found in the mouth), Propionibacterium and Corynebacterium

(found on the skin), were primary indicators in the study. While these microbes are common in humans, it was the different combinations of these bacteria populations that distinguished between individuals.

Subtle differences were also found in the microbial clouds. Some people, for example, gave off different amounts of bacteria to the air due to such personal habits as how much they scratched and how much they fidgeted.

Tracking individual biology

The demonstration that bacteria clouds can be traced to individuals could shed light on how infectious diseases spread in buildings.

The results could also have a forensic use. Bacterial residue in the air might, for example, be used to determine where a person has been, even after they’ve left a space. The contributions of other people – or even animals – in those spaces, however, could easily complicate any analysis, so forensic uses will likely require more research.

Many things are used to detect and identify us in modern life. Now we know that the bacteria on, in, and around each of us has a close connection to the places we occupy, can be detected even after we’re gone . . . . and can be traced back to us.

 

 

Primary Source: New research finds that people emit their own personal microbial cloud

 

Can Hearing Aids Also Save Your Memory?

Screenshot_1

If you’ve ever lived with someone who’s hard of hearing, or have struggled with hearing difficulties yourself, you know it can be a hard condition to live with. Thankfully, modern technology has given us a wide variety of hearing implements that can let us salvage this precious physical sense.

But hearing loss can come with a number of related symptoms that are just as debilitating; those who live with moderate to severe hearing loss may also experience worsening memory and general mental confusion as their hearing slips away.

Scientists have believed for some time that these two conditions, while closely linked, would require different forms of treatment. But a recent study from Johns Hopkins has revealed that a familiar technology—hearing aids—may help to reverse the mental side-effects of hearing loss.

The impetus behind the study was the purported connection between hearing loss and an elevated risk of various cognitive problems. The study’s lead author, Jennifer Deal, has published previous work about this connection, but few studies have sought to study the connection in detail—until now. They’re hopeful that their findings will determine once and for all whether hearing aids are capable of staving off or even reversing a trend of worsening cognitive function.

The answer, so far, is yes they can.

Deal hailed the team’s findings, saying: “This study is important because it focuses on a risk factor that is amenable to intervention in later life and could potentially postpone cognitive decline.”

Deal and her team worked with a total of 253 people for the study, with an average age of 77 years. The patients all exhibited hearing loss described as “mild” to “moderate.” Each one was given a series of memory, learning, and reasoning tests way back in the early 1990’s. To truly test the long-term effects, the team waited until 2013 for a battery of follow-up tests.

And what did they find? It turns out that those participants who did not wear hearing aids in the intervening years posted the greatest drop in test scores. The ones who did wear hearing aids showed a slowing of their mental deterioration—a triumph both for this team of researchers and for modern hearing aid technology.

Most human beings know only too well that the gift of hearing is too easily squandered or damaged. Experiencing head trauma, listening to music too loud, getting clumsy on the Fourth of July, genetics, or merely growing older can all contribute to hearing loss later in life. But, while a person’s history and habits might give them some advanced warning that their hearing might one day be in danger, it can be tempting to ignore the problem—as well as the potential side-effects that come with it.

According to Dr. Alice Holmes, the study is important because it proves in a new way that hearing loss should not be ignored: “This shows we really should be treating hearing loss. I think treating the hearing loss can really improve someone’s overall quality of life and not only does it improve that quality of life but also others’ around him.”

So if you’re experiencing hearing loss, or know someone who is, now is the time to seek treatment. In addition to losing a part of your connection to the world around you, your mental health may suffer in some unexpected ways if you let your problem go untreated.

A Second Look at the Iceman – New discoveries motivate new analyses

Ötzi Reconstruction (© South Tyrol Museum of Archaeology – www.iceman.it)

Ötzi Reconstruction (© South Tyrol Museum of Archaeology – www.iceman.it)

Hikers discovered Ötzi the Iceman in the Ötztal Alps of Tyrol, Austria in 1991. Forensic analysis showed that he died around 5,300 years ago, making his the oldest intact human body every found. Ötzi had been preserved by glacier ice and was found with his tools, clothes, and weapons – a time capsule of Copper Age life. While years of study have enhanced our knowledge of his world, recent work with the Iceman has shown that he still holds a few mysteries for science.

Ötzi’s ailments

The Iceman’s body was found with a rich set of tattoos, something that had never been seen from this period before. Early studies revealed about 50 markings, made by rubbing soot into small skin incisions. The body had been darkened by centuries in the snow, however, so the tattoos had little contrast. Finding them visually was difficult and researchers couldn’t be sure that they had discovered all of them.

Initially, scientists believed that this body art was for decoration, but further study yielded a more intriguing theory: Investigators noted that the tattoos were placed over areas of physical wear such as the wrist, knees, and ankles, which meant that their purpose might have been pain relief. A medical study later pointed out that the tattoo locations matched known acupuncture meridians. If this theory were correct, the practice would predate the earliest recorded use of acupuncture in China by 2,000 years.

Although the idea was exciting, the evidence wasn’t definitive, and some researchers had their doubts.Therefore, scientists recently took another at Ötzi, to create a complete map of his tattoos. This time, they came with technology.

 

Looking deeper . . .

Ötzi Wrist Tattoo (© South Tyrol Museum of Archaeology/Eurac/Samadelli/Staschitz)

Scientists from the South Tyrol Museum of Archaeology and the Institute for Mummies and the Iceman (at the European Academy of Bozen) recently surveyed Ötzi using multispectral photographic tools.

Multispectral imaging records the reflection responses of materials to different wavelengths of light and combines those results into a single, color-coded picture. Different reflectance properties make fine distinctions easier to see, exposing and highlighting fine contrasts. This imaging method has been used by art restorers to find evidence of earlier painting beneath a canvas, and archaeologists to recover writing on charred papyri.

The camera used for the Iceman survey was fitted with filters that captured wavelengths from ultraviolet (300 nanometers) to infrared (1,000 nanometers). Human visual response, for comparison, spans about 400 – 700 nanometers.

 

 

 

. . . yields new insights

After the survey, the tattoo count went up to 61, and the research team thinks that they’ve completed a full tally.

“It is an extraordinary result. Finally, we have been able to clarify many doubts on the existence of these tattoos,” said Marco Samadelli, of the South Tyrol Museum of Archaeology.

Ötzi Chest Tattoo (© South Tyrol Museum of Archaeology//Eurac/Samadelli/Staschitz)

Ötzi Chest Tattoo (© South Tyrol Museum of Archaeology//Eurac/Samadelli/Staschitz)

The survey produced a surprise, however, when it revealed a new group of tattoos on the right lower right side of Ötzi’s rib cage. Unlike the earlier tattoos, this group was in an area with no apparent physical ailment. So, did the markings represent medical treatments or not? Several possibilities still fit the pattern.

Ötzi might have suffered another condition that also resulted in chest pain, such as gallstones, colonic worms, or atherosclerosis (plaque in arteries). If this were the work of a Copper Age healer, then treating symptoms where they were reported would only be logical.

An unfinished story

Samadelli and his research team want to use the new mapping for fresh examinations into the purpose of the tattoos, including any relation to acupuncture points.

“Future comparative studies based on the known health problems of the Iceman as evidenced by radiological investigations and molecular studies, could help to find out whether the tattoos had a therapeutic, diagnostic or more symbolic background.”

The new imaging work has been published in the Journal of Cultural Heritage.

Analysis and interpretation of Ötzi’s tattoos will likely continue for a while, but at least the puzzle seems to have all its pieces now.

The Iceman has been teaching scientists about Copper Age life for almost 25 years, and new information is still coming to light.

Not bad for a chance discovery on a Tyrolean hike.

genetic engineering

On Genetic Manipulation and the Government’s Role in Science

genetic engineering

In an announcement that’s been a long time coming for science fiction fans, the White House has, for the first time, come out in support of a global moratorium on altering the human germline.

It’s a decision that has implications not just for this particular type of scientific inquiry, but also for the future of government involvement in science. With a number of scientific singularities marching ever closer, there are some fundamental questions that we need to start asking ourselves, and there’s never been a better time to do it.

An Ethical Quandary

The announcement comes after Chinese scientists announced that they had successfully modified the genetic code of human embryos. To be more precise, the researchers removed a mutated gene from the embryo using a process called CRISPR-Cas9. This is clearly a significant achievement, and one with far-reaching scientific potential. Even so, the news was tempered by warnings from fellow scientists and bioethicists from across the country about the possible ramifications of meddling with human DNA.

In answer, the White House voiced support for a temporary moratorium on the process.

A breakthrough like this has the potential to catch humanity totally off-guard. We’re still spinning our wheels in an attempt to reach a scientific consensus on the merits of genetically modified crops, for example, so the thought of altering a human being has understandably left more than a few people worried.

And why shouldn’t it? We’ve been watching the consequences of genetic engineering play out in science fiction stories for almost a hundred years already. From Brave New World to Star Trek, it’s clear that the issue is far from straightforward.

But let’s get back to this moratorium. It seems to be inspired by the fear of so-called “designer babies”—that is, the idea that we could alter the genetic code of a human embryo before birth, thereby tailoring our progeny according to our (probably superficial) desires.

The optimists, meanwhile, view this kind of manipulation as a way to proactively eliminate a number of heritable genetic disorders, but critics have always been quick to question whether the risks outweigh the potential benefits.

Following China’s triumph, the National Academy of Sciences and the National Academy of Medicine announced their intentions to meet this fall with a stable of other researchers and bioethicists. Their goal is to discuss at length the advantages, implications, and possible risks of gene-altering technologies.

The White House’s Office of Science and Technology Policy added its voice to the growing chorus of concern; in a statement on their website, spokesman John P. Holdren confirmed the administration’s stance that “altering the human germline for clinical purposes is a line that should not be crossed at this time.”

Nevertheless, the office also acknowledged that biotechnology has played a significant role in moving the human race forward. Their statement continued:

“The advances in health technology over the past century—vaccines, antibiotics, early disease diagnostics, and treatment for countless health conditions—have reduced infant mortality, extended life expectancy, and alleviated suffering for millions. But new technology also brings risks and ethical challenges that require careful consideration.”

The Role of Government in Science

I’m sure that this deliberate and measured approach will cause some to chafe at the characteristic slowness of the federal government, or bemoan another “government intrusion” into the lives and livelihoods of the American people. But for my part, I think this is the right move. Frankly, I’m pleased that the government is showing some initiative here and is aggressively getting ahead of the issue. It’s a rare day when any branch of the government takes action that’s both proactive and in line with common sense. Usually we can count on just one of these at a time.

But to address the question of whether or not the government should be dictating matters of scientific ethics, we must first address the question of whether or not the government should be funding scientific inquiry in the first place.

In a thoughtfully worded but ultimately flawed article for the CATO institute, Terence Kealey argued in 1997 that the government should be taken out of the equation entirely. He calls upon tech industry magnates such as David Packard, John D. Rockefeller, and Howard Hughes, all of whom left significant fortunes to research foundations. His argument centers on the idea that, so long as private industry is enthusiastically and generously funding scientific research, we don’t need the government to contribute to the efforts.

To that I say only this: shouldn’t we be throwing as much money as we can at the cause of technological progress? I, for one, am dismayed every time Congress cuts NASA’s funding, say, or passes yet another piece of legislation that denies either the legitimacy, or the power, of science.

Science is not a partisan issue, nor even a political one; it’s a human issue that benefits each of us in discrete and priceless ways.

Still, I’ll gladly admit that the private sector is doing some amazing things in the name of progress, often without a flow of money from federal sources. Elon Musk’s SpaceX and Tesla Motors are making great strides in space exploration and the automotive industry, and are sharing their innovations with the world by making their patents freely available to all. Meanwhile, Google is hastening the coming autonomous car revolution and even the long-stagnant home energy industry is making strides toward a bold and sustainable future, with companies from coast to coast bringing cleaner technologies like biodiesel and ethanol and solar to a wider audience than ever before.

But let’s get hypothetical. Where will this push for technological and scientific progress really take us? The answer may be found, albeit in its infancy, in Amsterdam. It’s long been understood that 3-D printing could represent the boldest step yet into an uncertain future: it has applications in medicine, manufacturing, agriculture, and a variety of other industries. And now a Netherlands-based company is putting together what might be one of the most important proofs of concept we’ve seen yet: a 3-D printed bridge built by autonomous “robot arms.”

This is significant for a number of reasons. First and foremost, it represents a very modern fear: the further loss of jobs to automation. The call for a universal basic income in certain parts of the world seem to herald this very future: it represents a way to keep Americans financially intact even as the march of technology robs us of more and more decent-paying jobs.

This future will not be hypothetical for much longer; in the not-too-distant future, emerging technologies will eliminate just about every kind of physical job, and even some of the creative ones, too. What this means is that the world’s governments will not simply oversee or regulate science and technology, but in fact ensure that progress does not, paradoxically, prove to be the undoing of millions of American families. Coming technologies will vastly outpace our already fertile imaginations, and will fundamentally change life in this country and beyond. It will challenge the very idea of government in ways that no war, nor any other external force, ever has. It will force the State to become the first and last line of defense for when our reach finally exceeds our grasp: a role that no profit-based entity can (or should) be allowed to occupy.

We truly are living in unprecedented times, where not a week goes by where there’s not at least a whisper of some new technological or scientific breakthrough on the horizon. And just as a central government allows us access to shared benefits such as roads and national parks, so too can it ensure that each of us shares ownership in the fruits of our scientific labors, as it was always meant to be.

Image Credit: Thlerry Ehrmann (via Flickr)