Tag Archives: human

Brain scans are saving convicted murderers from death row–but should they?

Over a decade ago, a brain-mapping technique known as a quantitative electroencephalogram (qEEG) was first used in a death penalty case, helping keep a convicted killer and serial child rapist off death row. It achieved this by swaying jurors that traumatic brain injury (TBI) had left him prone to impulsive violence.

In the years since, qEEG has remained in a weird stasis, inconsistently accepted in a small number of death penalty cases in the USA. In some trials, prosecutors fought it as junk science; in others, they raised no objections to the imaging: producing a case history built on sand. Still, this handful of test cases could signal a new era where the legal execution of humans becomes outlawed through science.

Quantifying criminal behavior to prevent it

As it stands, if science cannot quantify or explain every event or action in the universe, then we remain in chaos with the very fabric of life teetering on nothing but conjecture. But DNA evidentiary status aside, isn’t this what happens in a criminal court case? So why is it so hard to integrate verified neuroimaging into legal cases? Of course, one could make a solid argument that it would be easier to simply do away with barbaric death penalties and concentrate on stopping these awful crimes from occurring in the first instance, but this is a different debate.

The problem is more complex than it seems. Neuroimaging could be used not just to exempt the mentally ill from the death penalty but also to explain horrendous crimes to the victims or their families. And just as crucial, could governments start implementing measures to prevent this type of criminal behavior using electrotherapy or counseling to ‘rectify’ abnormal brain patterns? This could lead down some very slippery slopes.

Especially it’s not just death row cases that are questioning qEEG — nearly every injury lawsuit in the USA also now includes a TBI claim. With Magnetic Resonance Imaging (MRIs) and Computed tomography (CT) being generally expensive, lawyers are constantly seeking new ways to prove brain dysfunction. Readers should note that both of these neuroimaging techniques are viewed as more accurate than qEEG but can only provide a single, static image of the neurological condition – and thus provide no direct measurement of functional, ongoing brain activity.

In contrast, the cheaper and quicker qEEG testing purports to monitor active brain activity to diagnose many neurological conditions continuously and could one-day flag those more inclined to violence, enabling early interventional therapy sessions and one-to-one help, focusing on preventing the problem.

But until we can reach this sort of societal level, defense and human rights lawyers have been attempting to slowly phase out legal executions by using brain mapping – to explain why their convicted clients may have committed these crimes. Gradually moving from the consequences of mental illness and disorders to understanding these conditions more.

The sad case of Nikolas Cruz

But the questions surrounding this technology will soon be on trial again in the most high-profile death penalty case in decades: Florida vs. Nikolas Cruz. On the afternoon of February 14, 2018, Cruz opened fire on school children and staff at Marjory Stoneman Douglas High in Parkland when he was just 19 years of age. Now classed as the deadliest school shooting in the country’s history, the state charged the former Stoneman Douglas High student with the premeditated murder of 17 school children and staff and the attempted murder of a further seventeen people. 

With the sentencing expected in April 2022, Cruz’s defense lawyers have enlisted qEEG experts as part of their case to persuade jurors that brain defects should spare him the death penalty. The Broward State Attorney’s Office signaled in a court filing last month that it will challenge the technology and ask a judge to exclude the test results—not yet made public—from the case.

Cruz has already pleaded guilty to all charges, but a jury will now debate whether to hand down the death penalty or life in prison.

According to a court document filed recently, Cruz’s defense team intends to ask the jury to consider mitigating factors. These include his tumultuous family life, a long history of mental health disorders, brain damage caused by his mother’s drug addiction, and claims that a trusted peer sexually abused him—all expected to be verified using qEEG.

After reading the flurry of news reports on the upcoming case, one can’t help but wonder why, even without the use of qEEG, someone with a record of mental health issues at only 19 years old should be on death row. And as authorities and medical professionals were aware of Cruz’s problems, what were the preventative-based failings that led to him murdering seventeen individuals? Have these even been addressed or corrected? Unlikely.

On a positive note, prosecutors in several US counties have not opposed brain mapping testimony in more recent years. According to Dr. David Ross, CEO of NeuroPAs Global and qEEG expert, the reason is that more scientific papers and research over the years have validated the test’s reliability. Helping this technique gain broader use in the diagnosis and treatment of cognitive disorders, even though courts are still debating its effectiveness. “It’s hard to argue it’s not a scientifically valid tool to explore brain function,” Ross stated in an interview with the Miami Herald.

What exactly is a quantitative electroencephalogram (qEEG)?

To explain what a qEEG is, first, you must know what an electroencephalogram or EEG does. These provide the analog data for computerized qEEGs that record the electrical potential difference between two electrodes placed on the outside of the scalp. Multiple electrodes (generally >20) are connected in pairs to form various patterns called montages, resulting in a series of paired channels of EEG activity. The results appear as squiggly lines on paper—brain wave patterns that clinicians have used for decades to detect evidence of neurological problems.

More recently, trained professionals have computerized this data to create qEEG – translating raw EEG data using mathematical algorithms to help analyze brainwave frequencies. Clinicians then compare this statistical analysis against a database of standard or neurotypical brain types to discern those with abnormal brain function that could cause criminal behavior in death row cases.

While this can be true, results can still go awry due to incorrect electrode placement, unnatural imaging, inadequate band filtering, drowsiness, comparisons using incorrect control databases, and choice of timeframes. Furthermore, processing can yield a large number of clinically irrelevant data. These are some reasons that the usefulness of qEEG remains controversial despite the volume of published research. However, many of these discrepancies can be corrected by simply using trained medical professionals to operate the apparatus and interpret the data.

Just one case is disrupting the use of this novel technology

Yet, despite this easy correction, qEEG is not generally accepted by the relevant scientific community to diagnose traumatic brain injuries and is therefore inadmissible under Frye v. the United States. An archaic case from way back in 1923 based on a polygraph test, the trial came a mere 17-years after Cajal and Golgi won a Nobel Prize for producing slides and hand-drawn pictures of neurons in the brain.

Experts could also argue that a lie detector test (measuring blood pressure, pulse, respiration, and skin conductivity) is far removed from a machine monitoring brain activity. Furthermore, when the Court of Appeals of the District of Columbia decided on this lawsuit, qEEG didn’t exist. 

Applying the Frye standard, courts throughout the country have excluded qEEG evidence in the context of alleged brain trauma. For example, the Florida Supreme Court has formally noted that the relevant scientific community for purposes of Frye showed “qEEG is not a reliable method for determining brain damage and is not widely accepted by those who diagnose a neurologic disease or brain damage.” 

However, in a seminal paper covering the use of qEEG in cognitive disorders, the American Academy of Neurology (AAN) overall felt computer-assisted diagnosis using qEEG is an accurate, inexpensive, easy to handle tool that represents a valuable aid for diagnosing, evaluating, following up and predicting response to therapy — despite their opposition to the technology in this press. The paper also features other neurological associations validating the use of this technology.

The introduction of qEEg on death row was not that long ago

Only recently introduced, the technology was first deemed admissible in court during the death-penalty prosecution of Grady Nelson in 2010. Nelson stabbed his wife 61 times with a knife, then raped and stabbed her 11-year-old intellectually disabled daughter and her 9-year old son. The woman died, while her children survived. Documents state that Nelson’s wife found out he had been sexually abusing both children for many years and sought to keep them away from him.

Nelson’s defense argued that earlier brain damage had left him prone to impulsive behavior and violence. Prosecutors fought to strike the qEEG test from evidence, contending that the science was unproven and misused in this case.

“It was a lot of hocus pocus and bells and whistles, and it amounted to nothing,” the prosecutor on the case, Abbe Rifkin, stated. “When you look at the facts of the case, there was nothing impulsive about this murder.”

However, after hearing the testimony of Dr. Robert W. Thatcher, a multi-award-winning pioneer in qEEG analysis for the defense, Judge Hogan-Scola, found qEEG met the legal prerequisites for reliability. She based this on Frye and Daubert standards, two important cases involving the technology.

She allowed jurors to hear the qEEG report and even permitted Thatcher to present a computer slide show of Nelson’s brain with an explanation of the effects of frontal lobe damage at the sentencing phase. He testified that Nelson exhibited “sharp waves” in this region, typically seen in people with epilepsy – explaining that Grady doesn’t have epilepsy but does have a history of at least three TBIs, which could explain the abnormality seen in the EEG.  

Interpreting the data, Thatcher also told the court that the frontal lobes, located directly behind the forehead, regulate behavior. “When the frontal lobes are damaged, people have difficulty suppressing actions … and don’t understand the consequences of their actions,” Thatcher told ScienceInsider.

Jurors rejected the death penalty. Two jurors who agreed to be interviewed by a major national publication later categorically stated that the qEEG imaging and testimony influenced their decision.

“The moment this crime occurred, Grady had a broken brain,” his defense attorney, Terry Lenamon, said. “I think this is a huge step forward in explaining why people are broken—not excusing it. This is going to go a long way in mitigating death penalty sentences.”

On the other hand, Charles Epstein, a neurologist at Emory University in Atlanta, who testified for the prosecution, states that the qEEG data Thatcher presented flawed statistical analysis riddled with artifacts not naturally present in EEG imaging. Epstein adds that the sharp waves Thatcher reported may have been blips caused by the contraction of muscles in the head. “I treat people with head trauma all the time,” he says. “I never see this in people with head trauma.”

You can see Epstein’s point as it’s unclear whether these brain injuries occurred before or after Nelson brutally raped a 7-year old girl in 1991, after which he was granted probation and trained as a social worker.

All of which invokes the following questions: Firstly, do we need qEEG to state this person’s behavior is abnormal or that the legal system does not protect children and secondly, was the reaction of authorities in the 1991 case appropriate, let alone preventative?

As more mass shootings and other forms of extreme violence remain at relatively high levels in the United States, committed by younger and younger perpetrators flagged as loners and fantasists by the state mental healthcare systems they disappear into – it’s evident that sturdier preventative programs need to be implemented by governments worldwide. The worst has already occurred; our children are unprotected against dangerous predators and unaided when affected by their unstable and abusive environments, inappropriate social media, and TV.  

A potential beacon of hope, qEEG is already beginning to highlight the country’s broken socio-legal systems and the amount of work it will take to fix them. Attempting to humanize a diffracted court system that still disposes of the product of trauma and abuse like they’re nothing but waste, forcing the authorities to answer for their failings – and any science that can do this can’t be a bad thing.

The fascinating science behind the first human HIV mRNA vaccine trial – what exactly does it entail?

In a moment described as a “potential first step forward” in protecting people against one of the world’s most devastating pandemics, Moderna, International AIDS Vaccine Initiative (IAVI), and the Bill and Melinda Gates Foundation have joined forces to begin a landmark trial — the first human trials of an HIV vaccine based on messenger ribonucleic acid (mRNA) technology. The collaboration between these organizations, a mixture of non-profits and a company, will bring plenty of experience and technology to the table, which is absolutely necessary when taking on this type of mammoth challenge.

The goal is more than worth it: helping the estimated 37.7 million people currently living with HIV (including 1.7 million children) and protecting those who will be exposed to the virus in the future. Sadly, around 16% of the infected population (6.1 million people) are unaware they are carriers.

Despite progress, HIV remains lethal. Disturbingly, in 2020, 680,000 people died of AIDS-related illnesses, despite inroads made in therapies to dampen the disease’s effects on the immune system. One of these, antiretroviral therapy (ART), has proven to be highly effective in preventing HIV transmission, clinical progression, and death. Still, even with the success of this lifelong therapy, the number of HIV-infected individuals continues to grow.

There is no cure for this disease. Therefore, the development of vaccines to either treat HIV or prevent the acquisition of the disease would be crucial in turning the tables on the virus.

However, it’s not so easy to make an HIV vaccine because the virus mutates very quickly, creating multiple variants within the body, which produce too many targets for one therapy to treat. Plus, this highly conserved retrovirus becomes part of the human genome a mere 72 hours after transmission, meaning that high levels of neutralizing antibodies must be present at the time of transmission to prevent infection.

Because the virus is so tricky, researchers generally consider that a therapeutic vaccine (administered after infection) is unfeasible. Instead, researchers are concentrating on a preventative or ‘prophylactic’ mRNA vaccine similar to those used by Pfizer/BioNTech and Moderna to fight COVID-19.

What is the science behind the vaccine?

The groundwork research was made possible by the discovery of broadly neutralizing HIV-1 antibodies (bnAbs) in 1990. They are the most potent human antibodies ever identified and are extremely rare, only developing in some patients with chronic HIV after years of infection.

Significantly, bnAbs can neutralize the particular viral strain infecting that patient and other variants of HIV–hence, the term ‘broad’ in broadly neutralizing antibodies. They achieve this by using unusual extensions not seen in other immune cells to penetrate the HIV envelope glycoprotein (Env). The Env is the virus’s outer shell, formed from the cell membrane of the host cell it has invaded, making it extremely difficult to destroy; still, bnAbs can target vulnerable sites on this shell to neutralize and eliminate infected cells.

Unfortunately, the antibodies do little to help chronic patients because there’s already too much virus in their systems; however, researchers theorize if an HIV-free person could produce bnABS, it might help protect them from infection.

Last year, the same organizations tested a vaccine based on this idea in extensive animal tests and a small human trial that didn’t employ mRNA technology. It showed that specific immunogens—substances that can provoke an immune response—triggered the desired antibodies in dozens of people participating in the research. “This study demonstrates proof of principle for a new vaccine concept for HIV,” said Professor William Schief, Department of Immunology and Microbiology at Scripps Research, who worked on the previous trial.

BnABS are the desired endgame with the potential HIV mRNA vaccine and the fundamental basis of its action. “The induction of bnAbs is widely considered to be a goal of HIV vaccination, and this is the first step in that process,” Moderna and the IAVI (International AIDS Vaccine Initiative) said in a statement.

So how exactly does the mRNA vaccine work?

The experimental HIV vaccine delivers coded mRNA instructions for two HIV proteins into the host’s cells: the immunogens are Env and Gag, which make up roughly 50% of the total virus particle. As a result, this triggers an immune response allowing the body to create the necessary defenses—antibodies and numerous white blood cells such as B cells and T cells—which then protect against the actual infection.

Later, the participants will also receive a booster immunogen containing Gag and Env mRNA from two other HIV strains to broaden the immune response, hopefully inducing bnABS.

Karie Youngdahl, a spokesperson for IAVI, clarified that the main aim of the vaccines is to stimulate “B cells that have the potential to produce bnAbs.” These then target the virus’s envelope—its outermost layer that protects its genetic material—to keep it from entering cells and infecting them.  

Pulling back, the team is adamant that the trial is still in the very early stages, with the volunteers possibly needing an unknown number of boosters.

“Further immunogens will be needed to guide the immune system on this path, but this prime-boost combination could be the first key element of an eventual HIV immunization regimen,” said Professor David Diemert, clinical director at George Washington University and a lead investigator in the trials.

What will happen in the Moderna HIV vaccine trial?

The Phase 1 trial consists of 56 healthy adults who are HIV negative to evaluate the safety and efficacy of vaccine candidates mRNA-1644 and mRNA-1644v2-Core. Moderna will explore how to deliver their proprietary EOD-GT8 60mer immunogen with mRNA technology and investigate how to use it to direct B cells to make proteins that elicit bnABS with the expert aid of non-profit organizations. But readers should note that only one in every 300,000 B cells in the human body produces them to give an idea of the fragility of the probability involved here.

Sensibly, the trial isn’t ‘blind,’ which means everyone who receives the vaccine will know what they’re getting at this early stage. That’s because the scientists aren’t trying to work out how well the vaccine works in this first phase lasting approximately ten months – they want to make sure it’s safe and capable of mounting the desired immune response.

And even though there is much hype around this trial, experts caution that “Moderna are testing a complicated concept which starts the immune response against HIV,” says Robin Shattock, an immunologist at Imperial College London, to the Independent. “It gets you to first base, but it’s not a home run. Essentially, we recognize that you need a series of vaccines to induce a response that gives you the breadth needed to neutralize HIV. The mRNA technology may be key to solving the HIV vaccine issue, but it’s going to be a multi-year process.”

And after this long period, if the vaccine is found to be safe and shows signs of producing an immune response, it will progress to more extensive real-world studies and a possible solution to a virus that is still decimating whole communities.

Still, this hybrid collaboration offers future hope regarding the prioritization of humans over financial gain in clinical trials – the proof is that most HIV patients are citizens of the third world.

As IAVI president Mark Feinberg wrote in June at the 40th anniversary of the HIV epidemic: “The only real hope we have of ending the HIV/AIDS pandemic is through the deployment of an effective HIV vaccine, one that is achieved through the work of partners, advocates, and community members joining hands to do together what no one individual or group can do on its own.”

Whatever the outcome, money is no longer a prerogative here, and with luck, we may see more trials based on this premise very soon.

Good news — study finds that people generally try to help one another out

Different motivators to do good don’t drown each other out, the team reports, adding that people generally want to help those around them.

Image credits Andrew Martin.

The findings help cement our understanding of reciprocity and prosocial behavior in the complex societal contexts of today. It’s also a hopeful reminder in these strange and trying times that deep down, we all want to make life better for everyone.

Sharing is caring

We all have four broad categories of motivators for which to help those around us: doing a kindness in return for someone who helped us out, doing something nice for someone we’ve seen helping a third person out, doing good as a response to people in our social circles who might be impressed with or reward that behavior, and as a way to “pay it forward” — to help someone if somebody else has done something nice for us.

The team explains that these four motivators could be at odds with one another. For example, we could prioritize rewarding someone who helped us out before to the detriment of others who might need assistance more than that person. The interplay between these four motivators during our social interactions has not been studied, however.

But there are grounds for hope. The authors report that in their experiment, people overwhelmingly chose to be generous to others, and even if they were complete strangers, even in situations where their motivators could create conflicts of interests.

“We wanted to do an exhaustive study to see what the effects of those motivations would be when combined — because they are combined in the real world, where people are making choices about how generous or kind to be with one another,” said David Melamed, lead author of the study and an associate professor of sociology at The Ohio State University.

The study included 700 participants and was designed to put them in a variety of situations where different motivators might compete. Participants took part in online interactions where they had to decide how much of a 10-point endowment they wanted to give other people. They were informed that these points would have a monetary value at the end of the study. This way, giving points away had a cost for the participants.

“[Prosocial behavior] means doing something for someone else at a cost to yourself,” Melamed said. “So one example would be paying for the person behind you’s order at the coffee shop. Or right now, wearing your mask in public. It’s a cost to you; it’s uncomfortable. But you contribute to the public good by wearing it and not spreading the virus.”

“In the real world, the conditions under which people are nice to each other are not isolated — people are embedded in their networks, and they’re going about their daily lives and coming into contact with things that will affect their decisions.”

Melamed says he expected to see the different motivators ‘crowd’ one another out. For example, a person focusing on giving back help they received might be less inclined towards the other motivators.

However, they found that “while [there is] some minor variation in how a given form of reciprocity might affect other forms,” people overwhelmingly showed an inclination towards helping others in all scenarios (each of which emphasized one type or combination of reciprocity types).

Melamed notes that from an evolutionary perspective, such behavior is very curious, as it decreases an individual’s fitness to boost that of others. Having it so deeply ingrained in our nature then shows the importance social relations played during our evolution. It also shows the extent to which they helped shape our cultures and civilizations.

Studying our prosocial behavior can also help us better understand it in other species such as bees and ants.

The paper “The robustness of reciprocity: Experimental evidence that each form of reciprocity is robust to the presence of other forms of reciprocity” has been published in the journal Science Advances.

Meet the Internet’s unsung heroes: Wikipedia’s human collaborators

It would be impossible to imagine the world today without Wikipedia — a fact that students around the world can gratefully attest to.

Although editor bots often steal the limelight in conversations about this great resource, a lot of people have been putting in a lot of work to make Wikipedia what it is today.

Wikipedia actually has a pretty interesting page dedicated to tracking the most prolific authors on the site. All contributors are equal in the eyes of the site and its userbase, so this list isn’t about giving anyone bragging rights.

It serves to acknowledge the people putting considerable time and effort into creating this unique repository of knowledge that we all use daily (and mostly take for granted). In my eyes, they’re the unsung heroes, the ‘real MVPs’ of the internet, and a list commemorating their work is the least of what we should do for them.

But let’s get to know who they are so that we know who to be thankful to while scrambling to meet that paper deadline in the wee hours of the morning.

Steven Pruitt / Wiki user Ser Amantio di Nicolao

Pruitt in a jacket and tie

Steven Pruitt is an editor from Virginia, USA, with over three million edits and more than 35,000 written articles under his belt. Pruitt is hands down the most prolific human Wikipedia editor and publisher — not a bad accolade to hold.

Time magazine seems to agree, as it named Pruitt as one of the 25 most important influencers on the Internet in 2017.

Pruitt works as a contractor for the U.S. Customs and Border Protection but also finds the time to edit, flesh-out, and create material for the online encyclopedia.

“It started in 2001,” he explained in a Reddit AMA (ask me anything) thread in 2019.

“I matriculated college in 2002. I remember watching it climb in the Google search results, from the bottom of the first page to about two or three from the top. Honestly, I didn’t think it was going to take off…but it kept showing up, and one day I thought, ‘What the hell?’, and jumped in. I’m not sure I believed the ‘anyone can edit’ part of it until I became part of ‘everyone’.”

Pruitt is also one of the leading forces that helps shine a light on the achievements of women throughout history (my personal favorite is Hypatia), having written 212 new articles detailing the lives and achievements of influential women when the Time Magazine piece was published. He is also part of the Women in Red initiative, which is “focused on improving content systemic bias in the wiki movement”.

However, he doesn’t focus solely on this or any other topic. His primary source of information, according to the AMA thread are “books, mostly encyclopedias”, alongside material on the web or other sources “as long as they pass a small test :)”.

As to why he does it, it’s the oldest reason in the book — “it’s a hobby”.

“I have my moments, I think everyone does,” he said when asked whether he ever felt like he’s putting too much time and effort into Wikipedia. “But then I look back on some of the articles I’ve written […] and it feels good. That wonderful feeling of having made something useful. That’s what keeps me going, often as not.”

Pruitt adds that he has been approached with offers to write Wikipedia articles for pay by “a couple of people” and only said yes once because “I genuinely felt the subject deserved an article, and would pass the notability test”, but didn’t accept payment for it.

“I know it sounds cheesy, but I’ve come to believe that we, collectively, are changing the world and the way the world thinks about knowledge. That’s an amazing thing to think about, and it still blows my mind.”

It’s safe to say that without Pruitt, Wikipedia — and maybe the internet — wouldn’t be the same.

Justin Knapp / Wiki user Koavf

Justin Knapp—a Caucasian male with brown hair and a bushy beard—stands with his arms folded

He used to be the top contributor between April 18, 2012, and November 1, 2015, when Pruitt took the title. While Justin may not be the most prolific contributor to Wikipedia by sheer number of edits and posts any longer, he will forever remain the first to reach one million edits on the site. As of March 2020, he has performed over 2 million edits and doesn’t seem to be losing any steam.

We have to keep in mind the dedication and the workload people such as Knapp and Pruitt put themselves through for our collective benefit. He has submitted an average of 385 edits a day, every day, for seven years (starting in 2005) by the time he reached 1 million edits in 2012.

To be fair, he does have a perk most of us don’t: a degree in philosophy (and one in political science, but it’s harder to make unemployment jokes with that one) from Indiana University – Purdue University Indianapolis.

“Being suddenly and involuntarily unemployed will do that to you,” he wrote in “his personal page”, according to The Telegraph.

His work didn’t go unnoticed in the community, with Wikipedia co-founder Jimmy Wales congratulating him and declaring that April 20 would be Justin Knapp Day. He says that he doesn’t have a fixed routine in regards to his editing and that his “go-to edits are small style and typo fixes”. Philosophy, politics, religion, history, and popular culture are some of the categories he works on most.

In his day to day life, Knapp has had several odd jobs including pizza delivery, working at a grocery store, and as an operator for a crisis hotline. He also owns a magnificent beard.

I find it quite infuriating that the work of people such as Knapp goes unrecognized by the vast majority of society, despite all the immense benefits it brings. His is a prime example of why careers aren’t a true reflection of an individual’s worth and merits. We all are liable to look down on someone for being “just a pizza delivery guy,” or “just a cashier lady,” with the unstated but implied belief that if they’d only work hard and educate themselves as we do — perhaps on Wikipedia — they would deserve the quality of life we enjoy.

But that pizza delivery guy and that girl working her fingers to the bone behind the counter might just be the person who wrote and corrected the Wiki page you used in your dissertation at school or high-stakes presentation at work. And they may be working “up to 16 [hours] a day” to allow anyone, anywhere, including you, access to the sum of human knowledge.

Wiki user BrownHairedGirl (BHG)

Grainne Ní Mháille (Grace O’Malley), the pirate queen of Ireland, whom BHG calls “one of her heroes”.
Image modified after Wikipedia.

The definition of an unsung hero, because she wants to “neither tell [her] life story or reveal all sorts of interesting details about [herself].”

All we know of the elusive BHG comes from her user page (linked above). She’s currently living in Connacht, Ireland, was “expelled from the University of Life”, her eye color is “working”, and she seems to be in a romantic triangle with Gráinne and Finn McCool, two figures from Irish mythology. Which, one would assume, makes her Mrs. McCool.

Beyond her sense of humor, BHG is also a major contributor to Wikipedia, having performed close to 1.8 million edits. She’s one of the scarier admins of the site, with close to 12,000-page deletions and 245 user bans under her belt in her 14 years and 4 months’ worth of work on the site. But she also has a nurturing side, restoring over 1,100 pages back to the Wiki.

BHG spawned BHGbot in 2007, a bot that tags “talk pages of articles and categories [to] identify the articles as being within the scope of a particular WikiProject”, although we are not informed of whether she did so by Finn McCool or someone else. Tragically, BHGbot’s life was cut short in 2009.

Her work further includes quite a bit of heavy lifting behind the scenes, which has to do with streamlining the way Wikipedia’s automated systems handle category indexing. I won’t pretend to fully understand what that means, but it involves coding.

BHG mostly works “on Irish topics, especially politics,” she explains on her user page.

“I have also done a lot of Scottish politicians and judges, and on Westminster MPs from across the UK. Plus Irish and UK constituencies and by-elections. I created the page Families in the Oireachtas [the Irish legislature], built and developed a large chunk of the articles on UK constituencies, and built the [navigation boxes] which unify navigation between the constituencies in Ireland of 5 different parliaments and assemblies.”

She cites the British Newspaper Archive (BNA) as her “most useful source” of information for her work on Wikipedia, noting that “sadly, the BNA no longer offers free access to Wikipedia contributors” but that she plans to one day take out a subscription and “start writing articles again”.

Richard Farmborough / Wikiuser Rich Farmborough

A 53-year old Richard Farmborough.
Image credits SKBalchal / imgur.

Another one of Wikipedia’s mysterious contributors (hard facts and mystery — this site has it all), Rich is close to reaching around 1.7 million edits since he joined in March of 2015 (according to his user page).

His user page is quite cryptic. It tells us that Richard is skilled in technical and general editing and that he likes “to welcome, facilitate, and enable new editors”.

I had to do some prodding around to find out more about the man behind the Wikiuser Rich, but from what I’ve found, he seems like a genuinely cool guy. Writing in a blog post for Wikimedia, Syed Muzammiluddin explains that Rich “developed a passion for English Wikipedia the moment he discovered the project as early as 2004,” especially given his previous experience and interest with bulletin boards. He considers himself to have been a “full-time Wikipedian” ever since March 2012 “with some gaps”.

According to the same source (it also has a photo of a younger Richard), he was born and brought up in Enfield (a London borough) and holds a degree in Mathematics. As far as careers go, he has worked as a professional in car insurance, in e-commerce, and in academia.

“The English Wikipedia should be considered a storehouse of resources,” he told Muzammiluddin. “Given the ubiquity of the language, anyone with even a passable command of English can make a valuable contribution to Wikipedias in other languages. Not just in articles, policies, and guidelines, but also in the wide reuse of templates—saving thousands of hours.”

British tabloid The Sun cites Rich as a “retired project manager Richard, from Stamford, Lincs”, so make of his bio what you will. However, they do have some interesting quotes detailing’s Richard’s experience with the encyclopedia and what drives him to contribute:

“When I first found Wikipedia I started jumping in and editing as I read, adding bits here and there. If I see something that needs doing, I will do it. It might mean writing a few sentences, but it could be as simple as fixing a typo,” he explains.

“It seems like a lot of time, but what else would I be doing? Watching videos of cats on YouTube? At least this is productive.”

He further explains that it is important to him “that knowledge is accessible to all,” and volunteers like him “are making that possible — one edit at a time.” His advice to everyone out there is to “be bold, be patient, and be kind”.

That being said, though, I am very partial to cat videos on YouTube.

Wikiuser BD2412

“This editor is a Grandmaster Editor First-Class and is entitled to display this
Mithril Editor Star with the Neutronium Superstar hologram” — this is the message that, in a bright yellow box alongside a picture of said Star, greet you upon accessing BD2412’s profile page.

This is the best picture I’ve been able to find of BD2412 — conveniently supplied by himself.

BD2412 joined Wikipedia in December 2005, making him one of the longest-serving Administrators of the site.

Among the few tidbits of information we get from his page is that BD2412 is a lawyer. Perhaps unsurprisingly, his profile lists “law” and “people in law” as some of his main areas of contribution. Wikipedia lists him as having made in excess of 1.5 million edits, and BD2412 states that he has edited about 14.25% of the articles on Wikipedia — not a mean feat by any margin.

“If you have edited more than seven articles, there is probably an article that both you and I have edited,” he adds.

BD is also “an administrator on English Wiktionary; and on English Wikisource; and an admin and bureaucrat on English Wikiquote,” and judging by the pins on his page, has published peer-reviewed articles in academic journals.

I also like these two quotes he has on his profile — one about what Wikipedia is for and how it should function, the other about cautioning that the source of information you’re using can shape the data that you find:

“Wikipedia is not an experiment in democracy: its primary method of finding consensus is discussion, not voting. That is, majority opinion does not necessarily rule in Wikipedia. Various votes are regularly conducted, but their numerical results are usually only one of several means of making a decision. The discussions that accompany the voting processes are crucial means of reaching consensus.”

“Be aware of Google bias when testing for importance or existence: bear in mind that Google will be biased in favor of modern subjects of interest to people from developed countries with Internet access, so it should be used with some judgment.”

These are just 5 of Wikipedia’s contributors — granted, they are the 5 most prolific ones as measured by the number of their edits and posts, but they’re by no means the only ones. They’re just 5 out of a page listing over 5000 people. And there’s a second page dedicated to yet another 5000 (presumably to keep the lists navigable).

It’s a testament to these people, their work, and the mind-boggling wealth of data that Wikipedia encapsulates, that with very tiny exceptions (i.e., one Reddit threat, a quote, and picture), everything I’ve written here is available on the platform.

We tend to take it for granted. There’s absolutely no shred of doubt, at least to most of us in the West, that if an internet connection is available, we can access data on virtually anything; on everything. In 30 seconds we can use our smartphone to say “see, I was right” in an argument with our friends just by typing “wi” and hitting enter on Wikipedia when it comes up as the first suggestion.

For any of our ancestors, Wikipedia would be nothing short of a miracle. We know it’s not; it’s just a system constructed on electricity forced into silicon chips. But systems are only as powerful as the people who build them allow them to be. In that light, the work contributors such as these five here do, without asking for recognition, fame, praise, or fortune, often refusing it voluntarily, is nothing short of a modern miracle.

All images used in this post, unless otherwise stated, are sourced from Wikipedia.

Researchers obtain oldest-ever human DNA from ancient tooth

Researchers at the University of Copenhagen have successfully isolated the oldest human genetic material to date, from an 800,000-year-old human fossil.

The Homo antecessor tooth used in the study.
Image credits Frido Welker et al., (2020), Nature.

The study gives us insight into humanity’s past going back much farther than previously considered possible. It could also help us get a better understanding of the different (now extinct) branches in the human family and how they related to one another, the team adds.

Old relatives

“Ancient protein analysis provides evidence for a close relationship between Homo antecessor, us (Homo sapiens), Neanderthals, and Denisovans. Our results support the idea that Homo antecessor was a sister group to the group containing Homo sapiens, Neanderthals, and Denisovans,” says Frido Welker, Postdoctoral Research Fellow at the Globe Institute, University of Copenhagen, and first author on the paper.

The ancient DNA was harvested from an 800,000-year-old tooth belonging to the species Homo antecessor. It was discovered by palaeoanthropologist José María Bermúdez de Castro and his team in 1994 at the Gran Dolina cave site, Spain, a part of the larger Sierra de Atapuerca archeological site.

Through the use of a technique called mass spectrometry, the team was able to isolate proteins from the enamel of the tooth, allowing them to sequence its genetic information and discover where this species fits in the human family tree.

We know that humans and chimpanzees split, genetically speaking, about 9-7 million years ago, but we don’t have a very clear picture of how many different human species there were at the time and how they related to one another. The fact that all these other human lineages are now extinct doesn’t help either, nor does the fact that genetic material tends to break down over time. Much of what we know on the topic so far is based on DNA analysis of samples no older than around 400,000 years, or direct observations of the shape and structure of early human fossils.

“Now, the analysis of ancient proteins with mass spectrometry, an approach commonly known as palaeoproteomics, allow us to overcome these limits,” says says Enrico Cappellini, Associate Professor at the Globe Institute, University of Copenhagen and leading author on the paper.

The findings suggest that Homo antecessor isn’t, in fact, the last common ancestor for us and the Neanderthals, but rather a closely related relative of that ancestor.

“I am happy that the protein study provides evidence that the Homo antecessor species may be closely related to the last common ancestor of Homo sapiens, Neanderthals, and Denisovans. The features shared by Homo antecessor with these hominins clearly appeared much earlier than previously thought. Homo antecessor would therefore be a basal species of the emerging humanity formed by Neanderthals, Denisovans, and modern humans,” says José María Bermúdez de Castro, Scientific Co-director of the excavations in Atapuerca and co-corresponding author on the paper.

The current research was only made possible by a ten-year-long collaboration between experts in fields ranging from paleoanthropology to biochemistry, proteomics, and population genomics, which allowed the team to retrieve and read proteins of such incredibly old age.

The paper “The dental proteome of Homo antecessor” has been published in the journal Nature.

Our long shadow: humanity places ‘intense’ pressure on 17,500 species of land vertebrates

Humanity has a massive impact on wildlife across the world, new research found.

Image via Pixabay.

A new study looking into the impact of human activities on wildlife reports that a staggering number of terrestrial vertebrate species are exposed to ‘intense’ human pressure, spelling trouble ahead for biodiversity and the integrity of wild ecosystems.

Wild no more

“Given the growing human influence on the planet, time and space are running out for biodiversity, and we need to prioritize actions against these intense human pressures,” says Said senior author James Watson of the Wildlife Conservation Society (WCS) and the University of Queensland.

“Using cumulative human pressure data, we can identify areas that are at higher risk and where conservation action is immediately needed to ensure wildlife has enough range to persist. “

Using the most comprehensive dataset on the human footprint to date — this maps the net impact of human activity on a given surface of land — a new study looked at the pressure humanity is exerting on 20,529 terrestrial vertebrate species. Roughly 85% of that number (or 17,517 species) have half of their range exposed to ‘intense’ human pressure, and 16% (3,328 species) are exposed throughout the entirety of their range, the team found.

Terrestrial vertebrates with small ranges were impacted most heavily, they add. A further 2,478 species considered ‘least concern’ have large portions of their range overlapping with areas of intense human pressure, which may place them at risk of decline.

The Human Footprint dataset the team used takes into account the impact of human habitation (population density, dwelling density), accessways (roads, rail), land-use (urban areas, agriculture, forestry, mining, large dams), and electrical power infrastructure (utility corridors). These factors are known to put pressure on local wildlife and are driving the high extinction rates seen today.

While the findings are pretty grim, the team hopes they can be used to better assess which species are buckling under the human footprint, which would allow us to better conserve them and the habitats they live in. The data, they explain, can aid current assessments of progress against the 2020 Aichi Targets — especially Target 12, which deals with preventing extinctions, and Target 5, which deals with preventing loss of natural habitats.

“Our work shows that a large proportion of terrestrial vertebrates have nowhere to hide from human pressures ranging from pastureland and agriculture all the way to extreme urban conglomerates,” says Christopher O’Bryan of the University of Queensland, the study’s lead author.

The paper “Intense human pressure is widespread across terrestrial vertebrate ranges” has been published in the journal Global Ecology and Conservation.

Humans and Neanderthals diverged at least 800,000 years ago, new teeth study shows

The origin of humans and our closest cousins, the Neanderthals, has been hotly debated. Now, a new study suggests that the two lineages diverged much earlier than anticipated — and the key might lie in modern-looking teeth.

Hominin teeth. Image credits: Aida Gómez-Robles.

The Atapuerca Mountains in north-eastern Spain might not look like much. They feature gentle slopes and a rather dry landscape, interrupted from place to place by forests and the occasional river. But these mountains hold a karstic environment that is key to understanding how humans came to be, and what life was for our early ancestors.

The most important site is a cave called Sima de los Huesos (Pit of Bones). Anthropologists have recovered over 5,500 human remains which are at least 350,000 years old. The remains belong to 28 individuals of Homo heidelbergensis, an archaic hominin that lived from approximately 700,000 years to 300,000 years ago. Researchers also found fossils of cave bears and some remarkable tools and structures developed by these ancient humans.

In 2016, nuclear DNA analysis showed that remains from some individuals belong to Neanderthals, also suggesting that the divergence between Neanderthals and Denisovans (more enigmatic cousins of ours) happened at least 430,000 years ago. Now, Aida Gómez-Robles, an anthropologist at University College London, believes the cave might also have some clues as to when Neanderthals split from humans.

Gómez-Robles studies what makes humans distinct from other primate species, despite our genetic similarities. She also looks at teeth, in particular, using dental variations to assess the evolutionary relationships of fossil hominins. She believes that the Neanderthal teeth found in the cave look more modern than they should, which offers two possibilities: either the teeth evolved unusually quickly (and there’s no reason to believe that that might be the case) or, as she believes, the teeth had more time to evolve. If the latter is true, then this would mean that Neanderthals split up from our own lineage earlier than expected: some 800,000 years ago.

“There are different factors that could potentially explain these results, including strong selection to change the teeth of these hominins or their isolation from other Neanderthals found in mainland Europe. However, the simplest explanation is that the divergence between Neanderthals and modern humans was older than 800,000 years. This would make the evolutionary rates of the early Neanderthals from Sima de los Huesos roughly comparable to those found in other species.”

Previously, DNA analyses have generally indicated that the lineages diverged around 300,000 to 500,000 years ago. This was very important for anthropological context since researchers took it as a temporal anchor for interpreting other findings. However, anatomical evidence (such as the teeth in the Pit of Bones) seems to contradict this timeline. All the evidence we have seems to suggest that dental shape has evolved at very similar rates across all hominin species, so there’s not much reason to believe Neanderthals would be an exception. It’s a strong argument and a plausible scenario.

“The Sima people’s teeth are very different from those that we would expect to find in their last common ancestral species with modern humans, suggesting that they evolved separately over a long period of time to develop such stark differences,” Gómez-Robles adds.

However, not everyone is convinced. Paleoanthropologist Rick Potts, director of the Smithsonian’s Human Origins Program says that it’s an interesting find, but there’s not enough evidence to counterbalance the previous molecular and DNA results. It’s also unclear why only teeth seem to be heavily evolved — the rest is in accordance with the 300,000-500,000 timeline.

There’s also another complication: hybridization. Neanderthals have been found to interbreed with humans and Denisovans, and during this period, with populations being separated from each other, adapting to a particular environment, and then being re-united and starting breeding again — we don’t really know what the effect of that might have on tooth evolution.

Without additional evidence, the jury is still out regarding the early evolution of these hominin species, but this new study goes to show just how complex our evolutionary history is, and how difficult it is to uncover the intricacies that led to the development of humans and our cousins.

The study was published in Science Advances.

The cave where the fossils which may belong to a new hominin species were found. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.

Potential new species of human found in cave in Philippines

The cave where the fossils which may belong to a new hominin species were found. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.

The cave where the fossils which may belong to a new hominin species were found. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.

In a cave on a small island in the Philippines, scientists have found evidence of a new species of humans that lived at least 50,000 years ago. They called it Homo luzonensis, after the island of Luzon where the remains were found. These hominins were very short in stature, comparable to Homo floresiensis, nicknamed “hobbits”, which lived on the nearby Indonesian island called Flores. If the species is confirmed by DNA analysis, the findings will not only enrich the human family tree but also complicate the story of human migration and evolution in Asia.

Another one?

The fossils from the island of Luzon were excavated during three expeditions in 2007, 2011, and 2015. Inside the island’s Callao cave, researchers found seven teeth (five from the same individual), two finger bones, two toe bones, and an upper leg bone. All were dated as being at least 50,000 years old by radiocarbon decay analysis. These fossils were found alongside those of butchered animals, suggesting that the cave’s inhabitants were at least sophisticated enough to devise cutting tools and rafts to reach the island from the mainland.

Individually, the bones are very similar to other Homo species in terms of shape and size. However, taken together, they reveal a combination of features that no other hominin shared. Homo luzonensis‘ molars were very small, even smaller than the hobbits. The premolars were relatively large, however, and had up to three roots rather than one — a feature shared by Homo erectus. The finger and toe bones were curved, suggesting tree-climbing ability that is more reminiscent of hominids living two million years ago in Africa.

Five fossil teeth from the same individual have unusual features that helped researchers determine that they might be dealing with a new species of human. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.

Five fossil teeth from the same individual have unusual features that helped researchers determine that they might be dealing with a new species of human. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.

These findings suggest that the landscape occupied by our species was once quite crowded. We now know that Homo sapiens were contemporaries not only with their famous cousins, the Neanderthals, but also with Homo floresiensisthe Denisovans (a species that lived around a cave in the Altai Mountains of western Siberia), and now this fifth species, Homo luzonensis. This dramatically complicates the story of human migration into Asia suggesting that several human lineages had already occupied East Asia by the time the first modern humans reached China as early as 80,000 years ago.

A Homo luzonensis toe bone, showing the longitudinal curve. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.

A Homo luzonensis toe bone, showing the longitudinal curve. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.

Homo luzonensis was typically around 30 to 50 kilograms, stood 1 to 1.5 meters tall, and had brains around one-third the size of our own. Just like the hobbits on Flores, Homo luzonensis may be descendants of Homo erectus populations that crossed the sea from mainland Asia to Luzon. The small body and unusual skeletal traits may have been adaptations pressured by island dwarfing —  a process whereby some creatures confined to isolated habitats such as islands are known to have become smaller over time due to limited resources and ecology.

Digital reconstruction of homo floresiensis, nicknamed 'the hobbit'. Credit: Wikimedia Commons.

Digital reconstruction of homo floresiensis, nicknamed ‘the hobbit’. Credit: Wikimedia Commons.

It’s not clear yet if we’re dealing with a new species at all. The team of researchers, led by Florent Détroit of the Musée de l’Homme in Paris, was unable to extract DNA from the fossils. Until a proper DNA analysis confirms the distinct lineage, Homo luzonensis’ inclusion in the human family tree remains questionable. For instance, the fossils might belong to hybrids — the products of interbreeding between two or more earlier Homo species. Or perhaps Homo erectus populations that arrived at Luzon simply acquired some traits that made them more adapted to their environment, rather than speciating.

The findings are still incredibly exciting, nevertheless. It’s amazing to hear that our species lived at the same time as four other human lineages and perhaps interacted with them. What a sight that must have been to behold.

Hominin talus.

New research suggests humanity’s ancestors began walking upright earlier than believed

New research shows that one ancestor of modern humans was walking upright much more often than we believed.


The skull of the Ardipithecus ramidus specimen nicknamed “Ardi”.
Image credits T. Michael Keesey / Flickr.

One immediately-distinguishable feature of humans is the way we move about —  we’re unique among mammals in that we consistently walk upright. Since it’s such a distinguishing feature, anthropologists are very interested in finding out when our ancestors picked up the habit.

New research from the Case Western Reserve University School of Medicine suggests that at least one of our ancestors relied on bipedal walking much more than previously believed.

Early walker

“Our research shows that while Ardipithecus was a lousy biped, she was somewhat better than we thought before,” said Scott Simpson, a professor of anatomy at Case Western who led the study.

The findings come from an analysis Simpson’s team performed on fragments of a 4.5 million-year-old female Ardipithecus ramidus. This specimen was discovered in the Gona Project study area in the Afar Regional State of Ethiopia. Hip, ankle, and hallux (big toe) bones belonging to the ancient female showed that Ar. ramidus was far better adapted to bipedalism than previously thought, but still far from perfect.

Fossil evidence from this stage of humanity’s past are rare, so we have a pretty dim idea of what was going on at the time. As such, Simpson’s research — although seemingly a simple pursuit involving an ankle of all things — actually goes a long way into fleshing out our understanding of  Ardipithecus locomotion, as well as the timing, context, and anatomical details of ancient upright walking.

Previous research has shown that Ardipithecus was capable of walking upright and climbing trees — but the fossils that research was based on lacked the anatomical specializations seen in the Gona fossil examined by Simpson. This suggests that the species saw a wide range of adaptations as they transitioned to modern, upright walking as seen in modern humans.

“Our research shows that while Ardipithecus was a lousy biped, she was somewhat better than we thought before,” says Simpson.”The fact that [it] could both walk upright […] and scurry in trees marks it out as a pivotal transitional figure in our human lineage.”

Hominin talus.

A freshly-found fossil A. ramidus talus.
Image credits Case Western Reserve University School of Medicine

The team says that certain adaptations in the specimen’s lower limbs are tell-tale signs of bipedality. Unlike monkeys and apes, for example, our big toes are parallel with the others. The team worked to reconstruct the foot’s range of motions by analyzing the area of the joints between the arch of the foot and the big toe. While joint cartilage no longer remains for the Ardipithecus fossil, the surface of the bone has a characteristic texture which shows that it had once been covered by cartilage.

Having all toes neatly parallel to one another allows the foot to function as a propulsive lever when walking. Ardipithecus had an offset grasping big toe useful for climbing in trees, but the team’s analysis showed that it was also used to help propel the individual forward when walking — in other words, it’s a mixed-use tool, indicative of a transition towards bipedalism. The team also reports that Ardipithecus’s knees were aligned directly above its ankles when it stood. This latter characteristic is also similar to what you’d see in a modern (bipedal) human, and stands in contrast to what you’d see in a (non-bipedal) chimp, for example, whose knees are “outside” the ankle, i.e., they are bow-legged, when they stand.

The paper “Ardipithecus ramidus postcrania from the Gona Project area, Afar Regional State, Ethiopia” has been published in the Journal of Human Evolution.


The ‘forager gene’ of humans and fruit flies works in practically the same way

An international team of researchers reports that a gene humans and fruit flies share has a similar effect on their behavior. The same gene is found in many species across the world, likely acting in a similar way.


Image via Pixabay.

This might seem ludicrous, but there was a time in which humans couldn’t go to the grocery store to get food. In those dark times, we had to forage our way into a meal. New research shows that one gene with significant impact on foraging behavior in fruit flies (Drosophila melanogaster) has a similar effect on our own foraging strategies.

Will search for food

The team, which includes researchers from Canada, the U.K., and the U.S. has found that a gene known as PRKG1 — which is present in a wide range of species — can dictate whether individuals are “assessors” or “locomotors” when foraging for food.

The team worked with a group of college volunteers, who were asked to play a video game on a tablet. The object of this game was to find as many berries (which were hidden among plants) as possible. Each participant could navigate the environment at will and click on individual berries to pick them up. After playing the game, each volunteer was asked to give a tissue sample for DNA testing.

Some volunteers took a perimeter-first approach, the team reports — these were the “assessors” — while others dove right into the thick of it — these are the “locomotors”. Next, the team looked at the differences in the human equivalent of the PRKG1 — a nucleotide polymorphism genotype called rs13499 — among these participants, and compare them with those seen in fruit flies.

Prior research has shown that one variant of the PRKG1 gene pushes flies towards the “assessor” pattern of behavior, while another makes them “locomotors”. Upon entering an area, assessors are more likely to tour its perimeter first, then move inward. Locomotors, in contrast, make a beeline for the first fruits they see.

If you’re thinking ‘hey, those behaviors seem pretty similar,’ you’re spot on. The team reports finding the same gene variants responsible for instigating locomotor or assessor behavior in fruit flies in their college participants, having the same effect in both species. They further note that the search paths taken by the human volunteers and the sitter and rover fruit flies were nearly identical.

The findings suggest that this gene-induced preference in foraging patterns likely holds for other species as well. The team adds that their findings also suggest the patterns of behavior we employ when pursuing our goals can also be connected with these two gene variants.

The paper “Self-regulation and the foraging gene (PRKG1) in humans” has been published in the journal PNAS.


We’re watching a ‘horror story’ in which the casualty is Earth’s wilderness

The West is no longer wild — nor all other cardinal directions, a new paper reports.


Image via Pixabay.

The first comprehensive fine-scale map of the world’s remaining wild areas reveals that only 23% of the world can now be considered wilderness. The analysis included all terrestrial and marine environments, excluding Antarctica. Every other place on Earth has been directly affected by human activities.

Tamed with the stick

“These results are nothing short of a horror story for the planet’s last wild places,” says lead author James Watson of the Wildlife Conservation Society (WCS) and the University of Queensland. “The loss of wilderness must be treated in the same way we treat extinction.”

“There is no reversing once the first cut enters. The decision is forever.”

The findings raise particular concerns as wilderness areas play an increasingly important role in mitigating a host of human impacts on the planet, including species extinction and climate change.

The team defined wilderness areas as those that have escaped industrial-level activity. Local communities can live, hunt, fish, forage, or whatever else they like to do within these areas — as long as the community’s footprint is under that of an industrialized society. They’re the last untouched ecosystems around and are much more robust than their counterparts as a result. Several previous papers (here’s an example) have shown that intact ecosystems are much more effective in sequestering carbon than degraded ones — overall, they capture over two times as much of the element as degraded ones. In the oceans, intact habitats are the last regions that can still support viable populations of top predators such as tuna, marlins, or sharks.

If those two points aren’t enough to sway you, first of all, shame on you. Secondly, it’s not only an environmental issue. Wild areas are home to millions of indigenous people who have forged a very deep bond with their environments. Their culture, as well as their livelihoods, are deeply intertwined with the wilderness — losing it would effectively render many of the world’s most unique cultures extinct.

Bushman group.

Cultures such as this.
Image credits Aino Tuominen.

Not all is lost, however. The authors note that two upcoming gatherings of key decision makers will be crucial to preserving Earth’s wild areas. These are the 14th meeting of the Conference of the Parties to the Convention on Biological Diversity (CBD), held from November 17-29, and the United Nations Framework Convention on Climate Change, held between December 2-14 in Katowice, Poland.

Just 20 nations hold 94% of the worlds marine and terrestrial wilderness areas (excluding Antarctica and the High Seas), the team writes. Five of them (Russia, Canada, Australia, United States, and Brazil) together hold 70%. The authors say that these countries can make or break our efforts of securing the last wild vestiges of Earth’s ecosystems.

“Wilderness will only be secured globally if these nations take a leadership role,” says John Robinson, Executive Vice President for Global Conservation at WCS and a co-author of the paper.

“Right now, across the board, this type of leadership is missing. Already we have lost so much. We must grasp these opportunities to secure the wilderness before it disappears forever.”

The authors and their organizations urge participants at the meetings to include a mandated target for wilderness conservation. They recommend setting 100% conservation of all intact wild ecosystems as a bold but achievable goal. Formally documenting the carbon sequestration and storage capacities of wilderness areas, and enshrining them into policy recommendations, would also help governments include such ecosystems in their emission-reduction strategies, they add.

The paper “Protect the last of the wild” has been published in the journal Nature.

Why do people self-harm? New study offers surprising answers

If you’ve seen HBO’s newest miniseries, Sharp Objects, you’re well familiar with what doctors call NSSI: non-suicidal self-injury. NSSI is a serious mental health condition, but despite years of research, we’re still not quite sure why individuals engage in this type of behavior. A new study performed at St. Edward’s University in Austin, Texas, sought the answer to this question by drawing on existing theories in the literature.

Previous studies have shown that individuals who exhibit NSSI have low levels of β-endorphin, which is produced to mediate stress-induced analgesia (the inability to feel pain) — and high ratings of clinical dissociation, which is a feeling of disconnection with oneself and one’s surroundings. The researchers hypothesized that NSSI individuals are attempting to restore these imbalances using self-harm. To test their hypothesis, researchers recruited participants from the university. Using saliva samples and surveys, they assessed β-endorphin levels and psychological state before and after a procedure called the cold-pressor test.

[panel style=”panel-info” title=”Cold-Pressor Test” footer=””]During the cold-pressor test, an individual immerses his or her hand in a bucket of ice water. Researchers then note how long it takes the individual to feel pain (their pain threshold) and how long until the pain is unbearable (pain tolerance), at which point the test ends.

They discovered that non-suicidal self-injurers have lower levels of arousal than people without these tendencies (the control group). After the pain challenge, their arousal levels matched the baseline of the control group — in other words, experiencing pain was able to correct their low levels of arousal. The pain challenge also decreased symptoms of dissociation. However, these changes weren’t exclusive to the NSSI group: the control group also experienced an increase in arousal and a decrease in dissociative symptoms after the cold-pressor test.

Next, the researchers sorted the NSSI group by symptom severity. They found that the more severe the individual’s NSSI symptoms, the stronger their dissociative symptoms were. However, only the most severe cases experienced a reduction in these symptoms after the pain challenge. Another interesting finding is that the NSSI individuals with moderate symptom severity actually had higher levels of β-endorphins (both before and after the pain challenge). This wasn’t seen in those with low or high symptom severity.

However, perhaps the most surprising part of the study was the high percentage of NSSI participants. 

“The literature states that there’s a 5% prevalence of NSSI in the general population, and we found this in 17 out of 65 participants, which is way above what we would expect, even when taking into consideration that university students tend to have a higher NSSI rate than the general population,” said Haley Rhodes, who presented the research at the 2018 Society for Neuroscience Meeting.

Rhodes admits that a bigger sample size is necessary before we can draw full conclusions from the data, but it’s intriguing that there seems to be a minor psychological benefit to the pain — though it most definitely doesn’t warrant any self-harming practices.

Understanding the imbalances in individuals that partake in NSSI might help us find a way to provide for their psychological needs, and allow them to get the same benefits without needing to resort to self-injury.



Running man.

One broken gene made us very good runners

A genetic fluke two to three million years ago turned humans into the best endurance runners around.

Running man.

Image via Pixabay.

A new paper published by researchers from the University of California San Diego School of Medicine reports that our ancestors’ functional loss of one gene called CMAH dramatically shifted our species’ evolutionary path. The loss altered significant metabolic processes, with impacts on fertility rates and risk of developing cancer.

The same change may have also made humans one of the best long-distance runners on Earth, the team adds.

These genes were made for runnin’

Our ancestors were presumably quite busy two to three million years ago transitioning from living in trees to live on the savannah. They were able to walk upright by this time, but they weren’t particularly good at it.

However, soon after this, some of our ancestors’ physiology starts undergoing some striking changes. Most relevant are shifts we see in their skeletons, resulting in long legs, big feet, and large gluteal muscles (butts) — all very good for walking around. These shifts were also accompanied by the evolution of sweat glands with much the same layout and capacity as ours which, according to the team, is quite expansive and much better at dissipating heat than that of other large mammals.

In other words, humanity received powerful legs and one of the most solid cooling systems in one fell swoop.

Our ancestors proceeded to use their new toys to hunt and eat anything they could bring down. They did so by adopting a hunting pattern unique among primates (and very rare among animals in general) known as persistence hunting: they would go out in the heat of the day, when other carnivores were resting, relying on their legs and sweat glands to chase prey until — exhausted and overheated — it couldn’t physically run away anymore.

We didn’t know much about the biological changes that underpinned this radical change, however. The first clues were uncovered around 20 years ago — when Ajit Varki, a physician-scientist at the University of California, San Diego (UCSD), and colleagues unearthed one of the first genetic differences between humans and chimps: a gene called CMP-Neu5Ac Hydroxylase (CMAH). Other species of primates also have this gene.

We, however, have a broken version of CMAH. Varki’s team calculated that this genetic change happened 2 million to 3 million years ago, based on the genetic differences among primates and other animals.

More recent research has shown that mice models with a muscular dystrophy-like syndrome exhibit more acute symptoms when this gene is inactivated. This hinted to Varki that the faulty gene might be what led to the changes our ancestors experienced in the savannahs.

“Since the mice were also more prone to muscle dystrophy, I had a hunch that there was a connection to the increased long distance running and endurance of Homo,” said Varki.

UCSD graduate student Jonathan Okerblom, the study’s first author, put the theory to the test. He built mouse running wheels, borrowed a mouse treadmill, and pitted mice with a normal and broken version of CMAH to the task.

“We evaluated the exercise capacity (of mice lacking the CMAH gene), and noted an increased performance during treadmill testing and after 15 days of voluntary wheel running,” Okerblom explained.

The two then consulted Ellen Breen, Ph.D., a research scientist in the division of physiology, part of the Department of Medicine in the UC San Diego School of Medicine. She examined the mice’s leg muscles before and after running different distances, some after 2 weeks and some after 1 month.

After training, mice with the human-like version of CMAH ran 12% faster and 20% longer than the other mice, the team reports. Breen adds that the mice displayed greater resistance to fatigue, increased mitochondrial respiration and hind-limb muscle, with more capillaries to increase blood and oxygen supply. Taken together, Varki says the data suggest CMAH loss contributed to improved skeletal muscle capacity for oxygen utilization.

“And if the findings translate to humans, they may have provided early hominids with a selective advantage in their move from trees to becoming permanent hunter-gatherers on the open range.”

The most likely cause of this change was evolutionary pressures associated with an ancient pathogen, the team explains.

The version of the gene we carry determines the loss of a sialic acid called N-glycolylneuraminic acid (Neu5Gc), and accumulation of its precursor, called N-acetylneuraminic acid or Neu5Ac, which differs by only a single oxygen atom. Sialic acids serve as vital contact points for cell-to-cell interaction and cellular interactions with the surrounding environment. This change likely led to enhanced innate immunity in early hominids, according to past research.

Sialic acids may also be a biomarker for cancer risk, and the team has also reported that certain sialic acids are associated with increased risk of type 2 diabetes; may contribute to elevated cancer risk associated with red meat consumption, and trigger inflammation.

“They are a double-edged sword,” said Varki. “The consequence of a single lost gene and a small molecular change that appears to have profoundly altered human biology and abilities going back to our origins.”

The paper “Human-like Cmah inactivation in mice increases running endurance and decreases muscle fatigability: implications for human evolution” has been published in the journal Proceedings of the Royal Society B.

HPV vaccine.

The UK’s HPV-vaccine effort paid off: infections are down 86%

HPV vaccines work: new data from Public Health England show that the vaccine has led to a significant decline in the number of young women infected with the virus in the UK.

HPV vaccine.

HPV Vaccination in Sao Paulo, Brazil.
Image via Pan American Health Organization PAHO / Flickr.

Between 2010 and 2016, infections with human papillomavirus (HPV) strains 16 and 18 fell by 86% in women aged 16 to 21 who were eligible for the vaccine. HPV 16 and 18 are the two strains responsible for most cervical cancer cases associated with the virus. Overall, the HPV vaccination program introduced in 2008 led to a sharp decline in the five high-risk strains of HPV — which collectively cause 90% of cervical cancer cases — a new study reveals.

Safely inoculated

“The study shows the positive effects of HPV vaccinations,” said David Mesher, lead author of the study. “There have been some very positive results from the program.”

The study also identified a decline in several strains of HPV that are not covered by the vaccine — suggesting that the treatment can help inoculate against various strains.

The study worked with a sample of 15,349 English women aged 16 to 24 who visited a doctor’s office for chlamydia screening between 2010 and 2016. Along the reduction in HPV infection rates, the team also reports a decline in diagnoses of genital warts among boys and girls aged 15 to 17 between 2009 and 2017. These are caused by certain HPV strains that the vaccine is meant to protect against.

HPV spreads from infected individuals through sexual contact. The US Centers for Disease Control and Prevention says that “almost every person who is sexually-active will get HPV at some time in their life if they don’t get the HPV vaccine”. For women, it can be especially dangerous, as HPV infections can develop into cervical cancer, the fourth most common cancer in women globally. “Virtually all cervical cancer cases (99%) are linked to genital infection with HPV which is the most common viral infection of the reproductive tract,” writes the World Health Organization (WHO).

There are over 100 known strains of HPV out there, and the vaccine doesn’t protect against all of them. But it does protect against the most dangerous ones. The vaccine is typically administered to girls aged 12 to 13 since that’s when it’s most effective. The UK has achieved an impressive 80% vaccination rate for women between the ages of 15 and 24. According to Masher, the results show that the HPV vaccine “works better than anybody would have expected”.

Some 80 million people have been vaccinated against HPV globally, the team notes. While vaccination rates have increased during the last 10 years, even in some of the world’s more affluent countries, such as the U.S., France, Denmark, and Japan, vaccination levels are low. The UK’s efforts are inspiring other countries to establish vaccination drives.

The paper “The Impact of the National HPV Vaccination Program in England Using the Bivalent HPV Vaccine: Surveillance of Type-Specific HPV in Young Females, 2010–2016” has been published in The Journal of Infectious Diseases.

Acuity Kitchen Photo.

There are huge differences in how animals see the world — we’re among the crisp-eyed

Not all eyeballs are created equal.

Acuity Kitchen Photo.

Image credits E. Caves, N. Brandley, S. Johnsen , 2018, Trends in Ecology and Evolution.

If seeing is believing, humans probably believe a lot more than other animals, according to new research from Duke University. Our eyes perceive the world in much sharper detail than those of most other members of the animal kingdom, the results suggest.

To see or not to see

The researchers measured the visual sharpness of several species using a method called ‘cycles per degree’. Basically, what this method ascertains is how many pairs of parallel black and white lines an eye can distinguish in a single degree of vision. The human eye, the team writes, can resolve around 60 cycles per degree. Anything above 60 line pairs starts to look a blurry grey to us.

These measured visual acuity levels were then fed into software that transformed a reference image to give us a taste of how other animals see the world (the image above). Compared to most other organisms on the planet, our eyesight is actually crisp:

Eyesight sharpness.

Image credits Image credits E. Caves, N. Brandley, S. Johnsen , 2018, Trends in Ecology and Evolution.

The team writes that chimps and other primates see roughly as well as we do. That’s not very surprising, given that they’re our closest living relatives. There are a few species that can boast higher visual acuity than us the team notes that some birds of prey, such as the Australian wedge-tailed eagle with 140 cycles per degree, can see nearly two times more detail than we do. Given that they need to spot small prey from thousands of meters away, that’s not very surprising. Apart from these, however, we humans seem to have quite good eyesight.

Fish and most birds, the team reports, can only distinguish about 30 cycles per degree. Elephants can only see a paltry 10 — which is actually the level at which a human is declared legally blind.

The team also explored the implications of their findings. It’s easy to assume that every living thing sees the world roughly the same way as we do, but the results show there’s an incredible variation in visual acuity. They note the case of the cleaner shrimp, which “likely cannot resolve one another’s colour patterns, even from distances as close as 2 cm”. Then what’s the point of sporting bright colors and waving your antennae or your body around? For context, it looks like this:

The team believes that this behavior isn’t meant to communicate anything to other cleaner shrimp — it’s meant to signal fish: “both [the shrimps’] colour patterns and antennae are visible to fish viewers of various acuities from a distance of at least 10 cm,” they write.

“Thus, these distinctive colour patterns and antennae-whipping behaviors likely serve as signals directed at clients, despite the inability of cleaner shrimp themselves to distinguish them.”

They make a similar point about butterflies. Based on the team’s results, these animals probably can’t even distinguish each other’s patterns. Birds, however, can.

“The point is that researchers who study animal interactions shouldn’t assume that different species perceive detail the same way we do,” Caves concludes.

While I do find the findings fascinating, it’s important to note that animals may actually see better than their visual acuity alone suggests. The team’s research only focused on how their eyes work, but ‘seeing’ is mostly handled by the brain. It may very well be that these relatively dim-sighted species have neural systems in place to improve the final images they perceive.

For now, we simply don’t know. Judging from the amount of data each species’ eyes records, however, it may be that we are some of the sharpest-eyed animals out there.

The paper “Visual Acuity and the Evolution of Signals” has been published in the journal Cell.


Three confirmed, six suspected deaths from emerging Nipah virus in India

Health officials in the state of Kerala, India, report that nine people lost their lives in confirmed and suspected case of the emerging Nipah virus.


Transmission electron micrograph (TEM) showing a number of Nipah virus virions isolated from a patient’s cerebrospinal fluid.
Image credits CDC / C. S. Goldsmith, P. E. Rollin.

Three victims have tested positive for the virus in the past two weeks. The results from the other six are expected later today. A further twenty-five people have been hospitalized with symptoms indicative of the same infection in Kozhikode, Kerala.

Nipah is one of the viruses on the list of the most dangerous viral threats, candidates for a major outbreak, published by the WHO — in fact, it was at the top of the list. It got there by virtue of two characteristics: Nipah can be transmitted to humans from animal hosts, and there is no current treatment against it. Nipah has a mortality rate of 70%.

Fruit bats are currently considered to be one of the most prolific carriers and spreaders of the virus. Local authorities reported finding mangoes bitten by bats in the home of three suspected Nipah victims. Furthermore, Kerala’s health secretary Rajeev Sadanandan told the BBC that a nurse who treated the patients had also died. However, doctors are yet to confirm if she had contracted the Nipah virus, The Indian Express adds.

“We have sent blood and body fluid samples of all suspected cases for confirmation to National Institute of Virology in Pune. So far, we got confirmation that three deaths were because of Nipah,” he said.

“We are now concentrating on precautions to prevent the spread of the disease since the treatment is limited to supportive care.”

The first time we had seen the Nipah virus (NiV) was during a 1999 outbreak of encephalitis and respiratory illness in Malaysia and Singapore. The outbreak centered around pig farmers and other people in close contact with pigs, suggesting the animals were helping spread the disease. More than a million animals were euthanized in a bid to limit the spread.

The outbreak reached nearly 300 confirmed human infections and 100 deaths. However, in subsequent NiV outbreaks, there were no intermediate hosts.

Nipah’s symptoms include fever, headache, drowsiness, respiratory illness, disorientation and mental confusion — and can progress to coma within 24-48 hours. The WHO recommends avoiding contact with sick pigs or bats in endemic areas, as well as not drinking raw date palm sap as precautions against infection.

Human-like walking evolved before the genus Homo, more than 3.6 million years ago

Hominins may have started walking like us much more earlier than believed, new research shows.


Image credits Ajay Karat.

It may not be obvious, but we draw a lot of meaning from the way we walk — from the way we walk and other animals don’t, to be more precise.

At the start of each playthrough in the latest installment of Civilization, a narrator eases you into the game with the words “from man’s first upright steps”. We laugh when pets rise on their hind feet and share videos of it with our friends, chuckling all around at the animal that ‘thinks it’s human!’. That ubiquitous picture showing the evolution of man starts with an ape walking on all fours in the left and ends with a man walking on just two. And, well, to understand somebody, your best bet seems to be to walk a mile in his or her shoes.

In subtle ways, we see our bipedal walk as something that sets us apart from the rest of the animals on Earth. Something that’s intrinsic to what it means to be human.


As such, ever since we’ve realized that people had to evolve from apes, our collective imagination has given quite a lot of thought to the moment when our ancestors rose from ape-like postures to the upright gait we use today. Scientists have also been interested in finding out when it happened, as that would give us precious insight into the way our ancestors lived, hunted, and evolved.

Fossilized footprints discovered in Laetoli, Tanzania, suggests that the hallmark human-like, extended leg bipedalism evolved substantially earlier than previously believed.

“Fossil footprints are truly the only direct evidence of walking in the past,” said David Raichlen, PhD, associate professor at the University of Arizona, one of two authors of the paper describing these findings.

“By 3.6 million years ago, our data suggest that if you can account for differences in size, hominins were walking in a way that is very similar to living humans. While there may have been some nuanced differences, in general, these hominins probably looked like us when they walked.”

Our species, Homo sapiens sapiens, is believed to have emerged some 200,000 to 300,000 years ago. Homo, our genus (extended family) as a whole, likely emerged some 2-2.5 million years ago. These are ‘humans’.

We refer to the wider set of ancestors that came before Homo as ‘hominins‘, although there is still a lot of debate on what species should be included in that group and what the relationships between them are. This is partly due to the fact that species evolve gradually, in small increments, making it hard to distinguish clearly, partly due to a lackluster fossil record, and partly because evolutionary anthropologists seem decided not to agree on anything, ever.

Common wisdom up to now held that hominids took to two feet around 7 million years ago. Based on observations of how other primate species evolved, however, it was also largely held that these hominids likely walked in a crouched posture with legs bent for quite some time.

Walk a mile in this mud

Raichlen’s and his colleagues’ work focuses on reconstructing walking mechanics starting from fossilized footprints and skeletons of early human ancestors. Together with co-author Adam Gordon (University at Albany), he used a combination of morphological studies and experimental data to show that the Laetoli footprints point to a fully upright, human-like bipedal style of walking.

In one experiment, the duo asked eight volunteers to walk in either an upright or stooped posture, with bent knees and hips, on a mud surface — then compared the depth and shape of their footprints to the Laetoli ones. When they analyzed the impression made by the toe versus the heel, which reflects how the center of pressure moves along your foot as you take a step, they found the footprints at Laeoli were much more similar to the footprints made by modern humans walking upright than stooped.

Our ancestors used to walk with a similar upright gait 3.6 million years ago. Credit: David Raichlen, University of Arizona.

Our ancestors used to walk with a similar upright gait 3.6 million years ago. Credit: David Raichlen, University of Arizona.

Raichlen believes this can be explained by simple economics. When walking upright, having your legs fully extended uses less energy than adopting a stooped, more ape-like position. This suggests that the switch to a more human-like gait likely had something to do with how our ancestors found food, and how far they had to travel to find it.

“The data suggest that by this time in our evolutionary history, selection for reduced energy expenditures during walking was strong,” said Raichlen.

“This work suggests that, by 3.6 million years ago, climate and habitat changes likely led to the need for ancestral hominins to walk longer distances during their daily foraging bouts. Selection may have acted at this time to improve energy economy during locomotion, generating the human-like mechanics we employ today.”

Raichlen, however, cautions that we still don’t know when hominins started adopting different gaits than other primates — we just know that they were walking upright 3.6 million years ago. Until we find the right footprints, that question will remain unanswered.

The paper “Using experimental biomechanics to reconstruct the evolution of hominin locomotor postures” has been presented at the American Association of Anatomists‘ annual meeting during the 2018 Experimental Biology meeting.

We can’t grow new neurons in adulthood after all, new study says

Previous research has suggested neurogenesis — the birth of new neurons — was able to take place in the adult human brain, but a new controversial study published in the journal Nature seems to challenge this idea.

a. Toluidine-blue-counterstained semi-thin sections of the human Granule Cell Layer (GCL) from fetal to adult ages. Note that a discrete cellular layer does not form next to the GCL and the small dark cells characteristic of neural precursors are not present.

Scientists have been struggling to settle the matter of human neurogenesis for quite some time. The first study to challenge the old theory that humans did not have the ability to grow new neurons after birth was published in 1998, but scientists had been questioning this entrenched idea since the 60’s when emerging techniques for labeling dividing cells revealed the birth of new neurons in rats. Another neurogenesis study was published in 2013, reinforcing the validity of the results from 1998.

Arturo Alvarez-Buylla, a neuroscientist at the University of California, San Francisco, and his team conducted a study to test the neurogenesis theory using immunohistochemistry — a process that applies various fluorescent antibodies on brain samples. The antibodies signal if young neurons as well as dividing cells are present. Researchers involved in this study were shocked by the findings.

“We went into the hippocampus expecting to see many young neurons,” says senior author Arturo Alvarez-Buylla. “We were surprised when we couldn’t find them.”

In the new study, scientists analyzed brain samples from 59 patients of various ages, ranging from fetal stages to the age of 77. The brain tissue samples came from people who had died or pieces were extracted in an unrelated procedure during brain surgery. Scientists found new neurons forming in prenatal and neonatal samples, but they did not find any sustainable evidence of neurogenesis happening in humans older than 13. The research also indicates the rate of neurogenesis drops 23 times between the ages one and seven.

But some other uninvolved scientists say that the study left much room for error. The way the brain slices were handled, the deceased patients’ psychiatric history, or whether they had brain inflammation could all explain why the researchers failed to confirm earlier findings.

The 1998 study was performed on brains of dead cancer patients who had received injections of a chemical called bromodeoxyuridine while they were still alive. The imaging molecule — which was used as a cancer treatment — became integrated into the DNA of actively dividing cells. Fred Gage, a neuroscientist involved in the 1998 study, says that this new paper does not really measure neurogenesis.

“Neurogenesis is a process, not an event. They just took dead tissue and looked at it at that moment in time,” he adds.

Gage also thinks that the authors used overly restrictive criteria for counting neural progenitor cells, thus lowering the chances of seeing them in adult humans.

But some neuroscientists agree with the findings. “I feel vindicated,” Pasko Rakic, a longtime outspoken skeptic of neurogenesis in human adults, told Scientific American. He believes the lack of new neurons in adult primates and humans helps preserve complex neural circuits. If new neurons would be constantly born throughout adulthood, they could interfere with preexisting precious circuits, causing chaos in the central nervous system.

“This paper not only shows very convincing evidence of a lack of neurogenesis in the adult human hippocampus but also shows that some of the evidence presented by other studies was not conclusive,” he says.

Dividing neural progenitors in the granule cell layer (GCL) are rare at 17 gestational weeks (orthogonal views, inset) but were abundant in the ganglionic eminence at the same age (data not shown). Dividing neural progenitors were absent in the GCL from 22 gestational weeks to 55 years.

Steven Goldman, a neurologist at the University of Rochester Medical Center and the University of Copenhagen, said, “It’s by far the best database that has ever been put together on cell turnover in the adult human hippocampus. The jury is still out about whether there are any new neurons being produced.” He added that if there is neurogenesis, “it’s just not at the levels that have been presumed by many.”

The debate still goes on. No one really seems to know the answer yet, but I think that’s a positive — the controversy will generate a new wave of research on the subject.


Man-made: we’ve domesticated our own species

They don’t make humans like they used to — quite literally. Living in social groups has led us to self-domesticate our species, new research finds.


Image credits hairymuseummatt / Wikimedia.

According to the hypothesis of human self-domestication, one of the forces that powered and steered our evolution was artificial selection in the cave, tribe, or hut. As we like to live with other people (and there’s no indication that this was ever different), prosocial behavior became valuable while antisocial behavior became increasingly shunned throughout our history — which created a selective evolutionary pressure for the former. New research, led by Professor Cedric Boeckx from the University of Barcelona, comes to offer genetic evidence in favor of this hypothesis.

The goodest boy is you!

Self-domestication is, in broad strokes, pretty much like regular domestication. Outwardly, it leads to a change in anatomical features — for example, imagine how dogs have adorable droopy ears and rounder heads while wolves have scary pointed ears and more angular heads. It also leads to behavioral changes, most notably a reduction in aggressiveness. The key difference, however, is that self-domestication of a species is done internally, if you will, without input from other species.

It’s a process that several researchers believe helped shaped modern humans, as well as other species, such as bonobos. Up to now, however, we lacked any genetic evidence to help prop this hypothesis up. Boeckx’s team worked with the genomes of our extinct (and wild) Neanderthal or Denisovan relatives to try and determine whether humans have, in fact, domesticated themselves.

They compared the genetic material from modern humans against that of several domesticated species and their wild type. The comparison aimed to find overlapping genes associated with domestication, such as those linked to docile behavior or gracile facial features. The researchers first compiled a list of domestication-associated genes in humans based on the comparison with Neanderthals and Denisovans, wild but extinct human species. Then, they compared this list to genomes from domesticated animals and their wild type.

According to their paper, there are a “statistically significant” number of domestication-associated genes which overlapped between modern humans and domestic animals, but not with their wild types. The team says these results strengthen the self-domestication hypothesis and help “shed light on […] our social instinct.”

“One reason that made scientists claim that humans are self-domesticated lied within our behavior: modern humans are docile and tolerant, like domesticated species, our cooperative abilities and pro-social behaviour are key features of our modern cognition,” says Boeckx.

“The second reason is that modern humans, when compared to Neanderthals, present a more gracile phenotype that resembles the one seen in domesticates when compared to their wild-type cousins,” added the expert.

[accordion style=”info”][accordion_item title=”What’s a phenotype?”]The phenotype represents an organism’s characteristics resulting from the interaction between its genes (genotype) and the environment. While the genotype dictates characteristics such as, let’s say, eye color, skin color, or maximum height, your phenotype is your actual height (a product of nutrition and genetics), your actual skin color (a product of exposure to sunlight and genetics), so on.

As a rule of thumb, the genotype is the digital blueprint and the phenotype is what actually came out on the production line.


The team used other statistical measures, including control species, to make sure they weren’t picking up on a fluke. Their aim was to rule out the possibility that these genes would randomly overlap between humans and domesticated animals, so they compared the genomes with species of great apes.

“We found that chimpanzees, orangutans and gorillas do not show a significant overlap of genes under positive selection with domesticates. Therefore, it seems there is a ‘special’ intersection between humans and domesticated species, and we take this to be evidence for self-domestication,” Boeckx said.

There’s still a lot of work left if we’re to tease out the physical, mental, and behavioral characteristics that these genes impart onto us. However, some broad lines can already be drawn, the team believes: Boeckx himself suspects that self-domestication might explain why humans are so ridiculously cooperative or our “special mode of cognition”,

The paper “Self-domestication in Homo sapiens: Insights from comparative genomics” has been published in the journal PLOS One.

Brain comparisons.

Humans got a brain upgrade less than 200,000 years ago, and it made us what we are today

Humans have been around this savannah for quite some time now — numbered in the millions of years — but ‘modern’ humans, the ones we’re used to today, have only been around for less than 200,000 years, new research suggests.

Brain comparisons.

The blue brain shows the globular shape of present-day human heads. In contrast, the red skull shape of a Neandertal, like the earliest Homo sapiens fossils, is elongated.
Image credits Simon Neubauer, Jean-Jacques Hublin, Philipp Gunz / MPI EVA Leipzig.

Novel research from the Max Planck Institute for Evolutionary Anthropology, Germany, reveals that modern humans have a distinct brain and skull architecture which likely fully developed around 40,000 years ago.

Brains for days

The team used 3-D scanning technology to examine the curves, shapes, and features of 20 different Homo sapiens fossils ranging from 300,000 to 10,000 years old. Using this data, they then constructed a timeline of how our skulls (and the brains therein) changed and evolved through history.

Among other traits, they note that our skulls are rounder than our ancestors‘, our facial bones smaller and more retracted into the skull. Such changes are meant to accommodate a bigger and rounder cerebellum (the ‘little brain’ at the back of the skull, which chiefly handles motor control and balance), a beefed-up parietal lobe (which plays a part in planning, attention, and spatial orientation), and a less robust facial structure, they say.

At the same time, our brains grew in overall size, with the skull’s side walls becoming more parallel and the frontal area growing higher to accommodate the increased volume. Finally, the occipital area, at the back of the head, traded its ‘overhanging’ shape for a more rounded structure.

The authors remark that changes in modern human brain shape can be tracked almost perfectly alongside the development of what we call modern behavior. Such behavior includes multi-component tool manufacture and use, bone carving, planning, but also a wider range of cognitive processes from self-awareness, language and funeral practices, to the use of pigments and the creation of art.

This team says that these changes in size and shape occurred “at some point after about 100,000 years ago and probably before 35,000 years ago.” It wasn’t an overnight affair, but likely an addition of slight changes over tens of thousands of years. Still, it would mean that our skulls and brains were well into their new shape around the same time that the “human revolution,” a rapid emergence of modern behaviors in humans, started.

“Our new data, therefore, suggest evolutionary changes to early brain development in a critical and vulnerable period for neural wiring and cognitive development,” said the paper’s lead author, Simon Neubauer.

However, the team notes that while our brain’s rounder, fuller shape is “related to our [modern] behavior,” it’s likely not its cause.

“It is important to note that it is probably not the globular brain shape itself that is advantageous for brain function,” he adds.

“The parietal lobe is an important hub in connecting different brain regions and involved in functions like orientation, attention, sensorimotor transformations underlying planning and visuospatial integration. The cerebellum is involved in motor-related functions like the coordination of movements and balance, but also in functions like working memory, language, social cognition and affective (emotional) processing.”

The paper “The evolution of modern human brain shape” has been published in the journal Science.