A landmark study found that only 1.5% to 7% of the human genome contains uniquely (modern) human DNA. The rest is shared with relatives such as Neanderthals and Denisovans.
However, the DNA that is unique to us is pretty important, as it’s related to brain development and function.
Researchers used DNA from fossils of our close relatives (Neanderthals and Denisovans) dating from around 40,000-50,000 years ago and compared them with the genome of 279 modern people from around the world. They used a new computational method that allowed them to disentangle the similarities and differences between different DNA with greater detail.
Many people around the world (all non-African populations) still contain genes from Neanderthals, a testament to past interbreeding between the two species. But the importance of this interbreeding may have been understated. The new study found that just 1.5% of humans’ genome is both unique and shared among all people living now, and up to 7% of the human genome is more closely related to that of humans than to that of Neanderthals or Denisovans.
This doesn’t mean that we’re 93% Neanderthal. In fact, just 20% of Neanderthal DNA survives in modern humans, and non-African humans contain just around 1.5-2% Neanderthal DNA. But if you look at different people, they have bits of Neanderthal DNA in different places. So if you add all the parts where someone has Neanderthal DNA, that ends up covering most of the human genome, although it’s not the same for everyone. This 1.5% to 7% uniquely human DNA refers to human-specific tweaks to DNA that are not present in any other species and are strictly unique to Homo sapiens.
In addition, this doesn’t take into account the places where humans gained or lost DNA through other means such as duplication, which could have also played an important role in helping us evolve the way we are today.
What makes us human
The research team was surprised to see just how little DNA is ours and ours alone. But those small areas that make us unique may be crucial.
“We can tell those regions of the genome are highly enriched for genes that have to do with neural development and brain function,” University of California, Santa Cruz computational biologist Richard Green, a co-author of the paper, told AP.
The exact biological function of those bits of DNA remains a major problem to disentangle. Our cells are filled with “junk DNA“, which we don’t really use (or we just don’t understand how our bodies use it yet) — but we still seem to need it. We’re not even sure what the the non-junk DNA bits do. Understanding the full instructions and role that genes have is another massive challenge that’s not yet solved.
What this study seems to suggest is that interbreeding played a much bigger role in our evolutionary history than we thought. Previous archaeological studies also suggest this: humans interbred with Neanderthals, Denisovans, and at least one other mysterious species we haven’t discovered yet (but we carry its DNA). Researchers are finding more and more evidence that these interbreeding events weren’t necessarily isolated exceptions but could have happened multiple times and over a longer period than initially thought. It’s up for future studies to reconcile the archaeological and anthropological evidence with the genetic one.
The study also found that the human-specific mutations seemed to emerge in two distinct bursts: 600,000 years ago and 200,000 years ago, respectively. It’s not clear what triggered these bursts; it could have been an environmental challenge or some other event, which at this point is unknown.
Researchers say that studying this 1.5-7% of our genome could help us better understand Neanderthals and other ancient populations, but it could also help us understand what truly makes us human. For instance, you could set up a laboratory dish experiment where you’d edit out the human-specific genes and revert them back to their Neanderthal function, and compare the molecular results of this change. It wouldn’t exactly be like bringing back a Neanderthal, but it could help us deduct how Neanderthals would have been different from modern humans — or, in counterpart, what makes humans stand out from our closest relatives.
The study “An ancestral recombination graph of human, Neanderthal, and Denisovan genomes” has been published in Science.
There has long been a general assumption that human beings are essentially selfish. We’re apparently ruthless, with strong impulses to compete against each other for resources and to accumulate power and possessions.
If we are kind to one another, it’s usually because we have ulterior motives. If we are good, it’s only because we have managed to control and transcend our innate selfishness and brutality.
This bleak view of human nature is closely associated with the science writer Richard Dawkins, whose book The Selfish Gene became popular because it fitted so well with (and helped to justify) the competitive and individualistic ethos of late 20th-century societies.
Like many others, Dawkins justifies his views with reference to the field of evolutionary psychology. Evolutionary psychology theorises that present-day human traits developed in prehistoric times, during what is termed the “environment of evolutionary adaptedness”.
This is usually seen as a period of intense competition, when life was a kind of Roman gladiatorial battle in which only the traits that gave people a survival advantage were selected and all others fell by the wayside. And because people’s survival depended on access to resources – think rivers, forests and animals – there was bound to be competition and conflict between rival groups, which led to the development of traits like racism and warfare.
This seems logical. But in fact the assumption it’s based on — that prehistoric life was a desperate struggle for survival — is false.
It’s important to remember that in the prehistoric era, the world was very sparsely populated. So it’s likely there was an abundance of resources for hunter-gatherer groups.
According to some estimates, around 15,000 years ago, the population of Europe was only 29,000, and the population of the whole world was less than half a million. With such small population densities, it seems unlikely that prehistoric hunter-gatherer groups had to compete against each other or had any need to develop ruthlessness and competitiveness, or to go to war.
There’s also significant evidence from contemporary hunter-gatherer groups who live in the same way as prehistoric humans. One of the striking things about such groups is their egalitarianism.
As the anthropologist Bruce Knauft has remarked, hunter-gatherers are characterised by “extreme political and sexual egalitarianism”. Individuals in such groups don’t accumulate their own property and possessions. They have a moral obligation to share everything. They also have methods of preserving egalitarianism by ensuring that status differences don’t arise.
The !Kung of southern Africa, for example, swap arrows before going hunting and when an animal is killed, the credit does not go to the person who fired the arrow, but to the person who the arrow belongs to. And if a person becomes too domineering or arrogant, the other members of the group ostracise them.
ǃKung woman making jewellery next to a child. Staehler/wikimediacommons, CC BY.
Typically in such groups, men have no authority over women. Women usually choose their own marriage partners, decide what work they want to do and work whenever they choose to. And if a marriage breaks down, they have custody rights over their children.
Many anthropologists agree that such egalitarian societies were normal until a few thousand years ago, when population growth led to the development of farming and a settled lifestyle.
Altruism and egalitarianism
In view of the above, there seems little reason to assume that traits such as racism, warfare and male domination should have been selected by evolution – as they would have been of little benefit to us. Individuals who behaved selfishly and ruthlessly would be less likely to survive, since they would have been ostracised from their groups.
It makes more sense then to see traits such as cooperation, egalitarianism, altruism and peacefulness as natural to human beings. These were the traits that have been prevalent in human life for tens of thousands of years. So presumably these traits are still strong in us now.
Of course, you might argue that if this is case, why do present day humans often behave so selfishly and ruthlessly? Why are these negative traits so normal in many cultures? Perhaps though these traits should be seen as the result of environmental and psychological factors.
Research has shown repeatedly that when the natural habitats of primates are disrupted, they tend to become more violent and hierarchical. So it could well be that the same thing has happened to us, since we gave up the hunter-gatherer lifestyle.
In my book The Fall, I suggest that the end of the hunter-gatherer lifestyle and the advent of farming was connected to a psychological change that occurred in some groups of people. There was a new sense of individuality and separateness, which led a new selfishness, and ultimately to hierarchical societies, patriarchy and warfare.
At any rate, these negative traits appear to have developed so recently that it doesn’t seem feasible to explain them in adaptive or evolutionary terms. Meaning that the “good” side of our nature is much more deep-rooted than the “evil” side.
Exactly why humans lost their fur is unclear, but our skinny exteriors set us apart from most of our mammalian cousins.
Still, us shedding our fur had a dramatic effect on the evolution of our species. To understand why, let’s take a look at what body hair does and see what benefits or drawbacks it brings to the table.
First off, what is it?
Each strand of hair is a filament made out of the protein keratin; when hair grows thickly across an animal’s body, we call it fur.
Hair is a hallmark of mammals, but this family doesn’t have a monopoly on hairs. Insects grow hairs too, although theirs are different in structure from our own. Bees, for example, use it to keep them warm, but also as a sensory organ and to carry pollen. Other species of insects, for example, the fruit fly Drosophila, mostly use them as sensors for tactile (touch) and olfactory (smell) input — a fly’s antennas work similarly to our noses, having a pore to allow smell in and neurons at the base of the strands to sense odors.
Some lichens, algae, plants, and a group called protists can grow trichomes, which is like plant-hair. Trichomes serve a wide range of functions including nutrition, the absorption and retention of water, as well as protection from radiation, insects, or larger herbivores.
And now, fur
Fur consists of an undercoat of finer hairs that helps trap heat, and an outer coat (the ‘guard’) that’s oily and keeps out water.
The most obvious use of fur is to help you keep warm. Animals that produce their own body heat, most notably mammals, use fur as an energy-saving mechanism. A coat of hair traps air around the animal’s body, which provides insulation. Going without one is like always keeping the window open during winter — you can probably keep the place warm, but your energy bill will skyrocket. Evolution doesn’t like paying bills.
Fur protects against damage from the elements or other threats. A hairless animal on a cold winter night could avoid hypothermia but still develop frostbite (because tissues can’t transfer heat fast enough to prevent freezing). In a pinch, fur can also ward off light scratches or bruises, and some animals have water-repellent coats.
The most obvious drawback of having fur is that it costs energy and nutrients to grow. Hairs are renewed constantly to keep them healthy and efficient, and when you’re covered with them, it adds up fast. A 40kg-sheep for example can produce up to 13.6 kg (30 pounds) of wool per year and eats roughly 1.1 kgs of dry food per day (around 400kgs/year), according toSmilingTreeFarm. Its fleece weighs around 3.4% of its total annual intake of food — another way to look at it is that two weeks out of the year, this hypothetical sheep eats only to grow its fur.
Most other drawbacks of fur are dependent on context. Wet fur is a complete liability as it’s heavy and good at trapping water, which will chill you thoroughly. Dry fur is a good insulator but can also make you overheat in hot climates (most mammals apart from primates don’t sweat). Fur is a great home for parasites and creepy crawlies. Shed hairs can create a scent trail for predators to follow — even human hunters look for hairs in the brush when stalking prey.
The hair on our heads still acts as insulation and offers some degree of protection against solar radiation. A testament to its usefulness is that the follicles on our scalps (these are the ‘foundations’ from which hairs grow) have longer active growth periods than any other on our body. The strands of hair on our head can keep growing for years, whereas most others grow for weeks or a few months.
But not all hair grows to keep us warm. Our eyebrows are designed to keep sweat from our eyes, and are thus a product of our lack of fur. We have hair inside our noses, meant to keep out dust and other particles. There are strands of hair inside our ears that allow us to keep balance.
Hair is also meant to help us mate and signal various information to the group. Facial hair for men as well as body and pubic hair for both sexes show maturity. Guys tend to be the hairier of the sexes, and this is a product of testosterone. The most common theory as to why is that it helped keep ancient men warm on those long cold nights out hunting mammoths. But that wouldn’t explain why both sexes have a thin, almost invisible coat of body hair; from experience, I can also vouch that a hairy forearm won’t do much good in winter.
One proposed alternative is that it’s meant to help us feel parasites, and that women tended to favor men who had fewer parasites on them (sounds reasonable). This would have generated an evolutionary pressure for hairier men, as women essentially selected for this trait.
Finally, many species employ hairs as sensory organs. Whiskers are a prime example. They’re academically known as ‘vibrissae’ and come in two forms: the longer, thicker ‘macrovibirssae’ (that animals can typically move voluntarily) and the smaller, thinner ‘microvibrissae’ (which are typically immobile). Whiskers grow in groups on various parts of an animal’s body, most commonly on their snouts, and are used to sweep a wider area using touch. They vibrate when coming in contact with something, and blood vessels at their roots amplify this vibration for the animal to perceive.
If you’re a cat sticking your head in a dark mouse’s hole, having a good set of whiskers can help you find your meal. Spiders are another great example of hair used as sensing organs. They wear their bones (a chitinous exoskeleton) on their outside. Hairs grow out of this skeleton and help transmit vibrations from the soil or web to the animal, acting similarly to hearing or a long-range sense of touch. Note that while mammals grow their fur from keratin, insects use chitin.
So why don’t we have fur?
You might be surprised to hear that humans and other primates have virtually the same density of hair follicles (and thus, hairs) over most of their body. The difference is that ours is ‘vellus hair‘, so short and fine that it’s almost invisible. So it’s not that we lost our body hair, we just changed its type.
We don’t really know why this happened. We have several theories, though.
One of them is that, as our ancestors moved down from the trees, they also discovered seafood. Since wet fur isn’t effective, this could have favored individuals with less fur. However, this isn’t regarded as the likely reason, or at least not the main one.
A more widely-accepted theory is that our ancestors needed to better control their temperature as they switched from living in forests to living in the savannah. No fur meant they could sweat more, preventing heat exhaustion. This may have directly underpinned the success of our species by allowing us to outlast, and thus capture, prey.
Our largest departure from the primate family in regards to our skin comes from our sweat glands. Humans have up to ten times as many eccrine (sweat) glands than chimps or macaques. We also have extremely few apocrine glands, which produce an oil-like substance. Our primate cousins can have equal parts eccrine and apocrine glands and other animals such as dogs will only have apocrine glands on their body and eccrine ones on the pads of their feet or other hairless areas.
The truth is probably somewhere in the middle, and our hairlessness was caused by several factors working together. Environmental pressures and a selective advantage during hunts started the process, and human ingenuity (which means fires and clothes to keep warm) kept it going up to today. However, the heavy presence of sweat glands on our skin suggests that thermoregulation (keeping our bodies’ heat just right) was a major advantage of our hairless outsides.
Keep in mind that there are still many unknowns regarding our hairlessness. But two moments in our evolutionary history could have started this transition.
The first was in our very ancient past, as our furry ancestors climbed down from the trees. Humans have never been too physically imposing, and our ancestors were probably similar in this regard. The theory goes that they focused their activity during the hottest parts of the day in order to avoid predators (who would be hiding from the sun and avoiding activity), which made it advantageous to sweat and made fur impractical. This likely took place around four to seven million years ago. Essentially, in this scenario, it was our efforts to avoid being eaten that lost us our fur.
The second possibility is that humanity needed to shed the hairs in order to be able to hunt. Again, our bodies are very tiny and fragile compared to most wildlife. We’re slower than most animals we’d like to hunt, have no fangs, no claws, and can’t roar. We had tools, maybe spears, to help, but they couldn’t make up all the difference.
However, what our ancestors (and call center operators) can tell you is that you can accomplish a lot if you just tire your competition out. Persistence hunting was one of the few ways our ancestors could acquire large quantities of meat apart from maybe scavenging (which is dangerous and not very lucrative).
Despite our many shortcomings, humans are the best persistence hunters on the planet (or at least among the top ones) simply because we can sweat and cool down even while running. Our ancestors figured out that they didn’t need to fight and stab the antelope; they could just scare it away and chase it, tire it out so that it couldn’t fight back. And then stab it — safely.
This scenario would be more recent — around two million years ago — as hominins like Homo erectus (the first bipedal hominin) started hunting. Bones with tool marks discovered at Homo erectus sites show that they were hunting and butchering large prey regularly. Their bone structures suggest they could walk and run much better than earlier hominins due to longer legs, a foot structure more adapted to walking upright, and larger butt muscles. In this scenario, it was our efforts to catch and eat other animals that cost us our furs.
We may never know for sure why our skin is pink and bare when all our relatives are furry. Personally, I find “because it helped our ancestors survive, somehow” to be a satisfying answer. What I do find more interesting to ponder, however, was whether the way our ancestors wanted to live made fur obsolete, or whether they lost their coats first and then just tried to make the most of it.
Wherever the answers may lie, I’m pretty happy to be a skinny ape; especially as I pick clumps of my cat’s fur from the floor.
Around 40,000 years ago, the northern hemisphere went through an abrupt and prolonged shift in climate that coincided with the extinction of many species. Both Homo sapiens and Neanderthals had to struggle under these conditions but it was only our species that survived, despite the fact that evidence suggests that Neanderthals were perhaps every bit as resourceful as us.
A new study suggests that it was, in fact, competition with humans that sealed Neanderthals’ fate.
Neanderthals had been in Europe and Asia since 300,000 years ago, whereas humans ventured out of Africa into Eurasia through the Middle East around 60,000 years ago.
Before humans entered the picture, Neanderthals were dispersed all around Eurasia and had successfully lived through multiple shifts in climate that caused their food supply to dwindle. Nevertheless, they braved through such obstacles and challenges time and time again.
But despite having survived for hundreds of thousands of years, within a couple of thousands of years from the first contact with humans — which also resulted in interbreeding, as evidence by the fraction of Neanderthal DNA that can still be found in our genomes to this day — the Neanderthals were toast.
For many years, scientists have been debating what the final days of our extinct cousins looked like and what exactly brought about their demise. The climate change factor is obviously important, but so is competition with humans. Which had more weight, though?
This vital question was at the forefront of a new study published by a team of researchers led by Axel Timmermann, director of the IBS Center for Climate Physics at Pusan National University in South Korea.
Timmermann and colleagues devised the first mathematical model that simulates the migration of both Neanderthals and Homo sapiens, as well as their interactions (competition, interbreeding, etc.) in a changing climate.
Their work is based on highly established climate models and takes into account a time-varying glacial landscape and shifts in temperature, rainfall, and vegetation patterns.
The model assumes that both hominins compete for the same food resources and allows a small fraction of their members to interbreed.
“This is the first time we can quantify the drivers of Neanderthal extinction,” said Timmermann. “In the computer model I can turn on and off different processes, such as abrupt climate change, interbreeding or competition.”
According to Timmermann, the only realistic scenario in which Neanderthals go extinct is when Homo sapiens had advantages over their cousins in terms of exploiting shared resources.
Although Neanderthals used tools, clothing, and engaged in cultural and ritualistic practices like humans, our species may have had a technological edge by employing superior hunting techniques and weapons. Homo sapiens may have also been more resilient to pathogens.
Of course, this isn’t the final word in this heated debate, but it does paint a convincing picture that humans were largely responsible for the sudden downfall of Neanderthals.
“Neanderthals lived in Eurasia for the last 300,000 years and experienced and adapted to abrupt climate shifts, that were even more dramatic than those that occurred during the time of Neanderthal disappearance. It is not a coincidence that Neanderthals vanished just at the time, when Homo sapiens started to spread into Europe,” says Timmermann. He adds “The new computer model simulations show clearly that this event was the first major extinction caused by our own species.”
In the future, Timmermann and colleagues would like to refine their model by including megafauna and more realistic climate forcing.
Gorillas seem to be very territorial, a new study shows, but they seem to understand ‘ownership’ similarly to humans.
The study is the first one to demonstrate that gorillas are territorial in nature, unlike previous assumptions. At the same time, the findings suggest that these primates can recognise “ownership” of specific regions in a very human-like manner, and will attempt to avoid contact with other groups while travelling close to the centre of neighbouring ranges in order to avoid conflict.
Which seems like the polite thing to do!
My turf, your turf
“Gorillas don’t impose hard boundaries like chimpanzees. Instead, gorilla groups may have regions of priority or even exclusive use close to the centre of their home range, which could feasibly be defended by physical aggression,” says lead author Dr. Robin Morrison, who carried out the study during her PhD at the University of Cambridge
“Our findings indicate that there is an understanding among gorillas of ‘ownership’ of areas and the location of neighbouring groups restricts their movement.”
Because their home ranges often overlap, and because they’re quite peaceful to other gorilla groups, gorillas have long been assumed to be non-territorial. This would make them markedly different from chimpanzees, who have no qualms about using extreme violence to protect their home turf.
The new study, however, suggests that gorillas are, in fact, territorial animals — but they also display quite nuanced behavior around the issue. The study focused on monitoring the movements of the western lowland gorillas (Gorilla gorilla gorilla) at the Odzala-Kokoua National Park in the Republic of Congo. These animals are notoriously difficult to track, so the team placed video cameras at 36 feeding “hotspots” across a 60-square-km area of the park to help them monitor eight different groups of gorillas.
The team reports that the movements of each group are strongly influenced by the location of their neighbours, being less likely to feed at a site visited by another group earlier that day. They would also try to steer clear of the centre of their neighbours’ home range.
“At the same time groups can overlap and even peacefully co-exist in other regions of their ranges. The flexible system of defending and sharing space implies the presence of a complex social structure in gorillas,” explains Dr Morrison.
“Almost all comparative research into human evolution compares us to chimpanzees, with the extreme territorial violence observed in chimpanzees used as evidence that their behaviour provides an evolutionary basis for warfare among humans,” says co-author Dr Jacob Dunn from the Anglia Ruskin University (ARU).
Dr. Dunn adds that the findings showcases our similarities with the wider primate family, not just with chimpanzees. Observing the way gorillas interact over territory — setting up small, central areas of dominance and wider liminal areas of tolerance of other groups — could help us better understand early human populations. Just like us, he explains, gorillas have the capacity to both violently defend a specific territory and to establish between-group ties that lead to wider social cooperation.
The paper “Western gorilla space use suggests territoriality” has been published in the journal Scientific Reports.
A new study reports that there are four broad categories for the motivations that drive human behavior: prominence, inclusiveness, negativity prevention, and tradition.
What do people want? That’s a question psychologists have been trying to answer for a long time now, albeit with little agreement on the results so far. In an attempt to put the subject to rest, a team led by researchers at the University of Wyoming (UW) Department of Psychology looked at goal-related words used by English speakers. They report that human goals can be attributed to one of four broad categories: “prominence,” “inclusiveness,” “negativity prevention” and “tradition.”
What makes us tick
“Few questions are more important in the field of psychology than ‘What do people want?,’ but no set of terms to define those goals has gained widespread acceptance,” says UW Associate Professor Ben Wilkowski, the paper’s first author.
“We decided the best way to address the issue was to examine the words that people use to describe their goals, and we hope our conclusions will help bring about an ultimate consensus.”
The team started with a list of more than 140,000 English nouns, which they whittled down to a set of 1,060 that they deemed most relevant to human goals. They then carried out a series of seven studies in which they quizzed participants on their commitment to pursue goals. After crunching all the data, the team reports that human motivation is built on four main components (when it’s not drugs):
Prominence: these goals revolve around power, money making ability, mastery over skills, perfection, and glory. All in all, these motivators underpin our pursuit of social status and our desire to earn respect, admiration, and the deference of others through our achievements.
Inclusiveness: this represents our drive to be open-minded, tolerant, and accepting of other people, opposing views, different lifestyles, and values. In short, goals in this category revolve around accepting people of all types.
Negativityprevention: while the other categories on this list push us towards a goal, negativity prevention is aimed at pushing away undesirable outcomes. It includes goals meant to avoid conflict, disagreement, isolation, or social discord. In short, it’s our desire to keep the peace in the group and avoid personal pain.
Tradition: such goals revolve around our desire to uphold long-standing institutions or features of the culture we belong to. Religious affiliation and zeal, attitudes towards family and nation, cultural customs, attitudes towards other social groups are in large part shaped by the culture that raised us, and we each feel the need to nurture and pass on these cultural institutions — to a lesser or greater extent.
The more rebellious of you may have noticed that all these categories are externally-focused — the team did as well. Wilkowski says that the findings point to most of human motivation being “overwhelmingly social in nature,” adding that “the ‘need to belong’ and our ultra-social nature are reflected in all four categories.”
It has to be said, by this point, that the studies only addressed the English language as used within American culture. The team believes that their four categories apply to other industrialized cultures as well, but until that’s proven, they won’t say for sure.
“For example, ‘church’ would not serve as a good marker of tradition in non-Christian cultures; and ‘fatness’ would not serve as a good marker of negativity prevention in cultures where starvation is a larger concern than obesity,” they wrote.
“Nonetheless, we suggest that the deeper concepts underlying these four constructs are relevant to the human condition more generally — at least as experienced in large, industrialized cultures.”
The paper “Lexical derivation of the PINT taxonomy of goals: Prominence, inclusiveness, negativity prevention, and tradition” has been published in the Journal of Personality and Social Psychology.
Our species, Homo sapiens, gradually dispersed across the Asian continent during the Late Pleistocene (120,000-12,000 years ago). Most scholars have assumed a main southern migratory route, along the coast of the Indian Ocean, while routes in Northern and Central Asia were probably neglected due to their rough and inhospitable conditions.
In a new study, researchers found that these allegedly neglected routes are somewhat plausible after all. The authors argue that climate change may have made these regions easier to cross than they are today, opening up a crucial corridor for hominin dispersal.
Ancient lake landforms around Biger Nuur, Mongolia, which is evidence of larger lake sizes in the past. Credit: Nils Vanwezer.
Asia is a huge place, but humans managed to colonize it with remarkable efficiency. Most models of human dispersal assume that humans would have avoided passing through the Gobi Desert and the Altai Mountains, instead advancing through India and Southeast Asia or through Siberia, in the north.
However, archaeological findings from the past decade — such as the Denisova Cave and the Baishiya Karst Cave in China — has revealed that many areas of the globe that are today considered inhospitable might not have always been so in the past.
“Our previous work in Saudi Arabia, and work in the Thar Desert of India, has been key in highlighting that survey work in previously neglected regions can yield new insights into human routes and adaptations,” said Professor Michael Petraglia of the Max Planck Institute for the Science of Human History, a co-author of the new study.
The three routes from the “wet” simulations and the single route from the “dry” simulation are presented together in conjunction with palaeoclimatic extents (glaciers and palaeolakes). Credit: Nils Vanwezer and Hans Sell.
The team of researchers at the Max Planck Institute and the Institute of Vertebrate Paleontology and Paleoanthropology in Beijing, China, claims that climate change may have made northern corridors more accessible, allowing early human settlers to travel through.
The authors developed a Geographic Information Systems (GIS) software that maps geographical features alongside ancient climate records (past lake extents, changing precipitation amounts, changing glacial extents in mountain regions) in order to determine how environmental barriers might open or close different migration corridors.
“We factored in climate records and geographical features into GIS models for glacials (periods during which the polar ice caps were at their greatest extent) and interstadials (periods during the retreat of these ice caps) to test whether the direction of past human movement would vary, based on the presence of these environmental barriers,” said Nils Vanwezer, a Ph.D. student at the Max Planck Institute for the Science of Human History and a joint lead-author of the study published in PLOS ONE.
“We found that while during ‘glacial’ conditions humans would indeed likely have been forced to travel via a northern arc through southern Siberia, during wetter conditions a number of alternative pathways would have been possible, including across a ‘green’ Gobi Desert,” he continues.
The authors emphasize that their model does not prove that Pleistocene humans took such pathways. Instead, the study offers plausible pathways for human migration into eastern Asia which subsequent studies can investigate in the field.
“These models will stimulate new survey and fieldwork in previously forgotten regions of northern and Central Asia,” says Professor Nicole Boivin, Director of the Department of Archaeology at the Max Planck Institute for the Science of Human History, and co-author of the study. “Our next task is to undertake this work, which we will be doing in the next few years with an aim to test these new potential models of human arrival in these parts of Asia.”
The cave where the fossils which may belong to a new hominin species were found. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.
In a cave on a small island in the Philippines, scientists have found evidence of a new species of humans that lived at least 50,000 years ago. They called it Homo luzonensis, after the island of Luzon where the remains were found. These hominins were very short in stature, comparable to Homo floresiensis, nicknamed “hobbits”, which lived on the nearby Indonesian island called Flores. If the species is confirmed by DNA analysis, the findings will not only enrich the human family tree but also complicate the story of human migration and evolution in Asia.
The fossils from the island of Luzon were excavated during three expeditions in 2007, 2011, and 2015. Inside the island’s Callao cave, researchers found seven teeth (five from the same individual), two finger bones, two toe bones, and an upper leg bone. All were dated as being at least 50,000 years old by radiocarbon decay analysis. These fossils were found alongside those of butchered animals, suggesting that the cave’s inhabitants were at least sophisticated enough to devise cutting tools and rafts to reach the island from the mainland.
Individually, the bones are very similar to other Homo species in terms of shape and size. However, taken together, they reveal a combination of features that no other hominin shared. Homo luzonensis‘ molars were very small, even smaller than the hobbits. The premolars were relatively large, however, and had up to three roots rather than one — a feature shared by Homo erectus. The finger and toe bones were curved, suggesting tree-climbing ability that is more reminiscent of hominids living two million years ago in Africa.
Five fossil teeth from the same individual have unusual features that helped researchers determine that they might be dealing with a new species of human. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.
These findings suggest that the landscape occupied by our species was once quite crowded. We now know that Homo sapiens were contemporaries not only with their famous cousins, the Neanderthals, but also with Homo floresiensis, the Denisovans (a species that lived around a cave in the Altai Mountains of western Siberia), and now this fifth species, Homo luzonensis. This dramatically complicates the story of human migration into Asia suggesting that several human lineages had already occupied East Asia by the time the first modern humans reached China as early as 80,000 years ago.
A Homo luzonensis toe bone, showing the longitudinal curve. Credit: CALLAO CAVE ARCHAEOLOGY PROJECT.
Homo luzonensis was typically around 30 to 50 kilograms, stood 1 to 1.5 meters tall, and had brains around one-third the size of our own. Just like the hobbits on Flores, Homo luzonensis may be descendants of Homo erectus populations that crossed the sea from mainland Asia to Luzon. The small body and unusual skeletal traits may have been adaptations pressured by island dwarfing — a process whereby some creatures confined to isolated habitats such as islands are known to have become smaller over time due to limited resources and ecology.
Digital reconstruction of homo floresiensis, nicknamed ‘the hobbit’. Credit: Wikimedia Commons.
It’s not clear yet if we’re dealing with a new species at all. The team of researchers, led by Florent Détroit of the Musée de l’Homme in Paris, was unable to extract DNA from the fossils. Until a proper DNA analysis confirms the distinct lineage, Homo luzonensis’ inclusion in the human family tree remains questionable. For instance, the fossils might belong to hybrids — the products of interbreeding between two or more earlier Homo species. Or perhaps Homo erectus populations that arrived at Luzon simply acquired some traits that made them more adapted to their environment, rather than speciating.
The findings are still incredibly exciting, nevertheless. It’s amazing to hear that our species lived at the same time as four other human lineages and perhaps interacted with them. What a sight that must have been to behold.
The ability to sense Earth’s magnetic field is essential for many animals that use the magnetic field as a sort of a heads-up display to help them navigate the globe. But do humans also have this capacity? Seems so, claim the researchers of a new study which found that at least some humans have a magnetic ‘sixth sense’, albeit on a subconscious level.
The vestigial compass inside our heads
Hundreds of miles beneath our feet, floating molten liquid is churning away, driving the planet’s magnetic field like a huge electromagnet. The magnetic field serves to deflect most of the solar wind, whose charged particles would otherwise strip away the ozone layer that protects the Earth from harmful ultraviolet radiation. The magnetic field is also what causes a compass to point north. Interestingly, some creatures have an internal compass that enables them to “see” the magnetic field — an ability called magnetoreception.
In 2018, researchers found that European robins (Erithacusrubecula) and zebra finches (Taeniopygia guttata) both have a protein in their eyes’ retina, aptly called CRY4, which specifically evolved to detect magnetic fields. Humans do not have this protein inside their eyes but other studies have suggested that some animals are capable of magnetoreception as a result of complex neurological processes. This intrigued Caltech geophysicist Joseph Kirschvink and neuroscientist Shin Shimojo who embarked on a new study which solely focused on studying brain waves alone.
For their study, 34 participants had their brain’s electrical activity recorded with electroencephalography (EEG) while they seated inside a custom-built chamber fitted with coils and wires. The researchers varied the current which ran through the wires, with it varying the magnetic field inside the enclosure. The ‘Faraday Cage’ also insulated the participants seated inside from any external magnetic fields.
Illustration of the Faraday Cage used in the present experiment. Credit: Bickel.
During tests, each lasting for an hour each, the magnetic fields were rotated repeatedly while the participants sat inside the chamber in total darkness. When the magnetic field in the chamber was shifted, the participants could not experience any subjective experience of the fact. The EEG data, on the hand, showed that certain magnetic field rotations produced a strong and reproducible brain response. One of the EEG patterns recorded during the magnetic field shifts, called alpha-ERD, typically shows when an individual detects and processes a sensory stimulus.
“The brains were “concerned” with the unexpected change in the magnetic field direction, and this triggered the alpha-wave reduction. That we saw such alpha-ERD patterns in response to simple magnetic rotations is powerful evidence for human magnetoreception,” the authors wrote.
“Our participants’ brains only responded when the vertical component of the field was pointing downwards at about 60 degrees (while horizontally rotating), as it does naturally here in Pasadena, California. They did not respond to unnatural directions of the magnetic field – such as when it pointed upwards.”
“We suggest the response is tuned to natural stimuli, reflecting a biological mechanism that has been shaped by natural selection.”
For decades researchers have been testing humans’ ability to detect magnetic fields with conflicting results. The authors of the new study highlight how birds stop guiding themselves after the geomagnetic field if the strength is more than 25% different from what they’re used to. This tendency might also happen in humans, which may explain why previous efforts to detect magnetoreception came out empty handed — by cranking up the magnetic field to enable subjects to clearly detect it, the practice would only ensure the subjects’ brain ignored it.
The authors believe that this subconscious ability in humans, as in other species, is due to magnetoreceptor cells which contain the ferromagneticmineral magnetite. In the future, the researchers plan on studying the biophysics of the process in greater detail. They would also like to bring magnetoreception into conscious awareness.
I mean, who wouldn’t domesticate this guy?! Image via Pixabay.
By the third and second millennia BC, humans in today’s Spain often included animals in their tombs. The practice left us evidence of fox domestication by this time.
If you ever wanted a fox for a pet (be honest, we all do), you’ll be really envious of the Iberian peoples of the Early- to Middle-Bronze Age. Four foxes and a large number of dogs found at the Can Roqueta (Barcelona) and Minferri (Lleida) sites showcase their widespread practice of burying people alongside domestic animals. The findings also give us a glimpse into how these people and their animals lived, as well as their close relationships.
Foxy fur babies
Last week we saw how stone-age communities in roughly the same area of modern Spain included dogs in their funeral practices. Today, let’s take a look at how these practices evolved over time.
“We discovered that in some cases the dogs received a special kind of food. We believe this is linked to their function as working dogs. Besides, one of the foxes shows signs of having already been a domestic animal in those times,” says Aurora Grandal-d’Anglade, first author of the study.
Human remains found at these sites were buried in large silos along with dogs and a few foxes, the team reports. Carbon and nitrogen isotope analysis (performed on bone collagen), in addition to several other methods, allowed the researchers to piece together the diet of both the animals and their owners. The team looked at 37 dogs, 19 domestic ungulates, and 64 humans.
Map of the study area showing Can Roqueta, Minferri and other sites cited in the text: (1) Bòbila Madurell, (2) Can Gambús, (3) Pinetons, (4) Mas d’en Boixos, and (5) Cantorella. Image credits Grandal-d’Anglade et al., 2019, Arch. and Anth. Sciences.
The dogs tended to have comparable diets to that of the humans. The foxes had a more varied menu: in some cases, it closely resembled the dogs’, while others ate pretty much what wild animals with little human contact would eat.
Such diets suggest that the animals were already domesticated and relied on humans for food. Further evidence of the close ties these handlers formed with their pets comes from the remains of a fox retrieved at Can Roqueta.
“The case of the Can Roqueta fox is very special, because it is an old animal, with a broken leg. The fracture is still in its healing process, and shows signs of having been immobilized (cured) by humans. The feeding of this animal is very unusual, as it is more akin to a puppy dog’s. We interpret it as a domestic animal that lived for a long time with humans,” Grandal explains.
Some larger dogs — in particular those found at Con Roqueta — seem to have been fed a cereal-rich mix, as was at least one fox involved in the study. The team also reports findings signs of spinal column disorder in these specimens, suggesting they were used as pack animals. Their diet, then, directly reflected their role in the community — it’s not easy being a pack animal, and a high-carbohydrate diet gave them the calories needed to perform the task.
“It may seem strange that dogs were basically fed with cereals, but this was already recommended by the first-century Hispano-Roman agronomist Lucius Junius Moderatus Columella, in his work De re rustica,” says Silvia Albizuri Canadell, an archaeozoologist at the University of Barcelona and co-author of the study.
Unsurprisingly, other animals such as cows, sheep, or goats found in the graves had a herbivorous diet. Their role was likely to provide humans with food (milk, meat) or materials such as leather or wool — not labor. Factor in that the horse wasn’t known in these societies until much later, and the role of dogs as pack animals becomes more understandable. Dogs also served as an integral part of their communities’ economic pursuits by guiding herds and offering protection from wild animals. They likely obtained animal proteins from human leftovers.
In general, the team adds, both humans and dogs likely ate mostly plant matter, with some (but not a lot of) animal proteins, but “not necessarily much meat; they could be, for example, derived from milk,” according to Grandal. The men of these communities do stand out as incorporating more meat in their diets compared to women and children. Dogs’ diets were more similar to that of women and children, the team also found, suggesting that they were more linked to […] domestic environments.”
“The characteristics of dogs include their great intelligence, easy trainability and, undoubtedly, their defensive behaviour. As if that were not enough, this animal was used until the nineteenth century AD in North America, Canada and Europe for light transport on its back and for dragging carts and sleds. It also functioned as a pack animal on the Peninsula during the Bronze Age,” says co-author Albizuri Canadell.
Some archaeological specimens from North America also show bone disorders that stem from the pulling of ‘travois’ (a type of sledge). Similar pathologies have also been recently identified in the vertebrae of Siberian Palaeolithic dogs.
All in all, the findings illustrate the role dogs played as transport animals in the first migrations and human movements through glacial Europe. These animals likely played a fundamental and much more important role in their communities than believed until recently, the team writes.
Animals may have also served as a type of status symbol. The team found significant variation in the funeral treatment of different members of the studied communities. In one case, the team found “the body of an old man with the remains of a whole cow and the legs of up to seven goats,” while a young woman was buried with “the offering of a whole goat, two foxes, and a bovine horn.” Yet another individual uncovered in a different funeral complex was laid to rest with the whole bodies of two bovines and two dogs.
“We still don’t know why only a few people would have had the right or privilege to be buried with this type of offering, unlike what happens with the vast majority of burials,” explains co-author Ariadna Nieto Espinet.
“[…] these could be an indicator of the wealth of the deceased individual or of his clan or family,” she argues. “It seems that species such as bovines and dogs, two of the most recurring animals in funeral offerings, are those that might have played a fundamental role in the economy and work as well as in the symbolic world, becoming elements of ostentation, prestige and protection”.
The paper “Dogs and foxes in Early-Middle Bronze Age funerary structures in the northeast of the Iberian Peninsula: human control of canid diet at the sites of Can Roqueta (Barcelona) and Minferri (Lleida)” has been published in the journal Archaeological and Anthropological Sciences.
An international research effort has found that Neanderthals were predominantly meat-eaters. The findings come from isotope analysis performed on Neanderthal remains recovered in France.
Our understanding of the Neanderthals has changed profoundly over time. At first, we simply assumed they were brutish, more ape than human. Among other characteristics, the prevailing theory was that their diets were primarily vegetarian — big apes are largely vegetarian, this line of thinking went, so Neanderthals must have been the same, right?
We’ve come a long way since then. Archeological evidence revealed that far from being simple-minded and lacking in general skills and finesse, these ancient humans were quite capable. They enjoyed beauty for beauty’s sake, they developed refined tools, established cultural and spiritual practices, and — as they managed to woo our ancestors into bed/the cave — some were probably quite dashing, as well.
The new study comes to flesh out our understanding of what Neanderthals liked to dine on. The team analyzed proteins from preserved collagen in Neanderthal bones found at two dig sites in France: the remains of a one-year-old baby found at Grotte du Renne, and a tooth from Les Cottés. The results show that Neanderthals were neither vegetarian nor simply content with scavenging meat from the kills of other beasts. In fact, they probably killed said beasts and ate them.
The team reports that the ratios of nitrogen-15 to nitrogen-14 isotopes in the collagen samples are similar to what we’d see today in major meat eaters — wolves or lions, for example. The findings, the team explains, add to the body of evidence pointing to the Neanderthals being predominantly meat eaters.
Nitrogen ratio analysis is a widely-used tool for diet reconstruction in ancient species. Nitrogen is a reliable indicator of an organism’s position in a food chain, as organisms obtain it solely through diet. Higher N-15 to N-14 ratios are indicative of carnivores — who concentrate nitrogen from lower trophic levels through diet. The ratio the team found in the Neanderthal collagen is slightly higher than that found in carnivore remains at Neanderthal sites, which the team takes as evidence the Neanderthal’s high position in their local food webs.
There’s also a growing body of indirect evidence supporting this view, the authors note. Previous discoveries of spears found alongside their remains, as well as evidence of butchered animal bodies, suggests that they were quite adept at hunting and processing game. Neanderthals also likely had a bulkier, thicker thorax than modern humans (that’s us). This constitution allowed for larger kidneys and livers compared to our own, a feature common among animals whose diets are heavy in animal protein.
They note that another possibility is that the high ratios were owed to a diet heavy in mammoth meat, putrefying meat (I hope it was the mammoth), or fish. The team used a novel technique called compound-specific isotope analyses (CSIA) to separately analyze each amino acid found in the collagen. The exact isotope composition of amino acids is heavily influenced by diet.
“Using this technique, we discovered that the Neandertal of Les Cottés had a purely terrestrial carnivore diet: she was not a late weaned child or a regular fish eater [fish was not readily accessible at either site], and her people seem to have mostly hunted reindeers and horses”, says Klervia Jaouen, a researcher at the Max Planck Institute for Evolutionary Anthropology and first author of the study.
“We also confirmed that the Grotte du Renne Neandertal was a breastfeeding baby whose mother was a meat eater”.
Another finding was that Neanderthal diets were likely very stable over time, primarily meat, even after they had started to refine tool-processing techniques (possibly as a consequence of interacting with modern humans).
Taken as a whole, the study explains, these tidbits support the view that meat, particularly that obtained from herbivorous animals, was the main constituent of the Neanderthal diet. Small game was likely predominant on the menu, given that bones of fawns and other similarly-sized animals have been found at numerous Neanderthal dig sites and that smaller game is more readily killed with spears — but, as this study reveals, local food resources likely altered what Neanderthals ate in various areas.
The paper “Exceptionally high δ15N values in collagen single amino acids confirm Neandertals as high-trophic level carnivores” has been published in the journal PNAS.
Zebra finches seem to clump similar hues together and perceive them as single colors, new research suggests. This approach is similar to how the human mind processes color and sheds light on the biological root of color perception.
Zebra finches break the color spectrum into discrete colors — much like we do. Image credits Ryan Huang / TerraCommunications LLC
For zebra finch (Taeniopygiaguttata) males, wooing is all about what colors you’re wearing. These gents sport various hues on their beaks, ranging from light orange to dark red, which they use to attract mates. But all their chromatic efforts might be in vain, new research suggests, as the females may simply not perceive subtle nuances.
Red is red
For the study, the researchers worked with 26 zebra finch females and a handful of paper discs the size of a small coin. Some of these discs were painted in a solid color, while others were two-toned. The birds were taught that flipping over a two-toned disc earns them a reward in the form of a millet seed hidden beneath it. Solid-colored discs, meanwhile, didn’t return any tasty treats.
What the team wanted to determine through this experiment was how well the zebra finches could perceive ranges of hues. A bird picking at a certain disk before others during the experiment indicated that it perceived it as being two-toned, i.e. that it could perceive the hues on the disk as being different from one another. To see exactly how well the birds could distinguish different hues, some trials involved discs painted in color pairs that were far apart on the color spectrum (violet and yellow would be such a combination) while others used colors that were more similar (red-orange, for example).
Perhaps unsurprisingly, the females found it a breeze to perceive pairings of dissimilar colors. However, they didn’t fare nearly as well when trying to discern pairings of the hues in between these colors. The findings suggest a threshold effect at work — a sharp perceptual boundary near the orange-red transition.
The birds were also much better at spotting a two-toned disc if it bore colors from the opposite sides of the boundary (i.e. red-orange, for example) than pairs from the same side (two shades of the same color). This effect persisted even when the pairs were all equally far apart on the color spectrum, the team notes. This suggests that, while the finches have no problem perceiving different colors side-by-side, they do have some difficulty perceiving different hues of the same color on the discs.
First off, the findings help us gain a better understanding of how zebra finches handle romance. Previous research has shown that red-beaked males have more success with the ladies, likely because the color denotes good health. While this present study doesn’t show whether the females prefer one color over another, it does help us understand what females perceive when looking at potential mates.
The findings indicate that the birds lump all hues of red on one side of a certain threshold as being ‘red’. Because of this, the females likely aren’t very picky.
I give thee the color wheel. Image credits László Németh.
“What we’re showing is: he’s either red enough or not,” said senior author Stephen Nowicki, a biology professor at the Duke University.
It also helps us gain a better understanding of our own vision. The process of lumping similar hues together and perceiving them as a single color, known as categorical color perception, is something that our brain does as well. It’s not yet clear whether we share the same orange-red threshold with zebra finches, but the fact that we both exhibit categorical color perception suggests that the process has deep biological roots. Color, then, might not be just a construct of human language and culture, but may also stem from biological hardwiring.
Still, it likely doesn’t happen in the eye, the team writes — categorical color perception, even in zebra finches, is probably a product of their minds.
We don’t ‘see’ the light that hits our retina; what we see is the image our brain constructs from that data. This type of color perception, then, could be a way for the brain to help reduce ambiguity and noise from the environment — a way for our lumps of gray matter to help keep images simple so we don’t get overwhelmed.
“We’re taking in this barrage of information, and our brain is creating a reality that is not real,” said senior author Stephen Nowicki.
“Categorical perception — what we show in zebra finches — is perhaps one strategy the brain has for reducing this ambiguity,” adds Duke postdoctoral associate and paper co-author Eleanor Caves. “Categories make it less crucial that you precisely interpret a stimulus; rather, you just need to interpret the category that it’s in.”
The paper “Categorical Perception of Colour Signals in a Songbird” has been published in the journal Nature.
If you like having nails instead of claws, give a shout-out to society.
Image credits Daniel Nebreda.
Unlike other mammals, us humans and our primate cousins sport nails instead of claws. However, this wasn’t always the case — new fossil evidence shows that ancient primates had specialized grooming claws as well as nails. The findings showcase how primate social structure helped shift claw and nail evolution, the team writes, and overturns our assumption that the earliest primates had nails on all their fingers.
“We had just assumed nails all evolved once from a common ancestor, and in fact, it’s much more complicated than that,” said Jonathan Bloch, study co-author and curator of vertebrate paleontology at the Florida Museum of Natural History.
Grooming goes beyond just looking good. The thick body hair of primates is an ideal habitat for ticks, lice, and a whole host of other creepy crawlies which are both annoying and potential health hazards. As such, the ability to remove these pests formed an evolutionary advantage — and they evolved specialized grooming claws for the purpose. Many primates today retain such claws. Lemurs (subfamily Lemuroidea), lorises (subfamily Lorinae), and galagoes (family Galagidae) have grooming claws on their second toe, while tarsiers (family Tarsiidae) boast them on their second and third toe.
Up to now, we’ve believed that grooming claws developed independently across several primate lineages up to those alive today. However, new fossil evidence suggests that such claws are, rather, a key feature — they date back at least 56 million years, to the oldest-known primates.
Back in 2013, the study’s lead author Doug Boyer found several curious primate fossils at the University of California Museum of Paleontology. These fossils — distal phalanges, the bones that make the tips of fingers or toes — were hidden in sediment samples collected in Wyoming several decades earlier; as often happens, however, they were left waiting in a drawer in the archives. Based on the shape of these fossils, Boyer suspected that their owners sported grooming claws — in general, distal phalanges topped with a claw will be more narrow and tapered, while those supporting a nail will be flat and wide.
Lemurs, lorises, and galagoes have nails on most digits and grooming claws on their second toes, as seen on the feet of two greater slow lorises, Nycticebus coucang, in the Florida Museum mammals collection. Image credits Kristen Grace / Florida Museum.
Bloch’s work involved material recovered from Bighorn Basin, Wyoming. He discovered what initially looked like a “strange, narrow nail” bone, but on later comparison with modern specimens “it looked just like a tarsier grooming claw,” he recounts. Although smaller than a grain of rice, the bone matched the proportions of grooming claws of Teilhardina brandti, a mouse-sized, tree-dwelling primate.
Claw me, claw thee
These were the first hints that the fingers of early primates had grooming claws. To get to the bottom of things, the duo went out to Omomys Quarry, Wyoming, a site once inhabited by an early primate family, Omomys. Here, they found omomyoid grooming claws at three sites spanning 10 million years in the fossil record. The fossils proved beyond a doubt that early primates sported grooming claws.
Why, then, don’t we have some as well?
“The loss of grooming claws is probably a reflection of more complex social networks and increased social grooming,” said Boyer, an associate professor in the department of evolutionary anthropology at Duke University.
“You’re less reliant on yourself.”
This hypothesis could also explain why some species of (more) solitary primates, such as the titi (subfamily Callicebinae) or owl monkeys (family Aotidae) have re-evolved a grooming claw.
But why develop nails in the first place? The team believes it came down to shifts in how primates got around. As climbing, leaping, and grasping took center stage, claws simply became impractical — whereas nails wouldn’t snag or get in the way of anything.
Furthermore, the claws provide new insight into the lives of ancient primates, the team notes, many of which are only known from fossil teeth. Even these tiny claws can offer insight into how our ancestors moved about, their daily behavior, and their social structures.
“We see a bit of ourselves in the hands and feet of living primates,” Bloch said. “How they got this way is a profoundly important part of our evolutionary story.”
The paper “Oldest evidence for grooming claws in euprimates” has been published in the Journal of Human Evolution.
Over 10,000 years ago, humans in Borneo feasted on dried meats and palm plants, new research suggests.
Two human jaws from the Niah Caves in Borneo were originally discovered in 1958 but only just revealed. Top jaw is 30,000 years old, bottom jaw 11,000 years old; left image is Niah Caves archaeological site where they were both found. Image credits: Darren Curnoe.
While researchers have a fairly good idea about the diet of Late Pleistocene hunter-gatherers, not much is known about populations in South-East Asia, since very few remains have ever been found. But the Niah Caves in Borneo might change all that: researchers have made several promising findings in the area, shedding much-needed light on the island’s inhabitants.
The cave isn’t a new discovery; in fact, it’s been studied for decades. But new technologies are enabling researchers to see these findings in a new light. In a new study, Darren Curnoe from the University of New South Wales, Australia, along with colleagues, have re-analyzed three human mandibles that were previously excavated from the West Mouth of the Niah Cave in 1957.
The mandibles were dated with Uranium-series techniques to 30,000, 11,000, and 10,000 years old respectively. The shape and characteristics of the jawbone can provide important clues about what its owner usually ate. For instance, the older mandible was smaller but more robust than the other two, indicating that it was subjected to a lot of strain, probably due to chewing a lot of tough foods — the most likely candidates are dried meats or palm plants, a diet that has previously been identified in the Niah Caves.
Niah Caves – Malaysian Borneo. Entrance to the Great Cave. Image via Wikipedia.
It likely wasn’t an easy life for any of these people. Living close to a rainforest was challenging, and not many resources were around. These people were likely struggling to make a living and eating raw plants and dried meats likely made up much of their diets.
“These early modern humans were seemingly adapted to a difficult life in the tropical rainforests with their very small bodies and ruggedly build jaws from chewing really tough foods,” says Curnoe. “They tell us a lot about the challenges faced by the earliest people living in island Southeast Asia.”
It’s unclear to what extent they were able to adapt to the changing environment.
Journal Reference: Curnoe D, Datan I, Zhao J-x, Leh Moi Ung C, Aubert M, Sauffi MS, et al. (2018) Rare Late Pleistocene-early Holocene human mandibles from the Niah Caves (Sarawak, Borneo). PLoS ONE 13(6): e0196633. https://doi.org/10.1371/journal.pone.0196633
About 60% of all mammals on Earth are livestock. Credit: Pixabay.
With our fragile appearance, humans don’t look like much at first glance. Our impact on the planet, however, is unrivaled and unfortunately mostly negative. It is a proven scientific fact that even the climate is changing in response to human activity — that’s how consequential our actions are. Our huge infrastructure works have carved huge gaping holes inside mountains or even beneath the ocean. And, according to a new study, we’ve culled most of the planet’s wild animals and plants, replacing them with our livestock and crops.
The world’s 7.6 billion people represent only 0.01% of all living things, according to researchers at the Weizmann Institute of Science in Israel. However, our impact on nature is disproportionately huge. After Ron Milo and colleagues in Israel estimated all of the different components of biomass, they eventually calculated that humans have caused the loss of 83% of all wild mammals and half of all plants.
For three years, Milo and his team had been combing through the scientific literature in an attempt to estimate the total mass of all life forms on Earth. To simplify things, the researchers looked at carbon content, ignoring other aspects — like water content — that might have made comparisons between various life forms difficult. The tally shows that the planet’s total biomass weighs 550 gigatonnes of carbon. Considering their amount of carbon, Earth’s life forms can be ranked as follows:
all other creatures, from marine life to insects, 5%;
By carbon content, fungi (12 Gt C) are about six times more abundant than all animal life on the planet (2 Gt C), whereas the biomass of humans represents just a tiny fraction of that (0.06 Gt C). However, humanity’s biomass can grow to gargantuan proportions if you factor in our food: livestock. The researchers estimate that of all birds on the planet, 70% are farmed poultry, with just 30% being wild. For mammals, the picture is even grimmer: 60% of all mammals on Earth are livestock, mostly cattle and pigs, 36% are the humans themselves, and a mere 4% are wild mammals.
Five-sixths of wild land animals have been lost since the industrial revolution began, over than a century and a half ago. Meanwhile, in the oceans, three centuries of whaling and aggressive fishing have reduced marine mammals to a fifth of what they were.
“It is pretty staggering,” said Milo. “In wildlife films, we see flocks of birds, of every kind, in vast amounts, and then when we did the analysis we found there are [far] more domesticated birds.
”The destruction of wild habitat for farming, logging and development has resulted in the start of what many scientists consider the sixth mass extinction of life to occur in the Earth’s four billion year history. About half the Earth’s animals are thought to have been lost in the last 50 years.”
The results suggest that there has never before been a species that comes close to Homo sapiens, one that has caused the destruction of so many other species and individuals. In fact, our propensity to alter the environment and replace wildlife has some scientists claiming that we’re actually living in a new geological era called the Anthropocene.
“It is definitely striking, our disproportionate place on Earth,” said Milo. “When I do a puzzle with my daughters, there is usually an elephant next to a giraffe next to a rhino. But if I was trying to give them a more realistic sense of the world, it would be a cow next to a cow next to a cow and then a chicken.”
This study published in the Proceedings of the National Academy of Sciences goes to show that far from being puny mortals, we humans have become a huge driving force that’s capable of destroying hundreds of species if we so wish it (or just don’t pay particular attention). Perhaps, one day, we’ll also learn how to harness this immense power with matching responsibility.
In 2010, scientists announced the discovery of an extinct species of Ice Age humans called Denisovans, known only from bits of DNA taken from a sliver of bone in the Denisova Cave in Siberia. Recent research suggests that our Homo sapiens ancestors were intimately in contact with Denisovans. According to a new paper published by researchers at the University of Washington in Seattle, there were at least two distinct episodes of Denisovan genetic intermixing between the two species.
Two waves of Denisovan ancestry have shaped present-day humans. Credit: Browning et al./Cell.
It was first shown that humans interbred with Neanderthals 50,000 years ago, 100,000 years ago. Later, a 2016 study found that Oceanic individuals hold substantial amounts of not only Neanderthal, but also Denisovan DNA. For instance, the inhabitants of Melanesia, a subregion of Oceania, have between 4% and 6% Denisovan DNA. This fact itself is intriguing because we’re talking about an isolated population on a relatively inaccessible island, thousands of miles away from the Altai Mountains in Siberia.
Sharon Browning, a research professor of biostatistics at the University of Washington School of Public Health, along with colleagues studied 5,600 whole-genome sequences analyzed from individuals from Europe, Asia, America, and Oceania, then compared them to the Denisovan genome.
The analysis revealed that the genomes of two groups of modern humans with Denisovan ancestry are uniquely different, suggesting there were two separate episodes of Denisovan admixture. Specifically, the analysis showed that modern Papuan individuals contain approximately 5% Denisovan ancestry, while East Asians carry about 0.2% Denisovan DNA. It’s not yet clear what effects this Denisovan ancestry might pose to both populations, Browning told me.
“The major challenge was in developing a statistical method for detecting segments of archaic introgression in modern human genomes that would be sensitive (able to find such segments), specific (not yielding a lot of false positive results) and computationally efficient for analysis of thousands of modern human genomes. We spent a lot of time working on our method, testing it on simulated and real data, to address these challenges,” Browning told ZME Science.
Scientists were already aware that Papuans had significant amounts of Denisovan ancestry and that East Asians also bore signs of this admixture, but to a lesser degree. However, the assumption was that the Asian Denisovan ancestry was achieved from an admixture with an Oceanic population. The new work shows that this was not the case. Instead, East Asian populations must have interbred with Denisovans in a separate event, judging from the presence of a second set of Denisovan ancestry that could not be found in South Asians and Papuans. “This result was unexpected,” Browning said.
“When we compared pieces of DNA from the Papuans against the Denisovan genome, many sequences were similar enough to declare a match, but some of the DNA sequences in the East Asians, notably Han Chinese, Chinese Dai, and Japanese, were a much closer match with the Denisovan,” she said in a statement.
Browning thinks it’s possible that the ancestors of today’s Oceanians admixed with a southern group of Denisovans while the ancestors of East Asians admixed with a northern group. Perhaps upcoming studies of other Asian populations, as well as others throughout the world like Native Americans and Africans, might shed valuable new clues.
“We plan to apply our methodology to further worldwide populations, and see if we can find traces of introgression from archaic humans other than Neanderthals and Denisovans,” Browning told me, adding that “Our work helps to further reveal the complexity of human demographic history.”
Scientific reference: Cell, Browning, SR, et al: “Analysis of Human Sequence Data Reveals Two Pulses of Archaic Denisovan Admixture.
Even in small Paleolithic communities that lived 34,000 years ago, our early ancestors seem to have been aware of the dangers of inbreeding. Anthropologists report finding evidence of complex social structures at a site in Sunghir, Russia, which suggests people took precautions against inbreeding.
One of the burials from Sunghir, in Russia. Credit: University of Cambridge, UK.
The Upper Paleolithic burial site contains the complete remains of an adult male, the symbolically incomplete remains of another male, as well as those of two younger individuals. All of these people lived at this site during the same time. Unusual for similar finds from this period, all the four males were buried together.
When a team of scientists at the Cambridge University and the University of Copenhagen analyzed the genomes of these individuals, they were surprised to find they were not closely related. At most, one of the adults was no more related to the boys than a great-great-grandfather.
The researchers speculate that artifacts found at this location, which includes pieces of jewelry, may have been used in ceremonies and rituals that celebrated the exchange of mates between groups. Perhaps such exchanges foreshadowed modern marriage ceremonies.
In addition to the evidence that modern humans formed close-knit communities more than 30,000 years ago, this evidence also indicates that they deliberately sought mates beyond their immediate family. To avoid inbreeding, communities were likely connected to a wider network.
“What this means is that even people in the Upper Paleolithic, who were living in tiny groups, understood the importance of avoiding inbreeding,” Professor Eske Willerslev, fellow at St. John’s College, Cambridge said in a statement.
“The data that we have suggest that it was being purposely avoided. This means that they must have developed a system for this purpose. If small hunter–gatherer bands were mixing at random, we would see much greater evidence of inbreeding than we have here.”
Previously, a study published in Naturefound evidence of heavy inbreeding among Neanderthals, judging from a 50,000-year-old toe bone. Other studies also seem to indicate that Eurasian hominids were much more inbred and less genetically diverse than modern humans. For thousands of years, the Neanderthal population size remained small, and mating among close relatives was likely very common.
The Neanderthal genome included harmful mutations that made hominids around 40% less reproductively fit than modern humans. They had the last joke, though — non-African humans carry the burden of their inbreeding to this day. Some 2% of our DNA is Neanderthal, and due to our distant cousins’ interbreeding events, harmful gene variants continue to reduce the reproductive fitness of some populations today.
The genomic analysis of the Sunghir remains partly explain why anatomically modern humans were more successful than Neanderthals, who went extinct some 40,000 years ago.
“Most non-human primate societies are organised around single-sex kin where one of the sexes remains resident and the other migrates to another group, minimising inbreeding” says Professor Marta Mirazón Lahr, from the Leverhulme Centre for Human Evolutionary Studies at the University of Cambridge. “At some point, early human societies changed their mating system into one in which a large number of the individuals that form small hunter-gatherer units are non-kin. The results from Sunghir show that Upper Palaeolithic human groups could use sophisticated cultural systems to sustain very small group sizes by embedding them in a wide social network of other groups.”
Scientific reference: M. Sikora el al., “Ancient genomes show social and reproductive behavior of early Upper Paleolithic foragers,” Science (2017).
With their heavy brows and brutish appearance, Neanderthals (Homo neanderthalensis) seem to have been inferior to humans. But we all know looks can be deceiving. After all, it’s prejudices like these that can give birth to racism. Moreover, some people have even tried to use the scientific method to justify their racism.
Concerning our very close cousins, with each passing day, it seems like many of our assumptions, some subject of inter-species prejudice, are just wrong. One new study proves, for instance, that Neanderthals were distilling tar which they used to fashion tools some 200,000 years ago. It’s even questionable whether Homo sapiens even appeared around that time.
Could a brute make these?
(A) The larger of the two tar lumps found at Königsaue compared with (B) the maximum yield of tar produced with the raised structure method (RS 7). Credit: Scientific Reports.
Scientists learned of this fact after discovering tar beads in Italy, Germany, and other sites around Europe which were far older than the earliest signs of humans in Western Europe. The only explanation is that these were likely made by Neanderthals.
Tar is a huge technological breakthrough that enabled humans, and obviously, Neanderthals, to make superior tools. With this adhesive at hand, people could now assemble axes, hammers or any other machinery out of multiple parts.
There were some loose ends, though. We know that humans used to make tar ceramic vessels where tree bark was heated to around 350°C. The earliest archaeological evidence we have of ceramics is from 20,000 years ago, though.
The team of archaeologists at Leiden University, Netherlands, led by Paul Kozowyk investigated the lead by testing three scenarios for making tar from birch bark. The researchers were careful to only use technology that was available, from what we know of, to the Neanderthals. Ars Technica‘s Annalee Newitz describes one of the methods tested by the team:
“In the “ash mound” method, a roll of birch bark is heated under a pile of ash and embers. Tar is extruded into a birch bowl. In the “pit roll” method, a tube of birch bark is inserted into a narrow pit, and fire is lit on top. Tar drips from the roll onto a rock at the bottom of the pit. And finally, in the “raised structure” method, a birch bowl is placed in a shallow pit, under a screen woven from green willow wood. A roll of birch sits atop the screen and is then buried under dirt. Fire is lit on top of the dirt, slow-cooking the birch bark.”
Credit: Scientific Reports
All three methods rendered a couple grams of tar which is consisted with the findings from the archaeological sites, though they varied in complexity. The raised structure method required the most fire wood while the pit roll technique was simpler and required fewer resources but yielded little tar. Remarkably, they were also able to make tar at temperatures below 200 °C. This showed that Neanderthals didn’t need ceramics nor the technology to maintain a constant temperature to make tar.
It’s very likely that Neanderthals discovered tar making by accident. The researchers speculate in Nature Scientific Reports that it’s not unreasonable for Neanderthals to see tar dripping from bark thrown in the fireplace only for them to later attempt to manufacture their own after becoming impressed by its properties.
“In this way, they could develop the technology from producing small traces of tar on partially burned bark to techniques capable of manufacturing quantities of tar equal to those found in the Middle Palaeolithic archaeological record,” the authors concluded.
This is just the latest in a string of archaeological findings that demonstrate the intellectual prowess and social nature of Neanderthals. At the Croatian site of Krapina, anthropologists found a beautiful a set of eagle talons that included cut marks and were fashioned into a piece of jewelry. Neanderthals practiced cave painting and lit fires way before humans. The first instance of pre historic dentistry from about 130,000 years ago may have also been Neanderthal work. They even used manganese dioxide, today commonly found in batteries, to light fires some 50,000 years ago.
We can’t be sure that Neanderthals were the first to essentially invent glue. Maybe early humans in Africa independently arrived at the same discovery but there’s no evidence yet to back this up. In the meantime, no one can take this remarkable achievement away from Neanderthals who were far more complex than many care to give them credit.
An almost perfectly preserved, 7 million-year-old primate skull comes to fuel the debate around humanity’s cradle of birth.
Image credits Fred Spoor / AFP.
Anthropologists have been dying to get their hands on fossil evidence of the human-ape evolutionary split ever since we figured out that it must’ve happened. Three years ago, on a dusty trail in Kenya, Providence might have delivered them just one such prize. The catch, however, is that the fossilized skull is about the size of a baseball and comes from an infant individual. So it’s a bit of a mixed blessing, as it can help us piece together a rough idea of what the common ancestor of humans and apes looked like — but does little to settle other debates.
A Lucky Find
The skull is surprisingly well preserved for its age and was found in the Turkana Basin, northern Kenya, some 3 years ago. The team, led by primate paleontologist Isaiah Nengo of De Anza College in Cupertino, California, were working with Kenyan fossil hunter John Ekusi on excavations close to Lake Turkana. It hadn’t been a decidedly average day until Ekusi walked back to the jeep to light a cigarette — and found himself in surprising company.
“There was this skull just sticking out of the ground,” Nengo recalls. “It was incredible because we had been going up and down that path for weeks and never noticed it.”
It obviously once belonged to a primate, and the researchers sent it to the Noble Gas Laboratory at Rutgers University in New Brunswick, New Jersey, for argon isotope dating, revealing that it was about 13 million years old. Turkana Basin was a lush rainforest during that time, an ideal habitat for apes and other primates.
The skull resembled that of a modern gibbon, Nengo says, but teeth shape and dental patterns tie it closer to one other genus of Miocene primates found in Kenya, Nyanzapithecus. The skull’s molars, however, are much larger than those of known nyanzapithecines, which suggested a new species altogether. The researchers named it N. alesi after the Turkana word for “ancestor.”
Image credits Christopher Kiarie / AFP.
It was then sent to the European Synchrotron Radiation Facility in Grenoble, France, for extremely high-detail X-ray imaging study. This allowed the team to count growth lines in the skull’s (still unerupted) adult teeth. These indicated that the animal was about 485 days (or 1 year and 4 months) old when it died. The imaging also showed bony ear tubes embedded in the skull, which likely acted as a balance organ.
And those tiny little tubes have a big implication. Whether Nyanzapithecus was an ape or monkey line has been a hotly topic, but these tubes, along with the shape and size of the teeth, solidly mark N. alesi — and by extension all nyanzapithecines — as apes, the team reports. Even more, these tubes signal a direct link to the ape line from which humans and modern apes originate.
That’s a finding that could put a long-lasting debate of humanity’s birthplace to bed. Finding a common man-ape ancestor in Africa tips the scales heavily in favor of this continent. But the debate is far from settled, because what the tiny skull giveth it also taketh away.
The problem is that most headway in anthropology research is based on comparative analysis. That’s a fancy way of saying that anthropologists spend a lot of time comparing similar fossils to create an evolutionary roadmap. It works really well if you have fossils to compare — but that’s not the case here. We didn’t find any other infant Miocene-ape skull apart from this one. So although it could offer a link between modern human and ape ancestry, it leaves too much wiggle room. For example, we can’t meaningfully compare it to recently-found Graecopithecus,a similar early hominid/human-like ape which seems to hail from Europe. So while supporters of the out of Africa theory can point to N. alesi, their counterparts can rally around Graecopithecus — and it’s a stalemate again.
We simply need more fossil evidence to pinpoint humanity’s place of birth beyond a doubt. We may never be able to do it, considering how unlikely it is for bones to successfully fossilize. But considering the team found their skull literally sticking half-out from the ground, I’d say we have a fair bit of luck on our side.
The paper “New infant cranium from the African Miocene sheds light on ape evolution” has been published in the journal Nature.
A “ghost” species of ancient humans may have left their mark in their saliva of Sub-Saharan populations today.
Disclaimer: not an actual representation of what a human looks like. Image credits dife88 / Pixabay.
We know that our ancestors didn’t shy away from mingling with other species of humans. As they migrated to Europe and Asia, they shared their share of genes around — primarily with the Neanderthals and Denisovans. But new genetic evidence shows that ancient Africans also had their fun with a so-far-unknown species of early hominids.
It’s all about mucin
A research team discovered the strange gene while working to understand the role and origin of MUC7, a mucus-like protein which lends saliva its sticky consistency and helps it binds to microbes — which is believed to help the body evict unwanted guests, such as bacteria.
Part of this process was required the team to chart how MUC7 evolved over time in different areas, which they did by analyzing genetic material from over 2,500 people all over the world. To their surprise, they found that one group of genomes had a very different version of the gene encoding MUC7 — all of which were harvested in Sub-Saharan Africa. It’s so different, in fact, that the MUC7 genes from people with Neanderthal and Denisovan heritage are more similar than any of them are to the new variant.
“Based on our analysis, the most plausible explanation for this extreme variation is archaic introgression — the introduction of genetic material from a ‘ghost’ species of ancient hominins,” says Omer Gokcumen, PhD, an assistant professor of biological sciences in the University at Buffalo College of Arts and Sciences.
“This unknown human relative could be a species that has been discovered, such as a subspecies of Homo erectus, or an undiscovered hominin. We call it a ‘ghost’ species because we don’t have the fossils.”
Considering a baseline value of how fast human genes mutate over the generations, the team estimates that the interbreeding happened sometime around 150,000 years ago and that the two groups first grew apart around 1.5 to 2 million years ago.
The team further report that MUC7 seems to influence the makeup of our oral flora — the bacteria that live inside our mouths. MUC7 comes in two flavors: in some people, it’s encoded by five genes; in others, by six. Based on biological samples harvested from 130 different people, the team found that the two different version of MUC7-encoding genes were each strongly associated with a different oral flora makeup.
“From what we know of MUC7, it makes sense that people with different versions of the MUC7 gene could have different oral microbiomes,” Ruhl says. “The MUC7 protein is thought to enhance the ability of saliva to bind to microbes, an important task that may help prevent disease by clearing unwanted bacteria or other pathogens from the mouth.”
But the biggest find here is that there may be a lost family of hominids just waiting to be discovered. And, yet again, that our ancestors were really horny.
“It seems that interbreeding between different early hominin species is not the exception — it’s the norm,” Ruhl concludes.
The paper “Archaic hominin introgression in Africa contributes to functional salivary MUC7 genetic variation” has been published in the journal Molecular Biology and Evolution.