Category Archives: Anthropology

Orangutans instinctively make and use basic stone tools

Loui (the juvenile male orangutan) using the core as an active element to vertically strike on the concrete floor of the testing room during the Flake Trading condition of Experiment 2. Image credits:

Orangutans are a crafty bunch. They seem to be able to use a bunch of tools in the wild and even make complex choices about these tools. So a team of researchers led by Alba Motes-Rodrigo at the University of Tübingen in Germany wanted to test their stone tool-making ability. The researchers tested out their hypothesis on two orangutans at Kristiansand Zoo in Norway.

“We wanted to investigate what stone-related behaviors might have served as stepping stones for the development of lithic technologies in our lineage. Extant apes (and monkeys) can be used as living models to build hypotheses in this regard,” Motes-Rodrigo tells ZME Science.

“We decided to test orangutans because despite being proficient tool users and using a variety of raw materials as tools, they do not use stone tools in the wild. This absence of stone tool use behaviors in the wild orangutan repertoire supports the naivety of our study subjects before the start of the experiments. This naivety allowed us to investigate the learning process of stone-related skills from the beginning, excluding previous knowledge of the tasks.”

Each orangutan was provided with a concrete hammer, a specially prepared blunt stone core, and two baited puzzle boxes. In order to get through the boxes, the orangutans had to cut through a rope or a silicon skin — but if they could do it, they got a treat.

Initially, both orangutans started hitting the hammer against the walls and floor of their enclosure. They didn’t strike the stone core directly. In the second experiment, they were also given a human-made sharp flint flake, which one orangutan used to cut the silicon skin, solving the puzzle.

It’s the first time cutting behavior has been observed in untrained, unenculturated orangutans. In a subsequent experiment, researchers demonstrated how to strike the core to create a flint to three female orangutans at another zoo (Twycross Zoo) in the UK. After being taught, one female went on to use the hammer to hit the core as demonstrated.

This suggests that two major prerequisites for creating stone tools (striking with stone hammers and recognizing that sharp edges can cut) may have existed in our common lineage with orangutans 13 million years ago. However, this is merely speculation at this point and we need more evidence before we can truly say whether this was the case or not.

“Our results have added a new piece to the puzzle of the technological origins of our species showing that an ape species that does not use stone tools in the wild and that diverged from our lineage 13 million years ago, spontaneously engages in stone-related behaviours crucial for stone tool making (lithic percussion) as well as has the ability to recognise and use sharp stones as cutting tools.”

“The lithic percussive behaviours that we observed seem to be relatively common among primates, with species such as macaques, capuchins and chimpanzees also expressing them in the wild and in some studies in captivity. The use of a sharp stone as a cutting tool had never been reported before in an untrained ape, but given that we only have one observation of this behaviour it would be premature to draw strong conclusions about its evolutionary history.”

Sharp-edged bits detached by the orangutan in the second experiment. Image credits: Motes-Rodrigo et al (2022).

The orangutans’ tool-making is remarkable, but they haven’t entered the Stone Age just yet, Motes-Rodrigo tells ZME Science. Essentially, their tools are not complex enough, and we haven’t seen them do this in a natural environment. They could be capable of doing it, but we haven’t observed them doing it. So, for the moment we can’t place them in the Stone Age just yet.

“Even the most primitive human stone tools were far more advanced than what we have seen in orangutans and reflect advanced spatial and cognitive skills. In addition, these behaviors have only been observed in captivity under experimental conditions. Perhaps if in future we would make similar observations in the wild, we could make such claims, but at the moment we can’t.”

Journal Reference: Motes-Rodrigo A, McPherron SP, Archer W, Hernandez-Aguilar RA, Tennie C (2022) Experimental investigation of orangutans’ lithic percussive and sharp stone tool behaviours. PLoS ONE 17(2): e0263343.

Eating meat may have not been decisive trigger in human evolution

For decades, the theory that eating meat enabled our ancestors to develop their brains and bodies was prevalent. But the theory may not stand up to scrutiny.

Evidence of carnivory and butchery can be found in bones, showing cut marks from early tools. Image credits: Briana Pobiner/George Washington University.

Many quintessential human traits are associated with Homo erectus, a species of archaic human that emerged some 2 million years ago. H. erectus was the first human ancestor to spread throughout Eurasia, ranging from today’s Spain to Indonesia. If you’d like to pinpoint an ancestor to humanity, H. erectus would be as good a guess as any.

We don’t have much fossil evidence about H. erectus, but from what little we have discovered, it seems that meat-eating increased once H. erectus entered the stage. Many anthropologists interpreted this as direct causation, proposing that consuming meat may have provided these archaic humans with the energy required to develop bigger, more potent brains. Meat eating would also help explain why the stomachs of these ancestors became smaller (which can happen with carnivores). The theory was first published in 1995 and has remained popular since.

But the theory may be influenced by sampling bias. Simply put, researchers say, we’ve been looking too much into some pieces of evidence and not enough into others. Dr. W. A. Barr, the study’s lead author, explains:

“Generations of paleoanthropologists have gone to famously well-preserved sites in places like Olduvai Gorge looking for, and finding, breath-taking direct evidence of early humans eating meat, furthering the viewpoint that there was an explosion of meat eating after two million years ago.”

“However, when you quantitatively synthesize the data from numerous sites across eastern Africa to test this hypothesis, as we did here, the “meat made us human” evolutionary narrative starts to unravel.”

The Olduvai Gorge or Oldupai Gorge in Tanzania is one of the most important paleoanthropological localities in the world. Image via Wiki Commons.

In the new study, researchers analyzed 59 sites spread across 9 areas of east Africa, dating from 2.6 to 1.2 million years ago. They were expecting to find signs of increasing meat consumption across this period. They used several metrics to quantify meat-eating: the number of sites that preserved signs of butchering, the total count of animal bones across sites, and the number of different layers in which meat had been discovered.

In principle, their method was simple: let’s see if, for instance, the percentage of bones bearing butchering marks increases after the emergence of H. erectus. This turned out not to be the case: instead, sites that had more bones also had more butchered bones — but the percentage was stable over time. In addition, the researchers say, sites with fewer bones have been less intensively investigated.

1.5 million year old fossil bones with cut marks from Koobi Fora, Kenya. Image credits: Briana Pobiner.

This doesn’t necessarily rule out the theory that meat was essential in our human development, but it does show that we need much more evidence if we want to support it. It also shows just how easy it is to find evidence to support your idea when you’re looking for it. Several studies have noted the number of animal bones carved by H. erectus, but previous studies have not compared the number of these bones to the total number of bones. Dr Briana Pobiner, one of the study’s co-authors, says:

“This study changes our understanding of what the zooarchaeological record tells us about the earliest prehistoric meat-eating. It also shows how important it is that we continue to ask big questions about our evolution, while we also continue to uncover and analyze new evidence about our past.”

Researchers also explained that we’re seeing an incomplete picture because when it comes to fossils, we’re at the mercy of nature. For instance, older layers (from right before H. erectus) are less likely to form useful fossils, so we have a poorer understanding of what was going on before this time.

If the extra nutrients humans likely needed didn’t come from meat, they could have come from better tools or cooking techniques — but these theories also need more evidence, the researchers conclude.

The study was published in PNAS.

Chimpanzees pass down what they’ve learned, much like humans

Human culture and society are based on the idea of learning new things and teaching new generations how to do those things. But this approach, called cumulative culture, may not be unique to humans. According to a new study, chimps do the same thing.

Chimps cracking nuts. Image credits: Koops et al (2022).

Chimps don’t automatically know what to do when they come across nuts and stones. That simple bit of information may not seem like much on its own, but it actually says a lot about how they develop and pass knowledge.

Some groups of chimpanzees in Guinea have figured out that they can use tools to crack nuts; others have not. Researchers wanted to see whether other chimps can individually figure out how nut-cracking works, or if this knowledge is passed on in the group that figured it out. If this were indeed the case, it would mean that passing information is embedded into chimpanzee culture, much like it is embedded in human culture.

To get to the bottom of this, a team led by primatologist Kathelijne Koops from Zurich University set up an experiment where they exposed another chimpanzee group just 6 km away from a tool-using group to everything it needed to crack open nuts — they even provided them with palm nuts.

Initially, the chimps were excited by the stone tools. But they didn’t figure out how to crack the nuts, and over a few months, they gradually lost interest. The researchers then added a palm fruit to the experimental setup, to familiarize the chimps with the food source. They even cracked open some nuts and placed them on top of the stone tools, to give them a hint, and offered some easier-to-crack types of nuts.

But regardless of what they did, they couldn’t get the chimps to crack open the nuts without being shown how to do it.

Image credits: Koops et al (2022).

“None of the Seringbara chimpanzees cracked nuts, nor attempted to do so. Hence, stimulus/local enhancement (nuts and stones) or end-state emulation (cracked-open nuts) did not elicit a nut-cracking (re-)innovation in these wild chimpanzees,” the researchers write in the study. “In sum, nut cracking was not independently (re-)innovated by wild Western chimpanzees in field experiments.”

This strongly suggests that cracking nuts is a behavior chimps teach among their group, much like humans do. It’s a form of social learning that allowed human culture to develop progressively more complex tools and technologies. This new finding would force us to re-think how unique human culture really is.

“Our findings suggest that chimpanzees acquire cultural behaviors more like humans and do not simply invent a complex tool use behavior like nut cracking on their own,” says Koops. “Our findings on wild chimpanzees, our closest living relatives, help to shed light on what it is (and isn’t!) that makes human culture unique. Specifically, they suggest greater continuity between chimpanzee and human cultural evolution than is normally assumed and that the human capacity for cumulative culture may have a shared evolutionary origin with chimpanzees.”

Previous experiments have suggested that captive primates can start using tools without being taught, but some researchers suspected that this may be because they observed humans using tools and learned this behavior from them. This new experiment seems to suggest this idea is true.

If humans and chimpanzees both exhibit cumulative culture, and since the two species are so closely related biologically, it’s plausible that cumulative culture was also a trait of our common ancestor with chimpanzees.

“Our findings suggest greater continuity between chimpanzee and human cultural evolution than is normally assumed,” the researchers conclude.

The study was published in Nature Human Behavior.

Who invented school?

School is an institution that is hated (especially during exams) by millions of kids around the world — but at the same time billions of adults remember it as the ‘good old days’. For all its good and bad, society as we know it couldn’t exist without schools — and we’re not just talking about the building, we’re talking about the entire system and environment that allows us to pass knowledge to younger generations and prepare them for what’s to come in the real world (at least in theory). But who actually invented school?

Image credits: Max Fischer/pexels

From old school to modern schooling system

Ironically enough, for all the information you can find in schools, no textbook mentions exactly when and how the idea of a school originated. This is mostly because it depends on how exactly you define a school. For instance, in ancient Greece, education was somewhat democratized, and education in a gymnasium school was considered essential for participation in Greek culture, but it was reserved only for boys (and often, not all boys). In ancient Rome, rich children were tutored by private professors, but neither of these is a school in the sense we consider today — public, formal education that is compulsory, open, and available to all — though you could argue that in some sense, school dates from ancient times, and the organized practice of teaching children dates for thousands of years.

Compulsory education was also not an unheard-of concept in ancient times –though it was mostly compulsory for those tied to royal, religious, or military organizations. In fact, Plato’s landmark The Republic, written more than 2,300 years ago, argues in favor of compulsory education, though women and slaves were not truly a part of Greek society.

Much information about schooling is also lost to the shroud of time. For instance, there is some indirect evidence about schools in China existing at least 3,000 years ago, but this comes from “oracle bones” where parents would try to divine whether it was auspicious for their children to go to ‘school’ — and there’s little information about what these schools were like.

It’s not just the Chinese, Greeks, and Romans. The Hindus, for instance, had developed their own schooling system in the form of gurukuls. In 425 AD, the Byzantine empire in Rome came up with the world’s first known primary education system dedicated to educating soldiers enrolled in the Byzantine army so that no person in the army faces problems in communicating and understanding war manuals. Different parts of the world had developed different types of education — some more efficient than others.

In Western Europe (and England, in particular), the church became involved in public education early on, and a significant number of church schools were founded in the Early Middle Ages. The oldest still operating (and continuously operating school) is The King’s School in Canterbury, which dates from the year 597. Several other schools still in operation were founded in the 6th century — though again, you could argue whether they were true schools as they were only open to boys.

Albert Bettannier’s 1887 painting that depicts the scene of an old European school. Image credits: Deutsches Historisches Museum Berlin/Wikimedia Commons

Furthermore, compared to the modern schools, education in the above-mentioned institutes was more focused on religious teachings, language, and low-level or practical skills only. Many of them even used to operate in a single room with no set standards and curriculum, but as humanity progressed ahead people started to realize the need for an organized system to educate the future generations. 

For more than ten centuries, schools maintained the same general profile, focused mostly on a niched set of skills and religious training. In the 9th century, the first university was founded in Fez, Morocco. However, that too was founded as a mosque and focused on religious teachings. The oldest university still in operation, the University of Bologna, in Italy, was founded in 1088. It hired scholars from the city’s pre-existing educational facilities and gave lectures in informal schools called scholae. In addition to religion, the university also taught liberal arts, notarial law, and scrivenery (official writing). The university is notable for also teaching civil law.

However, the university is not necessarily the same as a school — it wasn’t a public “for all” education system, but rather a “school” for the intellectual elite. For schools to truly emerge as we know them today, we have to fast forward a few more centuries.

Compulsory, free education for all

In 1592, a German Duchy called Palatine Zweibrücken became the first territory in the world with compulsory education for girls and boys — a remarkable and often-ignored achievement in the history of education. The duchy was followed in 1598 by Strasbourg, then a free city of the Holy Roman Empire and now part of France. Similar attempts emerged a few decades later in Scotland, although this compulsory education was subject to political and social turmoil.

In the United States — or rather, in the colonies that were to later become the United States — three legislative acts enacted in the Massachusetts Bay Colony in 1642, 1647, and 1648 mandated that every town having more than 50 families to hire a teacher, and every town of more than 100 families to establish a school.

Prussia, a prominent German state, implemented a compulsory education system in 1763 by royal decree. The Prussian General School Regulation asked for all young citizens, girls and boys, to be educated from age 5 to age 13-14 and to be provided with a basic education on religion, singing, reading, and writing based on a regulated, state-provided curriculum of textbooks. To support this financially, the teachers (often former soldiers) cultivated silkworms to make a living. In nearby Austria, Empress Maria Theresa introduced mandatory primary education in 1774 — and mandatory, systemized education was starting to take shape in Europe. Schools, as we know them today, were becoming a thing.

Meanwhile, the US was having its own educational revolution.

In 1837, a lawyer and educator Horace Mann became the Secretary of the Massachusetts Board of Education in the newly-formed United States. Mann was a supporter of public schooling and he believed that without a well-educated population political stability and social harmony could not be achieved. So he put forward the idea of a universal public education system for teaching American kids. Mann wanted a system with a set curriculum taught to students in an organized manner by well-trained subject experts. 

Without undervaluing any other human agency, it may be safely affirmed that the Common School…may become the most effective and benignant of all forces of civilization.

Horace Mann, Father of the Common School Movement

Mann employed his “normal school” system in Massachusetts and later other states in the US also started implementing the education reforms that he envisioned. He also managed to convince his colleagues and other modernizers to support his idea of providing government-funded primary education for all. 

Due to his efforts, Massachusetts became the first American state in 1852 to have a mandatory education law, school attendance and elementary education were made compulsory in various states (mandatory education law was enacted in all states of the US by 1917), teacher training programs were launched, and new public schools were being opened in rural areas. 

At the time, when women were not even allowed to attend schools in many parts of the world, Mann advocated the appointment of women as teachers in public schools. Instead of offering religious learning to students, Mann’s normal schools were aimed at teaching them reading, writing, grammar, arithmetic, geography, and history. He believed that school education should not incorporate sectarian instructions, however, for the same reason, some religious leaders and schoolmasters used to criticize Mann for promoting non-sectarian education.

The innovative ideas and reforms introduced by Mann in the 1800s became the foundation of our modern school system. For his valuable contribution in the field of education, historians sometimes credit him as the inventor of the modern school system.

However, as we’ve seen, the history of schools is intricate, complex, and very rich. There is no one “inventor” of school — the process of arriving at the school systems we have today (imperfect as they may be) took thousands of years of progress, which was not always straightforward.

Shocking facts about school education

Now that we’ve looked a bit at the history of the school, let’s see how things are today — and why there’s still plenty of work to be done in schools around the world.

Image credits: Pixabay/pexels
  • A study conducted by the Institute of Education in the UK suggests that quality of primary education is more crucial for an individual’s academic progress, social behavior, and intellectual development as compared to factors including his or her family income, background, and gender. Another study highlights that students who receive good elementary education and have a positive attitude about the significance of their performance in primary and middle school are more likely to earn well and live a better life than others in the future.  
  • A UNESCO report reveals that school education up to nine years of age is compulsory in 155 countries but unfortunately, there are more than 250 million children in the world who are still not able to attend school. 
  • According to International Labour Organization (ILO), due to poverty and lack of educational opportunities, 160 million kids are forced into work across the globe and about 80 million of them work in unhealthy environments. Thousands of such kids are physically and sexually abused, tortured, and are even trained to work under drug mafia, criminal groups, and terrorist organizations. Some studies reveal that child labor is also associated with school dropout in less developed countries. Due to poor financial conditions, many individuals at a young age start giving preference to economic activities and lose interest in costly education opportunities. However, an easily accessible and high-quality school education model that could allow children (from poor families) to pursue education without compromising their financial security can play an important role in eliminating child labor.
  • African nation South Sudan has the lowest literacy rate in the world. Only 8% of females in this country are literate and overall only 27% of its adult population is educated. 98% of the schools that offer elementary education in Sudan do not have an electric power supply and only one-third of such schools have access to safe drinking water. 
  • City Montessori School (CMS) located in Dehradun, India is hailed as the largest school in the world. The CMS campus houses 1,050 classrooms in which more than 50,000 students attend classes every day. 

For Horace Mann, schools were a means to produce good citizens, uphold democratic values and ensure the well-being of society. Though not all schools are able to achieve these goals, the power of school education can be well understood from what famous French poet Victor Hugo once said, “He who opens a school door, closes a prison”.

Archaeologists discover stunning, ancient gold trove in Cyprus

In 2018, archaeologists at the New Swedish Cyprus Expedition struck gold — quite literally. They discovered two Bronze Age tombs, both underground chambers, in the ancient city of Hala Sultan Tekke. Hala Sultan Teke is a mosque and tekke complex in the capital of Cyprus, Larnaca, built on one of the largest Bronze Age archaeological sites. Based on these new findings, the site may be even more important than previously thought.

Some jewelry pieces found in the tombs resemble designs worn by Queen Nefertiti. Image credits: Peter Fischer, Teresa Bürge.

The excavations were made by researchers from the University of Gothenburg in Sweden as part of the “New Swedish Expedition” which started in 2010. The team discovered burial chambers that must have belonged to a family (or families) of great wealth. Overall, the research team found 150 human remains and over 500 funeral goods, including many pieces containing gold and jewelry. The remains were placed one over the other, suggesting that the burial chamber would have been used for multiple generations. Most likely, it was the mausoleum of the city’s rulers.

“The finds indicate that these are family tombs for the ruling elite in the city,” excavation leader Peter Fischer, professor emeritus of historical studies at the University of Gothenburg in Sweden, said in a statement.

“For example, we found the skeleton of a 5-year-old with a gold necklace, gold earrings and a gold tiara. This was probably a child of a powerful and wealthy family.”

A gold necklace found at the site. Image credits: Peter Fischer, Teresa Bürge.

It’s pretty obvious that for the family, the mausoleum had a significant importance. It wasn’t just a simple burial chamber, it also served a ceremonial role. Testament to this is a particular artifact uncovered inside.

“We also found a ceramic bull,” Fischer said. “The body of this hollow bull has two openings: one on the back to fill it with a liquid, likely wine, and one at the nose to drink from. Apparently, they had feasts in the chamber to honor their dead.”

As if the jewelry pieces weren’t remarkable enough, upon closer analysis, archaeologists found that they belong to different cultures. For instance, there is a blue lapis lazuli gemstone from Afghanistan, a red carnelian gemstone from India, and amber from around the Baltic Sea — valuables from the trade partners of the kingdom at the time. Another notable find is a cylinder-shaped seal made of a mineral called hematite and inscribed in cuneiform, the written language of ancient Mesopotamia. The cuneiform text mentions three names: two historical kings (father and son) from the 18th century BC, as well as Amurru, a god worshipped in the Akkadian and Sumerian kingdoms. “We are currently trying to determine why the seal ended up in Cyprus more than 1000 kilometres from where it was made,” the researchers said in a statement.

For historians, the ceramics discovered at the same are almost as important as the jewels themselves, because they offer valuable cultural information.

“The way that the ceramics changed in appearance and material over time allows us to date them and study the connections these people had with the surrounding world. What fascinates me most is the wide-ranging network of contacts they had 3,400 years ago,” Fischer explains.

A large ceramic pot featuring Grecian war chariots. Image credits: Peter Fischer, Teresa Bürge.

All of the objects from the excavation that have been processed and studied are stored in museums in Nicosia and Larnaca in Cyprus.

Now, the next step for researchers is to carry out genetic analysis on the remains discovered there and piece together as much as possible about this dynasty.

“This will reveal how the different individuals are related with each other and if there are immigrants from other cultures, which isn’t unlikely considering the vast trade networks,” says Peter Fischer.

Neanderthals were the first to artificially transform the world, turning a forest into grassland 125,000 years ago

Credit: Pixabay.

Many scientists believe we’ve now crossed a new geological epoch known as the Anthropocene, in recognition of the fact that, despite their short time on Earth, humans have fundamentally altered the physical, chemical, and biological makeup of the planet. Agriculture, urbanization, deforestation, and pollution have all caused extraordinary changes on Earth. But, perhaps, ironically it may have all started with a different, extinct species of humans.

The earliest evidence of ecosystem change at the hands of hunter-gathers has been pinpointed at a lignite quarry near Halle in Germany, where researchers found Neanderthal activities from 125,000 years ago transformed closed forests into open grasslands. The deforestation seems to have been mostly done through fire.

“Archeologists have long been asking questions about the character and temporal depth of human intervention in our planet’s ecosystems. We are increasingly seeing very early, generally weak signs of this,” says Wil Roebroeks, an archeology professor at Leiden University in the Netherlands.

Roebroeks and colleagues have analyzed evidence collected over the decades at the Neumark-Nord quarry, including hundreds of slaughtered animals, numerous stone tools, and charcoal remains. Some 130,000 years ago, the region experienced a prosperous warm spell that promoted the growth of thick deciduous forests stretching from the Netherlands to Poland, which were inhabited by deer and cattle, but also elephants, lions, and hyenas.

These forest lands attracted communities of Neanderthal hunter-gatherers, who rapidly moved in, especially into areas with lakes. They effectively competed with other carnivores and occupied their own ecological niche until the region was occupied by advancing ice 115,000 years ago.

Compared to forested regions where Neanderthals didn’t live, the scientists found that the Neanderthal inhabited regions experienced a significant decrease in tree cover. Instead of dense forests, the Neanderthal habitat was much lighter and open. There are also signs that these ancient people settled at least semi-permanently in the region, which is unusual in itself since Neanderthals are thought of as highly mobile groups. Perhaps the open landscape, which attracted plenty of game and offered reasonable shelter, was attractive enough to keep some Neanderthal groups more or less settled in one place.

Stone tools found at the Neumark-Nord site in Germany. Credit: Eduard Pop/Leiden University.

However, there’s a chicken or egg problem. While it’s tempting to look at the charcoal data and imagine Neanderthal activity burned the local vegetation, they could have also moved into more advantageous open areas after wildfires did all the hard work for them.

Whether or not the Neanderthals initiated the deforestation, one thing is at least clearer: they kept these areas open, and they did so for at least 2,000 years. At similar neighboring lakes where there was no Neanderthal activity, such as hunting, collecting wood, making tools, and building shelters, the dense forest vegetation remained largely intact.

There’s ancient evidence that modern humans altered the landscape much in the same way, but these kinds of practices were seen only in the past 50,000 years. In contrast, the new findings point to much earlier artificial ecosystem changes at the hand of Neanderthals.

The ability of humans to alter nature is obvious today when our cities stretch over hundreds of square miles and carbon emissions from our activities have grown to such copious amounts that we’ve come to change the climate. The origin of this long process of changing the planet to suit our needs is typically considered the advent of agriculture, which appeared about 10,000 years ago. But recent research, such as the present study, increasingly suggests environmental alteration by hominins started much earlier, albeit at a smaller scale. Neumark-Nord is, perhaps, the earliest example of such interventions.

“It also adds something to the behavioral spectrum of early hunter-gatherers. They weren’t simply ‘primal hippies’ who roamed the landscape picking fruit here and hunting animals there. They helped shape their landscape,” says Roebroeks.

The findings appeared in the journal Science Advances.

Three million years ago, an unknown hominin species was walking on two legs

In 1970, five consecutive footprints were discovered at a site in Tanzania. The site, called Laetoli, dates to more than 3 million years ago — and “The Laetoli Footprints,” as they became known, were dated to around 3.6 million years ago.

The footprints are notable because they suggest that one hominin species was walking upright, on two legs, 3.6 million years ago. But there were other footprints at the site as well. Because of their similarity to bear prints, these footprints fell into obscurity, but a new study suggests that instead, they belong to another, unidentified species of hominin — and this hominin also walked on two legs.

Three dimensional scans of experimental footprints and a Laetoli footprint Contours are 1 mm. Image credits: Raichlen et al (not from this recent study).

In 2019, Ellison McNutt and colleagues went to re-excavate this second set of footprints and have a better look at it. They weren’t expecting much; they even feared that erosion or other natural phenomena may have worn the footprints away. As McNutt puts it, they had “meager” hopes of rediscovering the prints. But they got a bit lucky: thanks to careful notes left by Mary Leakey, a paleontologist who had worked at the site, they were able to pinpoint their exact location and start digging — finding the footprints almost in better shape than four decades ago.

“Our field team excavated the area and found that instead of eroding away, four decades of seasonal rains had instead washed sediment over the prints and had beautifully preserved them. We carefully excavated the footprints, revealing more detail than ever before, and we recorded the footprints with high-resolution 3D laser scanners, unavailable to researchers in the 1970s. While we were unable to find more than the original five footprints, we plan to return to the site and search for more footprints of this new hominin,” McNutt told ZME Science.

They compared the newly discovered footprints with those of chimps and bears and found, beyond a doubt, that these footprints belong to a hominin; but which one? They don’t appear to be made by the same one that formed the original set of footprints, nor any other hominin we know. To make matters even more intriguing, this doesn’t seem to be a transitionally bipedal hominin — someone who was just learning to walk on two legs — this hominin had already transitioned to bipedalism.

Model of Laetoli Site A using photogrammetry showing five hominin footprints (a); and corresponding contour map of the site at Laetoli, Tanzania, generated from a 3D surface scan (b); map showing Laetoli, which is located within the Ngorongoro Conservation Area in northern Tanzania, south of Olduvai Gorge (c); topographical maps of A2 footprint (d) and A3 footprint (e). Images (a) and (b) by Austin C. Hill and Catherine Miller. Image (c): Illustration using GoogleMaps by Ellison McNutt. Images (d) and (e) by Stephen Gaughan and James Adams.

“We are quite certain that these are hominin prints. The prints are bipedal, with no evidence that this individual was transitioning between quadrupedalism and bipedalism (that is, there are no imprints of forefeet),” the researcher added.

“The prints also have size portions between the big toe and the second digit that are consistent with hominins and their relatives (i.e, relatively large big toes), as opposed to the smaller big toes found in bears. Additionally, these prints display an instance of cross-stepping (i.e., where one foot passes completely across the midline in front of the other foot.) Humans have the ability to cross-step due to the anatomy of our hips and knees, which allows us to retain our balance whereas quadrupedal animals (like bears or chimpanzees) cannot.”

Ellison McNutt gathering data on how bears walk.

The hominin that left the footprint had an unusual gait, walking in cross-stepping way, with each foot crossing over the body’s midline to touch down in front of the other foot. But other than this, we don’t really know all that much about who made the footprint, McNutt explained. It’s likely that the individual was a juvenile, and it was a different species than Australopithecus afarensis (described from the famous “Lucy” skeleton) that left the other footprints.

This suggests that there was a much greater diversity of hominins than we’re currently aware of and there may be several species still waiting to be discovered — the site of Laetoli could be instrumental in helping us understand this diversity. The fact that not one, but at least two hominin species were bipedal at the time is striking, McNutt concludes.

“The bipedal trackways found there represent the oldest unequivocal evidence of bipedal locomotion in the human fossil record, a way of moving through the world that is one of the fundamentals of the human species.”

The study describing the footprints has been published in Nature.

We’re evolving right now: scientists see how our genome is changing in recent history

A new study from Europe has identified 755 traits that have changed in the past 2-3,000 years of human evolution. These traits are linked with things like pigmentation, nutritional intake, and several common diseases or disorders.

We sometimes tend to think of humans as the pinnacle of evolution, the tip of the biological pyramid. Not only does that show just how self-centered we humans can be, but it’s not really correct either. Even if it were to be the case, it’s not like evolution has stopped — it’s happening right as you’re reading this.

Natural selection (the process through which individuals better adapted to an environment are more likely to reproduce) isn’t just happening in the animal world, it’s happening for humans too. Granted, the pressures that drive this can be quite different, but the process is taking place nonetheless — and it’s been happening since the dawn of human history.

Understanding the patterns behind our past and present evolution isn’t just a scientific curiosity, it could have important applications in the field of medicine and human biology.

“The genetic architecture of present-day humans is shaped by selection pressures in our history. Understanding the patterns of natural selection in humans can provide valuable insights into the mechanisms of biological processes, the origin of human psychological characteristics and key anthropological events,” the researchers write in the new study.

The shells of individuals within a bivalve mollusk species showing diverse coloration and patterning in their phenotypes. Image credits: Debivort.

The team, led by Weichen Song  from Shanghai Jiao Tong University School of Medicine, sequenced modern human genetic data from the UK Biobank, along with ancient human DNA from across Europe. They analyzed 870 polygenic traits — traits whose phenotype (the set of observable characteristics or traits of an organism) is influenced by more than one gene, comparing differences between the old and the new genetic groups.

They found that 88% of these traits (755) underwent significant change in the past 2-3 thousand years. Some of these findings were linked to pigmentation, body size, and nutritional intake.

“One of our most interesting results was the finding that pigmentation, body measurement, and dietary traits were continuously under intense selection pressure across various human development timescales,” the study also reads.

However, researchers caution that their findings are limited exclusively to European data, and it’s not clear if there is a cause-effect between the associations between genetic variants and phenotype.

“In sum, we provide an overview of signals of selection on human polygenic traits and their characteristics across human evolution, based on a European subset of human genetic diversity. These findings could serve as a foundation for further populational and medical genetic studies,” the researchers write.

Nevertheless, this could serve as a foundation for larger, wider studies, aiding future research into human genetics and evolution.

The study “A selection pressure landscape for 870 human polygenic traits” was published in Nature Communications.

A different way of looking at the sky — Brazilian ethnoastronomy and its unique constellations

We often regard the invention of astronomy from a Greek perspective — after all, most of the official constellations and planets are named after Greek mythology. The names are connected to epic stories that permeated ancient people’s imagination, making it easier to pass the information to a younger generation. However, astronomy was not exclusive to western philosophy– other people used astronomy in their lives as well, and they had their own, different systems.

Archeoastronomy focuses on the way ancient civilizations used astronomy, either for religious purposes or scientific observations. It is known from archeoastronomy that Mesoamerican cultures used their architecture as a form of measuring time. Their legacy is studied by ethnoastronomy.

The Southern Cross, Milky Way and Carina Nebula, viewed from Kenya.Credit…Babak Tafreshi/National Geographic Society, via Corbis

We know a few examples of different astronomical classifications. The Aboriginal culture has a constellation called Emu, the Australian ostrich, between the Southern Cross and Scorpius. Similarly, in African Tswana and Venda traditions, the Southern Cross is a group of giraffes.

Brazilian indigenous groups also have their own astronomical system. Most of the information we now know can be traced to the moment when the Europeans started interacting with these indigenous people, also through more careful observation from explorers who visited the Americas as part of their academic lives. For most of the outsiders, the indigenous culture was seen as inferior, limited, and they formed a narrative  in order to fit in the eurocentric view:

“With the true God, who created heaven and earth, they don’t care. They believe, with a long tradition, that heaven and earth have always existed. In fact, they know nothing about the beginning of the world, they just narrate that there was once a vastness of water in which all their ancestors drowned. Only a few there escaped on a boat and others on tall trees. I think it must have been the flood.” (Hans Staden between 1547 and 1548)

However, although Europeans tried to discard indigenous knowledge, an important part of it survives to this day.

Tupis, Tupinambá, Guarani

Tupi is the term used to describe the people and the family of languages that includes 41 native languages spoken between Brazil, Argentina, Peru, Bolivia, Paraguay, and Uruguay. The Tupinambá people, one of the Tupi ethnic groups that lived in the Atlantic Forest in Brazil, speak a Tupi language, so researchers chose to name the people Tupinambá, and Tupi their language.

The Guarani, also living in the same countries listed above, are distinguished from the Tupinambá because they don’t speak any of the 41 languages, they speak the Guarani language. Historians believe the Guarani descended from the Tupinambá and a series of migrations changed their language over time.

Percentage of Indigenous population with national population by country in Latin America and the Caribbean (end 1990s-beginning 2000s). Credits: Raul A Montenegro and Carolyn Stephens.

In Brazil alone, there are known 220 indigenous ethnicities. The most populous group is the Guarani, approximately 46,000 people.  Anthropologists estimate that there are at least 185 isolated groups between Venezuela, Brazil, Peru, Bolívia, and Ecuador. Of these, many have their own way of looking at constellations.


Despite the many disturbing events that took place when Europeans started colonizing Brazil, records of local astronomy still exist, and they’re a good source of information for researchers. These written records provide the ‘Rosetta stone’’ that enables astronomers to translate the constellations named by the natives to the stars as we know them today.

The Tupinambá people, one of the Tupi ethnic groups that lived in the Atlantic Forest in Brazil, used to mark time according to moon phases, proving they used astronomy in their lives. The Europeans learned that as they asked the age of native Brazilians they met, the replies were large numbers, and the foreigners soon connected it to a different system of units.

Tupinambás understood the tides and their connection to the lunar phases, even without a gravitational theory. In 1612, the Franciscan missionary Claude d’Abbeville wrote that “the Tupinambá attribute the ebb and flow of the sea to the Moon and distinguish the two high tides very well that occur at the full moon and the new moon or a few days later”. It was only in 1632 that Galileu Galileiwrote in his book ‘Dialogue Concerning the Two Chief World Systems’ that the mechanism which causes the tides are Earth’s rotation and translation. It took 75 years until Isaac Newton gave the correct explanation but still lagging years behind the Tupinambás.

The seasons

Thanks to Professor Germano Afonso’s work, we learned more about Tupi astronomy in recent years. Afonso spent months among the Tupi, collecting all the information he could. He discovered that among the Tupi, the common celestial bodies used as a calendar were the Moon, the Sun, Pleiads, the galactic center, Orion and Scorpius’ region, and the Southern Cross. Their gnomon, the solar clock, called the Cuaracyraangaba, is a vertical stone pointing at the zenith, similar to many other cultures around the globe. According to local myth, the Nhanderu god created four other main gods who helped create the world. Nhanderu represents the zenith and the four gods are the cardinal points.

Indigenous solar observatory in the Mato Grosso do Sul State University.

For the Tupi, there are only two seasons: the new weather (spring and summer) and the old weather (autumn and winter) — which makes sense for a good part of the Brazilian territory in terms of climate, the four seasons system work better for mid-latitudes. Thanks to the gnomon, they knew the day on which season started depending on the Sun’s directions. This is simple, in the Southern Hemisphere, the Sun rises and sets closer to the South in the summer and closer to the north in winter.


Different from the zodiac constellations, constellations were not only patterns between stars for the indigenous people,, but also the light and dark marks in the Milky Way. Nhanderu is the best example for the dark constellations, it is the dark region near Cygnus, a northern constellation in the Milky Way plane, deriving its name from the Greek word for swan. Both the Large and Small Magellanic clouds (dwarf galaxy companions to the Milky Way) are constellations as well, both named after South American animals: Tapir’s fountain and Skunk Pig’s fountain respectively.

The Large and Small Magellanic Clouds over Paranal Observatory in Chile. Image via the European Southern Observatory.

Seasonally, the Pleiads were another tool to mark the year, they knew they would appear in a wet season and disappear in the dryer one.

The Rhea constellation. Credits: Almanaque Brasil.

The beginning of summer is based on the constellation of the Old Man marking the start of the rainy season in the North of Brazil. It is the image of a disabled person made out of some of Orion and Taurus stars. The head is in the Hyades star cluster, above the head Pleiads, Orion’s belt is in the left leg, while a shorter leg ends with Betelgeuse. He also holds a stick with his right hand to help to stand. In their mythology, the Old Man lost his right leg after he was  murdered by his wife who was younger and interested in the man’s younger brother. The gods felt sorry for him and took him to the sky in the form of a constellation.

Old Man constellation: The Old Man, in more modern vernacular, may be composed of the Hyades star cluster as his head and the belt of Orion as part of one leg. Tupi folklore relates that the other leg was cut off by his unhappy wife, causing it to end at the orange star now known as Betelgeuse. The Pleiades star cluster, on the far left, can be interpreted as a head feather. In the featured image, the hobbled Old Man is mirrored by a person posing in the foreground. Folklore of the night sky is important for many reasons, including that it records cultural heritage and documents the universality of human intelligence and imagination. Image Credit & Copyright: Rodrigo Guerra.

It’s evident that looking towards the sky is part of human nature. For Europeans, Native Americans, Aborigines, Africans, and many other cultures around the world, this was clearly a common pursuit of knowledge. The differences are the myths and shapes used alongside these observations, but the guiding principles were the same.For millennia, the sky was the best calendar we had, and it was a way to prepare for the weather ahead. Perhaps we should add a few different gods to name new planets and stars observed.

Neanderthals likely spoke and understood language like humans

With each new study, scientists’ perceptions on Neanderthals have shifted away from that of mindless brutes to highly complex hominids — a new study is cementing the notion that our extinct cousins were very human-like. One central question in human evolution is whether spoken language was employed by other species in the Homo lineage. A study published today confirms that Neanderthals were indeed linguistically capable.

“This is one of the most important studies I have been involved in during my career”, says Rolf Quam, an anthropology professor at Binghamton University and co-author of the new study. “The results are solid and clearly show the Neandertals had the capacity to perceive and produce human speech. This is one of the very few current, ongoing research lines relying on fossil evidence to study the evolution of language, a notoriously tricky subject in anthropology.”

The Atapuerca Mountains in north-eastern Spain may not look like much. They feature gentle slopes and a rather dry landscape, interrupted from time to time by forests and the occasional river. But these mountains hold a karstic environment that is key to understanding how humans came to be, and what life was for our early ancestors.

The most important site is a cave called Sima de los Huesos (Pit of Bones). Anthropologists have recovered over 5,500 human remains which are at least 350,000 years old from this site. The remains belong to 28 individuals of Homo heidelbergensis, an archaic hominin that lived from approximately 700,000 years to 300,000 years ago. Scientists believe that H. heidelbergensis is the ancestor of Homo neanderthalensis.

For their study, Quam along with colleagues at the Universidad Complutense de Madrid performed high resolution CT scans of Atapuerca fossils in order to produce virtual 3D models of the ear structure. The scientists generated models for Homo sapiens and Neanderthals, as well as for the ancestors of the Neanderthals.

The ear models were then inputted into software that can estimate hearing abilities based on the structure of the ears up to 5 kHz, which is most of the frequency range of modern human speech sounds.

Compared with the Atapuerca fossils, the researchers found that the Neanderthals had slightly better hearing in the 4-5 kHz range, which closely resembles modern humans.

The study also assessed the frequency range of maximum sensitivity, also known as the occupied bandwidth, for each species. The wider this bandwidth, the easier it is to distinguish complex sounds and to deliver a clear message in the shortest amount of time.

Once again, compared to their Atapuerca ancestors, Neanderthals showed a wider bandwidth resembling modern humans.

“This really is the key,” says Mercedes Conde-Valverde, professor at the Universidad de Alcalá in Spain and lead author of the study. “The presence of similar hearing abilities, particularly the bandwidth, demonstrates that the Neandertals possessed a communication system that was as complex and efficient as modern human speech.”

This begs the question: what did a Neanderthal language sound like? According to the researchers, one of the most intriguing findings of the study is that Neanderthal speech likely included an increased use of consonants.

“Most previous studies of Neandertal speech capacities focused on their ability to produce the main vowels in English spoken language. However, we feel this emphasis is misplaced, since the use of consonants is a way to include more information in the vocal signal and it also separates human speech and language from the communication patterns in nearly all other primates. The fact that our study picked up on this is a really interesting aspect of the research and is a novel suggestion regarding the linguistic capacities in our fossil ancestors,” Quam said.

These documented improvements in auditory capacity in Neandertals mirrors increasing complexity in stone tool technology, domestication of fire, and possible symbolic practices. We know, for instance, that Neanderthals also painted, fashioned jewelry, and employed abstract thinking in which symbols or images are used to represent objects, persons, and events that are not present.

As such, the study suggests that increasingly complex behaviors coevolve with increasing efficiency in oral communication. More insights may be gleaned once the researchers extend this investigation to other species of Homo.

“These results are particularly gratifying,” said Ignacio Martinez from Universidad de Alcalá in Spain. “We believe, after more than a century of research into this question, that we have provided a conclusive answer to the question of Neandertal speech capacities.”

Would we still see ourselves as ‘human’ if other hominin species hadn’t gone extinct?

Would we see Neanderthals (right) as human if they were around today? wikipedia, CC BY-SA

In our mythologies, there’s often a singular moment when we became “human”. Eve plucked the fruit of the tree of knowledge and gained awareness of good and evil. Prometheus created men from clay and gave them fire. But in the modern origin story, evolution, there’s no defining moment of creation. Instead, humans emerged gradually, generation by generation, from earlier species.

As with any other complex adaptation – a bird’s wing, a whale’s fluke, our own fingers – our humanity evolved step by step, over millions of years. Mutations appeared in our DNA, spread through the population, and our ancestors slowly became something more like us and, finally, we appeared.

Strange apes, but still apes

People are animals, but we’re unlike other animals. We have complex languages that let us articulate and communicate ideas. We’re creative: we make art, music, tools. Our imaginations let us think up worlds that once existed, dream up worlds that might yet exist, and reorder the external world according to those thoughts. Our social lives are complex networks of families, friends and tribes, linked by a sense of responsibility towards each other. We also have awareness of ourselves and our universe: sentience, sapience, consciousness, whatever you call it.

And yet the distinction between ourselves and other animals is, arguably, artificial. Animals are more like humans than we might think – or like to think. Almost all behaviour we once considered unique to ourselves are seen in animals, even if they’re less well developed.

That’s especially true of the great apes. Chimps, for example, have simple gestural and verbal communication. They make crude tools, even weapons, and different groups have different suites of tools – distinct cultures. Chimps also have complex social lives and cooperate with each other.

As Darwin noted in Descent of Man, almost everything odd about Homo sapiens – emotion, cognition, language, tools, society – exists, in some primitive form, in other animals. We’re different, but less different than we think.

And in the past, some species were far more like us than other apes – Ardipithecus, Australopithecus, Homo erectus and Neanderthals. Homo sapiens is the only survivor of a once diverse group of humans and human-like apes, the hominins, which includes around 20 known species and probably dozens of unknown species.

The extinction of those other hominins wiped out all the species that were intermediate between ourselves and other apes, creating the impression that some vast, unbridgeable gulf separates us from the rest of life on Earth. But the division would be far less clear if those species still existed. What looks like a bright, sharp dividing line is really an artefact of extinction.

The discovery of these extinct species now blurs that line again and shows how the distance between us and other animals was crossed – gradually, over millennia.

The evolution of humanity

Our lineage probably split from the chimpanzees around 6 million years ago. These first hominins, members of the human line, would barely have seemed human, however. For the first few million years, hominin evolution was slow.

The first big change was walking upright, which let hominins move away from forests into more open grassland and bush. But if they walked like us, nothing else suggests the first hominins were any more human than chimps or gorillas. Ardipithecus, the earliest well-known hominin, had a brain that was slightly smaller than a chimp’s, and there’s no evidence they used tools.

In the next million years, Australopithecus appeared. Australopithecus had a slightly larger brain – larger than a chimp’s, still smaller than a gorilla’s. It made slightly more sophisticated tools than chimps, using sharp stones to butcher animals.

Core from which sharp flakes have been struck off, likely by H. habilis. Olduvai Gorge, Tanzania. Nick Longrich, Author provided

Then came Homo habilis. For the first time, hominin brain size exceeded that of other apes. Tools – stone flakes, hammer stones, “choppers” – became much more complex. After that, around 2 million years ago, human evolution accelerated, for reasons we’re yet to understand.

Big brains

At this point, Homo erectus appeared. Erectus was taller, more like us in stature, and had large brains – several times bigger than a chimp’s brain, and up to two-thirds the size of ours. They made sophisticated tools, such as stone handaxes. This was a major technological advance. Handaxes needed skill and planning to create, and you probably had to be taught how to make one. It may have been a meta-tool – used to fashion other tools, such as spears and digging sticks.

Handaxes made by Homo erectus, from Lake Natron, Tanzania. Nick Longrich, Author provided

Like us, Homo erectus had small teeth. That suggests a shift from plant-based diets to eating more meat, probably from hunting.

It’s here that our evolution seems to accelerate. The big-brained Erectus soon gave rise to even larger-brained species. These highly intelligent hominins spread through Africa and Eurasia, evolving into Neanderthals, Denisovans, Homo rhodesiensis and archaic Homo sapiens. Technology became far more advanced – stone-tipped spears and firemaking appeared. Objects with no clear functionality, such as jewellery and art, also showed up over the past half-million years.

Some of these species were startlingly like us in their skeletons, and their DNA.

Homo neanderthalensis, the Neanderthals, had brains approaching ours in size, and evolved even larger brains over time until the last Neanderthals had cranial capacities comparable to a modern human’s. They might have thought of themselves, even spoke of themselves, as human.

The Neanderthal archaeological record records uniquely human behaviour, suggesting a mind resembling ours. Neanderthals were skilled, versatile hunters, exploiting everything from rabbits to rhinoceroses and woolly mammoths. They made sophisticated tools, such as throwing spears tipped with stone points. They fashioned jewellery from shells, animal teeth and eagle talons, and made cave art. And Neanderthal ears were, like ours, adapted to hear the subtleties of speech. We know they buried their dead, and probably mourned them.

There’s so much about Neanderthals we don’t know, and never will. But if they were so like us in their skeletons and their behaviour, it’s reasonable to guess they may have been like us in other ways that don’t leave a record – that they sang and danced, that they feared spirits and worshipped gods, that they wondered at the stars, told stories, laughed with friends, and loved their children. To the extent Neanderthals were like us, they must have been capable of acts of great kindness and empathy, but also cruelty, violence and deceit.

Far less is known about other species, like Denisovans, Homo rhodesiensis, and extinct sapiens, but it’s reasonable to guess from their large brains and human-looking skulls that they were also very much like us.

Love and war

I admit this sounds speculative, but for one detail. The DNA of Neanderthals, Denisovans and other hominins is found in us. We met them, and we had children together. That says a lot about how human they were.

It’s not impossible that Homo sapiens took Neanderthal women captive, or vice versa. But for Neanderthal genes to enter our populations, we had to not only mate but successfully raise children, who grew up to raise children of their own. That’s more likely to happen if these pairings resulted from voluntary intermarriage. Mixing of genes also required their hybrid descendants to become accepted into their groups – to be treated as fully human.

These arguments hold not only for the Neanderthals, I’d argue, but for other species we interbred with, including Denisovans, and unknown hominins in Africa. Which isn’t to say that encounters between our species were without prejudice, or entirely peaceful. We were probably responsible for the extinction of these species. But there must have been times we looked past our differences to find a shared humanity.

Finally, it’s telling that while we did replace these other hominins, this took time. Extinction of Neanderthals, Denisovans, and other species took hundreds of thousands of years. If Neanderthals and Denisovans were really just stupid, grunting brutes, lacking language or complex thought, it’s impossible they could have held modern humans off as long as they did.

The human edge

Why, if they were so like us, did we replace them? It’s unclear, which suggests the difference was something that doesn’t leave clear marks in fossils or stone tools. Perhaps a spark of creativity – a way with words, a knack for tools, social skills – gave us an edge. Whatever the difference was, it was subtle, or it wouldn’t have taken us so long to win out.

While we don’t know exactly what these differences were, our distinctive skull shape may offer a clue. Neanderthals had elongated crania, with massive brow ridges. Humans have a bulbous skull, shaped like a soccer ball, and lack brow ridges. Curiously, the peculiar smooth, round head of adult Homo sapiens is seen in young Neanderthals – and even baby apes. Similarly, juvenilised skulls of wild animals are found in domesticated ones, like domestic dogs: an adult dog skull resembles the skull of a wolf pup. These similarities aren’t just superficial. Dogs are behaviourally like young wolves – [less aggressive] and more playful.

My suspicion, mostly a hunch, is that Homo sapiens’ edge might not necessarily be raw intelligence, but differences in attitude. Like dogs, we may retain juvenile behaviours, things like playfulness, openness to meeting new people, lower aggression, more creativity and curiosity. This in turn might have helped us make our societies larger, more complex, collaborative, open and innovative – which then outcompeted theirs.

But what is it?

Until now, I’ve dodged an important question, arguably the most important one. It’s all well and good to discuss how our humanity evolved – but what even is humanity? How can we study and recognise it, without defining it?

People tend to assume that there’s something that makes us fundamentally different from other animals. Most people, for example, would tend to think that it’s okay to sell, cook or eat a cow, but not to do the same to the butcher. This would be, well, inhuman. As a society, we tolerate displaying chimps and gorillas in cages but would be uncomfortable doing this to each other. Similarly, we can go to a shop and buy a puppy or a kitten, but not a baby.

The rules are different for us and them. Even die-hard animal-rights activists advocate animal rights for animals, not human rights. No one is proposing giving apes the right to vote or stand for office. We inherently see ourselves as occupying a different moral and spiritual plane. We might bury our dead pet, but we wouldn’t expect the dog’s ghost to haunt us, or to find the cat waiting in heaven.

And yet, it’s hard to find evidence for this kind of fundamental difference.

The word humanity implies taking care of and having compassion for each other, but that’s arguably a mammalian quality, not a human one. A mother cat cares for her kittens, and a dog loves his master, perhaps more than any human does. Killer whales and elephants form lifelong family bonds. Orcas grieve for their dead calves, and elephants have been seen visiting the remains of their dead companions. Emotional lives and relationships aren’t unique to us.

Perhaps it’s awareness that sets us apart. But dogs and cats certainly seem aware of us – they recognise us as individuals, as we recognise them. They understand us well enough to know how to get us to give them food, or let them out the door – or even when we’ve had a bad day, and need company. If that’s not awareness, what is?

We might point to our large brains as setting us apart, but does that make us human? Bottlenose dolphins have somewhat larger brains than we do. Elephant brains are three times the size of ours; orcas, four times; and sperm whales, five times. Brain size also varies in humans. Albert Einstein had a relatively small brain – smaller than the average Neanderthal, Denisovan, or Homo rhodesiensis – was he less human? Something other than brain size must make us human – or maybe there’s more going on in the minds of other animals, including extinct hominins, than we think.

We could define humanity in terms of higher cognitive abilities – art, maths, music, language. This creates a curious problem because humans vary in how well we do all these things. I’m less mathematically inclined than Steven Hawking, less literary than Jane Austen, less inventive than Steve Jobs, less musical than Taylor Swift, less articulate than Martin Luther King. In these respects, am I less human than they are?

If we can’t even define it, how can we really say where it starts, and where it ends – or that we’re unique? Why do we insist on treating other species as inherently inferior, if we’re not exactly sure what makes us, us?

Neither are we necessarily the logical endpoint of human evolution. We were one of many hominin species, and yes, we won out. But it’s possible to imagine another evolutionary course, a different sequence of mutations and historical events leading to Neanderthal archaeologists studying our strange, bubble-like skulls, wondering just how human we were.

The nature of evolution means that living things don’t fit into neat categories. Species gradually change from one into another, and every individual in a species is slightly different – that makes evolutionary change possible. But that makes defining humanity hard.

We’re both unlike other animals due to natural selection, but like them because of shared ancestry; the same, yet different. And we humans are both like and unlike each other – united by common ancestry with other Homo sapiens, different due to evolution and the unique combination of genes we inherit from our families or even other species, such as Neanderthals and Denisovans.

It’s hard to classify living things in strict categories, because evolution constantly changes things, creating diverse species, and diversity within species.

And what diversity it is.

True, in some ways, our species isn’t that diverse. Homo sapiens shows less genetic diversity than your average bacterial strain; our bodies show less variation in shape than sponges, or roses, or oak trees. But in our behaviour, humanity is wildly diverse. We are hunters, farmers, mathematicians, soldiers, explorers, carpenters, criminals, artists. There are so many different ways of being human, so many different aspects to the human condition, and each of us has to define and discover what it means to be human. It is, ironically, this inability to define humanity that is one of our most human characteristics.

Nicholas R. Longrich, Senior Lecturer in Paleontology and Evolutionary Biology, University of Bath

This article is republished from The Conversation under a Creative Commons license. Read the original article.

3,000-year-old Mesopotamian tablets document the earliest known case of PTSD

Assyrian relief of a horseman from Nimrud, now in the British Museum. “Battle scene, Assyrian, about 728 BC. From Nimrud, Central Palace, re-used in South-West Palace.” Credit: British Museum.

Researchers studying ancient texts from Mesopotamia dating to 1300 BCE came across descriptions of symptoms that sound remarkably similar to post-traumatic stress disorder, or PTSD. As such, this may be the earliest depiction of PTSD in history.

The findings were reported in the journal Early Science and Medicine by Walid Khalid Abdul-Hamid of Queen Mary University of London and Jamie Hacker Hughes of the  Veterans and Families Institute at Anglia Ruskin University. Speaking to BBC News, the researchers said that the Assyrian soldiers “described hearing and seeing ghosts talking to them, who would be the ghosts of people they’d killed in battle – and that’s exactly the experience of modern-day soldiers who’ve been involved in close hand-to-hand combat.”

Nothing new under the sun

According to the researchers, professional soldiers enlisted by the Assyrian Dynasty in Mesopotamia, present-day Iraq, between 1300BC and 609BC first went through a year-long bootcamp, which also involved civil works like building roads, bridges, and other infrastructure for the kingdom. The soldiers were then sent to war for a year and, if they made it back in one piece, they were allowed to return their families for one year before repeating the cycle again.

But as the ancient texts analyzed by the researchers showed, although their bodies might have come back home intact, some of the soldiers’ minds were in shatters.

PTSD has only fairly recently been formally described by psychiatrists, after studies of Vietnam war veterans. Previously, doctors simply dismissed PTSD symptoms in soldiers as ‘shell shock’ or ‘battle fatigue’.

“Ancient soldiers facing the risk of injury and death must have been just as terrified of hardened and sharpened swords, showers of sling-stones or iron-hardened tips of arrows and fire arrows. The risk of death and the witnessing of the death of fellow soldiers appears to have been a major source of psychological trauma,” the paper reads. “Moreover, the chance of death from injuries, which can nowadays be surgically treated, must have been much greater in those days. All these factors contributed to post-traumatic or other psychiatric stress disorders resulting from the experience on the ancient battlefield.” 

Until now, the oldest reference to PTSD-like symptoms came from ancient Greece, in texts by Herodotus describing the aftermath of the infamous Battle of Marathon in 490 BCE. Herodotus claimed that some Athenian warriors had hallucinations and suffered from spontaneous blindness following their close encounter with death on the battlefield. Achilles, hero of the Trojan war, is commonly held to be an ancient sufferer of PTSD as well. And in one potential account of PTSD, one chronicler described the crusaders coming home from the Third Crusade (1189-92), writing that though these men “survived unharmed … their hearts were pierced by swords of sorrows from different sorts of suffering”.

Although PTSD is challenging (and sometimes impossible) to diagnose from text alone, these accounts show that trauma and distress haunted veterans likely since humans first waged war on one another. 

When did people first start wearing clothes? 120,000-year-old bone tools found in Moroccan cave shed clues

Hides drying in the sun at Chouara Tannery in Fez, Morocco, a traditional process thousands of years old. Credit: Emily Yuko Hallett, 2009.

Humans can get pretty weird with their fashion, so much so that it’s easy to forget that clothing is, first and foremost, supposed to be practical and functional. Our bare skin is rather ill-equipped to handle extreme cold, which is why clothes are so important. Without them, humans could have never migrated out of the cozily warm African savanna and survive incomprehensible long cold spells such as ice ages.

The first clothes humans wore were made from naturally available materials such as animal fur and hide, grass, leaves, bone, and shells. It’s not clear when we first starting adorning our skin with clothing, but a new study that found 120,000-year-old clothing manufacturing tools in Contrebandiers Cave on Morocco’s Atlantic Coast suggests this practice is at least that old.

The Pleistocene wardrobe

Credit: Jacopo Niccolò Cerasoni.

Researchers led by Emily Hallett of the Max Planck Institute for the Science of Human History in Germany, initially arrived at the cave to examine bone fossils in order to determine what Pleistocene humans in the area ate. What they found instead was far more interesting.

Clothes don’t fossilize, as they decompose and vanish in just a couple hundred years. But the tools used to fashion them are much sturdier. Inside the Moroccan cave, the researchers discovered dozens of tools ideal for scraping hides and pelts to make leather and fur.

“Our findings show that early humans were manufacturing bone tools that were used to prepare skins and furs, and that this behavior is likely part of a larger tradition with earlier examples that have not yet been found,” Hallett says.

Some of the tools included bovid ribs carved into a broad, round-ended shape that is ideal for scraping and removing tissues from leathers and pelts. These tools look remarkably similar to those that craftsmen still employ today to process hides.

In total, the scientists identified 62 different bone tools dated to 90,000 to 120,000 years ago, including a whale tooth that appears to have been used as a flake stone. These tools were already specialized. But early humans must have used cruder tools to process natural when they first started making clothing, so the first clothes likely appear much earlier than this. Previously, researchers sequenced the DNA of lice known to infest clothing and found they appeared about 170,000 years ago.

Alongside the bone tools, the scientists found the remains of sand foxes, golden jackals, and wildcats exhibiting cut marks in patterns resembling those left by skinning. For instance, incisions were found at each of the animal’s four paws, performed to allow the skin to be pulled in one piece from the paws to the head. Ancient cut marks around the animal’s mouth show how Pleistocene humans removed the skin of the head.

Carnivores were skinned for fur and bone tools were then used to prepare the furs into pelts. Credit: Jacopo Niccolò Cerasoni, 2021.

The marks left on the bones of these carnivorous animals were not those you’d expect to see due to butchery, suggesting hunter-gathers were only interested in obtaining their hides. On the other hand, other animal remains, including ancient cow-like bovids, showed clear signs that their meat was processed.

The timeline of these bone tools precedes the great migration of humans out of Africa — and it makes sense. Early humans required clothing if they were to survive the trek across frigid Eurasia.

As to how these clothes must have looked like, that’s a mystery. We can only speculate if they were primarily practical to provide protection against the elements or whether they also contained symbolic ornaments.

Hallett and colleagues want to replicate these tools and experimentally manufacture clothes from natural materials available to Pleistocene hunter-gatherers. While undoubtedly fun, the goal is to better understand the kind of time and labor required in this ancient process.

China’s ethnic cleansing could prevent 4.5 million Uyghur births by 2040. Researchers say this is genocide

Uyghur protest at UN Climate Summit in 2014. Credit: Wikimedia Commons.

China’s abusive treatment of its Uyghur minority in Xinjiang province is, by now, an open secret. Human rights groups estimate that up to one million Uyghur people have been detained over the past decade in what the Chinese state calls “re-education” camps. A new study now provides the most compelling evidence that China is actively seeking to control and reduce the population of Uyghurs and replace them with Han Chinese.

Dr. Adrian Zenz, a German anthropologist and one of the world’s foremost experts on the topic of Xinjiang internment camps, is the lead author of the new study, which found China is employing cruel population control policies and tactics, such as enforced birth control, forced displacements, and re-education camps. These measures could see between 2.6 and 4.5 million fewer Uyghurs being born by the year 2040. Zenz goes as far as stating that these ethnic cleansing policies could be classed as genocide under the 1948 U.N. Genocide Convention.

Who are the Uyghurs?

The Uyghur are Turkic-speaking people who live for the most part in northwestern China, in Xinjiang, which is officially known as the Xinjiang Uyghur Autonomous Region (XUAR).

The first mention of these Central Asian people in Chinese records dates from the 3rd century CE. In the 8th century, they even established their own kingdom along the Orhan River and what is now Mongolia.

Today, the Uyghur people, which are Sunni Muslims, number around ten million in Xinjiang, around half of the region's population. However, Uyghur used to constitute a larger proportion of Xinjiang's population until a large number of Han (ethnic Chinese) began moving into the autonomous region. This migration began in the 1950s and became especially pronounced after 1990. By the late 20th century, Han Chinese constituted two-fifths of Xinjiang's population.

Over time, tensions between the two ethnic groups grew, resulting in protests and culminating in an outbreak of violence in 2009, in which 200 people were killed and some 1,700 were injured. In response, Chinese authorities have cracked down on Uyghurs suspected of being dissidents and separatists.

However, human rights groups have accused China's government of using its security crackdown as an excuse to launch an ethnic cleansing campaign meant to turn Xinjiang into a Han-majority region. Up to one million Uyghurs are reportedly detained in "political training centers", which have been likened to the gruesome re-education camps from the bloody Mao Zedong era. China also installed an extensive state surveillance programme with cameras, checkpoints, and constant police patrols in Uyghur-dominated areas.

According to Human Rights Watch, people's behavior is monitored with a mobile app, such as how much electricity they are using and how often they use their front door.

Forced labor and mass sterilization, part of China's strategy to revamp Xinjiang's ethnic makeup

Satellite images showing the rapid development of detention camps in Xinjiang. Credit: Google.

Satellite imagery suggests that factories have been used within the grounds of the heavily fortified internment camps. Xinjiang produces about a fifth of the world's cotton, and human rights groups have accused China of using forced labor in the camps to produce much of this cotton.

Many Western brands have removed Xinjiang cotton from their supply chains in 2021. In response, China blocked the online shops and greatly hindered sales for H&M, Nike, Burberry, Adidas, Converse, and other brands that announced they would no longer source their cotton from Xinjiang.

In a new study published today in the journal Central Asian Survey, Zenz compiled the most important evidence to this date concerning Uyghur abuse at the hand of Chinese authorities, concluding that China is on a campaign to depopulate Xinjiang of Uyghurs.

According to Zenz, Beijing is “attaching great importance to the problem of Xinjiang’s population structure and population security,” which it intends to 'optimize' with instructions on how to proceed coming from the very top of the central government.

After analyzing a trove of publicly available documents, the researcher documented a state-run scheme meant to forcibly uproot, assimilate, and reduce the population of Uyghur people. These efforts have ramped up starting since 2017, resulting in mass internment for 'political re-education', but also systematic birth control, mass sterilization, and forced displacement.

In a previous 2020 study, Zenz revealed that Xinjiang authorities are administering drugs and injections to Uyghur women in detention, implanting intrauterine contraceptive devices, and coercing women to accept surgical sterilization.

As a result, population growth rates have fallen nearly 85% in the two largest Uyghur prefectures between 2015 and 2018. Meanwhile, the birth rate in Han-majority counties declined by only 20%.

Zenz estimates that the birth control measures could result in a potential loss to the Uyghur population of between 2.6 and 4.5 million by 2040.

“My study reveals the presence of a long-term strategy by Beijing to solve the Xinjiang “problem” through “optimization” of the ethnic population structure,” Zenz said in a statement. “The most realistic method to achieve this involves a drastic suppression of ethnic minority birth rates for the coming decades, resulting in a potential loss of several million lives. A smaller ethnic minority population will also be easier to police, control and assimilate.”

The 1948 U.N. Genocide Convention defines genocide as "intent to destroy, in whole or in part, a national, ethnical, racial or religious group," and Zenz says that this optimization campaign could fall under this definition.

Initially, China denied the existence of internment camps in the Xinjiang region. After the evidence was undeniable, Chinese authorities defended their existence as a necessary measure against terrorism and separatist violence. China has dismissed claims it is trying to reduce the Uyghur population through mass sterilizations as "baseless", and says allegations of forced labor are "completely fabricated".

“The most concerning aspect of this strategy is that ethnic minority citizens are framed as a “problem”. This language is akin to purported statements by Xinjiang officials that problem populations are like “weeds hidden among the crops” where the state will “need to spray chemicals to kill them all. Such a framing of an entire ethnic group is highly concerning,” Zenz said.

People in the Philippines are the most Denisovan in the world

Genetic analysis has found clear traces that humans and Denisovans interbred in the past. The Philippine ethnic group known as the Ayta Magbukon has the highest level of Denisovan ancestry in the world.

The Negritos group in the Philippines comprises some 25 different ethnic groups, scattered throughout the Andaman archipelago in South-East Asia. They were once considered to be a single population, but the more researchers looked into it, the more they found that Negritos are actually very diverse.

In the new study, Maximilian Larena of Uppsala University and colleagues set out to establish the demographic history of the Philippines. Their project involved indigenous cultural communities, local universities, as well as official and non-governmental organizations from the area. With everyone working together, they were able to analyze 2.3 million genotypes from 118 ethnic groups in the Philippines — including the diverse Negrito populations.

The results were particularly intriguing for a population called the Ayta Magbukon, which still occupy vast swaths of their ancestral land and continue to coexist with the lowland population surrounding them. The Ayta Magbukon seem to possess the highest level of Denisovan ancestry in the world.

“We made this observation despite the fact that Philippine Negritos were recently admixed with East Asian-related groups—who carry little Denisovan ancestry, and which consequently diluted their levels of Denisovan ancestry,” said Larena “If we account for and masked away the East Asian-related ancestry in Philippine Negritos, their Denisovan ancestry can be up to 46 percent greater than that of Australians and Papuans.”

This finding, along with the recent discovery of a small-bodied hominin called Homo luzonensis, suggests that multiple hominin species inhabited the Philippines prior to the arrival of modern humans — and these groups likely interbred multiple times.

The Denisovans are a mysterious group of hominins identified in 2010 based on mitochondrial DNA (mtDNA) extracted from a juvenile female finger bone from the Siberian Denisova Cave. Although researchers haven’t found numerous traces of DNA, they’ve discovered traces of their DNA in modern populations. Apparently, this group in the Philippines has the highest percentage of Denisovan DNA in the world — at least that we’ve found so far.

“This admixture led to variable levels of Denisovan ancestry in the genomes of Philippine Negritos and Papuans,” co-author Mattias Jakobsson said. “In Island Southeast Asia, Philippine Negritos later admixed with East Asian migrants who possess little Denisovan ancestry, which subsequently diluted their archaic ancestry. Some groups, though, such as the Ayta Magbukon, minimally admixed with the more recent incoming migrants. For this reason, the Ayta Magbukon retained most of their inherited archaic tracts and were left with the highest level of Denisovan ancestry in the world.”

Researchers hope to sequence more genomes and better understand “how the inherited archaic tracts influenced our biology and how it contributed to our adaptation as a species,” Larena concludes.

Journal Reference: “Philippine Ayta possess the highest level of Denisovan ancestry in the world” 

Apes signal ‘hello’ and ‘farewell’ when starting and exiting social interactions

Two bonobos grooming each other at the San Diego Zoo. Credit: Pixabay.

When it’s a simple “hi!”, a head nod, or a bow, humans across cultures signal acknowledgment of the person when engaging in conversation, acts of cooperation, or simply being in the presence of others. Likewise, we also signal disengagement with a gesture or vocalization signifying farewell. This complex social behavior has important implications beyond mere politeness, and apes seem to also purposefully signal the start and end of interactions, according to a new study.

“Investigating how humans and other primates use communication and gaze to coordinate joint action with others is fascinating! It is exciting because it happens on-the-fly – a spontaneous coordination process that bears witness of our sense of joint commitment. Watching two friends having lunch together tells much about how they value each other and their commitment to each other. We thought that, by looking at how apes get into and out of natural interaction of play and grooming, we might find a similar external structure of joint action as in humans; a way by which joint commitment could be studied naturally,” Raphaela Heesen, a postdoctoral researcher at Durham University in the United Kingdom, told ZME Science.

In order to act together to fulfill a common goal, whether it’s building a new house in their community or launching a rover on Mars, two or more people must be jointly committed to acting as a body. This mutual sense of obligation is known as joint commitment, and philosophers and scientists consider it integral to human cooperation, society at large, and the historical success of our species in shaping the world.

But is joint commitment unique to humans? Perhaps not. At least some aspects of it, such as signaling engagement and disengagement, may be shared by apes.

“In humans, joint commitment is not just a product (a mental state) but also a process, or “interactional achievement”. What it means is that, in order for us two to even establish a feeling of mutual obligation (a product of joint commitment) we have to go through an interactional process that requires mutual coordination (in the form of mutual gaze or exchange of communicative signals, in our case language). We first need to establish joint commitment in an entry phase, then maintain it, and later agree to dissolve it in an exit phase. Entry and exit phases of a joint action can thus be used as markers for joint commitment; therefore, entry and exit phases in non-human animal species can be analyzed to investigate joint commitment. In a species that doesn’t communicate before getting into an interaction and while getting out of it, this could mean that there is probably no commitment involved,” Heesen said.

Heesen and colleagues recorded interactions among chimpanzee and bonobo groups in order to investigate whether these closely related species also shared joint commitment features. They got the idea after considering anecdotal evidence that this may be the case. For instance, when two bonobos were interrupted during their grooming, they later used gestures to resume the interaction with each other. Was this a singular event? That’s what the researchers set out to investigate — and it turns out that both chimps and bonobos signal greetings and farewells.

“Our most important finding was that chimpanzees and bonobos do very frequently mutually gaze at each other and communicate when entering and exiting from joint actions.” Heesen said.

To arrive at these conclusions, the researchers analyzed more than 1,200 interactions within five different groups of bonobos and chimpanzees housed in zoos. Bonobos exchanged entry signals and mutual gaze prior to playing 90% of the time and chimps 69% of the time. Exit signals were even more common, with 92% of bonobo and 86% of chimpanzee interactions featuring them.

These greeting and farewell signals included gestures like touching each other, holding hands, butting heads, or gazing at each other.

This video shows two chimpanzees entering a social grooming activity. Madingo (male) approaches Macourie (female) and both mutually gaze at each other (start of the entry phase). Macourie then uses a series of gestures, first attempting to grab Madingo, then touching his shoulder and back (gestures), and finally, grab-pulling him at his hips (gesture). Macourie then starts grooming him on his shoulder once he is sitting in close proximity. The entry stops with the first grooming movements, upon which the main body starts. Credit: Raphaela Heesen and Emilie Genty.

Furthermore, when engaging in entry and exit phases, the apes took into consideration familiarity and power dynamics. Bonobos who were familiar with each other tended to have shorter entry and exit phases, if they existed at all. That’s pretty similar to how humans engage with close friends since they are not afraid to come off as rude or impolite. The shared social history allows them to more rapidly cut to the chase.

“The second most important finding was that, in bonobos, the phases seemed more affected by social dimensions, particularly social bond strength, compared to chimpanzees. Intriguingly, the pattern mirrored what we find in humans and what some people define as “social etiquette” or “politeness”: when interacting with a good friend, you are less likely to put effort into communicating politely. In bonobos, a similar pattern is evident in the structure of the joint action phases. Bonobos produce fewer and shorter entry and exit phases when initiating or terminating a joint action with a closely bonded conspecific as compared to when initiating or terminating a joint action with a weakly bonded one; this pattern was not apparent in chimpanzees by contrast,” Heesen explained, adding that this doesn’t necessarily mean that apes have notions of “politeness” or “social etiquette”, or at least not in the way humans perceive them. “It could also be explained by the fact that apes care about themselves and want to avoid risks with unfamiliar partners,” she added.

Interestingly, the degree of familiarity and strength of social bonds did not seem to have an impact on the social entries and exits among chimps. This may be owed to the strict hierarchical nature of chimp communities, whereas bonobos tend to be more egalitarian.

There are still many unknowns concerning the origin and evolution of joint commitment, but this study marks a step further in unraveling this behavior that’s so central to human society. Next, the researchers plan to investigate joint commitment in other great apes, such as orangutans and gorillas, as well as more distantly related species like wolves or dolphins.

“I think generally there is much to explore from the way in which primates communicate when coordinating joint action. And one way this can be done is by comparing how different species get into and out of social interactions with peers. There may well be differences in the complexity with which some species do so; perhaps other, more distantly related species do not even communicate when exiting from a social encounter; we advocate more studies to investigate this process,” Heesen concluded.

The findings appeared in the journal iScience.

Just how “human” are we? At most, 7% of your DNA is uniquely human, study finds

A landmark study found that only 1.5% to 7% of the human genome contains uniquely (modern) human DNA. The rest is shared with relatives such as Neanderthals and Denisovans.

However, the DNA that is unique to us is pretty important, as it’s related to brain development and function.

Image in public domain.

Researchers used DNA from fossils of our close relatives (Neanderthals and Denisovans) dating from around 40,000-50,000 years ago and compared them with the genome of 279 modern people from around the world. They used a new computational method that allowed them to disentangle the similarities and differences between different DNA with greater detail.

Many people around the world (all non-African populations) still contain genes from Neanderthals, a testament to past interbreeding between the two species. But the importance of this interbreeding may have been understated. The new study found that just 1.5% of humans’ genome is both unique and shared among all people living now, and up to 7% of the human genome is more closely related to that of humans than to that of Neanderthals or Denisovans.

This doesn’t mean that we’re 93% Neanderthal. In fact, just 20% of Neanderthal DNA survives in modern humans, and non-African humans contain just around 1.5-2% Neanderthal DNA. But if you look at different people, they have bits of Neanderthal DNA in different places. So if you add all the parts where someone has Neanderthal DNA, that ends up covering most of the human genome, although it’s not the same for everyone. This 1.5% to 7% uniquely human DNA refers to human-specific tweaks to DNA that are not present in any other species and are strictly unique to Homo sapiens.

In addition, this doesn’t take into account the places where humans gained or lost DNA through other means such as duplication, which could have also played an important role in helping us evolve the way we are today.

What makes us human

The research team was surprised to see just how little DNA is ours and ours alone. But those small areas that make us unique may be crucial.

“We can tell those regions of the genome are highly enriched for genes that have to do with neural development and brain function,” University of California, Santa Cruz computational biologist Richard Green, a co-author of the paper, told AP.

The exact biological function of those bits of DNA remains a major problem to disentangle. Our cells are filled with “junk DNA“, which we don’t really use (or we just don’t understand how our bodies use it yet) — but we still seem to need it. We’re not even sure what the the non-junk DNA bits do. Understanding the full instructions and role that genes have is another massive challenge that’s not yet solved.

What this study seems to suggest is that interbreeding played a much bigger role in our evolutionary history than we thought. Previous archaeological studies also suggest this: humans interbred with Neanderthals, Denisovans, and at least one other mysterious species we haven’t discovered yet (but we carry its DNA). Researchers are finding more and more evidence that these interbreeding events weren’t necessarily isolated exceptions but could have happened multiple times and over a longer period than initially thought. It’s up for future studies to reconcile the archaeological and anthropological evidence with the genetic one.

The study also found that the human-specific mutations seemed to emerge in two distinct bursts: 600,000 years ago and 200,000 years ago, respectively. It’s not clear what triggered these bursts; it could have been an environmental challenge or some other event, which at this point is unknown.

Researchers say that studying this 1.5-7% of our genome could help us better understand Neanderthals and other ancient populations, but it could also help us understand what truly makes us human. For instance, you could set up a laboratory dish experiment where you’d edit out the human-specific genes and revert them back to their Neanderthal function, and compare the molecular results of this change. It wouldn’t exactly be like bringing back a Neanderthal, but it could help us deduct how Neanderthals would have been different from modern humans — or, in counterpart, what makes humans stand out from our closest relatives.

The study “An ancestral recombination graph of human, Neanderthal, and Denisovan genomes” has been published in Science.

Is the Easter Island population collapse just a myth? These scientists think so

Credit: Flickr, Arian Zwegers.

When the first European boats arrived at the remote shores of Easter Island in 1722, the native community there was but a shadow of its former glory. Due to centuries of unsustainable deforestation and farming practices, the islanders had depleted their natural resources and the population had been experiencing a slow decline by the time they made contact with Europeans. For this reason, Easter Island is often used as a historical anecdote for the environmental calamities that threatened us due to climate change.

However, a group of researchers from Binghamton University, State University of New York beg to differ. Their research suggests that Eastern Island natives hadn’t experienced demographic collapse. In fact, their numbers were as high as ever when Europeans landed on the remote island’s shore. If anything, Easter Island should be used as an example of resilience rather than disastrous collapse, the researchers argued in a new study published in Nature Communications.

Easter Island: a story of resilience?

The island famous for its 70-ton carved statues was first settled around the year 1200 AD. When the first people arrived on the 63-square-mile patch of land, the place was covered with as many as 15 million trees.

But the settlers, who were slash-and-burn farmers, burned down most of the woods to open up space for crops and gather building material. Within a couple of generations, the island reached an unsustainable number of people, about 3,000 to 4,000 people at its height, and too few trees, according to one often-cited estimate presented in the best-selling book, Collapse, by author Jared Diamond.

Such conclusions are the result of demographic reconstructions based on archaeological evidence and estimations of environmental change during certain periods of the island’s history. For instance, scientists know that around the year 1500, more than two centuries before the natives made contact with European settlers for the first time, the island went through a climactic shift known as the Southern Oscillation Index, which caused the climate to dry.

By counting burials sites and measuring ancient homes, it is possible to estimate a population’s size over its history. But a more established technique uses radiocarbon dating to track the extent of human activity during a certain moment in time. If there is a lot of activity, then this is a sign of a larger population.

“One argument is that changes in the environment had a negative impact. People see that there was a drought and said, ‘Well, the drought caused these changes,'” said Carl Lipo, a professor of anthropology and environmental studies at Binghamton University, State University of New York. “There are changes. Their population changes and their environment changes; over time, the palm trees were lost and at the end, the climate got drier. But do those changes really explain what we’re seeing in the population data through the radiocarbon dating?”

Reconstructing Easter Island’s population

Lipo and fellow anthropologist Robert DiNapoli, who is the lead author of the new study, aren’t at all convinced by previous population assessments on Easter Island using radiocarbon dating, which they call uncertain.

Using a statistical method called Approximate Bayesian Computation on radiocarbon and paleoenvironmental records, the researchers paint a different story of how the population varied in size relative to environmental variables such as instances of climate change.

Unlike traditional statistical methods that break down when computing population shifts as a function of environmental change, Approximate Bayesian Computation can model population change with less uncertainty, the researchers claim.

Their analysis suggests that, contrary to previous findings, the island actually experienced steady population growth from its initial settlement until European contact roughly 500 years later. After this initial point of contact, the population either plateaus or declines, according to the outcomes of four different models computed by the authors of the study.

To support their conclusions, DiNapoli and Lipo point to recent evidence showing that deforestation on the island was prolonged rather than sudden and that it didn’t result in catastrophic soil erosion since forests were replaced by vegetable gardens that contained stones whose minerals increased agricultural yield. During droughts, the natives could have relied on freshwater coastal seeps. New Moai statues continued to be erected on the island even after European contact.

“There’s a natural tendency to think that people in the past aren’t as smart as we are and that they somehow made all these mistakes, but it’s really the opposite,” Lipo said. “They produced offspring, and the success that created the present. Even though their technologies might be more simple than ours, there is so much to be learned about the context in which they were able to survive.”

The researchers argue that the idea of Easter Island’s collapse at the hand of unsustainable practices and climate change is a myth — and a very entrenched one to boot since people are attached to the idea of humans destroying themselves in the context of the post-industrial destruction we’ve caused and anthropogenic climate change. But despite the very real threat of man-made climate change, we shouldn’t let our judgement regarding historical contexts be clouded by present reality. Instead, Easter Island could very well be seen as a story of resilience.

“Those resilience strategies were very successful, despite the fact that the climate got drier,” Lipo said. “They are a really good case for resiliency and sustainability.”

Of course, not everyone is convinced. There are still many scientists who believe that the Easter Island native practices led to the collapse of their civilization. We will likely have more to learn about this mysterious island and its people.

The findings appeared in the journal Nature Communications.

Is the ‘Dragon Man’ a new species of human? Here’s what we know so far

Artist’s impression of Dragon Man. Credit: Chuang Zhao.

Last week, paleontologists in China broke the news that they have identified a 146,000-year-old cranium that may belong to a distinct, up until now unidentified species of humans. This tentative new species, known as Homo longi, or Dragon Man, has a mix of features shared by Neanderthals, Denisovans, and humans. If it is indeed a new species, scientists believe it may be the closest relative to modern humans, replacing the Neanderthals as our closest extinct kin.

The Dragon Man skull

From left to right are the skulls of Peking Man, Maba, Jinniushan, Dali, and Harbin. Credit: Kai Geng.

The skull was found near Harbin, a town in northeast China, in 1933 by bridge construction workers. Its potential importance was missed until 2018 when it reached the hands of a team of paleontologists led by Xijun Ni, a professor of primatology and paleoanthropology at the Chinese Academy of Sciences and Hebei GEO University.

Unlike most other hominin fossilized skulls that are usually crushed and fragmented, the Harbin skull was discovered remarkably intact. Its only major flaw is that it has only one tooth still attached to the mandible, a left molar.

In a series of three papers, the researchers described the extraordinary skull, which could hold a brain comparable in size to modern humans. It features almost square eye sockets beneath a heavy brow ridge reminiscent of the Neanderthals but has a wide face with small, flat cheekbones that is typical of modern humans. The cranium, which scientists believed belonged to a 50-year-old male, also features a wide mouth and oversized teeth.

“The Harbin fossil is one of the most complete human cranial fossils in the world. This fossil preserved many morphological details that are critical for understanding the evolution of the Homo genus and the origin of Homo sapiens. While it shows typical archaic human features, the Harbin cranium presents a mosaic combination of primitive and derived characters setting itself apart from all the other previously named Homo species,” said Qiang Ji, a professor of paleontology at Hebei GEO University.

A new species of human? not so fast

Artist impression of Dragon Man. Credit: The Innovation.

Like modern humans, Homo longi probably hunted mammals and birds, gathered wild fruits and vegetables, and perhaps even caught fish. Considering the Harbin individual was large in stature, as well as the location where it was found, the researchers believed that H. longi was well adapted to harsh environmental conditions.

Geochemical analyses showed that the Harbin man fossils are at least 146,000 years old, placing them well within the Middle Pleistocene, an era when humans were busy dispersing across the world. It is thus very likely that H. longi encountered Homo sapiens, as well as Denisovans and Neanderthals.

“We see multiple evolutionary lineages of Homo species and populations co-existing in Asia, Africa, and Europe during that time. So, if Homo sapiens indeed got to East Asia that early, they could have a chance to interact with H. longi, and since we don’t know when the Harbin group disappeared, there could have been later encounters as well,” says author Chris Stringer, a paleoanthropologist at the Natural History Museum in London.

When the researchers reconstructed the human tree of life to account for H. longi, they found that the tentative new species is even more closely related to us than Neanderthals and represents a sister species. This implies that Homo sapiens must have split from Neanderthals even further back in time, diverging from a common ancestor roughly 400,000 years earlier than scientists had previously thought.

“It is widely believed that the Neanderthal belongs to an extinct lineage that is the closest relative of our own species. However, our discovery suggests that the new lineage we identified that includes Homo longi is the actual sister group of H. sapiens,” says Professor Ni.

But is Homo longi truly a new species of human? It’s a bit too early to tell. The Harbin man may well be a Denisovan, an extinct species of archaic human that ranged across Asia during the Lower and Middle Paleolithic and whose fossil record is very scant. So far, the only fossils we have found of Denisovans include a finger bone, a few teeth and a skull fragment retrieved from Denisova Cave in Siberia, and a jawbone from Xiahe, northern China.

According to Ars Technica, when “Ni and colleagues did their statistical analysis, they pointed out that the Harbin skull fell into a group along with the 160,000-year-old Denisovan mandible from Xiahe. Given the great diversity of shapes and sizes that human skulls come in, it wouldn’t be that surprising for the Harbin skull to actually belong to the range of diversity for Denisovans.

If scientists manage to extract DNA from the Harbin skull, they could then compare it to the genomes of Denisovans, Neanderthals, and modern humans, to which we have access. That would settle at least some of the debate.

In any event, the Harbin skull is hugely significant. If it turns out to be a distinct species, then the human tree of life just got enlarged with one member. If subsequent research shows it is from a Denisovan, then we’ll finally know what these rather mysterious cousins looked like. So a win/win for science.

A quarter of American adults may not want to ever become parents — and they’re quite happy about it

Credit: Pixabay.

Many parents swear by the fact that having children changed their life for the better. Some people, however, are not attracted at all by the parent lifestyle and would like things to stay this way for the foreseeable future. A new study found that this group of people, known as ‘childfree individuals’, represent a quarter of all adults in Michigan, much more than previously believed.

According to the findings, these people aren’t any more or less happy than parents and individuals who wish to conceive in the future, contrary to popular belief. What’s more, there aren’t any significant personality differences between the two that could be used to predict who may be a ‘parent type of person’.

Studies in the past tended to lump together all types of nonparents into a single category which scientists then compared to parents. This approach can render flawed results if the intention is to compare people who are childless by their own volition to parents. Some people may be childless but only because of their current circumstances. They may not be able to conceive due to fertility issues or perhaps they’re wanting to improve their financial situation or career prospects. Some simply haven’t found that special person yet with whom they’d feel confident and secure to become a parent but would nevertheless want to have a child in the future.

Child-free individuals aren’t like that at all. They don’t have children nor do they wish to have children in the future — at least in this present time of their lives.

In order to make a more reliable comparison, researchers at Michigan State University relied on a set of three key questions when surveying a representative sample of 1,000 adults from the state. These were:

  1. Do you have, or have you ever had, any biological or adopted children?
  2. Do you plan to have any biological or adopted children in the future?
  3. Do you wish you had or could have biological or adopted children?

Those who answered ‘yes’ to the first question were skipped out of the subsequent two questions and were automatically categorized as parents. Those who answered ‘no’ to the first question but gave an affirmative answer to the second were classed as not-yet-parents. Finally, those who said ‘no’ to all three questions were categorized as childfree.

“This approach to identifying childfree individuals differs from prior research in two important ways. The second question allows us to distinguish individuals who expect to have children in the future (i.e. not-yet-parents) from those who do not (i.e. childless and childfree). Additionally, the third question classifies an individual as childfree or childless based solely on their lack of desire for biological or adopted children, regardless of their fertility status. In our analyses, parental status is treated as a categorical variable with childfree as the omitted or reference category,” the researchers wrote in their study published in the journal PLOS ONE.

After controlling for the demographics of the participants, the researchers could not find any differences in terms of life satisfaction and found limited differences in personality traits between childfree and parents or not-yet-parents.

“We also found that childfree individuals were more liberal than parents, and that people who aren’t child-free felt substantially less warm toward child-free individuals,” said Zachary Neal, lead author of the study and an associate professor in Michigan State University’s department of psychology.

The most surprising part of the study was that nearly one in four people in Michigan identified themselves as childfree, which is a pretty huge proportion when compared to previous estimates that relied on fertility to identify childfree individuals. Such previous assessments placed the childfree rate at only 2% to 9%.

Among the childfree, 35% are in a partnered relationship, which means that couples who have no interest in becoming parents during their lifetimes are a sizable type of family. It is not clear whether the proportion of this type of childfree family has increased over time and what impact this might have on Michigan’s demographics, or the United States at large if the findings apply countrywide since this was not the focus of the study.