Tag Archives: age

How old is your dog? Open-science project is studying how dogs age, and you can join it

We’ve all heard the saying that one dog year is roughly equivalent to seven human years. But new research is working to find out more about how dogs progress through life — and, in turn, teach us about how we, ourselves, age.

Image via Pixabay.

It is true that dogs age faster than humans. However, according to researchers behind the Dog Aging Project (DAP), founded in 2018, the details are a bit murky. Saying that one human year is equivalent to seven dog years is a very broad simplification; big dogs tend to age the fastest, around 10 times as fast as humans, while little breeds age slower, about five times as fast as humans.

In other words, there is still much we don’t know about how man’s best friend grows old. Which is why the DAP was set up.

A dog’s life

“This is a very large, ambitious, wildly interdisciplinary project that has the potential to be a powerful resource for the broader scientific community,” said Joshua Akey, a professor in Princeton’s Lewis-Sigler Institute for Integrative Genomics and a member of the Dog Aging Project’s research team.

“Personally, I find this project exciting because I think it will improve dog, and ultimately, human health.”

The project is the largest undertaking to date that looks into canine aging and longevity. It currently involves tens of thousands of dogs of all breeds, sizes, and backgrounds, data from which goes into an open-source repository for veterinarians and scientists to use in the future. This wealth of data can be used to assess how well a particular dog is faring for their age, the researchers behind the DAP explain and help further our understanding of healthy aging in both dogs and humans.

It is set to run for at least 10 years in order to gather the data required. So far, over 32,000 dogs and their owners have joined the program, and recruitment is still ongoing. The owners of these dogs agreed to fill out annual surveys and take various measurements of their dogs to be used in the program. Some of them have also been asked to collect DNA material via cheek swabs for the researchers to sample. In addition, veterinarians associated with the program across the USA submit fur, blood, and other required samples from the dogs enrolled in the program (collectively known as the “DAP Pack”).

“We are sequencing the genomes of 10,000 dogs,” Akey said. “This will be one of the largest genetics data sets ever produced for dogs, and it will be a powerful resource not only to understand the role of genetics in aging, but also to answer more fundamental questions about the evolutionary history and domestication of dogs.”

The end goal of the program is to isolate specific biomarkers of aging in dogs. These should translate well to humans, the team explains. Dogs experience almost the same diseases and functional declines related to age as humans, veterinary care of dogs mirrors human healthcare in many ways, and dogs very often share living environments with humans. That last factor is very important as the environment is a main driver of aging and cannot be replicated in the lab.

Given that dogs share our environment, age similarly to us, but are much shorter-lived than humans, we have an exciting opportunity to identify factors that promote a healthy lifespan, and to find the signs of premature aging.

The oldest 300 dogs in the program will have their DNA sequenced as part of the ‘super-centenarian study. The team hopes to start this process in a few months. By that time, they will also open their entire anonymized dataset for researchers around the world to study.

If you live in the USA and would like to help, you and your doggo can enroll here.

The paper “An Open Science study of ageing in companion dogs” has been published in the journal Nature.

Genomic studies uncover the tale of the first Bronze Age civilizations in Europe

Although they were set apart in cultural customs, architectural preference, and art, the earliest bronze-using civilizations from Europe were quite similar from a genetic standpoint, a new paper reports.

Reenactors living as a bronze-age family. Image credits Hans Splinter.

The exact details of the Early Bronze Age civilizations across the world aren’t always clear — and the peoples living around the Aegean Sea are no exception to this. One theory regarding this period is that these groups — mainly the Minoan, Helladic, and Cycladic civilizations — were introduced to new technology and ideas by groups migrating from the east of the Aegean, with whom they intermingled.

However, new findings show that these groups were very similar genetically, which wouldn’t support the idea that an outside group was present and overwhelmingly mixed with the locals, at least during the Early Bronze age (5000 years ago). In turn, this would mean that the defining technologies and developments of this era, the ones that took us from the stone to the copper/bronze age, were developed in the Aegean Sea region largely independently of outside influences. That being said, the team does report finding genetic evidence of ‘relatively small-scale migration’ from the East of this area.

Domestically-developed, foreign influences

“Implementation of deep learning in demographic inference based on ancient samples allowed us to reconstruct ancestral relationships between ancient populations and reliably infer the amount and timing of massive migration events that marked the cultural transition from Neolithic to Bronze Age in Aegean,” says Olga Dolgova, a postdoctoral researcher in the Population Genomics Group at the Centre Nacional d’anàlisi Genòmica (CNAG-CRG), and a co-author of the paper.

The transition from the late stone age to the early bronze age was mediated (and made possible) by the development of ideas such as urban centers, the use of metal, an intensification in trade, and writing. History is rife with examples of people moving around and spreading ideas as they go, so the team set out to understand whether the Early Bronze Age in the Aegean area was made possible by such a movement of people and ideas.

To find out, they took samples from well-preserved skeletal remains at archaeological sites throughout this region. Six whole genomes were sequenced, four of them belonging to individuals from the three local culture groups during the Early Bronze Age, and two from the Helladic culture. Furthermore, full mitochondrial genomes were sequenced from 11 other individuals who lived during the Early Bronze Age.

This data was pooled together and used to perform demographic and statistical analyses in order to uncover the individual histories of the different population groups that inhabited this area at the time.

The findings seem to suggest that early developments were in large part made locally, most likely growing on top of the cultural background of local Neolithic groups, and weren’t owed to a massive influx of people from other areas.

By the Middle Bronze Age (4000-4,600 years ago) however, individuals living in the northern Aegean area were quite different, genetically, from those in the Early Bronze Age. Half their lineage traced back to people from the Pontic-Caspian steppe, an area spanning to the north of the Black Sea from the Danube and to the Ural river. By this point, they were already highly similar to modern Greeks, the team adds.

In essence, the findings suggest that immigration started playing an important role in shaping local genetics after the peoples in the Aegean area had already transitioned from the stone to copper/early bronze age. With that in mind, these influxes precede the earliest known forms of Greek; this would suggest that although immigration didn’t play a large part in shaping technology and know-how during the early bronze age, it did play a central role in cultural matters as time went on, such as the emergence and evolution of Proto-Greek and Indo-European languages in either Anatolia or the Pontic-Caspian Steppe region.

“Taking advantage [of the fact] that the number of samples and DNA quality we found is huge for this type of study, we have developed sophisticated machine learning tools to overcome challenges such as low depth of coverage, damage, and modern human contamination, opening the door for the application of artificial intelligence to paleogenomics data,” says Oscar Lao, Head of the Population Genomics Group at the CNAG-CRG, and a co-author of the paper.

The advent of the Bronze Age in the Aegean region was a pivotal event in European history, one whose legacy still shapes much of its economic, social, politic, and philosophical traditions — and, by extension, the shape of the world we live in today.

Despite this, we know precious little of the peoples that made this transition, how they fared over time, or how much of them still resides in the genomes of modern-day groups such as the Greeks. The team hopes that similar research can be carried out in the Armenian and Caucasus regions, two regions ‘to the east of the Aegean’. A better understanding of peoples here could help further clarify what was going on in the Aegean at the time, helping us better understand the evolution of local technology, languages, customs, and genetic heritage.

The paper “The genomic history of the Aegean palatial civilizations” has been published in the journal Cell.

Stone-age humans mostly ate meat, then ran out of big animals

Stone age humans used to dine mainly on meat, a new study reports. It was only as megafauna (the huge animals of yore, like mammoths) died off that vegetables were increasingly making their way on the menu.

Image credits Uwe Ruhrmann.

A new paper offers a fresh and interesting interpretation of how humanity made the trek from hunting to agriculture. According to the findings, ancient humans were primarily carnivores, with game meat making up an important part of their diet. But as the species they hunted died out, vegetables and plant matter made up a growing part of their diets. These extinctions likely also led to the domestication of plants and animals, as our ancestors needed to secure sources of food.

Traditional cuisine

“So far, attempts to reconstruct the diet of stone-age humans were mostly based on comparisons to 20th-century hunter-gatherer societies,” explains Dr. Miki Ben-Dor of the Jacob M. Alkov Department of Archaeology at Tel Aviv University, first author of the paper.

“This comparison is futile, however, because two million years ago hunter-gatherer societies could hunt and consume elephants and other large animals — while today’s hunter-gatherers do not have access to such bounty. The entire ecosystem has changed, and conditions cannot be compared. We decided to use other methods to reconstruct the diet of stone-age humans: to examine the memory preserved in our own bodies, our metabolism, genetics, and physical build. Human behavior changes rapidly, but evolution is slow. The body remembers.”

The team trawled through almost 400 scientific papers from various disciplines, trying to determine whether stone-age humans were carnivores or omnivores. They collected around 25 lines of evidence, mostly from papers dealing with genetics, metabolism, physiology, and morphology, that can help us determine this.

One of the tidbits cited by the team includes the acidity of the human stomach. This is “high when compared to omnivores and even to other predators”, they explain, which means our bodies have to spend extra energy to keep them so. But it also provides some protection from bacteria often found in meat, suggesting that this was an adaptation meant to help our ancestors eat meat. Ancient peoples hunted large animals whose meat would feed the group for days or weeks, meaning they often ate old meat laden with bacteria.

Another clue they list is the way our bodies store fat. Omnivores, they explain, tend to store fat in a relatively small number of large cells. Predators do it the other way around — humans also share this latter approach of using a large number of relatively small cells. A comparison with chimpanzees also shows that areas of our genetic code are inactivated to specialize us for a fat-rich diet (in chimps, these changes support a sugar-rich diet).

Archeological evidence also supports the meat-eating hypothesis. Isotope ratio studies on the bones of ancient humans, alongside evidence of how they hunted, suggests our ancestors specialized in hunting large or medium-sized animals that had a lot of fat. Large social predators today also hunt large animals and get over 70% of their energy from animal sources, the team writes, and this parallel suggests that early human groups acted a lot like hypercarnivores.

“Hunting large animals is not an afternoon hobby,” says Dr. Ben-Dor. “It requires a great deal of knowledge, and lions and hyenas attain these abilities after long years of learning. Clearly, the remains of large animals found in countless archaeological sites are the result of humans’ high expertise as hunters of large animals.”

“Many researchers who study the extinction of the large animals agree that hunting by humans played a major role in this extinction — and there is no better proof of humans’ specialization in hunting large animals. Most probably, like in current-day predators, hunting itself was a focal human activity throughout most of human evolution. Other archaeological evidence — like the fact that specialized tools for obtaining and processing vegetable foods only appeared in the later stages of human evolution — also supports the centrality of large animals in the human diet, throughout most of human history.”

The findings go against the grain of our previous hypotheses on how humans evolved. Previously, it was assumed that humans’ dietary flexibility allowed them to adapt to a wide range of situations and environments, giving them an evolutionary edge; but the current findings suggest that we evolved largely as predators instead. That’s not to mean that they ate only meat — there is well-documented evidence of plant-eating during this time — but plants only gained a central part in their diets in the latter days of the stone age.

Stone tools specialized for processing plants started appearing around 85,000 years ago in Africa and about 40,000 years ago in Europe and Asia, the team adds, suggesting plants were increasingly being eaten. The researchers also explain that such tools show an increase in local uniqueness over time, a process similar to that seen in 20th-century hunter-gatherer societies. In contrast, during the time when the team believes humans acted more like apex predators, stone tools maintained very high degrees of similarity and continuity regardless of local ecological conditions.

“Our study addresses a very great current controversy — both scientific and non-scientific. It is hard to convince a devout vegetarian that his/her ancestors were not vegetarians, and people tend to confuse personal beliefs with scientific reality,” adds Prof. Ran Barkai, also of the Jacob M. Alkov Department of Archaeology at Tel Aviv University, and a co-author of the paper.

“Our study is both multidisciplinary and interdisciplinary. We propose a picture that is unprecedented in its inclusiveness and breadth, which clearly shows that humans were initially apex predators, who specialized in hunting large animals. As Darwin discovered, the adaptation of species to obtaining and digesting their food is the main source of evolutionary changes, and thus the claim that humans were apex predators throughout most of their development may provide a broad basis for fundamental insights on the biological and cultural evolution of humans.”

The paper “The evolution of the human trophic level during the Pleistocene” has been published in the American Journal of Physical Anthropology.

A new study looked at how early complex European cultures farmed and ate

New research is shedding light onto the social and agricultural customs of early Bronze Age societies.

Map showing the maximum territorial extension of the El Argar culture with locations of the analyzed sites (La Bastida and Gatas)
Image credits Corina Knipper et al., (2020), PLOS One.

The El Agar society is known from a site in the south-western corner of the Iberian peninsula (today’s Spain). It is believed, however, that it held cultural and political sway over a larger area during its day, from 2200-1550 cal BCE. It also developed sophisticated pottery and ceramics, which they traded with other tribes in the Mediterranean region.

New research based on El Agar gravesites and the layouts of their settlements reports that it was likely a strongly-hierarchical society that revolved around complex, “monumentally fortified” hilltop settlements. The findings showcase the potential use of including trophic (food) analysis in anthropology, and help to reveal the complexity that societies in this period could achieve.

Farming for success

“It is essential to not only investigate human remains, but also comparative samples of different former food stuffs as well as to interpret the data in the light of the archaeological and social historical context,” explains Dr. Corina Knipper from the Curt Engelhorn Center Archaeometry, the paper’s lead author.

The team used carbon dating and nitrogen isotope analysis on artefacts recovered from two El Algar hilltop settlements: a large fortified urban site (La Bastida, in today’s Murcia region) and a smaller settlement at Gatas (in today’s Almería region). The samples analyzed include remains from 75 different individuals across all social levels, 28 domestic animal and wild deer bones, 75 grains of charred barley and 29 grains of charred wheat. All the samples hail from the middle to late El Algar civilization.

The findings showed no significant difference in isotope values between males and females, which is indicative of the fact that both genders shared similar diets. However, the team did find a difference between individual social strata — remains from individuals that made up the elite of La Bastida showed higher levels of both carbon and nitrogen than their peers. This could be indicative of individuals here eating more animal-based products (nitrogen concentrates the farther up you go along the food chain). However, the team further reported that while the nitrogen values for barley were similar at both sites, domestic animals at La Bastida showed higher nitrogen values. This means that the same general diet at both sites could still have resulted in the different nitrogen levels seen.

The latter view is further strengthened by the finding that these communities relied heavily on cereal farming, which they only supplemented with livestock. Analysis of the wheat and barley suggests that the landscape they grew in were dry and unirrigated, but likely fertilized with animal manure, judging from the high nitrogen levels they contain. Cereals and their by-products also seem to have provided most of the forage of domesticated animals (sheep, goats, cattle, and pigs).

The study is based on a small sample size, which limits the reliability of the results. However, it does highlight the role trophic chain analysis plays in helping archeologists piece together the past from human remains. It also goes a long way to show that El Algar farmers had developed relatively sophisticated practices for their time, which allowed them to feed a thriving community.

The paper “Reconstructing Bronze Age diets and farming strategies at the early Bronze Age sites of La Bastida and Gatas (southeast Iberia) using stable isotope analysis” has been published in the journal PLOS One.

Earliest evidence of milk consumption comes from Stone Age Britain

Researchers, led by archaeologists at the University of York, have found the earliest evidence of milk consumption ever observed in the teeth of prehistoric British farmers.

Image credits Myriam Zilles.

The team identified a milk protein called beta lactoglobulin (BLG) in the mineralized dental plaque of seven individuals who lived around 6,000 years ago. The findings will help improve our understanding of when humans developed lactose persistence (LP), the ability to digest lactose in milk. It’s also the earliest confirmed sighting of the BLG molecule so far.

Luckily they didn’t brush their teeth

“The fact that we found this protein in the dental calculus of individuals from three different Neolithic sites may suggest that dairy consumption was a widespread dietary practice in the past,” says lead author Dr. Sophy Charlton, from the Department of Archaeology at the University of York.

Dental plaque, while not something you want to have, can be used to gain insight into the diets of ancient people. The material traps proteins from food, through saliva, which are then mineralized in plaque or tartar. The samples of dental plaque analyzed in this study are the oldest to be investigated for protein content, the team explains.

The Neolithic period in Britain ran from 4,000 to 2,400 BC and saw the transition from hunter-gatherer communities to farming, mostly revolving around the growing of wheat and barley and the domestication of animals such as cows, sheep, pigs, and goats. This time also saw the emergence of complex cultural practices such as the construction of monumental and burial sites.

The remains used in this study come from three different Neolithic sites in England: Hambledon Hill, Hazleton North (both in the south of England), and Banbury Lane (in the East Midlands). Individuals from all three sites had milk proteins from goats, cows, and sheep, suggesting that multiple domesticated species were reared at the same time.

“It would be a fascinating avenue for further research to look at more individuals and see if we can determine whether there are any patterns as to who was consuming milk in the archaeological past — perhaps the amount of dairy products consumed or the animals utilised varied along the lines of sex, gender, age or social standing,” says Dr. Charlton.

Finding these proteins in the ancient teeth is particularly exciting, as previous genetic work has suggested that people living at the time did not yet have the ability to digest lactose.

Overall, it means that the ancient farmers either consumed milk in small amounts or processed it into foods such as cheese (which removes most of the lactose). Lactose persistence, our ability to consume milk into adulthood, was the result of a mutation in the genes encoding production of lactase, which breaks down lactose. How and why we evolved this ability is of quite some interest to researchers, as milk and dairy products played an important part in past diets, as well as those of today — and this study gives us a better idea of when the mutation occurred, the conditions that helped it appear, and how people dealt with lactose intolerance before it.

“Because drinking any more than very small amounts of milk would have made people from this period really quite ill, these early farmers may have been processing milk, perhaps into foodstuffs such as cheese, to reduce its lactose content,” says Dr. Charlton.

“Identifying more ancient individuals with evidence of BLG in the future may provide further insights into milk consumption and processing in the past, and increase our understanding of how genetics and culture have interacted to produce lactase persistence.

The paper “New insights into Neolithic milk consumption through proteomic analysis of dental calculus” has been published in the journal Archaeological and Anthropological Sciences.

Small blade.

In the stone-age people recycled flint on purpose to produce precision blades

Research from Tel Aviv University (TAU) shows that recycling may, in fact, be an ancient tradition. Prehistoric humans deliberately “recycled” discarded or broken flint tools 400,000 years ago to create smaller, more specialized tools.

Small blade.

Tuber cutting with a small recycled flake, alongside a close-up.
Image credits Flavia Venditti / AFTAU.

In collaboration with members from the University of Rome, researchers from TAU’s Department of Archaeology and Ancient Near Eastern Cultures used two different spectrometry methods to analyze small, peculiar tools that have been uncovered at prehistoric sites throughout Europe and North Africa. Their edges show signs of use, the team reports, and were likely used for in food preparation. This theory is also supported by micro residue found embedded in the edges.

Recycling, before it was cool

“Recycling was a way of life for these people,” says Prof. Ran Barkai from TAU, the paper’s corresponding author. “It has long been a part of human evolution and culture. Now, for the first time, we are discovering the specific uses of the recycled ‘tool kit’ at Qesem Cave.”

The site of Qesem Cave is located just outside Tel Aviv. It was discovered during road construction projects which were undergoing in the area in 2000. Together with caves in Spain and North Africa and digs in Italy and Israel, Qesem produced the tiny blades the team analyzed in the study. Along with other material retrieved from these sites, the tiny blades show signs that prehistoric humans recycled broken tools, or those that were no longer needed, into tinier but more specialized blades.

Due to these cave’s microclimates, the flint tools were preserved in excellent condition, along with residue material from their use embedded in their edges — allowing for their proper analysis. The researchers used two techniques to do so: Fourier transform infrared spectroscopy (FTIR) and scanning electron microscopy coupled with energy dispersive X-ray spectroscopy (SEM-EDX).

“We used microscopic and chemical analyses to discover that these small and sharp recycled tools were specifically produced to process animal resources like meat, hide, fat and bones,” explains Dr. Flavia Venditti of the TAU and lead author of the study.

“We also found evidence of plant and tuber processing, which demonstrated that they were also part of the hominids’ diet and subsistence strategies.”

Signs of use were found on the outer edges, the team reports, indicative of cutting activity. Material on the blades suggests that they were used in activities related to the consumption of food: butchery activities and tuber, hide and bone processing.

The team went to great pains to meticulously analyze the tools to “demonstrate that [they] were used in tandem with other types of utensils,” according to Dr Venditti. This would suggest that the recycling was a deliberate process, used specifically to produce a more specialized tool to be used as part of a larger kit.

“The research also demonstrates that the Qesem inhabitants practiced various activities in different parts of the cave: The fireplace and the area surrounding it were eventually a central area of activity devoted to the consumption of the hunted animal and collected vegetal resources, while the so-called ‘shelf area’ was used to process animal and vegetal materials to obtain different by-products,” she adds.

The study touches on two hot topics in the field of stone-age archaeology, looking at both the role of small tools and that of recycling in prehistoric communities. The findings show that recycling was an established, on-going practice at Qesem Cave rather than a more opportunistic process. The people in this area had ample access to flint, the team also notes, so it wasn’t a question of scarcity. Rather, it seems that this group of people deliberately used tool recycling to produce these tiny blades because it was the most effective way to do so. The blades had to be tiny yet sharp, as they were used in tasks where “precision and accuracy were essential,” Venditti concludes

The paper “Recycling for a purpose in the late Lower Paleolithic Levant: Use-wear and residue analyses of small sharp flint items indicate a planned and integrated subsistence behavior at Qesem Cave (Israel)” has been published in the journal Journal of Human Evolution.

Glacier ice.

Ice ages may be caused by tectonic activity in the tropics, new study proposes

New research says that the Earth’s past ice ages may have been caused by tectonic pile-ups in the tropics.

Glacier ice.

A crevasse in a glacier.
Image via Pixabay.

Our planet has braved three major ice ages in the past 540 million years, seeing global temperatures plummet and ice sheets stretching far beyond the poles. Needless to say, these were quite dramatic events for the planet, so researchers are keen to understand what set them off. A new study reports that plate tectonics might be the culprit.

Cold hard plates

“We think that arc-continent collisions at low latitudes are the trigger for global cooling,” says Oliver Jagoutz, an associate professor in MIT’s Department of Earth, Atmospheric, and Planetary Sciences and a co-author of the new study.

“This could occur over 1-5 million square kilometers, which sounds like a lot. But in reality, it’s a very thin strip of Earth, sitting in the right location, that can change the global climate.”

“Arc-continent collisions” is a term that describes the slow, grinding head-butting that takes place when a piece of oceanic crust hits a continent (i.e. continental crust). Generally speaking, oceanic crust (OC) will slip beneath the continental crust (CC) during such collisions, as the former is denser than the latter. Arc-continent collisions are a mainstay of orogen (mountain range) formation, as they cause the edges of CC plates ‘wrinkle up’. But in geology, as is often the case in life, things don’t always go according to plan.

The study reports that the last three major ice ages were preceded by arc-continent collisions in the tropics which exposed tens of thousands of kilometers of oceanic, rather than continental, crust to the atmosphere. The heat and humidity of the tropics then likely triggered a chemical reaction between calcium and magnesium minerals in these rocks and carbon dioxide in the air. This would have scrubbed huge quantities of atmospheric CO2 to form carbonate rocks (such as limestone).

Over time, this led to a global cooling of the climate, setting off the ice ages, they add.

The team tracked the movements of two suture zones (the areas where plates collide) in today’s Himalayan mountains. Both sutures were formed during the same tectonic migrations, they report: one collision 80 million years ago, when the supercontinent Gondwana moved north creating part of Eurasia, and another 50 million years ago. Both collisions occurred near the equator and proceeded global atmospheric cooling events by several million years.

In geological terms, ‘several million years’ is basically the blink of an eye. So, curious to see whether one event caused the other, the team analyzed the rate at which oceanic rocks known as ophiolites can react to CO2 in the tropics. They conclude that, given the location and magnitude of the events that created them, both of the sutures they investigated could have absorbed enough CO2 to cool the atmosphere enough to trigger the subsequent ice ages.

Another interesting find is that the same processes likely led to the end of these ice ages. The fresh oceanic crust progressively lost its ability to scrub CO2 from the air (as the calcium and magnesium minerals transformed into carbonate rocks), allowing the atmosphere to stabilize.

“We showed that this process can start and end glaciation,” Jagoutz says. “Then we wondered, how often does that work? If our hypothesis is correct, we should find that for every time there’s a cooling event, there are a lot of sutures in the tropics.”

The team then expanded their analysis to older ice ages to see whether they were also associated with tropical arc-continent collisions. After compiling the location of major suture zones on Earth from pre-existing literature, they reconstruct their movement and that of the plates which generated them over time using computer simulations.

All in all, the team found three periods over the last 540 million years in which major suture zones (those about 10,000 kilometers in length) were formed in the tropics. Their formation coincided with three major ice ages, they add: one the Late Ordovician (455 to 440 million years ago), one in the Permo-Carboniferous (335 to 280 million years ago), and one in the Cenozoic (35 million years ago to present day). This wasn’t a happy coincidence, either. The team explains that no ice ages or glaciation events occurred during periods when major suture zones formed outside of the tropics.

“We found that every time there was a peak in the suture zone in the tropics, there was a glaciation event,” Jagoutz says. “So every time you get, say, 10,000 kilometers of sutures in the tropics, you get an ice age.”

Jagoutz notes that there is a major suture zone active today in Indonesia. It includes some of the largest bodies of ophiolite rocks in the world today, and Jagoutz says it may prove to be an important resource for absorbing carbon dioxide. The team says that the findings lend some weight to current proposals to grind up these ophiolites in massive quantities and spread them along the equatorial belt in an effort to counteract our CO2 emissions. However, they also point to how such efforts may, in fact, produce additional carbon emissions — and also suggest that such measures may simply take too long to produce results within our lifetimes.

“It’s a challenge to make this process work on human timescales,” Jagoutz says. “The Earth does this in a slow, geological process that has nothing to do with what we do to the Earth today. And it will neither harm us, nor save us.”

The paper “Arc-continent collisions in the tropics set Earth’s climate state” has been published in the journal Science.

Actors.

Engaging in cultural activities can stave off depression in old age

Hit the movies when you’re feeling down, or go to the theater. It’ll help.

Actors.

Image via Pixabay.

Regularly attending cultural events can help fight depression as we age, a new study reports. The researchers showed that older people can cut their risk of developing depression by 32% simply by attending cultural activities once every few months. The more you do it, the better it works, too: people attending at least one such event per month lowered their risk of developing depression by 48%.

Culturally fit

The results come from a decade-long study that looked at the relationship between cultural engagement — plays, movies, concerts, and museum exhibits — and the risk of developing depression. That study, the English Longitudinal Study of Ageing (ELSA), followed roughly 2,000 men and women, all from England and over the age of 50, for 10 years.

The ELSA used interviews and surveys to gauge both depression incidence and the frequency with which study participants attended the theater, concerts, the opera, movies, art galleries and/or museums.

The present study’s lead author, Daisy Fancourt of University College London, suggested that there are probably many positive “side effects” generated by cultural participation, all of which seem to tone down the risk of developing depression.

“For example, going to concerts or the theater gets people out of the house,” she said, “which reduces sedentary behaviors and encourages gentle physical activity, which is protective against depression,” Fancourt explains.

“It also provides social engagement, reducing social isolation and loneliness. Engaging with the arts is stress-reducing, associated with lower stress hormones such as cortisol, and also lower inflammation, which is itself associated with depression.”

These activities are mentally-stimulating, which makes them useful for reducing the risk of depression, but also help prevent cognitive decline as we age. By stimulating the mind, evoking positive feelings, and creating opportunities for social interaction, such activities help enhance overall mental health. Fancourt adds that cultural engagement can also help trigger the release of dopamine — often called the “feel good” neurotransmitter

On the whole, the end result is likely not only a lower risk for depression but also lower risk for dementia, chronic pain, and even premature death, she concludes.

“So in the same way we have a ‘five-a-day’ [recommendation] for fruit and vegetable consumption, regular engagement in arts and cultural activities could be planned into our lives to support healthy aging,” she advised.

It has to be noted that the paper spotted an association, not a robust cause-and-effect relationship. Still, the results held true for all participants, regardless of age, gender, health, income, educational background, relationships with family and friends, participation in non-arts related social groups, or their exercise habits (or lack of). The results even held apparently for those with a predisposition to depression.

So why not book a ticket to a nearby play? It will help gently set you back into motion after the holidays (and all the food) and might just stave off depression in your later years. Win-win.

The paper “Cultural engagement and incident depression in older adults: evidence from the English Longitudinal Study of Ageing” has been published in The British Journal of Psychiatry.

Harar Old Man.

Proper hydration helps seniors get the full benefit of exercise and keeps their minds limber

When your hairs start turning gray, the water bottle should be your mainstay — at least while exercising. New research shows that middle-aged and older adults should drink more water to gain the full benefits of exercise.

Harar Old Man.

Image credits Gustavo Jeronimo / Wikimedia.

Few things will ruin your workout quite like dehydration. Even if you power through and keep to your routine despite the cottonmouth, you won’t benefit that much from it: dehydration has been shown to impair exercise performance and brain function in young people. However, the effect of dehydration during exercise for older individuals was poorly studied, and thus poorly understood, as there are some key metabolic differences between these age groups.

“Middle-age and older adults often display a blunted thirst perception, which places them at risk for dehydration and subsequently may reduce the cognitive health-related benefits of exercise,” the authors wrote.

Age slows down our metabolic rate, meaning we need fewer calories. Coupled with the fact that we generally tend not be as physically active as we age, elderly people tend to experience a decrease in appetite too. By eating less food, they get less hydration from solid food sources — humans generally get about half their daily water requirement from solid foods, as well fruit and vegetable juices.

To get a better understanding of how this impacts the health benefits of exercise, the New England-based team of researchers recruited recreational cyclists who took part in a large cycling event on a warm day (78-86°F or 25.5-30°C). The participants’ average age was 55.

The cyclists were asked to go through a “trail-making” executive function test: they had to connect numbered dots on a piece of paper, being graded both on their speed and accuracy. Executive functions are a set of processes that all have to do with managing oneself and one’s resources in order to achieve a goal. They include the ability to plan, focus, remember, and multitask. Exercise has been shown to improve intellectual health, including executive function.

The team also tested the volunteers’ urine before they exercised, and divided them into two groups based on the results — either in the ‘normal hydration’ or the ‘dehydrated’ groups.

Those in the normal hydration group showed a noticeable improvement in completion speed of the trail-making test after cycling (relative to their initial results). The dehydrated group also completed the task more quickly after cycling, but the difference in completion times wasn’t significant, the researchers noted.

“This suggests that older adults should adopt adequate drinking behaviors to reduce cognitive fatigue and potentially enhance the cognitive benefits of regular exercise participation,” the researchers wrote.

The paper “Dehydration impairs executive function task in middle-age and older adults following endurance exercise” was presented on Sunday, April 22, at the American Physiological Society (APS) annual meeting Experimental Biology 2018 in San Diego.

Older fathers tend to raise geekier children

New research shows that older fathers tend to have more intelligent children, who are less concerned about fitting in but more focused on their own interests — traits usually bunched together as ‘geekiness’.

Darth Vader.

Luke I *hhhhhh* am your father. I’m also pretty old, that’s why you like tinkering with racing craft.
Image credits Jordi Voltordu.

Researchers from King’s College London and The Seaver Autism Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai in the United States wanted to see how a father’s age influences their children’s personality. Towards this end, they looked at cognitive data from 15,000 pairs of UK twins, recorded as part of the Twins Early Development Study (TEDS). All in all, the older the father, the geekier their children tended to be, the team reports.

At the age of 12, the twins completed online tests that measured several of their cognitive traits, including some most people would bunch up as being ‘geeky’, such as non-verbal IQ, the strength of their focus on a subject of interest, and levels of social aloofness. Their parents were also asked to rate how much their child cares of the way their peers perceive them, and if they have any interests that take up a substantial chunk of their time. Using this data, the team calculated a ‘geek index’ for every child in the study.

Overall, children who scored higher on the index tended to have older fathers. This correlation held even after the team corrected for the family’s socioeconomic conditions, parent’s levels of education, and employment.

“Our study suggests that there may be some benefits associated with having an older father,” said Dr Magdalena Janecka from King’s College London and The Seaver Autism Center at Mount Sinai. “We have known for a while about the negative consequences of advanced paternal age, but now we have shown that these children may also go on to have better educational and career prospects”

Among these benefits, the team points out that the children who rated higher on the index tended to do better in school and rate higher in school exams several years after the measurements were taken, particularly for STEM (science, technology, engineering, and mathematics) subjects.

The team believes that there are several factors why older fathers may geekify their kids. For example, older fathers tend to have more well-established careers and a higher socioeconomic standing than their younger counterparts — so their children are more likely to be brought up in richer environments, have better education, and a higher exposure to STEM fields.

The findings could also help understand the links between higher paternal age, ‘geeky’ characteristics, and neurological conditions. Previous research has shown that children of older fathers are at a higher risk of some adverse outcomes, including autism and schizophrenia. Although the team couldn’t measure any effect directly, they hypothesize that some genes which encode geeky characteristics overlap with some of those who promote traits associated with autism — and older fathers are more likely to pass them along.

“When the child is born only with some of those genes, they may be more likely to succeed in school,” Dr Janecka adds. “However, with a higher ‘dose’ of these genes, and when there are other contributing risk factors, they may end up with a higher predisposition for autism.”

“This is supported by recent research showing that genes for autism are also linked with higher IQ.

The full paper “Advantageous developmental outcomes of advancing paternal age” has been published in the journal Translational Psychiatry.

Facial reconstruction shows how British people looked like 3,700 years ago

The facial reconstruction of Ava, who died more than 3,700 years ago.
Image credits Hew Morrison.

Archaeologists and forensic artists have completed the facial reconstruction of a woman who died around 3,700 years ago in the Scottish Highlands. The woman is believed to have belonged to the Beaker culture, which became prominent in Europe in the Bronze Age for their metalwork and characteristic pottery.

The woman has been named Ava, an abbreviation of Achacanich, Caithness, where she was found in 1987. A specialist examination at the time of the discovery in the 1980s suggested that the skeletal remains were that of a young Caucasian woman aged 18-22.

She became the subject of a long-term research project by archaeologist Maya Hoole, as her burial stands out from others during the Bronze Age. Her bones were discovered in a pit dug into solid rock — which is highly unusual as excavating such a hard medium was extremely laborious — along with several artifacts. Even more puzzlingly, her skull has an abnormal shape, which some believe is the result of deliberate binding.

Forensic artist Hew Morrison, a graduate of the University of Dundee’s Forensic Art Msc programme, created the reconstruction. As the skull was missing a jaw bone, he had to calculate the shape of its lower jaw starting from her skull dimensions – as well as the depth of her skin. Morrison also used a chart of modern average tissue depths as a reference.

“The size of the lips can be determined by measuring the enamel of the teeth and the width of the mouth from the position of the teeth,” he explained.

Morrison added layer after layer of muscle and tissue over her face, drawing on a large database of high-resolution facial images to recreate her features. These were then tailored to the anatomy of her skull after constructing her facial muscles. The features were then “morphed together”, using computer software to create the reconstructed face.

Ava’s skull was first discovered in 1987.
Image credits Michael Sharpe.

“Normally, when working on a live, unidentified person’s case not so much detail would be given to skin tone, eye or hair colour and hair style as none of these elements can be determined from the anatomy of the skull,” Morrison said.

“So, creating a facial reconstruction based on archaeological remains is somewhat different in that a greater amount of artistic licence can be allowed.”

He added: “I have really appreciated the chance to recreate the face of someone from ancient Britain. Being able to look at the faces of individuals from the past can give us a great opportunity to identify with our own ancient ancestors.”

“When I started this project I had no idea what path it would take, but I have been approached by so many enthusiastic and talented individuals – like Hew – who are making the research a reality, Hoole added.

“I’m very grateful to everyone who has invested in the project and I hope we can continue to reveal more about her life.”

Left: some grumpy old men. Right: Pheidole dentata, a native of the southeastern U.S. The ant isn't immortal, but doesn't seem to age.

Forever young: ants don’t seem to age

Most people don’t have that much of an issue with dying, like they do with being freaking old. Being old is a drag. You gain weight, the skin gets wrinkled, the mind and body weakens — and it all gets gradually worse until you expire. Ants don’t seem to share this human tragedy. By all accounts these particular ants don’t seem to age and die in youthful bodies.

Left: some grumpy old men. Right: Pheidole dentata, a native of the southeastern U.S. The ant isn't immortal, but doesn't seem to age.

Left: some grumpy old men. Right: Pheidole dentata, a native of the southeastern U.S. The ant isn’t immortal, but doesn’t seem to age.

This is according to researchers at Boston University who followed Pheidole dentata worker ants in a lab environment. These ants live to grow 140-days old, and the team suspected that these should show similar signs of aging like most organisms seeing how they seem to develop repertoires and behavior with age. “We expected that there would be a normal curve for these kinds of functions — they’d improve, they’d peak and then they’d decline,” James Traniello said, one of the study’s authors.

The researchers carefully looked for signs of aging from dead cells in the brain, to lower dopamine levels to declining performance in daily tasks. None of it was observed. It seems like these ants performed with flying colours until they die, like a bright flame that’s suddenly extinguished when the job is done. Moreover, the ants seemed to get better and better at anty-stuff (carrying food, finding resources) and became more active with each passing day in their lives.

Such displays are rare in the animal kingdom. Another notable examples includes naked mole rats which are arguably more impressive. These live for up to 30 years and stay spry for most of this time.

For now, scientists don’t know why these ants don’t seem to age, but being extremely social (part of the hive) might have something to do with it, they report in Proceedings of the Royal Society B. For sure, follow-up studies will be made on other ant species.

“Maybe the social component could be important,” says says Ysabel Giraldo, who studied the ants for her doctoral thesis at Boston University. “This could be a really exciting system to understand the neurobiology of aging.”

This might seem like an epiphany. Maybe there’s a way to transmute ants’ secret fountain of youth to humans. That would certainly make a lot of people happy, but it would likely never work. Ants are alien compared to humans. For one ants don’t reproduce and use a lot less oxygen. Given how complex the human organism is, it would never be feasible to mimic the ant’s way of life.

Don’t look so dull. Being human has its perks. Guess we’ll just have to come to terms with old age, until someone finds the Holy Grail.