New research is showcasing how a more healthy, balanced diet — including more legumes, whole grains, and nuts, while cutting down on red and processed meat — can lead to longer lives.
“You are what you eat” is an age-old saying, but a new study from the University of Bergen says that we also live as long as what we eat. The healthier and more diverse our diets, the healthier and longer our life expectancy (LE) becomes, it reports.
The paper estimates the effect of such changes in the typical Western diets for the two sexes at various ages; the earlier these guidelines are incorporated into our eating habits, the larger the improvements in LE, but older people stand to benefit from significant (if smaller) gains as well.
Change your meals, enjoy more meals
“Our modeling methodology used data from [the] most comprehensive meta-analyses, data from the Global Burden of Disease study, life-table methodology, and added analyses on [the] delay of effects and combination of effects including potential effect overlap”, says Lars Fadnes, a Professor at the Department of Global Public Health at the University of Bergen who led the research, in an email for ZME Science.
“The methodology provides population estimates under given assumptions and is not meant as individualized forecasting, with uncertainty that includes time to achieve full effects, the effect of eggs, white meat, and oils, individual variation in protective and risk factors, uncertainties for future development of medical treatments; and changes in lifestyle.”
Dietary habits are estimated to contribute to 11 million deaths annually worldwide, and to 255 million disability-adjusted life-years (DALYs). One DALY, according to the World Health Organization “represents the loss of the equivalent of one year of full health”. In other words, there’s a lot of room for good in changing what we eat.
The team drew on existing databases to develop a computerized model to estimate how a range of dietary changes would impact life expectancy. The model is publicly available as the online Food4HealthyLife calculator, which you can use to get a better idea of how changing what you eat can benefit your lifespan. The team envisions that their calculator would also help physicians and policy-makers to understand the impact of dietary choices on their patients and the public.
For your typical young adult (20 years old) in the United States, the team reports that changing from the typical diet to an optimal one (as described by their model) could provide an increase in LE of roughly 10.7 years for women and 13 years for men. There is quite some uncertainty in these results — meaning that increases for women range between 5.9 years and 14.1, and for men between 6.9 and 17.3 — due to the effect of factors that the model doesn’t factor in, such as preexisting health conditions, socioeconomic class, and so on. Changing diets at age 60 would still yield an increase in LE of 8 years for women and 8.8 years for men.
“The differences in life expectancy estimates between men and women are mainly due to differences in background mortality (and particularly cardiovascular disease such as coronary heart disease, where men generally are at higher risk at an earlier age compared to women),” prof. Fadnes explained for ZME Science.
The largest gains in LE would be made by eating more legumes, more whole grains, more nuts, less red meat, and less processed meat.
So far, the research focused on the impact of diet on LE, but such changes could be beneficial in other ways, as well. Many of the suggestions the team makes are also more environmentally sustainable and less costly, financially. The team is now hard at work incorporating these factors into their online calculator, in order to help people get a better understanding of just how changes in diet can improve their lives, on all levels involved.
“We are working to include sustainability aspects in Food4HealthyLife too. Based on former studies, the optimal diets are likely to have substantial benefits compared to a typical Western diet also in terms of reduction in greenhouse gas emissions, land use, and other sustainability facets,” he added for ZME Science. We have not systematically investigated financial aspects yet, but several of the healthy options could also be cheap, such as legumes and whole grains.”
The paper “Estimating the Impact of Food Choices on Life Expectancy: A Modeling Study” has been published in the journal PLoS Medicine.
More nutritious and healthy diet options can also help the climate, says a new analysis from the University of Leeds.
Our combined dietary habits can be a significant source of greenhouse gas emissions. Worldwide, food production accounts for roughly one-third of all emissions. This isn’t very surprising, since everybody needs to eat; but there are little tweaks we can apply to our lives which, added up, can lead to significant benefits for the climate.
New research at the University of Leeds reports that more nutritious, less processed, and less energy-dense diets can be much more sustainable from an environmental point of view than more common alternatives. While “less energy-dense” might sound like a bad thing, calorie content doesn’t translate into nutrient content. In other words, many energy-rich foods may actually just leave us fatter and malnourished.
“We all want to do our bit to help save the planet. Working out how to modify our diets is one way we can do that,” the authors explain. “There are broad-brush concepts like reducing our meat intake, particularly red meat, but our work also shows that big gains can be made from small changes, like cutting out sweets, or potentially just by switching brands.”
Similar analyses of the impacts of dietary options on the environment have been performed in the past. While their findings align well with the conclusions of the study we’re discussing today, they focused on broad categories of food instead of specific items. The team wanted to improve the accuracy of our data on this topic.
For the study, they pooled together published research on greenhouse gas emissions associated with food production to estimate the environmental impact of 3,233 specific food items. These items were selected from the UK Composition Of Foods Integrated Dataset (COFID). This dataset contains nutritional data regarding every item on the list and is commonly used to gauge the nutritional qualities of individuals’ diets.
The team used this data to evaluate the diets of 212 participants, who were asked to report what foods they ate during three 24-hour periods. In the end, this provided a snapshot of each participant’s usual nutritional intake and the greenhouse emissions generated during the production phase of all the items they consumed.
What the results show, in broad strokes, is the environmental burden of different types of diets, broken down by their constituent elements.
According to the findings, non-vegetarian diets had an overall 59% higher level of greenhouse gas emissions compared to vegetarian diets. This finding isn’t particularly surprising; industrial livestock farming is a big consumer of resources such as food and water and produces quite a sizeable amount of emissions from the animals themselves, the production of fodder, and through the processing and storage of meat and other goods.
Overall men’s diets tended to be associated with higher emissions — 41% more on average than women’s diets — mainly due to higher meat consumption.
People who exceeded the recommended sodium (salt), saturated fat, and carbohydrate intake as set out by World Health Organization guidelines generated more emissions through their diets than those who did not.
Based on these findings, the authors offer their support for policies aimed at encouraging sustainable diets, especially those that are heavily plant-based. One other measure they are in support of is policy that promotes the replacement of coffee, tea, and alcohol with more sustainable alternatives.
The current study offers a much higher-resolution view of the environmental impact of different food items, but it is not as in-depth as it could be. In the future, the authors hope to be able to expand their research to include elements such as brand or country of origin to help customers better understand what choices they’re making. They also plan to include broader measures of environmental impact in their analyses, not just greenhouse gas emissions.
For now, the findings are based only on data from the UK, so they may not translate perfectly to other areas of the globe.
The paper “Variations in greenhouse gas emissions of individual diets: Associations between the greenhouse gas emissions and nutrient intake in the United Kingdom” has been published in the journal PLOS One.
Every time you queue in line at the hot dog stand, it’s not just wasting time standing idle. According to a new study by health and nutrition scientists at the University of Michigan, a single hot dog could take 36 minutes off your life due to the ill effects of highly processed foods. On the other hand, the same study found that fresh foods like fruits, nuts, legumes, and non-starchy vegetables add valuable moments to your lifespan with each bite.
The researchers led by Olivier Jolliet, professor of environmental health sciences at Michigan University, analyzed 5,853 foods found in the diets of Americans and compared how healthy or unhealthy they were using a single standardized measure: time added to or removed from our lifespan.
In order to index the beneficial and detrimental health burden of each food, the researchers used the most recent nutritional scientific literature to estimate morbidities associated with certain classes of foods. For instance, the authors of the study assumed 0.45 minutes are lost per gram of processed meat. Conversely, 0.1 minutes per gram of fruit are added to your lifespan when you consume these foods.
The number of healthy minutes of life gained or lost per serving of food is measured by the Health Nutritional Index (HENI), which the researchers introduced to the scientific literature.
“HENI takes into account 15 dietary factors from the Global Burden of Disease, which studies the burden of disability and death from a number of causes. These cover health benefits associated with food containing milk, nuts and seeds, fruits, calcium, omega-3 fatty acids from seafood, fibers and polyunsaturated fatty acids (PUFAs) and health damages associated with food containing processed meat, red meat, trans fatty acids, sugar-sweetened beverages and sodium. For each of these dietary factors, we estimated the healthy minutes of life lost or gained per gram of food consumed,” wrote Katerina Stylianou, a research associate at the University of Michigan School of Public Health and the director of public health information and data strategy at the Detroit Health Department.
Using these estimates, the researchers calculated, for instance, that a standard beef hot dog on a bun takes 36 minutes off your life, considering its high content of processed meat, sodium, and trans fatty acids.
If this assumption reflects reality, it spells very bad news for professional competitors in hot dog eating contests. Miki Sudo, who won every edition of the woman’s competition at the Nathan’s Hot Dog Eating Contest since 2014, with an average of 40 hot dogs eaten per contest, could have lost 10,080 minutes or seven days of her life. That’s not counting the hot dogs she ate to train for the famous competition, which takes place every 4th of July in New York City. In 2020, the men’s competition was won by Joey Chestnut, who set a new record by eating 75 hot dogs at the cost of 2,700 minutes of his lifespan.
Other popular processed foods that may shorten your life include bacon (6 minutes and 30 seconds per serving), pizza (7 minutes and 8 seconds), and double cheeseburgers (8 minutes and 8 seconds). On the opposite end of the spectrum, foods that add to your lifespan include salmon (13 minutes and 5 seconds per serving), banans (13 minutes and 30 seconds), and avocados (2 minutes and 8 seconds).
A peanut butter and jelly sandwich surprisingly adds 33 minutes and 6 seconds to your lifespan, thanks to the nut butter that is rich in healthy fats, protein, and fiber. Seafood ranges from about 10 minutes of extra life to about 70 minutes, a broad range that is due to the healthy omega-3 fatty acid content that can vary wildly from fish to crustaceans.
These estimates are not meant to be exact, but rather to serve as a guideline to help consumers make more healthy choices for their diet. Every individual is different after all, and that includes their reaction to certain foods. Modern nutritional research unanimously agrees that ultra-processed foods significantly increase the risk of premature death, being associated with cancer and cardiovascular disease.
The takeaway is to stop eating processed foods, not do the math every time you feel guilty for eating a hot dog with way too much topping. Combining burgers with peanut butter servings so they cancel each other out is likely a very bad idea and would be missing the point of these findings.
The HENI index was described at length in a study published in the journal Nature Food.
Basing your meals on unprocessed plant-based foods is healthy for your heart at any age, according to a duo of studies published in the Journal of the American Heart Association.
Eating meals rich in unprocessed plants, including fruits and vegetables, whole grains, low-fat dairy products, skinless poultry and fish, nuts, legumes, and non-tropical vegetable oils, is a good way to keep your heart healthy all throughout your life. New research says that eating such diets in young adulthood is associated with lower risks of developing cardiovascular disease in midlife.
Eat your veggies
“Earlier research was focused on single nutrients or single foods, yet there is little data about a plant-centered diet and the long-term risk of cardiovascular disease,” said Yuni Choi, Ph.D., lead author of one of the studies and a postdoctoral researcher in the division of epidemiology and community health at the University of Minnesota School of Public Health in Minneapolis.
The paper looked at the occurrence of heart disease in 4,946 adults, all of whom were enrolled in the Coronary Artery Risk Development in Young Adults (CARDIA) study. All participants were aged 18 to 30 at the time of enrollment in the study, were free of cardiovascular disease, and were also analyzed by education level (equivalent to more than high school vs. high school or less). The sample included 2,509 black adults and 2,437 white adults, and 54.9% of participants were women.
Each participant had eight follow-up exams between the enrollment period (1985-1986) and the study’s end (2015-16), which included lab tests, physical measurements, as well as assessments of their medical histories and lifestyle factors. The participants were not instructed to change their habits in any way, such as being told to include or exclude certain items from their diets, and were not told their scores on the diet measures during the trial, so as not to influence the outcome.
The quality of each participant’s diet was scored based on the A Priori Diet Quality Score (APDQS) composed of 46 food groups at years 0, 7 and 20 of the study. The food groups were classified into beneficial (fruits, vegetables, beans, nuts, and whole grains), neutral (such as potatoes, refined grains, lean meats, and shellfish), and adverse (fried potatoes, high-fat red meat, salty snacks, pastries, and soft drinks) based on what we know of their relationship to the risk of developing cardiovascular disease. Under this methodology, higher scores were indicative of diets that more heavily revolved around nutritionally rich plant-based items.
Based on the data from this study, two papers measured how healthy plant food consumption influences cardiovascular health, in young adults or postmenopausal women. Both of these groups saw benefits, the papers report, as members of both were less likely to develop cardiovascular disease when they ate more healthy plant foods.
During the 32-year follow-up period, 289 participants developed cardiovascular disease (including heart attack, stroke, heart failure, heart-related chest pain or clogged arteries anywhere in the body). However, those who scored in the top 20% on the long-term diet quality score were 52% less likely to develop cardiovascular disease, after controlling for factors such as age, sex, education, and a host of other relevant factors. Those who improved their diet score the most between 25 to 50 years old were 61% less likely to develop subsequent cardiovascular disease compared to those whose quality of diet declined between the same ages.
The team notes that the study included very few participants who were vegetarians, so the study didn’t record the effects of strict vegetarianism (which excludes all animal products, including meat, dairy and eggs) on cardiovascular health, but are representative of general dietary habits.
“A nutritionally rich, plant-centered diet is beneficial for cardiovascular health. A plant-centered diet is not necessarily vegetarian,” Choi said. “People can choose among plant foods that are as close to natural as possible, not highly processed. We think that individuals can include animal products in moderation from time to time, such as non-fried poultry, non-fried fish, eggs and low-fat dairy.”
That being said, the study is observational. In other words, it can show that certain dietary habits are correlated to certain health outcomes, but it can’t say for sure that one causes the the other. Still, the findings are relevant for all of us, and it’s better to err on the side of caution. So maybe help yourself to some extra veggies and greens during your next lunch break.
The first paper “Relationship Between a Plant‐Based Dietary Portfolio and Risk of Cardiovascular Disease: Findings From the Women’s Health Initiative Prospective Cohort Study” has been published in the Journal of the American Heart Association.
The second paper “Plant‐Centered Diet and Risk of Incident Cardiovascular Disease During Young to Middle Adulthood” has been published in the Journal of the American Heart Association.
Stone age humans used to dine mainly on meat, a new study reports. It was only as megafauna (the huge animals of yore, like mammoths) died off that vegetables were increasingly making their way on the menu.
A new paper offers a fresh and interesting interpretation of how humanity made the trek from hunting to agriculture. According to the findings, ancient humans were primarily carnivores, with game meat making up an important part of their diet. But as the species they hunted died out, vegetables and plant matter made up a growing part of their diets. These extinctions likely also led to the domestication of plants and animals, as our ancestors needed to secure sources of food.
“So far, attempts to reconstruct the diet of stone-age humans were mostly based on comparisons to 20th-century hunter-gatherer societies,” explains Dr. Miki Ben-Dor of the Jacob M. Alkov Department of Archaeology at Tel Aviv University, first author of the paper.
“This comparison is futile, however, because two million years ago hunter-gatherer societies could hunt and consume elephants and other large animals — while today’s hunter-gatherers do not have access to such bounty. The entire ecosystem has changed, and conditions cannot be compared. We decided to use other methods to reconstruct the diet of stone-age humans: to examine the memory preserved in our own bodies, our metabolism, genetics, and physical build. Human behavior changes rapidly, but evolution is slow. The body remembers.”
The team trawled through almost 400 scientific papers from various disciplines, trying to determine whether stone-age humans were carnivores or omnivores. They collected around 25 lines of evidence, mostly from papers dealing with genetics, metabolism, physiology, and morphology, that can help us determine this.
One of the tidbits cited by the team includes the acidity of the human stomach. This is “high when compared to omnivores and even to other predators”, they explain, which means our bodies have to spend extra energy to keep them so. But it also provides some protection from bacteria often found in meat, suggesting that this was an adaptation meant to help our ancestors eat meat. Ancient peoples hunted large animals whose meat would feed the group for days or weeks, meaning they often ate old meat laden with bacteria.
Another clue they list is the way our bodies store fat. Omnivores, they explain, tend to store fat in a relatively small number of large cells. Predators do it the other way around — humans also share this latter approach of using a large number of relatively small cells. A comparison with chimpanzees also shows that areas of our genetic code are inactivated to specialize us for a fat-rich diet (in chimps, these changes support a sugar-rich diet).
Archeological evidence also supports the meat-eating hypothesis. Isotope ratio studies on the bones of ancient humans, alongside evidence of how they hunted, suggests our ancestors specialized in hunting large or medium-sized animals that had a lot of fat. Large social predators today also hunt large animals and get over 70% of their energy from animal sources, the team writes, and this parallel suggests that early human groups acted a lot like hypercarnivores.
“Hunting large animals is not an afternoon hobby,” says Dr. Ben-Dor. “It requires a great deal of knowledge, and lions and hyenas attain these abilities after long years of learning. Clearly, the remains of large animals found in countless archaeological sites are the result of humans’ high expertise as hunters of large animals.”
“Many researchers who study the extinction of the large animals agree that hunting by humans played a major role in this extinction — and there is no better proof of humans’ specialization in hunting large animals. Most probably, like in current-day predators, hunting itself was a focal human activity throughout most of human evolution. Other archaeological evidence — like the fact that specialized tools for obtaining and processing vegetable foods only appeared in the later stages of human evolution — also supports the centrality of large animals in the human diet, throughout most of human history.”
The findings go against the grain of our previous hypotheses on how humans evolved. Previously, it was assumed that humans’ dietary flexibility allowed them to adapt to a wide range of situations and environments, giving them an evolutionary edge; but the current findings suggest that we evolved largely as predators instead. That’s not to mean that they ate only meat — there is well-documented evidence of plant-eating during this time — but plants only gained a central part in their diets in the latter days of the stone age.
Stone tools specialized for processing plants started appearing around 85,000 years ago in Africa and about 40,000 years ago in Europe and Asia, the team adds, suggesting plants were increasingly being eaten. The researchers also explain that such tools show an increase in local uniqueness over time, a process similar to that seen in 20th-century hunter-gatherer societies. In contrast, during the time when the team believes humans acted more like apex predators, stone tools maintained very high degrees of similarity and continuity regardless of local ecological conditions.
“Our study addresses a very great current controversy — both scientific and non-scientific. It is hard to convince a devout vegetarian that his/her ancestors were not vegetarians, and people tend to confuse personal beliefs with scientific reality,” adds Prof. Ran Barkai, also of the Jacob M. Alkov Department of Archaeology at Tel Aviv University, and a co-author of the paper.
“Our study is both multidisciplinary and interdisciplinary. We propose a picture that is unprecedented in its inclusiveness and breadth, which clearly shows that humans were initially apex predators, who specialized in hunting large animals. As Darwin discovered, the adaptation of species to obtaining and digesting their food is the main source of evolutionary changes, and thus the claim that humans were apex predators throughout most of their development may provide a broad basis for fundamental insights on the biological and cultural evolution of humans.”
The paper “The evolution of the human trophic level during the Pleistocene” has been published in the American Journal of Physical Anthropology.
We often hear how we’re living in a more interconnected world than ever before — and that is true. But people have never lived in complete isolation from others. New research comes to support this view, by showing that long-distance trade in food and spices was already taking place between Asia and the Mediterranean region over 3000 years ago.
Spices such as turmeric and foods including bananas were known and present in the Mediterranean region during the Bronze Age, the paper explains, much earlier than previously assumed. The authors further note that such plants were not endemic to the Mediterranean, so the only way they could get there was via long-distance trade.
“Exotic spices, fruits, and oils from Asia had reached the Mediterranean several centuries, in some cases even millennia, earlier than had been previously thought,” says Philipp Stockhammer from LMU, who led the research. “This is the earliest direct evidence to date of turmeric, banana, and soy outside of South and East Asia.”
The international team of researchers analyzed the tartar (dental deposits) on the teeth of 16 people unearthed in excavations at the Megiddo and Tel Erani sites in modern-day Israel. This area mediated any ancient travel and trade between the Mediterranean, Asia, and Egypt. If you wanted to travel between these places in the 2nd millennium BCE, you had to go through the Levant.
What the researchers were looking for was food residue, such as proteins or plant microfossils, that remained preserved in the dental plaque over the last thousands of years. From there, they hoped, they could reconstruct the local diet.
“This enables us to find traces of what a person ate,” says Stockhammer. “Anyone who does not practice good dental hygiene will still be telling us archaeologists what they have been eating thousands of years from now!”
The techniques they used fall under the domain of paleoprotemics, a relatively new field of science concerned with the study of ancient proteins. The team managed to identify both “ancient proteins and plant residues” from the teeth, revealing that their owners had consumed foods brought from faraway lands.
It was quite surprising for the team as well. Such techniques are difficult to use, they explain, because you have to piece together what food people ate judging solely from the proteins they contained. The proteins themselves must also survive for thousands of years until analyzed, so there’s also quite a lot of luck required to pull it off.
The team confirmed the presence of sesame in local diets at the time (sesame is not an endemic plant to the Levant), suggesting that it had become a staple food here by the 2nd millennium BCE. The teeth of one individual from Megiddo showed turmeric and soy proteins, while one individual from Tel Erani showed traces of banana proteins — all of them likely entering the area through South Asia.
“Our analyses thus provide crucial information on the spread of the banana around the world. No archaeological or written evidence had previously suggested such an early spread into the Mediterranean region,” says Stockhammer. “I find it spectacular that food was exchanged over long distances at such an early point in history.”
Naturally, the team can’t rule out that this individual traveled or lived in South Asia for a period of time, consuming local foodstuffs during this time. They also can’t estimate the scale of any trades going on, only find evidence that such networks probably existed.
Still, the findings showcase how early long-distance trade began, and they go to show that people have been living in and building an interconnected world for a very long time now. While definitely interesting and important from an academic point of view, such results also help to put our current social dialogues around globalization, trade, and immigration into perspective.
The paper “Exotic foods reveal contact between South Asia and the Near East during the second millennium BCE” has been published in the journal PNAS.
The food industry could be actively working against public health by influencing the results of studies in their favor.
New research reports that around 13.4% of the nutrition studies it analyzed disclosed ties to the food industry. Studies in which the industry was involved were more likely to produce results that were favorable to its interest, the team adds, raising questions in regards to the merits of these findings.
“This study found that the food industry is commonly involved in published research from leading nutrition journals. Where the food industry is involved, research findings are nearly six times more likely to be favourable to their interests than when there is no food industry involvement,” the authors note.
It’s not uncommon for industry to become involved with research — after all, they have a direct stake in furthering knowledge in their field of activity. This can range from offering funding to assigning employees to research teams for support or active research.
The current paper comes to show that, at least in the food industry, such activities are actively skewing and biasing research into nutrition. It is possible, the team reports, that this can put public health at risk as corporate interests can start dictating what findings see the light of day, where, and in what form. Such findings are worrying since corporations are notorious for putting profits above anything else, including truth or the common good.
In order to get a better idea of just how extensive the influence of industry is in food-related research, the team — led by Gary Sacks of Deakin University in Melbourne, Australia — analyzed all papers published in the top 10 peer-reviewed academic journals related to diet or nutrition. They looked at which had ties to the industry such as funding from food companies or affiliated organizations, and then whether or not the authors went out of their way to support industry interests.
Roughly 13.4% of the articles had some level of industry involvement, with some journals bearing more of the blame than others. The authors explain that studies with industry involvement were over five times more likely to favor industry interests compared to a random sample of studies without involvement (55.6% vs 9.7% for the latter).
Such figures offer a pretty big warning sign that industry involvement could promote research bias or help push an agenda at the expense of quality science (such as the neglect of topics that are important for public health but go against industrial interests). The authors suggest several mechanisms that could be employed to preserve the quality of nutrition research.
The paper “The characteristics and extent of food industry involvement in peer-reviewed research articles from 10 leading nutrition-related journals in 2018” has been published in the journal PLOS One.
Removing all meat from the human diet to protect the environment isn’t a workable solution outside rich countries, a new paper reports.
Calls to remove all meat from our diets to limit CO2 emissions are only realistic in rich, industrialized regions. In low- or middle-income countries, livestock can represent a critical source of income and food, the paper argues, making such changes practically impossible for locals.
Let’s meat halfway
“Conclusions drawn in widely publicized reports argue that a main solution to the climate and human health crisis globally is to eat no or little meat but they are biased towards industrialized, Western systems,” said Birthe Paul, the lead author and environmental scientist at the Alliance of Bioversity International and the International Center for Tropical Agriculture (CIAT).
Animal sourced foodstuffs such as meat and dairy are a much heavier burden on the environment than plant-sourced items. As such, many governments and organizations around the world are urging citizens to reduce their intake of the former and include more of the latter. As a bonus, plant-based items tend to be healthier, too.
But we should not delude ourselves into thinking this is all it will take to address climate change. For many people, such a shift is simply impossible without a massive blow to their and their families’ financial and food security. Livestock are extremely important sources of food and repositories of value for people in low- and middle-income countries. Asking them to give up animal products is asking them to shoot themselves in the foot, the team argues.
Of all scientific literature published since 1945 on the subject of livestock only 13% covers Africa, they note — yet Africa houses around 20%, 27% and 32% of global cattle, sheep, and goat populations, respectively. Although livestock makes up a key pillar of local economies in Africa, eight of the world’s top 10 institutes publishing livestock research are based overseas. Only two, including the International Livestock Research Institute (ILRI), are headquartered in Africa.
The authors argue that this has left us biased in regards to research on livestock. As western nations focus more and more on climate change, they’re driven to understand the effects the livestock industry has on climate. This leaves out a lot of the picture, they add, including the positive role such animals can play, both from an environmental and socio-economic point of view. It also leaves out a huge difference — animals in Africa are rarely reared the same way that they are in highly-industrialized nations.
“Mixed systems in low- and middle-income countries, where animal production is fully linked with crop production, can actually be more environmentally sustainable,” said An Notenbaert, from the Alliance of Bioversity International, co-author of the paper.
“In sub-Saharan Africa, manure is a nutrient resource which maintains soil health and crop productivity; while in Europe, huge amounts of manure made available through industrialized livestock production are overfertilizing agricultural land and causing environmental problems.”
A common approach in African savannas is to keep herds in pens at night, which has been shown to increase the levels of nutrients available in the whole ecosystem, the authors argue. Feed is produced more locally and in more sustainable fashion, whereas industrialized nations import most of their feed (which means more fuel and infrastructure is needed to transport it). Such imports are also a driver of ecological damage — the authors note that soybean produced and exported as feed to animals in Vietnam and Europe is a leading cause for deforestation in the Amazon.
While livestock are an important source of greenhouse gases, we simply don’t have the data needed to establish national mitigation strategies in this regard. The authors also urge that we look beyond making animals more productive, and turn instead to looking at how we can be more resource efficient and what systems can be put in place to limit emissions from them.
“Meat production itself is not the problem. Like any food, when it is mass-produced, intensified and commercialized, the impact on our environment is multiplied,” said Polly Ericksen, Program Leader of Sustainable Livestock Systems at the International Livestock Research Institute and co-author of the paper.
“Eliminating meat from our diet is not going to solve that problem. While advocating a lower-meat diet makes sense in industrialized systems, the solution is not a blanket climate solution, and does not apply everywhere.”
Meat consumption in sub-Saharan Africa is much lower than that in developed countries, also. The paper cites estimates of the Food and Agriculture Organization, according to which average yearly meat consumption per capita here will be roughly 13kgs by 2028; in the US, this figure is expected to reach 100kgs in the same timeframe.
The authors point to a range of higher-impact environmental solutions. Among them, improved animal feed so animals emit less greenhouse gases like methane per kilogram of milk or meat. Better land management and approaches such as using manure and crop byproducts for fertilizers (by plowing them into the soil) would have a significant positive impact on farm output as well as the environment.
The paper “Sustainable livestock development in low and middle income countries – shedding light on evidence-based solutions” has been published in the journal Environmental Research Letters.
As plant-based diets are becoming mainstream across the world, tofu is gaining momentum as a healthy and versatile food option.
It might look bland or even intimidating at first, but this protein-rich food is actually easy to cook and it can be very tasty thanks to its ability to take on the flavors of anything you are cooking it with — all while providing you with plenty of important nutrients.
Tofu is essentially a food produced from condensed soy milk pressed into solid white blocks, a process somewhat similar to making cheese (although contrary to popular belief, tofu is not meant to be a cheese replacement). It originated in China and it has been a staple of Asian cuisines for hundreds or thousands of years, now becoming popular in Western cuisine, particularly for those who want replacements for animal protein.
It is believed that tofu was discovered by a Chinese chef more than 2,000 years ago when he accidentally mixed fresh soy milk with nigari — the liquid or powder that remains when salt is extracted from seawater. Nigari is a coagulant rich with minerals that help tofu solidify and keep its form.
Tofu can be purchased in bulk or individual packages. It can also be found dehydrated, freeze-dried, jarred, or canned — its versatility being one of the main reasons why it has become a favorite of many.
It’s a cheap way to include plant-based protein in a diet, usually costing less than $2 for a two to four serving block. It can be made at home if you really know what you’re doing, but you’re probably better off with the off-the-shelf options.
How do I eat tofu?
Fir thing you need to know: on its own, tofu is pretty bland and flavorless. This is exactly why so many people are put off by it. But just because it’s bland on its own, doesn’t mean it’s bland when cooked. Tofu is a flavor magnet: anything you cook it with, it will suck the flavor right from it. This makes tofu really versatile for any flavors you prefer. Tofu can also be steamed, grilled, baked, pan-cooked and even fried, especially in the air fryer — which again, makes it all the more versatile. Some people even prefer to freeze it to give it a more exquisite, meat-like texture.
The fact that it has a high-water content makes it necessary to first drain and press the tofu to take out the excess liquid. You can just use dish towels and cookbooks to press and expel water. Otherwise, it won’t absorb all the flavors and will take a firm texture when you cook it.
After you have pressed it, then cut the tofu into whatever shape and size you desire before you start cooking, such as slices, cubes, or slabs. Tofu will absorb whatever sauce, marinade, or spices you add so there’s no need to let it sit for too long while cooking.
You can find raw tofu in the refrigerated section at the supermarket or as pre-baked and seasoned. There are actually different types of tofu available, including silken, soft, firm, and extra-firm. Silken is sometimes used for things like omelets, or even smoothies and desserts, soft it’s great for soups and stews, while firm and extra-firm are used for baking and frying at high temperatures.
Tofu needs to be stored in the refrigerator. Unopened packs remain good for five to seven days after the “sell by” date listed on the package. Freezing is also an option, lasting up to six months. But before you do that, better drain the excess liquid and wrap it in a freezer bag.
How healthy is tofu?
Overall, tofu has a lot of proteins and contains all the essential amino acids that the body needs. But that’s not all. Tofu is an excellent food from a nutritional and health perspective as it provides a wide array of vitamins and minerals, fats, and carbs. Recent studies have consistently found that sources of plant protein such as tofu are linked with better health and increased longevity.
Soybeans used to produce tofu have natural plant compounds called isoflavones, which can attach to and activate estrogen receptors in your body. Studies have shown that people who consume large amounts of isoflavones have lower blood pressure and better blow flow in the arteries.
Depending on which type of tofu you end up buying, it may also be fortified with vitamins or minerals, such as calcium, Vitamin D, or Vitamin B12. These are nutrients vegetarians and vegans often don’t get enough of, but these are very useful in all balanced diets.
Tofu is made from soybeans and most of them as grown in the US and are genetically modified (GMO), which some see as controversial. Although GMOs are controversial, research has not found them to be harmful to human health so far. However, research on the impacts of GMOs on human health is not always conclusive, so if you want to be extra safe, you have the option of buying non-GMO or organic tofu brands.
Can tofu reduce heart disease risk?
There aren’t that many studies yet that have looked at the effect of tofu on heart health. But research has shown that high consumption of legumes such as soy can lead to a lower rate of heart disease. Tofu also has a small amount of saturated fat, which makes it a good choice for the heart.
The already mentioned isoflavones that soybeans reduce blood vessel inflammation and improve their elasticity, which is good news for the heart. A study found that a dose of 80mg of isoflavones per day for 12 weeks improved the blood flow of people at risk of stroke by almost 70%.
At the same time, a study in postmenopausal women found that a high intake of soy isoflavone can lead to several heart-protective factors, such as improvements in waist circumference and good HDL cholesterol. Tofu also has saponins, which is thought to have protective effects on heart health.
Can tofu reduce risk of cancer and diabetes?
Several studies have looked at the effects of tofu on different types of cancer. Research showed that women that eat soy products at least once a week have an average 50% lower risk of breast cancer. This is likely due to isoflavones. Exposure to soy during childhood and adolescence is believed to be most protective. Higher intakes of tofu have been also linked to an up to 61% lower risk of stomach cancer in men and women. A review of several studies recently linked a higher soy intake to a 7% lower risk of cancers of the digestive system. Lower risk of prostate cancer was also found due to the higher consumption of soy.
The soy isoflavones were found to boost sugar control as well. A study on postmenopausal women found that consumption of 100mg of isoflavones per day lowered blood sugar levels by 15% and insulin levels by 23%. Another study showed that taking isoflavones every day for a year improved insulin sensitivity.
Much of the health effects of tofu boil down to animal protein vs plant protein. Studies have consistently found that plant protein is typically healthy in a number of ways. Here’s what a recent review found:
“Higher intake of total protein was associated with a lower risk of all cause mortality, and intake of plant protein was associated with a lower risk of all cause and cardiovascular disease mortality. Replacement of foods high in animal protein with plant protein sources could be associated with longevity.”
Can tofu also cause problems?
It’s generally considered safe to eat tofu and other soyfoods every day. Nevertheless, you might want to moderate the intake if you have estrogen-sensitive breast tumors due to tofu’s weak hormonal effects and if you have a poor thyroid function because of tofu’s goitrogen content.
A recent report by the European Food Safety Authority (EFSA) found that soy and soy isoflavones pose no concerns for thyroid function or breast and uterine cancers. Nevertheless, if you have any concerns regarding eating tofu or implementing changes on your diet, it’s always better to discuss it with your doctor.
Drilled down to its essence, the ketogenic diet (keto for short) is low on carbs and high in fats and proteins. Because there is little blood sugar from food circulating in the bloodstream, the body will start burning stored fats for energy. This makes it an extremely efficient diet for weight loss, helping people lose many kilograms in a very short time span.
Keto is a very restrictive diet, which is why many fail to follow through, giving up before they can reap the full weight-loss benefits. Many also either consume too much protein or too many poor-quality fats from processed foods, sometimes both. The carb restrictions also mean keto followers eat very little fruits and vegetables, or none at all.
Obviously, this can be a problem as studies have shown time and time again that a diet rich in vegetables and fruits can lower blood pressure, reduce the risk of heart disease and stroke, prevent some types of cancer, lower risk of eye and digestive problems, and have a positive effect upon blood sugar, which can help keep appetite in check.
In short, keto can be an extremely effective weight loss diet but it could also cause some health problems if it is followed improperly. Additionally, some patients with underlying health conditions, such as kidney disease, should stay away from it as the diet might worsen their condition. Another thing to keep in mind is that, like with any fad diet, many end up regaining the weight they quickly lost soon after they get off the keto bandwagon.
How does keto work?
The body uses sugars to generate energy for cellular processes. But if a person’s carbohydrate intake is very low or non-existent, the body will start breaking down stored fats into molecules called ketones. It does so through a process called ketosis, hence the name of the diet, which typically starts 4-5 days after a person eats fewer than 20-50 grams of carbs per day. At this point, cells will use ketones for energy until a person starts eating carbs again.
“The Keto diet is an extremely low-carbohydrate diet (minimal carbohydrates), high protein/high fat diet. It is similar to the Atkin’s or other low-carb diets out there, but the difference is, if done CORRECTLY, you put your body into a state of Ketosis, meaning, that instead of using the preferred substrate for muscles and brain cells (glucose), your body is using primarily ketone-bodies which derive from fat molecules. This type of diet basically is teaching your body to burn fat and use it as fuel instead of glucose,” Dana Ellis Hunnes, Adjunct Assistant Professor at the Department of Community Health Sciences at UCLA, told ZME Science.
It’s not yet entirely clear why ketosis causes weight loss. One hypothesis is that ketosis suppresses appetite and may affect hormones like insulin that regulate hunger and metabolism. Since fats and proteins are more satiating, people may feel fuller on a lower calorie diet.
”In the short-term consequences may include less insulin response to meals since you are eating minimal carbohydrate which may be good for blood glucose levels and diabetes. However, with this in mind, many fibers come from carbohydrate-rich foods and without those in the diet, it may be more difficult to eliminate (have bowel movements) regularly. long-term consequences include potential increased risk for heart disease since it is such a high-protein, high-fat diet (though, this may be reduced (the risk) if healthy plant-based fats are the primary sources of fats instead of saturated/animal-based fats). There is a possibility of weight loss, but to me, the risk to vascular system is not worth it,” Hunnes said.
Types of keto diet
Although keto is a fad diet that has risen to prominence in the last decade, it is not by any means new. The keto diet was first designed in the 1920s by doctors looking to treat epilepsy. For reasons not entirely understood even to this day, fueling cells with ketones instead of glucose reduces the number of seizures experienced. Although there are now anti-seizure medications, some patients who don’t respond to treatment can still reap benefits by going on keto.
But since then, keto has been recognized for its weight loss potential. For instance, in the 1970s, Dr. Atkins popularized his famous low-carb weight-loss diet, which starts with a two-week ketogenic phase.
Like all diets, there is no single plan and every individual is different. However, any plan that calls for eating fewer than 50 grams of carbs a day can be called keto.
Some versions of the ketogenic diet include:
Standard ketogenic diet: 75% fat, 20% protein, and 5% carbs
Cyclical ketogenic diet: intermittent ketosis coupled with periods of higher-carb intake. Example: 5 ketogenic days followed by 2 high-carb days.
Targeted ketogenic diet: standard keto with added carbs around workouts.
High-protein ketogenic diet: 60% fat, 35% protein, and 5% carbs.
What are the benefits of keto?
We have nearly 100 years of evidence that ketosis reduces the frequency of seizures, sometimes on par with modern medication. Due to its neuroprotective effects, keto may help with other brain disorders such as Parkinson’s, Alzheimer’s, multiple sclerosis, sleep disorders, and autism. However, studies supporting the use of ketosis to treat such conditions are limited or lacking at the moment.
However, the main use and benefit of keto is tied to weight loss. For patients with type 2 diabetes, for which being overweight or obese can be life-threatening, keto and other low-carb diets might prove particularly advantageous.
One small study found that 7 of 21 overweight participants with type 2 diabetes were able to stop using medication after they consumed less than 20 grams of carbohydrates per day over the course of 16 weeks. Because the ketogenic diet “can be very effective at lowering blood glucose, patients on diabetes medication who use this diet should be under close medical supervision or capable of adjusting their medication,” the researchers wrote.
In yet another study on 49 volunteers with obesity and type 2 diabetes, researchers found that 95% of those on the ketogenic diet were able to stop or reduce diabetes medication compared to 62% on a low-glycemic, reduced-calorie diet (500 kcal/day deficit).
However, things can get complicated. A 1991 study on 12 Pima Indians and 12 Caucasians, all nondiabetic, found that a high-fat modern diet (50% fat and 30% carbohydrate) was associated with a decrease in glucose tolerance and higher cholesterol. In other words, a high-fat diet might actually put people at risk of developing diabetes.
There’s also something to be said about the quality of food. Replacing carbs with animal fats and proteins can be tricky to pull off in a healthy and sustainable manner. Most people turn to beef, pork, lamb, chicken, and cheese, which are associated with increased mortality and inflammation. Instead, using plant-derived fats and proteins would be less risky — something not at all accessible to most people who are already on a very restrictive diet.
“I do not advocate a keto diet for pretty much anyone except those who the keto diet was originally designed for – patients who have severe epilepsy/seizure disorders. Most people who go on a “keto” diet are not doing it correctly, meaning they are eating too many carbohydrates to really get into a state of ketosis which can be measured in the urine or blood. Moreover, if you lose weight too quickly as can happen with this type of diet, again, especially if not doing it correctly, you can lose significant amounts of muscle as your body feels it is in starvation. micronutrient deficiencies are also possible, again because of the low-carb (even from produce) diet,” Hunnes said.
“My go-to diet of choice is a whole-foods, plant-based diet. Studies time and again demonstrate a whole-foods plant-based diet can reduce the risk of a myriad of chronic diseases (cancers, stroke, heart disease, obesity, diabetes) and is full of fiber, water from the foods themselves, micronutrients (vitamins and minerals), potassium, and sufficient in protein. It is also far better for the environment than a keto diet high in animal products would be as it requires less water, less land, and produces far less greenhouse gases,” she added.
Is keto safe?
Not much is known about the long-term effects of keto, specifically. However, studies suggest that low-carb diets reduce lifespan compared to those consisting of moderate carbohydrate intake.
A 2018 study that followed 15,428 American adults aged 45-64 years from 1987 until 2012 correlated health outcomes with diet. According to the results, over a 25-year period, people who had a moderate carbohydrate intake (50-55% of daily calories) had an average life expectancy of 83 years — that’s four years longer than those with low carb intake (40%), who lived only 79 years on average. Participants with high carb intake (more than 70% of daily calories) had an average life expectancy of 82 years, slightly lower than the moderate carbs intake group.
Another long-term study published in The Lancet by researchers at Brigham and Women’s Hospital in Boston looked at the dieting habits of a staggering 432,000 people in more than 20 countries. The results suggest that those who consumed a moderate amount of carbs (around half of their daily calories) lived the longest. Conversely, those who had a diet of more than 70% carbs or less than 40% carbs were more likely to die earlier than those in the moderate carb-intake group.
Previous studies showed that low carbohydrate diets are beneficial for short-term weight loss and reduce cardiometabolic risk. In the long-term, however, low-carb diets seem to shave off 4-5 years of lifespan compared to moderate-carb intake.
“I would really only recommend a keto diet under the advisement of a dietitian who specializes in a ketogenic diet (and those dietitians typically work at major medical centers with patients who have severe epilepsy and seizures). This type of diet is ripe for increasing the risk of micronutrient deficiencies in people who eat it without proper counseling. It is also a recipe for increasing the risk of certain cardiovascular diseases. Even in people who have severe epilepsy/seizures, this type of diet is generally followed for 6 months to a year maximum and then regular foods are re-integrated. Those would be my recommendations,” Hunnes concluded.
Bottom line: keto can help overweight people to shed weight fast, however, you risk reverting back to your previous weight just as quickly as you lost those pounds after getting off the diet. Keto is often called a “yo-yo diet” for good reason.
Patients suffering from seizures and type 2 diabetes might reap additional benefits from keto, particularly thanks to improved blood sugar control. However, the long-term effects of keto are unknown and even the most die-hard fans of keto agree that people shouldn’t stay on a ketogenic diet more than a couple of months per year.
It may sound cliche, but like most things in life, the key to a healthy life is moderation. Consuming around 50% carbohydrates relative to daily calories seems to hit the sweet spot.
A balanced, unprocessed diet, rich in fruits and vegetables, lean meats, fish, whole grains, nuts, seeds, olive oil, and lots of water seems to show the best evidence for a longer and healthier life.
New research is shedding light onto the social and agricultural customs of early Bronze Age societies.
The El Agar society is known from a site in the south-western corner of the Iberian peninsula (today’s Spain). It is believed, however, that it held cultural and political sway over a larger area during its day, from 2200-1550 cal BCE. It also developed sophisticated pottery and ceramics, which they traded with other tribes in the Mediterranean region.
New research based on El Agar gravesites and the layouts of their settlements reports that it was likely a strongly-hierarchical society that revolved around complex, “monumentally fortified” hilltop settlements. The findings showcase the potential use of including trophic (food) analysis in anthropology, and help to reveal the complexity that societies in this period could achieve.
Farming for success
“It is essential to not only investigate human remains, but also comparative samples of different former food stuffs as well as to interpret the data in the light of the archaeological and social historical context,” explains Dr. Corina Knipper from the Curt Engelhorn Center Archaeometry, the paper’s lead author.
The team used carbon dating and nitrogen isotope analysis on artefacts recovered from two El Algar hilltop settlements: a large fortified urban site (La Bastida, in today’s Murcia region) and a smaller settlement at Gatas (in today’s Almería region). The samples analyzed include remains from 75 different individuals across all social levels, 28 domestic animal and wild deer bones, 75 grains of charred barley and 29 grains of charred wheat. All the samples hail from the middle to late El Algar civilization.
The findings showed no significant difference in isotope values between males and females, which is indicative of the fact that both genders shared similar diets. However, the team did find a difference between individual social strata — remains from individuals that made up the elite of La Bastida showed higher levels of both carbon and nitrogen than their peers. This could be indicative of individuals here eating more animal-based products (nitrogen concentrates the farther up you go along the food chain). However, the team further reported that while the nitrogen values for barley were similar at both sites, domestic animals at La Bastida showed higher nitrogen values. This means that the same general diet at both sites could still have resulted in the different nitrogen levels seen.
The latter view is further strengthened by the finding that these communities relied heavily on cereal farming, which they only supplemented with livestock. Analysis of the wheat and barley suggests that the landscape they grew in were dry and unirrigated, but likely fertilized with animal manure, judging from the high nitrogen levels they contain. Cereals and their by-products also seem to have provided most of the forage of domesticated animals (sheep, goats, cattle, and pigs).
The study is based on a small sample size, which limits the reliability of the results. However, it does highlight the role trophic chain analysis plays in helping archeologists piece together the past from human remains. It also goes a long way to show that El Algar farmers had developed relatively sophisticated practices for their time, which allowed them to feed a thriving community.
The paper “Reconstructing Bronze Age diets and farming strategies at the early Bronze Age sites of La Bastida and Gatas (southeast Iberia) using stable isotope analysis” has been published in the journal PLOS One.
Healthier diets could save the US around $50 billion in healthcare costs annually, according to a new study.
Unhealthy diets are a leading cause of poor health, as they promote the development of cardiometabolic diseases (CMDs) such as heart disease, stroke, and type 2 diabetes. A new study led by Brigham and Women’s Hospital researchers estimates that unhealthy diets can account for 45% of all CMD-related deaths in the US, leading to a national healthcare burden of around $50 billion nationally.
Fooding the bill
“There is a lot to be gained in terms of reducing risk and cost associated with heart disease, stroke, and diabetes by making relatively simple changes to one’s diet,” said corresponding author Thomas Gaziano, MD, MSc, of the Division of Cardiovascular Medicine at the Brigham. “Our study indicates that the foods we purchase at the grocery store can have a big impact. I was surprised to see a reduction of as much as 20 percent of the costs associated with these cardiometabolic diseases.”
In collaboration with researchers at the Friedman School of Nutrition Science and Policy at Tufts University, the team looked at the impact of 10 dietary factors — fruits, vegetables, nuts/seeds, whole grains, unprocessed red meats, processed meats, sugar-sweetened beverages, polyunsaturated fats, seafood omega-3 fats, and sodium — on one’s diet on annual CMD-related health costs.
Towards this end, they used data from the National Health and Nutrition Examination Survey (NHANES), to create a representative U.S. population sample of individuals aged between 35 and 85 years old. Then, using a model they developed, the team analyzed how the individual risk of CMDs shift based on the dietary patterns of respondents to the NHANES study. Finally, they calculated what the overall CMD-related costs would be if everyone followed an optimal diet in relation to the 10 factors.
They conclude that suboptimal diets cost around $301 per person per year, for a total of over $50 billion nationally. The team explains that this sum represents 18% of all heart disease, stroke and type 2 diabetes costs in the United States. Costs were highest for those with Medicare ($481/person) and those who were eligible for both Medicare and Medicaid ($536/person).
The consumption of processed meats and low consumption of nuts, seeds, and omega-3 fat foodstuffs (such as seafood) were the highest drivers of CMD risks and additional costs, the team explains.
“We have accumulating evidence […] to support policy changes focused on improving health at a population level. One driver for those changes is identifying the exorbitant economic burden associated with chronic disease caused by our poor diets,” said co-senior author Renata Micha of the Friedman School of Nutrition Science and Policy at Tufts.
“This study provides additional evidence that those costs are unacceptable. While individuals can and do make changes, we need innovative new solutions — incorporating policy makers, the agricultural and food industry, healthcare organizations, and advocacy/non-profit organizations — to implement changes to improve the health of all Americans.”
The results of this study may underestimate the total cost of unhealthy diets, the team explains, as it can contribute to other health complications aside from CMDs. Additionally, other factors beyond the 10 used in this study could drive health risks and costs, they add. Finally, the NHANES study relied on self-reported data — participants were asked to recall what they ate in the past 24 hours — which isn’t very reliable.
The paper “Cardiometabolic disease costs associated with suboptimal diet in the United States: A cost analysis based on a microsimulation model” has been published in the journal PLOS Medicine.
Many low- to middle-income countries struggle with issues of undernutrition. Around a third of them, however, are faced with a very unusual problem: undernutrition and obesity at the same time.
Obesity and undernutrition have become increasingly connected in recent decades, a new paper reports. It explains that modern food systems are negatively impacting the health of poorer countries around the world, with the poorest being particularly affected. The authors also look at the causes, context, and possible solutions to this issue.
The faults in our food
“We are facing a new nutrition reality where major food system changes have led the poorest countries to have high levels of overweight and obesity along with undernutrition,” says Barry M. Popkin, lead author of the paper and W.R. Kenan Jr. Distinguished Professor of Nutrition at the University of North Carolina.
“Our research shows that overweight and obesity levels of at least 20% among adults are found in all low-income countries. Furthermore, the double burden of high levels of both undernutrition and overweight occurs primarily in the lowest-income countries — a reality that is driven by the modern food system. This system has a global reach and is preventing low- and even moderate-income countries and households from consuming safe, affordable, and healthy diets in a sustainable way.”
Global estimates place the total number of obese children and adults in the world at some 2.3 billion, the paper explains. It’s just one half of the issue known as the double burden of malnutrition — the other being undernutrition, a deficiency of calories or (in this context) essential nutrients.
For the study, the team used survey data from low- and middle-income countries in the 1990s and 2010s to estimate which of them were experiencing the double burden of malnutrition. If over 15% of a country’s population had wasting and over 30% stunting, over 20% of its women show thinness, and over 20% of its citizens in total were overweight, the team considered that particular country to be experiencing this double burden.
Over a third of low- and middle-income countries satisfy this condition — 45 of 123 countries in the 1990s and 48 of 126 countries in the 2010s — meaning they’re experiencing both forms of malnutrition. It was most commonly seen in sub-Saharan Africa, East Asia and the Pacific, and South Asia where 29, 9, and 7 countries were affected, respectively. In the 2010s, 14 more countries (with some of the lowest incomes in the world) had started to experience this double burden of malnutrition compared to the 1990s.
In comparison, low- and middle-income countries that enjoy the highest incomes in the category were much less likely to experience this issue, the team adds. In their view, this is indicative of a growing number of overweight people in the poorest countries even as large segments of the population face stunting, wasting, and thinness.
“Emerging malnutrition issues are a stark indicator of the people who are not protected from the factors that drive poor diets,” Popkin says. “The poorest low- and middle-income countries are seeing a rapid transformation in the way people eat, drink, and move at work, home, in transport, and in leisure.”
“The new nutrition reality is driven by changes to the food system, which have increased the global availability of ultra-processed foods that are linked to weight gain while also adversely affecting infant and preschooler diets. These changes include disappearing fresh food markets, increasing numbers of supermarkets, and the control of the food chain by supermarkets and global food, catering, and agriculture companies in many countries.”
But how can someone be both underfed and overfed at the same time? It comes down to the quality of food they can access. Ultra-processed foods are a very attractive option for people with low incomes, as they’re convenient (they require very little time investment to prepare), they seem hearty and are widely available. However, while they usually pack a caloric punch, they’re very poor in nutrients; in essence, they’re empty calories. Even worse, they usually contain a high level of additives to make them more appealing and to increase shelf-life, which can have adverse effects on health and body mass.
In an ironic twist of fate, healthy options such as fresh vegetables can be effectively out of reach for people with low incomes who may not have the purchasing power or time necessary to acquire and prepare them, or simply haven’t been educated on the drawbacks of their current diet.
The authors recommend “double-duty” policies aimed at reducing both the risk of nutritional deficiency and that of obesity and its related effects. They call for a concentrated effort from local governments, civil society, academia, the private sector, and the United Nations to create the economic conditions needed to address the double burden of malnutrition to devise and implement such strategies.
We also shouldn’t make the error of believing this issue is limited solely to ‘someplace else’. Previous research has highlighted that over half of America’s calories come from ultra-processed foods, and that they are responsible for 90% of the total added sugar intake in the country.
The paper “Dynamics of the double burden of malnutrition and the changing nutrition reality” has been published in the journal The Lancet.
Expanding agriculture can not only affect the diversity and abundance of wildlife but also alter the diet and habitat of wild mammals, especially those living in fragmented forest areas near crops or pastures, according to new research.
“Forest remnants and the agricultural matrix aren’t separate. There’s an interface between these areas. It’s hardly news that animals need to find food in plantations, but this practice hadn’t been quantified until now. I should stress that the diet in question isn’t ideal. It’s a matter of survival,” said Marcelo Magioli, the lead author.
Magioli and his fellow researchers looked at stable carbon and nitrogen isotopes in the fur of the animals, a method that allows them to know the kind of food eaten in the last three months. They used hair traps and collection of droppings so as not to alter the animals analyzed, many of them threatened with extinction.
They collected samples in four areas of the Brazilian state of Sao Paulo, two near croplands in Campinas and Botucatu and two in conserved areas in stable carbon and nitrogen isotopes. The samples were from 29 species of mammals and of all the samples taken more than half were from animals living in human-modified areas.
“From previous studies using GPS collars and camera traps, we knew the animals moved through these areas,” Magioli said regarding their research. “However, stable isotope analysis told us where they were feeding and how important each food source was in their diet.”
The results showed that 34.5% of the animals fed only with agricultural resources from human-modified areas, while 67.5% survived on forest resources. Frugivores and insectivores ate the same no matter where they lived, while herbivores and omnivores were the most affected, eating mainly agricultural resources.
Species like the cougar, capybara, brocket deer, ocelot and crab-eating raccoon where some of the ones mentioned in the study for having adapted their diets because of the agricultural expansion. The margays, a small wild cat, for example, eat animals that live near sugarcane plantations.
“Our findings point to the need for more favorable agricultural management to support these animals and underscore the importance of the Brazilian Forest Code and of maintaining legal reserves and permanent conservation areas [APPs],” Katia María Ferraz, co-author, said.
Researchers, led by archaeologists at the University of York, have found the earliest evidence of milk consumption ever observed in the teeth of prehistoric British farmers.
The team identified a milk protein called beta lactoglobulin (BLG) in the mineralized dental plaque of seven individuals who lived around 6,000 years ago. The findings will help improve our understanding of when humans developed lactose persistence (LP), the ability to digest lactose in milk. It’s also the earliest confirmed sighting of the BLG molecule so far.
Luckily they didn’t brush their teeth
“The fact that we found this protein in the dental calculus of individuals from three different Neolithic sites may suggest that dairy consumption was a widespread dietary practice in the past,” says lead author Dr. Sophy Charlton, from the Department of Archaeology at the University of York.
Dental plaque, while not something you want to have, can be used to gain insight into the diets of ancient people. The material traps proteins from food, through saliva, which are then mineralized in plaque or tartar. The samples of dental plaque analyzed in this study are the oldest to be investigated for protein content, the team explains.
The Neolithic period in Britain ran from 4,000 to 2,400 BC and saw the transition from hunter-gatherer communities to farming, mostly revolving around the growing of wheat and barley and the domestication of animals such as cows, sheep, pigs, and goats. This time also saw the emergence of complex cultural practices such as the construction of monumental and burial sites.
The remains used in this study come from three different Neolithic sites in England: Hambledon Hill, Hazleton North (both in the south of England), and Banbury Lane (in the East Midlands). Individuals from all three sites had milk proteins from goats, cows, and sheep, suggesting that multiple domesticated species were reared at the same time.
“It would be a fascinating avenue for further research to look at more individuals and see if we can determine whether there are any patterns as to who was consuming milk in the archaeological past — perhaps the amount of dairy products consumed or the animals utilised varied along the lines of sex, gender, age or social standing,” says Dr. Charlton.
Finding these proteins in the ancient teeth is particularly exciting, as previous genetic work has suggested that people living at the time did not yet have the ability to digest lactose.
Overall, it means that the ancient farmers either consumed milk in small amounts or processed it into foods such as cheese (which removes most of the lactose). Lactose persistence, our ability to consume milk into adulthood, was the result of a mutation in the genes encoding production of lactase, which breaks down lactose. How and why we evolved this ability is of quite some interest to researchers, as milk and dairy products played an important part in past diets, as well as those of today — and this study gives us a better idea of when the mutation occurred, the conditions that helped it appear, and how people dealt with lactose intolerance before it.
“Because drinking any more than very small amounts of milk would have made people from this period really quite ill, these early farmers may have been processing milk, perhaps into foodstuffs such as cheese, to reduce its lactose content,” says Dr. Charlton.
“Identifying more ancient individuals with evidence of BLG in the future may provide further insights into milk consumption and processing in the past, and increase our understanding of how genetics and culture have interacted to produce lactase persistence.
The paper “New insights into Neolithic milk consumption through proteomic analysis of dental calculus” has been published in the journal Archaeological and Anthropological Sciences.
If the consumption of meat and dairy doesn’t fall, at least one-quarter of the world’s tropical lands could disappear by the end of the century, according to new research which studied the impacts of consumption trends on biodiverse regions across the globe.
Researchers at the University of Edinburgh and Karlsruhe Institute of Technology estimate that large swathes of natural land could potentially vanish if the demand for animal products continues to grow. The study was published in the Global Environmental Change journal.
About 9% of natural land — 95% of which is in the tropics — could go within 80 years unless global dietary habits change, the scientists said, looking at consumption and agriculture patterns.
“Reducing meat and dairy consumption will have positive effects on greenhouse gas emissions and human health. It will also help biodiversity, which must be conserved to ensure the world’s growing population is fed. Changing our diets will lead to a more sustainable future and complement food security goals while addressing global food inequalities,” lead author Dr Roslyn Henry said.
As incomes increase across the globe, consumption has shifted from staples such as starchy roots and pulses to meat, milk, and refined sugars. Meat and dairy products are associated with higher land and water use and higher greenhouse gas emissions than any other foods.
By replacing animal products with plant-based alternatives, the researchers predict that the global demand for agricultural land could be reduced by 11%. Industrial feed systems also reduce agricultural expansion but may increase environmental degradation due to agricultural pollutants such as fertilizer, they said.
The study comes only a week after a report on land use by the Intergovernmental Panel on Climate Change (IPCC), which identified reducing meat consumption and changing diets to plant-based as an important focus for climate change mitigation.
Insect-based dinner might not sound very enticing but new research shows it’s definitely packed full of antioxidants.
Scrumptious! Image credits Will Brown / Flickr.
A new study reports that edible insects and other creepy crawlies are comparable foods such as olive oil and orange juice in antioxidant content. The findings come as an effort to further entice people to consider insects as part of their diet, a move that would have huge implications for the sustainability and environmental footprint of agriculture worldwide.
Young grasshopper — sautéd
“At least 2 billion people — a quarter of the world’s population — regularly eat insects,” says Prof. Mauro Serafini, lead author of the study published in Frontiers in Nutrition. “The rest of us will need a bit more encouragement.”
“Edible insects are an excellent source of protein, polyunsaturated fatty acids, minerals, vitamins and fiber. But until now, nobody had compared them with classical functional foods such as olive oil or orange juice in terms of antioxidant activity.”
The fact of the matter is that what most of us put on the table, combined with how many people Earth houses currently, simply doesn’t make for a sustainable future. Insects can help us address this issue; they have a much more modest environmental footprint than livestock, and are a great source of nutrients. However, most people are quite reluctant to come anywhere near these animals, let alone put them in their mouth.
Those who do, however, will likely see the benefits, the new paper reports. According to the analysis, crickets pack 75% the antioxidant power of fresh OJ, and silkworm fat twice that of olive oil. The team hopes that the findings will provide the nudge many people need to consider including these insects into their diets. That taste and presentation are key elements of food, they write, but hope that the ‘selfish and immediate incentives’ provided by the insects’ antioxidant properties will be enough to convince some consumers.
“Consumption of foods rich in antioxidants, such as fruit and vegetables, play an important role in the prevention of oxidative stress-related diseases such as cardiovascular disease, diabetes and cancer,” the study explains.
Antioxidants are substances that bind free-radicals, uncharged molecules which are typically highly reactive and short-lived that damage cells and tissues. The team tested a range of commercially-available insects and invertebrates for their antioxidant activity. The inedible parts of these animals (such as wings or stingers) were removed, after which the insects were ground up.
Two parts were extracted from each species: a fat- and a water-soluble fraction. Each extract was then tested for antioxidant content and activity. Water-soluble extracts of grasshoppers, silkworms, and crickets have “antioxidant capacity 5-fold higher than fresh orange juice,” the authors report. The fat-soluble fractions of evening cicadas and silkworms showed twice the antioxidant activity of olive oil. “For perspective, using the same setup we tested the antioxidant capacity of fresh orange juice and olive oil — functional foods that are known to exert antioxidant effects in humans,” adds Serafini.
Trolox Equivalent Antioxidant Capacity (TEAC) of fat-soluble extracts compared to olive oil. Image credits Selena Ahmed et al., (2019), Frontiers.
Trolox Equivalent Antioxidant Capacity (TEAC) of water-soluble extracts compared to fresh orange juice. Image credits Selena Ahmed et al., (2019), Frontiers.
However, these values are representative for the dry, isolated extracts, which aren’t something you’d want to eat. The water content of the insects was within 2-7% while orange juice is 88% water; most foods fall somewhere in between the two. A glass of 88% water, 12% grasshopper or silkworm extract would have around three-quarters of the antioxidative effect of a glass of OJ.
Another interesting finding is that the insects showed a lower total content of polyphenols (a major source of plant-derived antioxidant activity) across the board compared to orange juice. However, this compound alone couldn’t account for the full antioxidant capacity seen in the study — suggesting that insects also contain a yet-unknown substance with antioxidant capacity.
“The in vivo efficiency [i.e. in humans] of antioxidant-rich food is highly dependent on bioavailability and the presence of an ongoing oxidative stress. So as well as identifying other antioxidant compounds in insects, we need tailored intervention studies to clarify their antioxidant effects in humans,” Serafini says.
“In the future, we might also adapt dietary regimens for insect rearing in order to increase their antioxidant content for animal or human consumption.”
The paper “Antioxidant Activities in vitro of Water and Liposoluble Extracts Obtained by Different Species of Edible Insects and Invertebrates” has been published in the journal Frontiers in Nutrition.
Some crocodile species are vegetarians — but also extinct.
Image credits Sasin Tipchai.
A study on fossilized teeth revealed that several ancient groups of crocodyliforms, the lineage that includes crocodiles and alligators, were not carnivores at all; in fact, they were vegetarians. The team reports that at least three (but potentially up to six) different species have relied on a plant-based diet in the past. They all are now extinct.
Mean green veggie machine
“The most interesting thing we discovered was how frequently it seems extinct crocodyliforms ate plants,” said Keegan Melstrom, a doctoral student at the University of Utah. “Our study indicates that complexly-shaped teeth, which we infer to indicate herbivory, appear in the extinct relatives of crocodiles at least three times and maybe as many as six.”
All crocodilians living today share the same general body shape, ecology, and live their lives as generalist, semi-aquatic carnivores. Being carnivores, their teeth are relatively simple, conical implements used to rip and tear through flesh and not much else. Melstrom and his graduate advisor, Randall Irmis, chief curator of the Natural History Museum of Utah, compared the tooth complexity of extinct and living crocodyliforms using a method originally developed for use in living mammals. Overall, they measured 146 teeth from 16 different species of extinct crocodyliforms.
It quickly became clear that the extinct species showed a different pattern of tooth structure. Some species showed multiple specializations that are not seen in living species today, including a feature known as heterodonty: regionalized differences in tooth size or shape.
“Carnivores possess simple teeth whereas herbivores have much more complex teeth,” Melstrom explained. “Omnivores, organisms that eat both plant and animal material, fall somewhere in between. Part of my earlier research showed that this pattern holds in living reptiles that have teeth, such as crocodilians and lizards.”
“So these results told us that the basic pattern between diet and teeth is found in both mammals and reptiles, despite very different tooth shapes, and is applicable to extinct reptiles.”
Through measurements of dental measurements and those of other morphological features, the team reconstructed the diets of the extinct crocodyliforms. The results suggest that these species had a wider range of dental complexity — and thus diet too — than previously estimated.
Plant-eating crocodyliforms popped up quite early in the group’s evolutionary history, the team explains, just after the mass extinction at the end of the Triassic. These species lived up until the end of the Cretaceous, when the dinosaur mass extinction occurred. The team’s analysis shows plant-eating species developed at least three times, possibly up to six times, during the Mesozoic.
“Our work demonstrates that extinct crocodyliforms had an incredibly varied diet,” Melstrom said. “Some were similar to living crocodilians and were primarily carnivorous, others were omnivores and still others likely specialized in plants.”
“The herbivores lived on different continents at different times, some alongside mammals and mammal relatives, and others did not. This suggests that an herbivorous crocodyliform was successful in a variety of environments!”
Their work is not yet done, however. Some fossil crocodyliforms are missing teeth and, armed with the knowledge of the present study, Melstrom plans to reconstruct their diets as well. He also wants to find out why these extinct crocodiles diversified so radically after one mass extinction but not another, and whether dietary ecology could have played a role.
The paper “Repeated Evolution of Herbivorous Crocodyliforms during the Age of Dinosaurs” has been published in the journal Current Biology.
Researchers at Drexel University, Pennsylvania want to help you cut down on excessive sugar consumption by playing a game.
Image via Pixabay.
The U.S. Department of Health and Human Services estimates that over half of American adults consume excessive amounts of added sugars, with detrimental effects to their health. A new study led by Evan Forman, Ph.D., a psychology professor at Drexel University’s College of Arts and Sciences, reports that computer games can be used to train players to wean off this sugar and help them to improve their health and manage their weight more easily.
“Added sugar is one of the biggest culprits of excess calories and is also associated with several health risks including cancer,” said Forman, who also leads the Center for Weight, Eating and Lifestyle Science (WELL Center) at Drexel.
“For these reasons, eliminating added sugar from a person’s diet results in weight loss and reduced risk of disease.”
The team developed and tested the efficiency of a “brain training game” that targets the brain area which inhibits our impulses. The aim was to train people to better resist the lure of foods with added sugars, specifically to decrease the consumption of sweets and sweet foods. Such systems have shown their efficiency in helping people quit other unhealthy habits, such as smoking. Forman says that this study is the first to look at how “highly personalized and/or gamified inhibitory control training” can help with weight loss using repeated, at-home training sessions.
In collaboration with Michael Wagner, a professor and head of the Digital Media department in Drexel’s Westphal College of Media Arts & Design and a group of digital media students, the team developed a game they named “Diet DASH”.
Diet DASH is built to integrate with each player’s particular habits. It automatically customized itself to focus on the sweets each participant tended to eat and adjusted its difficulty according to how well each player was resisting the temptation to eat said sweets. To test how well it worked, the team collaborated with a randomized group of 109 participants who were overweight and reported to over-enjoy sweets. Prior to starting the game, each participant took part in a workshop to help them understand why sugar is detrimental to their health and to learn which foods to avoid and methods for doing so.
“Prior to randomization, all participants attended a 2-h workshop in which they were provided with a dietary prescription (to eat only foods without added sugar or with very low amounts of added sugar, such as certain low-sugar breakfast cereals) as well as guidance in making dietary
modifications (e.g., reading food labels, shopping and cooking substitutions). Explanatory text, figures, and tables that allowed participants to easily identify targeted foods with added sugar were distributed,” the paper explains.
“The workshop helped give participants strategies for following a no-sugar diet. However, we hypothesized that participants would need an extra tool to help manage sweets cravings,” said Forman. “The daily trainings could make or break a person’s ability to follow the no-added sugar diet. They strengthen the part of your brain to not react to the impulse for sweets.”
Image credits Evan M. Forman et al., (2019), JoBM.
Each participant played the game for a few minutes every day for six weeks and then once a week for two weeks. The game itself places players in a grocery store, with the goal of putting the correct (healthy) food in a grocery cart as fast as possible while refraining from choosing incorrect food (their preferred sweets). Players were awarded points for correct items placed in carts.
Participants were randomly assigned to a highly-gamified version of the game (with better graphics and sounds) or a less-gamified version. The team reports that the gamification level didn’t seem to matter much as far as weight loss was concerned. However, the (few) male participants in the study reacted better to the highly gamified version than the women in the study.
Over half the participants in the study showed higher preferences toward sweets. For this group, the game helped them lose as much as 3.1% of their total body weight over eight weeks. Participants also rated how satisfactory they found the daily training, whether or not it became part of their daily routine, and whether they wished to continue with the training if it becomes publicly available.
“The study’s findings offer qualified support for the use of a computerized cognitive training to facilitate weight loss,” said Forman.
The WELL Center is now conducting a new trial with the highly gamified version of this training program specifically for men and is actively recruiting participants.
The paper “Computerized neurocognitive training for improving dietary health and facilitating weight loss” has been published in the Journal of Behavioral Medicine.
Edge-to-edge overbite of ancient hunter-gatherer woman (left) vs overbite configuration seen in Bronze age male (right). Credit: Science.
When humans invented agriculture, the world changed forever. With a steady and predictable food supply, humans were free to diversify their labor and pursuits, effectively ushering in civilization as we know it. By cultivating cereals and raising livestock, our diets also changed. This altered our face structure and led to less tooth wear. Now, a new study says that these biomechanical shifts may have allowed humans to produce new sounds such as “v” and “f”. In other words, language also changed along with diet.
Blame dairy for the “F” word
There are thousands of languages and dialects that are still spoken today, although most only have a handful of surviving speakers left. The languages are not only generally unintelligible between one another, but they can also be radically different in the way sounds that convey meaning are produced. This is why most scholars believe that biological machinery for producing human speech has remained largely unchanged since humans emerged hundreds of thousands of years ago.
A new study, however, suggests that language is more malleable by cultural influence — in this case agriculture — than previously thought. In 1985, renowned linguist Chares Hocket claimed that hunter-gatherers would find it difficult to pronounce “f” and “v” sounds — which linguists call labiodentals — due to their jaw structure. Before the advent of agriculture, humans, like most other primates, had teeth aligned edge-to-edge with the jaw due to their diet of hard food. When humans started eating softer food like cheese, tooth wear became less pronounced and as a result, more and more people kept an overbite into adulthood.
Steven Moran and colleagues at the University of Zurich put Hocket’s theory to the test by performing a complex statistical analysis of interdisciplinary evidence from linguistics, anthropology, and phonetics. A biomechanical computer model that mimics humans speech showed that having an overbite allows humans to produce “f” and “v” sounds using 29% less energy than in an edge-to-edge configuration.
“In Europe, our data suggests that the use of labiodentals has increased dramatically only in the last couple of millennia, correlated with the rise of food processing technology such as industrial milling,” Moran said in a statement. “The influence of biological conditions on the development of sounds has so far been underestimated.”
The findings are compelling but they’re definitely not the last word on the matter. Human speech organs do not use all that much energy to begin with — not relative to movement, for instance. If energy expenditure played a very important role, difficult speech sounds would have been gradually sifted out. But this is not the case, since many languages employ difficult speech sounds, such as clicks in some languages native to southern Africa.
But the authors claim that although the probabilities of generating labiodentals accidentally are low, over generations these sounds could have become incorporated into language — and having a diet-induced overbite helps to improve the odds. In the future, the researchers believe that their method could be used to reconstruct how ancient written languages were spoken aloud.
“Our results shed light on complex causal links between cultural practices, human biology and language,” Balthasar Bickel, project leader and UZH professor, said in the press release. “They also challenge the common assumption that, when it comes to language, the past sounds just like the present.”