Tag Archives: agriculture

Fields in North America will see their first robot tractors by the end of the year

American farm equipment manufactured John Deere has teamed up with French agricultural robot start-up Naio to create a driverless tractor that can plow, by itself, and be supervised by farmers through a smartphone.

Image credits CES 2022.

There are more people alive in the world today than ever before, and not very many of us want to work the land. A shortage of laborers is not the only issue plaguing today’s farms however: climate change, and the need to limit our environmental impact, are further impacting our ability to produce enough food to go around.

In a bid to address at least one of these problems, John Deere and Naio have developed a self-driving tractor that can get fields heady for crops on its own. This is a combination of John Deere’s R8 tractor, a plow, GPS suite, and 360-degree cameras, which a farmer can control remotely, from a smartphone.

Plowing ahead

The machine was shown off at the Consumer Electronics Show in Las Vegas, an event that began last Wednesday. According to a presentation held at the event, the tractor only needs to be driven into the field, after which the operator can sent it on its way with a simple swipe of their smartphone.

The tractor is equipped with an impressive sensory suite — six pairs of cameras, able to fully perceive the machine’s surroundings — and is run by artificial intelligence. These work together to check the tractor’s position at all times with a high level of accuracy (within an inch, according to the presentation) and keep an eye out for any obstacles. If an obstacle is met, the tractor stops and sends a warning signal to its user.

John Deere Chief Technology Officer Jahmy Hindman told AFP that the autonomous plowing tractor will be available in North America this year, although no price has yet been specified.

While the tractor, so far, can only plow by itself, the duo of companies plan to expand into more complicated processes — such as versions that can seed or fertilize fields — in the future. However, they add that combine harvesters are more difficult to automate, and there is no word yet on a release date for such vehicles.

However, with other farm equipment manufacturers (such as New Holland and Kubota) working on similar projects, they can’t be far off.

“The customers are probably more ready for autonomy in agriculture than just about anywhere else because they’ve been exposed to really sophisticated and high levels of automation for a very long time,” Hindman said.

Given their price and relative novelty, automated farming vehicles will most likely first be used for specialized, expensive, and labor-intensive crops. It may be a while before we see them working vast cereal crop fields, but they will definitely get there, eventually.

There is hope that, by automating the most labor-intensive and unpleasant jobs on the farm, such as weeding and crop monitoring, automation can help boost yields without increasing costs, while also reducing the need for mass use of pesticides or fungicides — which would reduce the environmental impact of the agricultural sector, while also making for healthier food on our tables.

Asia’s languages developed and spread alongside rice, millet agriculture

New research is peering into the shared past of the Transeurasian (or ‘Altaic’) family of languages. According to the findings, the hundreds of millions of people who speak one such language today can trace their shared legacy back to a single group of millet farmers that lived 9,000 years ago in what today is northeast China.

Integration of linguistic, agricultural, and genetic expansions in Northeast Asia. Red arrows show the eastward migrations of
millet farmers in the Neolithic, alongside Koreanic and Tungusic languages. Green arrows mark the integration of rice
agriculture in the Late Neolithic and the Bronze Age, alongside the Japonic language. Image credits Martine Robbeets at al, (2021), Nature.

This family of languages includes peoples and countries all across Eurasia, with notable members including Japanese, Korean, Tungusic, Mongolic, and Turkic. As such, it is definitely a very populous language family. Exactly how Transeurasian languages came to be, however, is still a matter of heated debate. This history is rife with expansions, population dispersals, and linguistic dispersals, making it exceedingly difficult to trace back and determine its origin.

New research, however, aims to shed light on this topic. The study combined three disciplines — historical linguistics, ancient DNA research, and archaeology — to determine where Transeurasian languages first originated. According to the findings, its roots formed around 9,000 years ago in modern China and then spread alongside the development and adoption of agriculture throughout Eurasia.

Hard to pinpoint

“We developed a method of ‘triangulation’, bringing linguistics, archaeology, and genetics together in equal proportions in a single approach,” Prof. Dr. habil Martine Robbeets, the corresponding author of the paper, said for ZME Science. “Taken by itself, linguistics alone will not conclusively resolve the big issues in the science of human history but taken together with genetics and archaeology it can increase the credibility and validity of certain scenarios.”

“Aligning the evidence offered by the three disciplines, we gained a more balanced and richer understanding of Transeurasian prehistory than each of the three disciplines could provide us with individually.”

The origin of Transeurasian languages can be traced back to a group of millet farmers — the “Amur ” people — in the Liao valley, according to the team’s findings.

These languages spread throughout Eurasia in two major phases. The first one took place during the Early–Middle Neolithic (Stone Age), when sub-groups of the Amur spread throughout the areas around the West Liao River. During this time, the five major branches of the Transeurasian linguistic family started to develop among the different groups, as the distance between them allowed for the creation of dialects.

The second phase involved contact between these five daughter branches during the Late Neolithic, Bronze Age, and Iron Age. This phase was characterized by these intergroup interactions as well as genetic inflows (and possible linguistic imports from) populations in the Yellow River area, western Eurasian peoples, and Jomon populations. Agriculturally speaking, this period also saw the adoption of rice farming (from the Yellow River area), the farming of crops native to west Eurasia, and pastoralism.

Although the spread of Transeurasian languages was largely driven by the expansion of a single ethnic group, it was not limited to a single one. Several peoples mixed together with the descendants of those millet farmers from the Liao River over time to create the rich tapestry of language, customs, and heritages seen in Eurasia today.

“Our [results] show that prehistoric hunter-gatherers from Northeast Asia as well as Neolithic farmers from the West Liao and Amur all project within the cluster of present-day Tungusic speakers. We call this shared genetic profile Amur-like ancestry,” explains Dr. Robbeets for ZME Science. “Turkic and Mongolic speakers and their ancestors preserve some of this Amur ancestry but with increasing gene flow from western Eurasia from the Bronze Age onwards.”

“As Amur-related ancestry can also be traced back to speakers of Japanese and Korean, it appears to be the original genetic component common to all speakers of Transeurasian languages. So the languages spread with a certain ethnic group, but this ethnic group got admixed with other ethnic groups as it spread across North and East Asia.”

Although we can trace these interactions in the genomes of individuals from across Eurasia, there are still a lot of unknowns. For example, we can’t estimate the degree or direction of linguistic and cultural exchanges between different groups. We can tell that there was an increasing degree of Yellow River genetic legacy woven into the peoples of the West Liao River, but there is no record after which we can gauge whether there was an exchange of words or cultural practices between these groups. Similarly, we can’t estimate the magnitude of the influence this exchange had on the two groups.

Still, one of the topics that Dr. Robbeets wants to underline with these findings is that, in order to truly understand the history of languages in Northeast Asia, a different approach is needed compared to what is being performed today.

“Archaeology and linguistics in Northeast Asia have tended to be conducted within the framework of modern nation-states,” she explained in an email for ZME Science. “Accepting that the roots of one’s language, culture, or people lie beyond the present national boundaries is a kind of surrender of identity, which some people are not yet prepared to make. Powerful nations such as Japan, Korea, and China are often pictured as representing one language, one culture, and one genetic profile but a truth that makes people with nationalist agendas uncomfortable is that all languages, cultures, and humans, including those in Asia, are mixed.”

“Our results show that a much more flexible and international framework is needed.”

Another more direct implication of these findings is that it implies that sedentarism and agriculture took root in the area much earlier than assumed up to now. Previously, the emergence of the Transeurasian family of languages was believed to have coincided with the adoption of livestock herding in Asia’s Eastern Steppes. Tying it to agricultural practices in the Liao River area, however, pushes the timeline of its emergence back roughly 4,000 years.

The paper “Triangulation supports agricultural spread of the Transeurasian languages” has been published in the journal Nature.

Nitrogen-fixing bacteria could make farming possible even in Martian soils

New research is investigating the role bacteria could play in future efforts to grow food on planets such as Mars. While such an approach has been shown to boost the growth of clover plants, more work needs to be done to determine exactly how to proceed with off-world farming.

Image credits Kathleen Bergmann.

Nitrogen is a key nutrient for plant growth, one which typically acts as a bottleneck here on Earth. Nitrogen itself cannot be directly assimilated by plants or animals, despite it being available in the atmosphere. Nature has found a workaround to this issue through the formation of symbiotic relationships between the roots of plants and nitrogen-fixing bacteria. These supply essential compounds to the roots that, in turn, feed the bacterial nodules.

Martian soil, or regolith, also lacks essential nutrients, including nitrogen compounds, which would severely limit our ability to grow food in space. In a bid to understand whether we could enrich alien dirt with the aid of Earth-born bacteria, a new study reports on efforts to grow clover in simulated regolith.

Clover for good luck

“Nodule forming bacteria Sinorhizobium meliloti has been shown to nodulate in Martian regolith, significantly enhancing the growth of clover (Melilotus officinalis) in a greenhouse assay. This work increases our understanding of how plant and microbe interactions will help aid efforts to terraform regolith on Mars,” the study reads.

For the study, the team planted clover plants in a man-made regolith substitute that closely resembles that found on Mars. Some of the plants were inoculated with nitrogen-fixing, nodule-forming bacteria, while the others were left to fend for themselves. Sinorhizobium meliloti is a common bacterium on Earth that naturally forms symbiotic relationships with clover plants. Previous research has shown that clover plants can grow in regolith substitutes, the authors explain, but didn’t explore the effects of nitrogen-fixing bacteria on their growth rate.

One of the key findings of the study was that inoculated plants experienced a significantly higher rate of growth than the controls. They recorded 75% more growth in the roots and shoots of these plants compared to clovers which didn’t have access to the bacteria.

Although the bacteria had a positive effect on the plants themselves, the team also reports not seeing any increase in ammonium (NH4) levels in the regolith. In other words, the soil itself did not become enriched in any meaningful way in key nitrogen compounds that other plants could tap into. Furthermore, the symbiotic relationship between bacteria and clovers planted in regolith showed some functional differences compared to those of clovers planted in potting soil.

This suggests that even with the benefit of nitrogen-fixing bacteria on their side, crops sown in alien soils might still develop at different rates to crops on Earth.

All in all, however, the research proves that there is a case to be made for growing crops on alien worlds. Although there are still many unknowns regarding this topic, and even considering a lower yield rate, it remains an attractive proposition. Shuttling materials to outer space remains extremely expensive. It’s also a very long trip to Mars. Both of these factors make it impractical to rely on food transports from Earth to feed a potential colony.

But we are making strides towards offering space explorers greater autonomy. For example, we’re exploring new ways to produce building materials from astronauts’ own bodies and waste. We’re also working on ways to obtain water from regolith.

We’re likely not ready to grow crops in space, however, and the authors note that more research is needed to understand exactly how such a process should be handled. Chief among these, they want to expand their research to other types of crops, and to address possible issues of plant toxicity in regolith.

The paper “Soil fertility interactions with Sinorhizobium-legume symbiosis in a simulated Martian regolith; effects on nitrogen content and plant health” has been published in the journal PLOS ONE.

The system is broken: 90% of farm subsidies are bad for the environment and our health

Current agricultural support policies are destroying nature, affecting people’s health, and worsening the climate crisis, according to a new report. The same report argues that the money would be well spent in other areas, improving the world’s food systems and the environment. 

Image credit: Flickr / Chris Shervey

Food producers receive $540 billion from governments every year to support their activities, according to a United Nations report. Almost 90% of those subsidies are “price distorting and environmentally and socially harmful,” with beef and milk, the biggest source of greenhouse gas emissions, receiving most of the funding.

“This report is a wake-up call for governments around the world to rethink agricultural support schemes to make them fit for purpose to transform our agri-food systems and contribute to the four betters: better nutrition, better production, better environment and a better life,” Qu Dongyu, head of the UN Food and Agriculture Organization (FAO), said in a statement. 

Previous studies have warned about the many challenges faced by the global food system, with millions experiencing chronic hunger while others are obese and overweight. Meanwhile, a third of the food produced every year is wasted, which amounts to more than one billion tons. The system is essentially broken, the researchers explain. 

Between 2013 and 2018, support to agricultural producers averaged almost $540 billion per year, representing around 15% of total agricultural production value, the UN found. Of this, about $294 billion was provided in the form of price incentives and around $245 billion as fiscal subsidies to farmers, the majority tied to the production of a specific commodity.

Price incentives and fiscal subsidies have many negative implications on food systems, according to the report. They boost practices and behaviors that that are harmful to the health, sustainability, equity, and efficiency of food systems. Price incentives can distort food trade and production, while subsidies can lead to the promotion of monoculture, for example. 

Unhealthy products (like sugar) and emission-intensive commodities (such as beef and dairy) receive the most support worldwide, despite the potentially negative impacts on health as well as on the environment. The repercussions on climate are especially relevant for rich countries, as they consume more dairy and beef per capita than poor countries, the UN said. 

“Agriculture contributes a quarter of greenhouse gas emissions, 70% of biodiversity loss and 80% of deforestation, Joy Kim from the UN Development Program said in a statement. “International finance pledges for climate change are $100bn a year but governments are providing $470bn in farm support that has a huge damaging impact on climate and nature.”

The road ahead

Farmers will be getting more money in the coming years. Global support to farmers is projected to increase to almost $1.8 trillion in 2030 under a business-as-usual scenario. About 73% of this (USD 1.3 trillion) would be in the form of border measures, which affect trade and domestic market prices. The rest would be fiscal subsidies

Simply removing agricultural support can have important adverse trade-offs. Eliminating all forms of agricultural support by 2030 without repurposing them would bring emissions down by 78.4 million tons of CO2 equivalent. This would also cause a decline in crop production, livestock farming, and farm employment of 1.3%, 0.2%, and 1.27%, respectively. 

That’s why the UN wants to repurpose rather than eliminate agricultural subsidies. Any fiscal savings from support reduction should be repurposed towards healthier, more sustainable, equitable, and efficient ways of supporting agriculture. This includes measures to mitigate negative short-term impacts, such as cash transfer schemes.

While there’s no one-size-fits-all optimal repurposing strategy, the report includes a guide for governments to create their own repurposing strategy. This includes estimating the amount of support, identifying its impact, designing the approach for repurposing, estimating the impact of that strategy, reviewing it before implementation, and monitoring the outcomes. 

“The transformation to healthier, more sustainable, equitable and efficient food systems needs to be accelerated if we are to meet the SDGs. While a few countries have started repurposing and reforming their agricultural support, broader, deeper and faster reforms are needed for food systems transformation,” the report reads.

The full report can be accessed here. 

Biochar can help us keep climate change at bay and more food on the table, according to a new meta-study

Biochar — organic material baked in oxygen-starved environments — can help power up the agriculture industry while also fighting against climate change, according to a new paper.

Image via Wikipedia.

Coal is naturally produced underground, over millions of years, from ancient biomass. This organic matter that got buried in some way or another was then compressed and heated up through geological processes, which broke down its original structure and increased its carbon content. Biochar is produced in a very similar way, but instead of letting natural (and slow) geological processes cook it up, we make it ourselves.

This material can help fertilize soils and, thus, increase crop yields. At the same time, by preventing the carbon within it from being released back into the atmosphere, the use of biochar in agriculture can help fight climate change.

Very, very, very well done

“Biochar can draw down carbon from the atmosphere into the soil and store it for hundreds to thousands of years,” says Stephen Joseph, lead author of the paper, and a Visiting Professor in the School of Materials Science and Engineering at the University of New South Wales Science. “This study also found that biochar helps build organic carbon in soil by up to 20 percent (average 3.8 percent) and can reduce nitrous oxide emissions from the soil by 12 to 50 percent, which increases the climate change mitigation benefits of biochar.”

Biochar is a product usually made from aggregated organic waste — a mixture of waste biomass from agriculture, forestry, and household sources. For such an unassuming substance, it could lend a sizable hand towards fighting climate change and us having more food, according to a new paper. The findings are supported by the Intergovernmental Panel on Climate Change’s recent Special Report on Climate Change and Land, which estimated there was important climate change mitigation potential available through biochar. This report estimated that biochar use “could mitigate between 300 million and 660 million tons of carbon dioxide [globally] per year by 2050,” Prof. Joseph explains.

“Compare that to Australia’s emissions last year—an estimated 499 million tons of carbon dioxide—and you can see that biochar can absorb a lot of emissions. We just need a will to develop and use it.”

The meta-study reviewed 300 papers on the topic, including 33 meta-analyses that together reviewed around 14,000 biochar studies that have been published over the last 20 years. According to its result, the use of biochar, when mixed-in with crop soils, can boost yields by 10% to 42%, reduce the levels of heavy metals in plant tissues by between 17% and 39%, and increases the bioavailability of phosphorus, a critical nutrient that often acts as a bottleneck for the development of plants.

All in all, its use helps plants grow faster and larger, while also helping them better resist environmental stresses such as toxic metals, diseases, organic stressors such as herbicides and pesticides, and water stress.

The paper also explains how biochar acts on the roots of plants, boosting them. In the first three weeks of a plant’s life, it explains, biochar particles react with soils and stimulate germination (i.e. it helps seeds ‘catch’) and the development of the fledgling plant. Over the next six months or so, biochar particles in the soil form reactive surfaces which help draw nutrients towards the roots. As these particles start to age, something that happens around three to six weeks after being mixed into the soil (depending on environmental conditions), they break down and form microaggregates with other chemicals. This, in turn, helps protect roots and prevents the decomposition of organic matter.

Biochar yielded the best effects when used in acidic or sandy soils together with fertilizers, the authors explain.

“We found the positive effects of biochar were dose-dependent and also dependent on matching the properties of the biochar to soil constraints and plant nutrient requirements,” Prof. Joseph says.”Plants, particularly in low-nutrient, acidic soils common in the tropics and humid subtropics, such as the north coast of NSW and Queensland, could significantly benefit from biochar.”

“Sandy soils in Western Australia, Victoria and South Australia, particularly in dryland regions increasingly affected by drought under climate change, would also greatly benefit.”

Prof. Joseph has been studying the use of biochar ever since he was introduced to the practice by Indigenous Australians in the seventies. He explains that these people, alongside indigenous groups in Australia, Latin America (especially in the Amazon basin), and Africa, have been using biochar to maintain soil health and improve crops for centuries. Despite this, it hasn’t really been adopted as a commercial product, and most countries only produce a small amount of biochar every year.

To really make an impact, he explains, biochar needs to be integrated with farming operations on a wide scale. The first step towards that, he feels, is to tell farmers that biochar is an alternative they can opt for, and establish demonstrations so farmers can see that the benefits are real, not just words.

“This is in part due to the small number of large-scale demonstration programs that have been funded, as well as farmers’ and government advisors’ lack of knowledge about biochar, regulatory hurdles, and lack of venture capital and young entrepreneurs to fund and build biochar businesses,” he explains. “We’ve done the science, what we don’t have is enough resources to educate and train people, to establish demonstrations so farmers can see the benefits of using biochar, to develop this new industry”.

The paper “How biochar works, and when it doesn’t: A review of mechanisms controlling soil and plant responses to biochar” has been published in the journal GCB Bioenergy.

Humans started growing cannabis 12,000 years ago — for food, fibers, and probably to get high

A new study traced back the origin of cannabis agriculture to nearly 12,000 years ago in East Asia. During this time cannabis was likely a multipurpose crop — it was only 4,000 years ago that farmers started growing different strains for either fiber or drug production.

Cannabis landraces in Qinghai province, central China. Credit: Guangpeng Ren.

Although it’s largely understudied due to legal reasons, cannabis is one of the first plants to be domesticated by humans. Archaeological studies have found traces of cannabis in various different cultures across the centuries, but when and where exactly was cannabis domesticated was still unclear.

Many botanists believed the plant emerged in central Asia, but a new study shows that east Asia (including parts of China) is the origin of domesticated cannabis.

A research team was led by Luca Fumagalli of the University of Lausanne and involved scientists from Britain, China, India, Pakistan, Qatar, and Switzerland. The researchers compared and analyzed 110 whole genomes of different plants, ranging from wild-growing feral plants and landraces to historical cultivars and modern hybrids.

They concluded that the ancestral domestication of cannabis plants occurred some 12,000 years ago, during a period called the Neolithic, and that the plants likely had multiple uses.

“We show that cannabis sativa was first domesticated in early Neolithic times in East Asia and that all current hemp and drug cultivars diverged from an ancestral gene pool currently represented by feral plants and landraces in China,” the study reads.

“Our genomic dating suggests that early domesticated ancestors of hemp and drug types diverged from Basal cannabis [around 12,000 years ago] indicating that the species had already been domesticated by early Neolithic times”, the study adds. The results go against a popular theory regarding the plant’s origin, the researchers add.

“Contrary to a widely-accepted view, which associates cannabis with a Central Asian center of crop domestication, our results are consistent with a single domestication origin of cannabis sativa in East Asia, in line with early archaeological evidence.”

When a study can land you in jail

Cannabis grown for drugs. Image credits: Esteban Lopez.

It’s hard to study cannabis, regardless of what your reasons are. You can’t just go around picking or buying plants because the odds are that’ll get you in trouble. To make matters even more difficult, if you want to see where a domesticated plant originated from, you have to collect samples from different parts of the world — which is even more likely to get you in trouble.

So for decades, researchers looked at indirect evidence. Most cannabis strains appear to be from Central Asia, and several cultures of that region have used cannabis for thousands of years, so that seems like a likely place of origin. It’s a good guess, but not exactly true.

Cannabis grows pretty much everywhere — that’s why it’s called “weed” — and just because people in Central Asia were quick to adopt the plant doesn’t necessarily mean they were the first ones to grow it.

After crossing legal and logistic hurdles, Fumagalli was able to gather around 80 different types of cannabis plants, either cultivated by farmers or growing in the wild. They also included 30 previously sequenced genomes in the analysis.

With this, they found that the likely ancestor of modern cannabis (the initial wild plant that was domesticated) is likely extinct. However, its closest relatives survive in parts of northwestern China. This fits very well with existing archaeological evidence, which shows evidence of hemp cord markings some 12,000 years ago. In particular, it seems to fit with a 2016 study by other scientists that said that the earliest cannabis records were mostly from China and Japan.

The early domestication of cannabis in the Neolithic could be a big deal. Cannabis isn’t exactly a food crop. You can indeed use it to get oil, and the seeds can be consumed but its main use is for fibers and for intoxication. Usually, when archaeologists look at a population domesticating a crop, they naturally think of food as a priority — but this would suggest that Neolithic folk also had, uhm, other priorities. Or simply, cannabis was a multi-purpose crop.

Diversifying crops

The team also identified the genetic changes that farmers brought over the centuries through selective breeding. They found that some 4,000 years ago, farmers started to focus on either plants that would produce fibers, or on those better suited for producing drugs.

For instance, hemp strains bred for fiber production have mutations that inhibit branching, which makes them grow taller and produce more fibers. Meanwhile, strains bred for drug production, have mutations that encourage branching and reduce vertical growth. This results in shorter plants that produce more flowers. In addition, plants grown for drug productions also have mutations that boost the production of tetrahydrocannabinol (THC).

For millennia, hemp (the cannabis grown for fibers) has been an important crop. Clothes, ropes, and various other products used hemp fibers, but the emergence of modern metalworking and modern synthetic fibers (such as nylon) led to its downfall, and the once-popular plant became all but forgotten. Until recently.

A modern cannabis greenhouse. Image credits: Richard T.

Recently, we’ve seen a resurgence in the interest in cannabis, for sustainable fiber production as well as medicinal and recreational purposes. With more and more countries decriminalizing the possession and growth of cannabis, the plant may be making a comeback — and for researchers looking to study its origin, that’s great news.

While this study offers an unprecedented view into the evolutionary history of cannabis, it’s still a relatively small sample size. Finding wild samples is hard — and feral samples you find today aren’t really wild, they’re just grown varieties that escaped and are now feral. Furthermore, even gaining access to cultivars can be difficult.

Maybe, as society becomes more inclined to consider cannabis, researchers can gain access to more resources about it as well. By studying its genomic history, scientists can also provide valuable insights into the desired functional properties of plants, helping growers develop better varieties both for medicine and for other uses.

The study has been published in Science Advances.

Novel technologies could reduce agriculture emissions by 70%

A combination of innovations in digital agriculture, crop and microbial genetics, and electrification could bring down emissions from agriculture by up to 70% in the next 15 years, according to a new study. It’s a win-win situation — these technologies are already available and could help agriculture decarbonize while increasing resilience, profits, and productivity.  

Image credit: Flickr / State of Israel

Agriculture is a difficult topic to tackle. On one hand, it has to keep up with the growing demand for food from a growing population, but on the other hand, it contributes to about a quarter of our total emissions. And the emissions have to go down if we want to avoid catastrophic climate change.

Agricultural activities from crops to livestock production contribute to emissions in several ways. Management practices on soils can lead to increased availability of nitrogen, resulting in emissions of nitrous oxide. Livestock, especially ruminants produce methane as part of their digesting process known as enteric fermentation. Cars and other equipment used for farming also produce emissions.

In a new study, researchers from the United States Department of Energy’s Argonne National Laboratory identified a set of technologies and agricultural practices that could significantly reduce greenhouse gas emissions. They grouped them under three phases and used a model to simulate adoption on grain production in the US. 

“Our study emphasizes the importance of a two-pronged approach—reducing farming emissions and maximizing soil carbon storage—to addressing the climate crisis through agriculture,” Dan Northrup, lead author of the study, said in a statement. “Developing and broadly applying emission reduction technologies, including seed genetics, is critical to achieving net negative production.”

The initial step focuses on reducing chemical use, mainly nitrogen fertilizer. Improving the timing, placement, and formulation of the fertilizers using commercially available nitrogen fertilizer additives can reduce emissions by delaying nitrate delivery to the roots and increasing plan nitrogen uptake, according to the researchers. Simply put, the whole process can be made more efficient — not necessarily sustainable, but definitely better.

Phase two then focuses on replacing current technology with near-mature low-emission alternatives. They are substitutes for current tools that now have low barriers of adoption. Implementing them would reduce emissions by 41%. This includes crop genetics for improved nitrogen use efficacy and electrification of farming operations.

Finally, phase three includes a complete system redesign of agricultural practices for emission reduction and soil carbon storage. This means using fewer chemicals and novel inputs such as locally producer and low concentration fertilizers as well as adopting soil health and carbon sequestration practices such as cover crops and no-till.

While this production system would be difficult to adopt today, years of transition, preparation for these tools, ecosystem service payments, and sector interest will make adoption significantly easier, the researchers argue. Substantial and ongoing technical innovation is required as well as confidence from farmers to adopt them soon. 

“Every field is unique and individualized emission reduction plans will be needed that optimize the combination of technologies. The need for an individualized plan highlights a significant social barrier to adoption of new technology. Producers will need information to adopt new practices,” the researchers wrote. 

The study was published in the journal PNAS.

Here are the world’s most favorite fruits — judging by production figures, at least

We’ve all heard time and time again how eating fruits and vegetables is healthy for us, and it definitely is. Hopefully, everybody here is getting their five-a-day. But that also raises an interesting question: which fruits do people prefer?

It’s practically impossible to track exactly how much of each type of fruit people consume worldwide, so we’ll use global production figures as a proxy. Presumably, farmers would be loathe to grow produce that nobody buys, so production figures should be a reliable indicator of consumption, as well.

Now, we all have our own preferences, and nowhere is that more true than when food is concerned. Don’t feel the need to change yours because of this list. But I always find it fascinating to see how individual choices compound on a global level. There are billions of people living on Earth today, and our food combined diets have, throughout history, shaped the world around us.

So let’s see what fruits we’re all munching on — statistically speaking.


The least fruit-tasting fruit out there is, actually, the one that sees the highest production levels worldwide.

Image credits Hans Braxmeier.

Tomatoes are a bit of an outlier on this list. Taxonomically speaking, they are fruits (berries, to be specific). But from a practical point of view, they’re employed as vegetables for salads, sauces, or cooked dishes.

The tomato originates from the American continents and was introduced to the rest of the world following the Columbian exchange, the single largest transfer of people, plants, and animals in history. Our earliest records suggest tomatoes were being cultivated by locals in the areas where they’re endemic (in the Andes, Peru, Chile, Ecuador, and the western stretches of Bolivia) since around 700 AD. Today, they’re virtually indispensable in multiple culinary traditions, including Mediterranean cooking.

Spanish and Portuguese explorers brought tomatoes back to Europe — and from there, the world — but what really made them a hit was that, at first, rich people died trying to eat them.

When it was first introduced to Europe tomatoes were, quite understandably, very expensive. Due to that, only well-off people could really afford to buy them — and they were probably also quite interested in doing so, both as a status symbol and due to sheer curiosity. Another thing rich people of the time used to show off their wealth and status were metal plates and cutlery, generally made out of pewter. And it was these plates that would make the tomato one of the most feared fruits in 1700’s Europe — when it was widely known as the ‘poison apple’.

You see, tomato juice is quite acidic. Pewter is an alloy that’s in large part comprised of lead, and this will leach out when exposed to a strong-enough acid. Eat enough lead and you get lead poisoning and die. People at the time didn’t understand this process, but they could observe that nobles would eat tomatoes and die sometime later. So people started to avoid eating them, which dramatically lowered their price.

This, turns out, was a huge boon for the tomato, because poor, hungry people aren’t picky. They also don’t own pewter plates, so they wouldn’t get any lead poisoning from eating them.

Another issue that plagued the tomato during its early days is that the plant and roots themselves are quite toxic, even if the fruits are not. Until people learned to avoid these parts, this toxicity further helped lower the price of tomatoes, making them a staple of the common people.

Tomatoes today are virtually everywhere, and very popular for their versatility. They’re a great source of umami flavor, and one of the few plants out there that contain it, which would further explain their popularity.

In 2019, the world produced 181 million tonnes of tomatoes, with China being the main producer.


The first undeniable fruit on the list also harbors a few secrets.

Image via Pixabay.

For starters, all the bananas you’ve ever eaten are, most likely, completely identical genetically. In essence, you’ve been eating the same banana over and over again. That’s because banana plants meant for commercial use are spread through saplings — they’re all clones.

They didn’t start out this way. The next time you bite into one, look for the very small seeds throughout the fruit’s pulpy core. Banana plants are spread through saplings because these seeds are very, very rarely viable. We’ve made them so. Wild bananas have large seeds in the middle of the fruit, to such an extent that eating one isn’t a pleasurable experience — it makes them borderline inedible, actually.

While the seeds of domesticated bananas are used for breeding programs, they have a low chance of germinating (growing into a plant). Furthermore, spreading the plants through samples of rhizome (a specialized type of root structure) allows farmers to reliably grow banana trees that have similar productivity, ensuring that their crops remain economically viable. This is made easier by the fact that bananas are parthenocarpic — they don’t need to be pollinated to bear fruit.

Naturally, there’s also downsides to this approach: for one, the root samples can carry diseases or pest from one plant to the new ones. Secondly, since all the plants in a crop are clones, a single pest or disease can wipe them all out. In theory, one could wipe out whole cultivars. It may sound like a pretty abstract issue, but it has actively lowered the quality of our bananas over time. Today, the Cavendish is the most common cultivar of banana. But up until the 1950’s, what you were most likely to find in a store were the Gros Michel variety. Taste-wise, these were reportedly much more enjoyable than the Cavendish. Artificial banana flavoring today tastes more like bananas than bananas themselves because they were based on the Gros Michel cultivar.

Sadly, the Panama disease virtually wiped out the Gros Michel — which, just like the Cavendish, were all clones of one another. The Cavendish cultivar was bred specifically to be more resistant to certain pests and diseases. That being said, in the wild or on small independent farms, bananas have much, much greater genetic diversity. Hopefully, this will act as an insurance policy, so we never have to give up bananas.

Another unusual aspect of the banana is how surprisingly radioactive it is. Large batches have been known, for example, to trigger sensors meant to identify smuggled nuclear material. This comes down to their high content of potassium (which is a good thing). One isotope of this element, potassium-40, is naturally radioactive. But worry not — unless you plan to eat a few million bananas in one sitting, you’re not getting radiation poisoning. And, honestly, if you reach that point, radiation won’t be your main issue.

Today, bananas are among the most cultivated plants out there, being the 4th biggest crop worldwide. In 2019, around 117 million metric tons of this yellow fruit were produced across the world, with India being the single largest grower.


Although their name implies the existence of earth-, fire-, and air- melons, so far we’ve only encountered watermelons.

Image credits Pete Linforth.

But boy oh boy are we happy we did. Watermelons are one of the most popular fruits on Earth, both in regards to quantity eaten, where it’s enjoyed, and how long it’s been enjoyed. Originally an African species, watermelons are part of the Cucurbitaceae family and closely related to the cucumber, squash, zucchini, and gourds. Biologically speaking it is, again, despite its looks, a berry.

Our earliest evidence of watermelon farming comes from around 4000 to 5000 years ago in ancient Egypt. Seeds of various cultivars have even been found buried with the Pharaos, which showcases just how popular and appreciated these fruits were even back then.

It quickly spread to any and all areas with a favorable climate. By the 7th century watermelons reached India, and by the 10th century, China. Between the 10th and 12th centuries it was also introduced to Europe, mainly by Muslim peoples from Northern Africa, and it became quite common here by the 17th century. From here, it made its way to the new world, and even Native Americans are documented to have grown watermelons in the Mississippi and Florida areas in the 17th century. Pacific island natives were also quite excited to adopt the crop as European explorers first encountered them.

Why would explorers have watermelons on them? Well, with a water content that can reach up to 92% by weight, they make excellent canteens, especially on long voyages where wooden barrels holding drinking water would routinely rot or become salty.

One especially interesting variety of watermelon is the seedless kind. While it’s tempting to assume that someone messed around with some watermelon DNA to produce them, that isn’t exactly the case. Seedless watermelons are actually produced by crossing a variety with 22 chromosomes with one that has 44 chromosomes. This results in an infertile, seedless hybrid, much like a mule.

But if you get the variety with seeds, you can practice your hand at breaking a world record. More specifically, the seed-spitting world record. You’re trying to beat Jason Schayot who, according to the Guinness Book of World Records, spit watermelon seeds a distance of 75 feet 2 inches (22.9108 meters) on August 12, 1995, at a seed-spitting festival in Georgetown, Texas. The seeds are actually edible, however, and quite nutritious, if you’d rather not spit.

In 2019, around 100.41 million metric tons of watermelon were grown worldwide, with China leading production.


The humble apple is iconic in European and Asian cultures and is one of the oldest domesticated fruits on the planet.

Image credits S. Hermann & F. Richter.

Since it’s been grown for so long and carried around by various groups of people, exactly where it originates is still a matter of some debate — but for now, the consensus is that the apple was born somewhere in central Asia. According to our best estimates, people found and first domesticated the apple around the Tian Shan mountain range between 4,000 and 10,000 years ago.

In those days, they most likely resembled crab apples in both appearance and taste. These are considerably smaller and less sweet than the apples you’re used to today, and can be quite sour and hard to bite into.

Apple trees today are mainly grown through grafting. Basically, this involves cutting the mid-upper parts of a growing tree, and attaching (grafting) an apple tree cutting on top. It’s quite like making a Frankenstein tree, and it’s not that hard to pull off if you know how to do it.

One interesting tidbit regarding the apple is that it often pops up in mythos as the ‘golden’ apple, usually for a hero to take back from some monster or another. Probably the earliest example of this (at least in Europe) is Greek mythology. But — and this is an important but — in Middle English, which was spoken as late as the 17th century, the word ‘apple’ was used to refer to any fruit (apart from berries), so ‘golden apples’ aren’t necessarily apples. That being said, other languages didn’t have this peculiarity, so the golden apples of Greek or Romanian mythos were, indeed, apples.

Everybody here knows what apples are. Sweet, crunchy, juicy. They keep doctors away. We won’t dwell too much on them. However, there is one last tidbit I’d like to discuss here. You might have heard that apple seeds are toxic — they are. Apple seeds contain amygdalin, which is broken down into hydrogen cyanide during digestion. Hydrogen cyanide is a decidedly deadly compound. But there’s no need to panic if you’ve bitten into a seed or six — an adult would need to ingest between 150 and a few thousand apple seeds (depending on how crushed or chewed they are) to have any issues. And, if you don’t chew them at all, they just pass harmlessly through you.

In 2019, global production of apples reached around 87.2 million metric tons, with China being the leading producer.

And now, in last place on this list, we have a bit of a tie!

Oranges and Grapes

Oranges — the common name for ‘sweet oranges’ — aren’t actually a naturally occurring fruit. They were developed by people, as a cross between the pomelo and the mandarin orange. Our earliest written evidence of the orange comes from around 300 BC, from Chinese literature.

Image via Pixabay.

Interestingly, despite its artificial origin, the sweet orange is the most cultivated fruit tree in the world, and accounts for most of the citrus production worldwide.

On the other hand, we have grapes. These are wildly-occurring fruits (berries), unlike the sweet orange. Grapes are believed to have originated in the Middle East, and we estimate they’ve been cultivated for a very long time now: between 6,000 to 8,000 years.

Needless to say, you can’t make wine without grapes. But that’s true in more ways than one — yeast, probably the first domesticated microorganism, that’s been used since time immemorial to produce alcohol, lives on the skin of grapes. Perhaps unsurprisingly, then, our earliest evidence of wine-making hails from around 8,000 years ago in present-day Georgia (the one in Europe, not America). They didn’t waste any time getting brewing, did they?

You can’t talk about any of the ancient European civilizations, nor ancient Egypt, without mentioning grapes and wine. The Phoenicians, Greeks, Romans, and people of Cyprus grew grapes for consumption and wine-making. Ancient Egyptians also grew the purple variety. These are pigmented with anthocyans, a class of colored compounds that give red wines their incredible hues.

So, why are these fruits tied? Is it because they’re both tasty and a good base for drinks? No. Is it their bright colors? Their preference for warm climates? Not really. It’s just that, production-wise, they’re pretty much neck and neck.

In 2019, global production of oranges reached 78.7 million metric tons, while that of grapes was around 77.14 million metric tons. Brazil was the single largest producer of oranges that year, while China led the way on grapes.

Goodbye, pesticides? This new robot can kill 100,000 weeds per hour using lasers

Can lasers rid us of pesticides? According to Seattle-based company Carbon Robotics, they surely can. The company just presented a new generation of an “autonomous weeder,” a tractor-sized farming robot that uses cameras and lasers to kill weeds. And it’s already sold out.

Image credit: Carbon Robotics

Pesticides have become a common ally for farmers seeking to boost their production and rid their fields of unwanted weeds. But the widespread use of pesticides has turned them into a growing problem, as they can contaminate soil, water, turf, and other vegetation. They can kill weeds but are also harmful to other organisms such as birds, fish, and insects — sometimes, including humans.

That’s why many farmers are now moving away from the use of pesticides and are instead looking for alternative options. Consumers are also taking note, and demand for non-pesticide products is surging. While some move to organic production, others are embracing new technological developments. And this is where Carbon Robotics enters with its ground-breaking robots packed with lasers and cameras.

“AI and deep learning technology are creating efficiencies across a variety of industries and we’re excited to apply it to agriculture. Farmers, and others in the global food supply chain, are innovating now more than ever to keep the world fed,” Paul Mikesell, CEO of Carbon Robotics, said in a statement. “Our goal is to create tools to address weed management and elimination.”

The Autonomous Weeder safely and effectively drives through crop fields to identify, target, and eliminate weeds. Unlike other weeding technologies, the robots utilize high-power lasers to eradicate weeds through thermal energy, without disturbing the soil. This allows farmers to use fewer pesticides and reduce costs to remove unwanted plants.

The farming robot essentially looks like a large cube on wheels. As it drives itself down rows of crops, its 12 cameras scan the ground. An onboard computer, powered by AI, identifies weeds, and the robot’s carbon dioxide lasers then zap and kill the plants. It can eliminate more than 100,000 weeds per hour and weed 15 to 20 acres of crops in one day.

“The potential with these new robots is the highest I’ve seen with any technology as a farmer. I expect the robots to go mainstream because of how effectively they address some of farming’s most critical issues, including the overuse of chemicals, process efficiency, and labor. These robots work with a variety of crops,” James Johnson, a Carzalia Valley farmer, said in a statement.

For Carbon Robotics, farmers who deploy the robots will likely see a significant increase in crop yield and quality. The lack of herbicides and soil disruption paves the way for a regenerative approach, which leads to healthy crops and higher yields. Overall costs also decline, as the robots enable farmers to reduce the costs of manual labor and of inputs such as pesticides and fertilizers.

Incorporating the Autonomous Weeded also represents an economical path to organic farming, Carbon Robotics argued. One of the largest obstacles to organic farming is cost-effective weed control. The farming robot provides a solution to weed management that doesn’t require herbicides or an increase in manual labor, helping farmers be organic.

Still, the rise of automation in agriculture has raised the alarm for workers that rely on agriculture as a main source of income. Edgar Franks, political director at the union Familias Unidas por La Justicia, based in Burlington, Washington, told Seattle Times: “What’s going to happen to the workers who made the industry so profitable, all of a sudden to be kicked out?” Agriculture, like many fields, is finding itself at a potentially defining crossroad.

For now, Carbon Robotics sold its farming robots directly to farmers, mostly on the US West Coast so far. But it has already sold out for 2021, with new models for the 2022 growing season soon available for pre-order. The company said it was to improve further on the software side, giving farmers more access to data and real-time information.

Consumers would pay more for sustainably produced food, study finds

If the environment isn’t being polluted and soils aren’t being damaged in its production, consumers in Finland would be OK to pay extra for food, according to a new study. Researchers found that 79% of households there are willing to pay extra for food produced using cropping diversification and other sustainable agricultural practices.

Image credit: Flickr / Andrew Fogg.

Monocultures are strongly linked to biodiversity loss around the world, and northern Europe makes no exception. High-input practices, often connected to monocultures, have been found to cause soil degradation and nutrient leaching to water bodies, negatively affecting ecosystems such as rivers or lakes. Soil organic matter content is gradually decreasing in Finish croplands due to these reasons.

Cereal monocultures dominate in large parts of southern Finland despite many alternative crops being available for diversification of monocultures. The area under protein crops, oilseeds, potatoes, sugar beets, and other edible plants in Finland is relatively small due to limited domestic demand and excessive imports of protein feed for livestock.

Researchers affiliated with the Diverfarming project, which encourages crop diversification, focused on the value of such practices in southern Finland, the prime crop production region in the country. They then quantified consumers’ willingness to pay for these items, after explaining the benefits of a larger diversity of cultivation practices and crop rotation.

Diversified farming practices under low-input and organic systems sustain and supply multiple agroecosystem services, thus reducing their environmental strain and the need for off-farm inputs. They can improve the resilience of cropping systems to multiple environmental stresses and thus make food production more stable.

Despite the importance of agroecological ecosystem services (these are things ecosystems do to support the growth of crops, such as nutrient recycling, that we don’t need to pay for), their total value is not currently included in the prices of food and agricultural products. There are few studies in Finland focusing on the non-market value of these services, but none is specifically targeted to the benefits of cropping diversification.

The researchers presented three valuation scenarios to a sample of 600 consumers. The first one focused on agroecosystem services on cropland, the second on wider socio-cultural effects, and the third was a combination of both. They found most consumers would be willing to pay $270 per household per year for crop diversification.

“Positive societal implications of cropping diversification were valued slightly higher than direct field-level effects of diversification. In particular, improved maintenance of domestic food production and processing, reduced nutrient runoffs from agriculture, maintained food culture and tradition, as well as improved carbon balance of agriculture and the number of jobs in rural areas were valued high,” the researchers wrote.

As demand for food and fiber increases, fueled by growing populations, rising incomes, and global integration, the negative effects of conventional agriculture increase in scale. A key question is then how society can motivate farmers to reduce negative side effects while meeting the demand for agricultural production.

For the researchers, the high willingness to pay for more sustainable agriculture represents an important message for policy makers and other key actors in the food chain. While agroecology has to be further developed, a higher contribution by consumers can also likely fund future transition towards more sustainable food production.

The study was published in the journal Environmental Management.

If we want to reduce global inequality, we could learn a thing or two from Mario Kart

For Boston University researcher Andrew Reid Bell, the popular Mario Kart is much more than a racing video game. In a new study, Bell argues that the principles of Mario Mark can serve as a useful guide for creating more equitable and favorable social and economic programs for low-income farmers.

Image credit: Flickr / Yamashita Yoel

“Farming is an awful thing to have to do if you don’t want to be a farmer. You have to be an entrepreneur, you have to be an agronomist, put in a bunch of labor…and in so many parts of the world people are farmers because their parents are farmers and those are the assets and options they had,” Bell said in a media statement.

For Bell, agriculture was once a path to prosperity for the world’s poor, but that’s no longer the case. He traveled across several countries in southern Africa and found small-scale farmers currently face many challenges and life is a perpetual uphill battle for them. New mechanisms for the alleviation of poverty are needed. This is where the Mario Kart metaphor enters the stage.

In the game, when players drift to the back of the pack, they get power-ups such as bananas or green shells that can help them get back into the race – making cars at the front slower while boosting those at the back. Those on top of the race can also get power-ups such as stars and mushrooms but they are much less effective. The worse you’re doing in the race, the more likely it is to get a bonus.

“In any room of professionals or decision-makers, anywhere in the world, someone or their kid plays Mario Kart,” Bell told Vice. “That makes it potentially powerful, because the same people who might launch the next social or environmental program are people who can relate to Mario Kart. It shows us this important social feedback mechanism that’s rare in practice.”

Of course, using the concept of rubber banding to help agricultural families and communities who are in need is much more complicated in the real world than in the game. Still, Bell is optimistic about the prospects. Governments could create a program through which a third party would pay farmers to adopt better agricultural practices – a concept known as Payments for Ecosystem Services (PES).

Ecosystems support plant and animal life by maintaining the overall balance in nature. When functioning well, ecosystems also bring multiple benefits to people. These benefits range from the provision of basic commodities, such as food and fuel, to spiritual benefits – for example, the visually pleasing landscapes that we all enjoy. PES can support farmers who take care of those services for everyone to enjoy.

Bell acknowledges that a big challenge would be finding companies willing to pay for ecosystem services and linking them with the farmers who are open to changing their agricultural practices. The good news is that the more people that participate in such programs, the more that will likely join – a concept that Bell calls as “crowding in” in his paper.

He highlighted that the adoption of mobile phones has significantly increased in most of the world’s developing places. This could help governments and organizations find individuals that are searching for a better livelihood through more sustainable agricultural practices. Still, the access to mobile devices is still far from ideal.

“So many of the things we do in practice—think, reinvesting profits in a business, paying for schools with local property taxes—are reinforcing loops that tend to increase gaps between groups, and it’s really helpful to have this shared, relatable gaming experience to build on,” Bell told Vice.

The study was published in the journal Nature Sustainability.

Climate change is reducing agricultural productivity

Researchers have quantified for the first time the impact of climate change on global agricultural productivity, and it’s bad: the sector is 21% down from where it could have been without the growing emissions. That’s the equivalent of losing about seven years of farm productivity increases since the 1960s, the researchers estimated.

Image credit: Flickr / StateOfIsrael

Enhancing agricultural productivity is vital not just for feeding the world, but also for lifting global living standards. Investments in agricultural research have boosted agricultural productivity in the past decades in several ways, but this has been distributed unequally across the world — with growing signs that progress is slowing down in certain regions.

Research to date on the historical impact of climate change on agricultural productivity has focused on yields of major cereal crops or on total GDP. While relevant, they have a limited value for assessing overall productivity for several factors. Cereal crops only represent 20% of agriculture’s global net production value, for example. In other words, we’re still missing an important part of the picture. A new study aims to fill in some of that void.

Agriculture and climate change

Agriculture not only contributes to climate change but is also largely impacted by it. Changing rainfall patterns, extreme weather events, and higher average temperatures are already challenging farmers around the world. Even if some changes may be positive for some regions, most will be negative, especially in parts of the world already suffering from environmental or other changes.

A team of researchers from the University of Maryland, Cornell University, and Sandford University developed a model of weather effects on productivity, looking at productivity in both the presence and absence of climate change. They linked changes in weather and productivity measures with climate models over the last six decades.

“Our study suggests climate and weather-related factors have already had a large impact on agricultural productivity,” Robert Chambers, co-author of the research and professor at the University of Maryland, said in a statement. “We used the model in this paper to estimate what total factor productivity patterns would have looked like in the absence of climate change.”

The researchers calculated the total factor productivity of the agricultural sector, a calculation used to measure the growth of an industry. Agriculture is a unique industry,as not all the inputs that determine productivity are in control of the farmer — like the weather, for instance. They incorporated weather data as part of their analysis, bringing a new perspective into productivity data.

“Productivity is essentially a calculation of your inputs compared to your outputs, and in most industries, the only way to get growth is with new inputs,” Chambers said in a statement. “Agricultural productivity measurement hasn’t historically incorporated weather data, but we want to see the trends for these inputs that are out of the farmer’s control

The findings showed a 21% overall reduction in global agricultural productivity since 1961. The situation is much more severe in warmer regions. Africa had a drop in productivity of 34%, following by Latin America and the Caribbean with 25.9%. Meanwhile, cooler regions were less affected, such as North America (12.5%) and Europe and Central Asia (7.1%).

“It’s not what we can do, but it is where we are headed,” Chambers said in a statement. “This gives us an idea of trends to help see what to do in the future with new changes in the climate that are beyond what we’ve previously seen. We are projected to have almost 10 billion people to feed by 2050, so making sure our productivity is not just stable but growing faster than ever before is a serious concern.”

The study was published in the journal Nature Climate Change.

How much of our emissions come from agriculture?

Between a quarter and a third of all the emissions mankind is producing comes from agriculture. Despite a range of estimates, the ultimate figure seems to always be around the 25%-35% figure, but a ten percent difference in global emissions is a huge deal. So where does this difference come from, and what can we do to reduce these emissions?

Reducing red meat is one of the most eco-friendly things you can do. It’s also healthy, and it’s not like we all need to go vegetarian: even small reductions can help.

Why so much greenhouse gas?

Although people are becoming increasingly aware of the environmental impact their food has, it can come as quite a shock to see just how much of our emissions are caused by our food. How is it that so much of the global emissions, with everything that’s involved, comes from agriculture? Meat alone is responsible for more emissions than all the cars and planes in the world put together, where does all that come from?

From planting a seed to having something served on a plate, our food undergoes quite the journey, and we don’t often think about everything it involves. Our food’s emissions can roughly be split into four categories:

  • Land use: even before a single calorie has been consumed, deforestation and land clearing can produce emissions. The drainage and burning of soils, and the degradation of peatlands and other carbon-rich soils also contribute.
  • Agricultural production: everything from fertilizer to fuel used for machines, methane from cows, burning of agricultural waste, etc.
  • Packaging and distribution: food processing, packaging, transport, and retail also produces a hefty chunk of emissions.
  • Cooking and waste: this part sometimes gets left out of studies, but cooking food and throwing it away can also produce substantial emissions.

Overall, this is what a breakdown of our food’s emissions would look like:

Why estimates differ

The chart above, compiled by the folks from Our World in Data, is based on a 2021 study by Crippa et al. Overall, the study found that a third of our total emissions comes from agriculture. It was a landmark study that clearly highlighted just how big of a role agriculture plays in the ongoing climate crisis, and how if we want to truly address the crisis, we need to look at more than just electric cars and renewable energy.

This was, at a basic level, not surprising at all. Previous studies have also warned that agriculture is a major contributor to emissions, and in general terms, the main takeaway message is the same. But beneath the takeaway message, why are the estimates different?

For instance, a 2018 study by Poore and Nemecek claimed that about a quarter of our emissions comes from agriculture, as opposed to a third, as per Crippa et al.

The difference between ‘a third of our emissions’ and ‘a quarter of our emissions’ may not seem like much, but it is a huge difference. That gap is four times largerthan the entire aviation industry, and about as much as India’s entire emissions. Going into the nuts and bolts of this difference may be unglamorous, but it’s what can help us better understand how to address this problem. So where do the differences come from?

For starters, Poore and Nemecek don’t always include cooking and post-consumer emissions. That alone is a big difference between the numbers, but not the only one. Poore and Nemecek only looked at food agriculture, whereas the other study also looked at non-edible agricultural products, like cotton and leather. Other differences also come from different estimates used, like for instance how much deforestation each study attributes to agriculture.

A comparison between the two studies would look like this:

So which is it? How much emissions actually come from agriculture? Well, if you include all agriculture, with not just food, it probably produces around a third of our emissions. If you don’t and only look at food, then the figure is probably somewhere over 25% — because the 26% figure of Poore and Nemecek doesn’t include post-retailer emissions. Hannah Ritchie, Head of Research at Our World In Data, sums it up thusly:

“The amount of uncertainty in these estimates means it’s helpful to understand where the differences come from, and that they all fall within a reasonably narrow range. If someone asks me, my response is usually “around 25% to 30% from food. Around one-third if we include all agricultural products.””

Meat is a problem, eating local doesn’t help much

Being aware of the problem is important, but it can only do so much. At the end of the day, we also need solutions. When it comes to reducing agriculture emissions, meat seems like the first place to strike.

An important finding of the Poore and Nemecek study is that meat’s emissions are more than just direct emissions. For instance, crops grown for animal feed amount for 6% of total food emissions, and land use for livestock amounts to 16% of total food emissions. In other words, that’s 22% of food emissions that were camouflaged under other categories. When you add it all up together, livestock and fisheries make up more than half our food’s emissions.

No matter how you look at this, this is a lot. A kilogram of beef emits 60 kilograms of greenhouse gases (CO2-equivalents) while peas, for instance, emits just 1 kilogram of gas per kg. Sure, meat can be very calorie-rich and has a lot of proteins, but it’s still disproportionate. Some meat is worse than others but, alas, alternatives fare much better environmentally.

The good, the bad, and the ugly

The world has pledged to do its best and keep the planet from heating more than 2 degrees Celsius over pre-industrial levels. Virtually all the countries on the planet have pledged to this. The bad news is that we’re really not on course to do this. If current trends continue, we’re headed for a disastrous warming.

By now, hopefully, it’s become clear that agriculture is a big part of this problem. To put it this way: we have an emissions budget, and a third of that budget goes to food and such. If we’re trying to cut expenses, it would make a lot of sense to look for cheaper food (read: less carbon-intensive food).

This is the good news: we know what needs to be done, and it’s already starting to happen. According to one recent report, Europe and the US are on track to reach “peak meat” by 2025, thanks especially to plant-based alternatives. It seems that as people pass a threshold of income and awareness, they start to shift to more plant foods — that’s great.

The ugly problem is that only a small part of the world seems to have reached that threshold, and before they do reach it, meat consumption actually grows. Simply put, the highly developed countries are starting to eat less meat; the other countries are eating more and more as they become more developed, and meat consumption grows as they become richer.

Overall, meat consumption is growing worldwide, especially in Asia.

There are, of course, other things that can be done. Reducing deforestation is one way, using fertilizers more sustainably is another. Having on-farm renewable energy and electric tractors will also help, as will paying more attention to crop rotation and sustainable agricultural practices that keep the soil healthy and prevent erosion. As consumers though, we have little control over that, other than choosing from producers who implement sustainable practices.

As consumers, the only real power we’ve got is what we choose to eat. Sometimes, carbon-intensive food is cheaper, more accessible, or takes less time to cook. Understandably, it can be easier to simply not look at this side of things. But if we want to truly address the climate crisis, this is the type of thing we need to look at.

3 technologies poised to change food and the planet

Image credits: Jan Kopřiva.

Roughly 40 per cent of the Earth’s suitable land surface is used for cropland and grazing. The number of domestic animals far outweighs remaining wild populations. Every day, more primary forest falls against a tide of crops and pasture and each year an area as large as the United Kingdom is lost. If humanity is to have a hope of addressing climate change, we must reimagine farming.

COVID-19 has also exposed weaknesses with current food systems. Agricultural scientists have known for decades that farm labour can be exploitative and hard, so it should surprise no one that farm owners had trouble importing labour to keep farms running as they struggled to ensure food workers stay free from the virus.

Similarly, “just enough, just in time” food supply chains are efficient but offer little redundancy. And pushing farmland into the wilds connects humans with reservoirs of viruses that — when they enter the human population — prove devastating.

To address these challenges, new technologies promise a greener approach to food production and focus on more plant-based, year-round, local and intensive production. Done right, three technologies — vertical, cellular and precision agriculture — can remake the relationship to land and food.

Farm in a box

Vertical farming — the practice of growing food in stacked trays — isn’t new; innovators have been growing crops indoors since Roman times. What is new is the efficiency of LED lighting and advanced robotics that allow vertical farms today to produce 20 times more food on the same footprint as is possible in the field.

Currently, most vertical farms only produce greens, such as lettuce, herbs and microgreens, as they are quick and profitable, but within five years many more crops will be possible as the cost of lighting continues to fall and technology develops.

The controlled environments of vertical farms slash pesticide and herbicide use, can be carbon neutral and they recycle water. For both cold and hot climates where field production of tender crops is difficult or impossible, vertical agriculture promises an end to expensive and environmentally intensive imports, such as berries, small fruits and avocados from regions such as California.

Cellular agriculture, or the science of producing animal products without animals, heralds even bigger change. In 2020 alone, hundreds of millions of dollars flowed into the sector, and in the past few months, the first products have come to market.

This includes Brave Robot “ice cream” that involves no cows and Eat Just’s limited release of “chicken” that never went cluck.

Precision agriculture is another big frontier. Soon self-driving tractors will use data to plant the right seed in the right place, and give each plant exactly the right amount of fertilizer, cutting down on energy, pollution and waste.

Taken together, vertical, cellular and precision farming should allow us the ability to produce more food on less land and with fewer inputs. Ideally, we will be able to produce any crop, anywhere, any time of year, eliminating the need for long, vulnerable, energy intensive supply chains.

Is agriculture 2.0 ready?

Of course, these technologies are no panacea — no technology ever is. For one thing, while these technologies are maturing rapidly, they aren’t quite ready for mainstream deployment. Many remain too expensive for small- and medium-sized farms and may drive farm consolidation.

Some consumers and food theorists are cautious, wondering why we can’t produce our food the way our great-grandparents did. Critics of these agricultural technologies call for agri-ecological or regenerative farming that achieves sustainability through diversified, small-scale farms that feed local consumers. Regenerative agriculture is very promising, but it isn’t clear it will scale.

While these are serious considerations, there is no such thing as a one-size-fits-all approach to food security. For instance, alternative small-scale mixed-crop farms also suffer labour shortages and typically produce expensive food that is beyond the means of lower-income consumers. But it doesn’t have to be an “either/or” situation. There are benefits and drawbacks to all approaches and we cannot achieve our climate and food security goals without also embracing agricultural technology.

Agriculture’s hopeful future

By taking the best aspects of alternative agriculture (namely the commitment to sustainability and nutrition), the best aspects of conventional agriculture (the economic efficiency and the ability to scale) and novel technologies such as those described above, the world can embark on an agricultural revolution that — when combined with progressive policies around labour, nutrition, animal welfare and the environment — will produce abundant food while reducing agriculture’s footprint on the planet.

Read more: Diet resolutions: 6 things to know about eating less meat and more plant-based foods

This new approach to agriculture, a “closed-loop revolution,” is already blooming in fields (and labs) from advanced greenhouses of the Netherlands and the indoor fish farms of Singapore to the cellular agriculture companies of Silicon Valley.

Cucumber plants growing indoors
Hydroponic cucumbers can be grown indoors with LED lights. (Lenore Newman)

Closed-loop farms use little pesticide, are land and energy efficient, and recycle water. They can allow for year-round local production, reduce repetitive hand labour, improve environmental outcomes and animal welfare. If these facilities are matched with good policy, then we should see the land not needed for farming be returned to nature as parks or wildlife refuges.

Today’s world was shaped by an agricultural revolution that began ten thousand years ago. This next revolution will be just as transformative. COVID-19 may have put the problems with our food system on the front page, but the long-term prospect for this ancient and vital industry is ultimately a good news story.

Authors: Lenore Newman, Canada Research Chair, Food Security and the Environment, University of The Fraser Valley and Evan Fraser, Director of the Arrell Food Institute and Professor in the Dept. of Geography, Environment and Geomatics, University of Guelph

This article is republished from The Conversation under a Creative Commons license. Read the original article.

It’s not just oil and coal. We need to tackle agriculture emissions too, study shows

A thorough inventory of the sector’s emissions underlined just how much agriculture contributes to our greenhouse gas emissions. If we want to avoid catastrophic damage, we’d be wise to address this, researchers say.

Image credit: Flickr / StateOfIsrael

Land-use and agriculture emissions are on the rise in most countries and this could cause the world to fail its climate targets, which could cause devastating damage for the entire planet.

Historically, human land use has affected the environment in multiple ways: it transformed and fragmented ecosystems, degraded biodiversity, disrupted carbon and nitrogen cycles, and added emissions to the atmosphere. But in contrast to fossil-fuels, trends and drivers of emissions from land-use change haven’t been analyzed as thoroughly.

The first problem is complexity. Compared to fossil fuels, land-use emissions are more difficult to assess. They are spatially diffuse, temporally distributed (for example, emissions from a deforested area may occur over many years), and require substantially more data and disciplinary knowledge to estimate. They are also comparatively more difficult to avoid.

A group of researchers from the University of California carried out a country-level analysis of trends in global land-use emissions in 1961–2017 and their demographic, economic, and technical drivers. They used annual time-series data on population, crop and livestock production, land area harvested, and agricultural emissions.

“We estimated and attributed global land-use emissions among 229 countries and areas and 169 agricultural products,” lead author Chaopeng Hong, said in a statement. “We looked into the processes responsible for higher or lower emissions and paid particularly close attention to trends in net CO2 emitted from changes in land use.”

Despite steady increases in population and agricultural production per capita, as well as smaller increases in emissions per land area used, land-use emissions relatively constant at about 11 gigatons CO2-equivalent until 2001, the study showed. This is mainly due to decreases in land required per unit of agricultural production.

But it all changed after 2001. Driven by rising emissions per land area, emissions increased by 2.4 gigatons CO2-equivalent per decade to 14.6 gigatons CO2-equivalent in 2017, the researchers found. This represents about 25% of total anthropogenic emissions, making agriculture a large contributor to global emissions, contributing to about a quarter of our total emissions.

Latin America, Southeast Asia, and sub-Saharan Africa are the three highest-emitting regions, accounting for 53% of global land-use emissions and more than two-thirds of global emissions growth over the period from 1961 to 2017. This is linked to cropland expansion and concomitant spikes in the emissions intensity of land use.

In the case of Latin America, increases in emissions after the year 2000 reversed earlier long-term declines; emissions in this region reached roughly 75% of 1961 levels in the 1990s. By contrast, emissions in Southeast Asia and sub-Saharan Africa have trended upwards throughout most of that period, driven by significant growth in production.

A meaty problem

The researchers also looked at different food groups and found some striking differences. Emissions per calorie of beef and other meat are 30 times greater than the average intensity of other products. Although these red meats supply just 1% of total calories produced worldwide, they account for 25% of total land-use emissions.

Between 1961 and 2017, beef production increased much less (+144%) than chicken and pork production (483%), reflecting a widespread shift in the type of meat consumed, which reduced per capita meat emissions in 2017 by 44%. This has caused a 14% decline in per capita land-use emissions in the period included in the study.

“While the situation in low-income countries is critical, mitigation opportunities in these places are large and clear,” senior author Steve Davis said in a statement. “Improving yields on already cultivated land can avoid clearing more carbon-dense forests for cultivation of soybeans, rice, maize and palm oil, thereby drastically reducing land-use emissions in these countries.”

The researchers argued countries can tackle the emissions of the agricultural sector by reducing food waste, improving the quality of the soil, better manage livestock waste and use more efficient tilling and harvesting methods. At the same time, dietary changes could also make a big difference, as highlighted in previous studies.

Recent research has also demonstrated some promising mitigation options, they added. For example, rice cultivars and non-continuous rice-paddy flooding practices may achieve substantial reductions in CH4 while also increasing yields, and dietary supplements for cattle have reduced methane emissions up to 95% in pilot studies.

The study was published in the journal Nature.

Despite good intentions, 5G might widen the gap between farmers

Mobile devices have revolutionized farming. When is it going to rain? Bring up an app. What are the grain prices? Bring up an app. Want to track your spraying? Bring up an app. While this technology has become an essential part of farming in first-world countries, those less fortunate could soon see the digital revolution pass them by, especially with the introduction of 5G.

Credit: Pixabay.

A new study out of the International Center for Tropical Agriculture (CIAT) has confirmed what many in farming are already experiencing: producers in 2nd and 3rd-world countries are seeing a widening gap open up between them and the more technologically-advanced.

Across many locations in sub-Saharan Africa, which has the potential to be a global breadbasket, fewer than 40% of farming households have internet access. Unlike Asia and Latin America where mobile phone ownership is nearly universal, fewer than 70% of farmers in sub-Saharan Africa have handheld devices. Access to 4G networks required to run more sophisticated apps is only nine percent.

“There’s an assumption that we’re going to be able to target everyone with these new technologies and everyone is going to be able to benefit,” said Zia Mehrabi, a scientist at the University of British Columbia who led the analysis published in Nature Sustainability, in a statement.

The study also showed major differences between farm size and mobile network services. Globally, 24-37% of farms under one hectare had access to 3G or 4G networks. Service availability is as high as 80% for farms over 200 hectares.

The researchers’ affordability analysis found that for many rural poor who do live in areas with coverage, getting connected could eat up the majority of their household budget.

“The study points to the need not only to expand coverage but vastly reduce the costs to make it affordable,” said Andy Jarvis, a co-author from the Alliance of Bioversity International and CIAT, in a statement. “We need to consider digital connectedness as a basic need, and design next-generation innovations to work in every corner of Africa.”

There are plans in the works to keep the schism from opening to extreme proportions, however. Probably the most notable is Elon Musk and SpaceX’s Starlink. The service aims to provide high-speed Internet globally in a cost-effective manner by leveraging a constellation of several thousand satellites. It’s hailed by agricultural groups, but villified by astronomers, who say it will ruin the night sky for research.

“There’s a lot of 5G coming online. If access is not addressed at lower-end technologies, this is only going to aggravate the divide and create more inequality,” said Mehrabi.

The study included authors from the World Bank and the Helmholtz Centre for Environmental Research in Germany.

Climate warming is changing the US planting zones

The last iteration of the plant hardiness map. The redder the area, the less it is resistant to extreme cold spells. The map is based on the average annual minimum winter temperature, divided into 10-degree F zones. See a high-resolution version of the map here. Image credits: USDA.

As climate heating starts to take its toll more and more, it’s becoming increasingly clear that planting patterns are also affected by these changes — and many plants are struggling to adapt. They do this in several ways, but one of the more direct ways is by changing their range. Simply put, as the climate becomes hotter and hotter, plants “move” to the north in the US (conversely, south of the equator, they migrate southward).

The most important factor for plants is the coldest winter temperature — this is crucial for the plants’ survival. According to a USDA study, the average coldest temperatures of 1989-2018 are more than 3°F warmer for the average city compared to the 1951-1980 baseline. Temperatures have significantly increased at almost all of the 244 stations analyzed. A warming climate shifts the natural ranges of plants all around the country and farmers and gardeners need to consider the ‘new normal’, the USDA urges.

These findings are echoed by the Third National Climate Assessment, which summarizes the impacts of climate change on the United States.

“Landscapes and seascapes are changing rapidly, and species, including many iconic species, may disappear from regions where they have been prevalent or become extinct, altering some regions so much that their mix of plant and animal life will become almost unrecognizable,” the assessment reads.

“Timing of critical biological events, such as spring bud burst, emergence from overwintering, and the start of migrations, has shifted, leading to important impacts on species and habitats.”

This is important to consider not only for gardeners but also for urban and rural planners. North Carolina Arboretum Director George Briggs says that people need to be climate-literate and make better decisions in the face of a shifting climate.

The National Oceanic and Atmospheric Administration (NOAA) also creates interactive plant hardiness maps which paint a similar picture.

“There is telling evidence that climate change is affecting plant life around the world and here at Longwood,” says Paul Redman, Director of Longwood Gardens in Pennsylvania. “Sharing the important work of NOAA with our staff, guests, and community is integral to our mission and continues Longwood Gardens’ commitment to environmental stewardship.”

In the grand scheme of things, it is yet another reminder that climate heating affects us in many (and often indirect) ways. It is a problem unfolding now, and that we need to address as soon as possible.

You can find out your area’s plant hardiness zone or check out the distribution of planting zones in your states, check out the USDA service here.

Intensive farming increases the risk of epidemics

Low genetic diversity, and the increasing use of antibiotics increase the likelihood of pathogens becoming a major health risk.

Image credits: Johny Goerend.

Agriculture has been one of the most impactful inventions in human history, but modern agriculture has changed dramatically.

Modern practices affect the entire planet, changing the distribution of animal species, triggering soil erosion, and producing a hefty amount of greenhouse gas emissions and pollution. The impact of agriculture isn’t just macroscopic, either — it’s also microscopic.

Previous research has shown that agricultural systems are highly conducive to the emergence and spread of pathogens, and intensive agriculture can increase the risk of zoonotic pathogens. A new study adds new evidence to that concern.

The study focused on the evolution of Campylobacter jejuni. C. jejuni is one of the most common causes of food poisoning in Europe and the Americas. The pathogen can cause bloody diarrhea in humans and is generally transferred from eating contaminated meat and poultry. Although it’s not as dangerous as typhoid or cholera, it can still cause serious illnesses, particularly in patients suffering from underlying health conditions.

Most cases occur as isolated events, not outbreaks, but around 1 in 7 people suffer an infection at some point in their life. It’s estimated that 20% of all cattle spread the pathogen through their faeces, and the bug is also highly resistant to antibiotics, due to the antibiotics used in farming.

In the new study, researchers analyzed the genetic evolution of the pathogen, finding that cattle-specific strains started emerging at the same time as cattle numbers increased in the 20th century — and intensive farming became a thing. According to the study conclusions, intensive agriculture brought changes in cattle diet, anatomy, and physiology — and these changes helped the bacterium to cross the species barrier, infecting humans.

Combine this with the increased movement of farm animals globally and you end up with a perfect gateway for the pathogen to move from farm animals to humans.

Professor Sam Sheppard from the Milner Centre for Evolution at the University of Bath, explains:

“There are an estimated 1.5 billion cattle on Earth, each producing around 30 kg of manure each day; if roughly 20 per cent of these are carrying Campylobacter, that amounts to a huge potential public health risk.”

This is not an isolated event — our interaction with animals (both farm animals and wild animals) can increase our risk of pathogen outbreaks. It is, perhaps, no coincidence that COVID-19 is also a zoonotic disease, potentially emerging as a result of our interaction with wildlife.

“Over the past few decades, there have been several viruses and pathogenic bacteria that have switched species from wild animals to humans: HIV started in monkeys; H5N1 came from birds; now Covid-19 is suspected to have come from bats.

The results come with a warning: if we continue in the same line, we are essentially encouraging pathogens to make the leap to humans, and the long term effects can cascade into long-term global health issues.

“Our work shows that environmental change and increased contact with farm animals has caused bacterial infections to cross over to humans too. I think this is a wake-up call to be more responsible about farming methods, so we can reduce the risk of outbreaks of problematic pathogens in the future.”

Professor Dave Kelly from the Department of Molecular Biology and Biotechnology at the University of Sheffield gives a similar warning:

“Human pathogens carried in animals are an increasing threat and our findings highlight how their adaptability can allow them to switch hosts and exploit intensive farming practices.”

Evangelos Mourkas el al., “Agricultural intensification and the evolution of host specialism in the enteric pathogen Campylobacter jejuni,” PNAS (2020). www.pnas.org/cgi/doi/10.1073/pnas.1917168117

Lack of irrigation water challenges farmers in the US

Agriculture is an important sector of the US economy. Crops, livestock, and seafood contribute more than $300 billion to the economy each year. But the sector is highly dependent on climate, which is already changing due to global warming.

Credit Flickr

Many farmers in the Western US rely on snowmelt to help irrigate their crops. But the timing and the availability of snowmelt could be severely altered because of climate change, according to a new study.

A team of researchers looked at monthly irrigation water demand and snowmelt runoff across global basins from 1985 to 2015, hoping to establish where irrigated agriculture has depended on snowmelt runoff in the past and how that might change with a higher temperature.

The next step was looking at the projected changes in snowmelt and rainfall runoff if the Earth warms by 2 or 4 degrees Celsius (about 3 ½ or 7 degrees Fahrenheit), which will potentially put snow-dependent basins at risk.

The findings showed many basins globally are at risk of not having enough water available at the right times for irrigation because of changes in snowmelt patterns. Of those most affected, two are the San Joaquin and Colorado river basins in the western United States.

“In many areas of the world, agriculture depends on snowmelt runoff happening at certain times and at certain magnitudes,” said in a statement Yue Qin, lead author of the study. “But climate change is going to cause less snow and early melting in some basins, which could have profound effects on food production.”

Under a 4-degree Celsius warming scenario, the researchers project that the share of irrigation water demand met by snowmelt in the San Joaquin Basin decreases from 33 to 18%. In the Colorado Basin, the share of water demand met by snowmelt decreases from 38 to 23%. Other basins in which agriculture is at particular risk because of changes in snowmelt are located in southern Europe, western China, and Central Asia, the authors report.

Depending on the magnitude and the timing, rainfall-runoff may be able to compensate for declines in snowmelt runoff in meeting irrigation water demand – but only for some basins. “In many basins, future changes in rainfall do not compensate for the lost snowmelt in crops’ growing seasons,” the study reads.

The researchers looked at the potential availability of reservoir storage and groundwater to help satisfy the additional irrigation need created by less snowmelt and early melting. In some basins, those additional requirements would pose great challenges in trying to make up for changing snowmelt patterns.

“Irrigation demands not met by rainfall or snowmelt currently already represent more than 40 percent of reservoir water storage in many Asian and North American basins,” Steve Davis, co-author, said. “And in a warming world, agriculture won’t be the only added demand on reservoirs and other alternative water supplies like groundwater.”

The study also examined which crops globally were at most at risk because of snowmelt changes resulting from climate change. Findings showed that rice and cotton in northern hemisphere summer, or wheat and managed grassland in spring, were particularly snow-dependent.

The results of the study could be used to prioritize and inform methods to minimize the impact of changing snowmelt on water supplies for agriculture, the researchers said. In some cases, policymakers may have to consider extra groundwater pumping and reservoir development.

The study was published in the journal Nature.

The Amazon was an early agricultural hotspot, new study shows

The Amazon is a vast region that spans across eight rapidly developing countries in Latin America. Harboring half of the planet’s remaining tropical forest, it is one of the most important biodiversity hotspots of the world.

A forest island in the Amazon. Credit José Capriles

The Amazon is now severely affected by deforestation, both legal and illegal, mainly to make room for agriculture. While this effect was seen as being something recent, some regions of the Amazon were now found to have been profoundly altered by humans dating back 10,000 years.

Crops were first domesticated in China, with the extensive use of rice, in the Middle East, with grains and pulses, Mesoamerica, with beans and squash, and the Andres, with potatoes and quinoa. But those weren’t the only regions that engaged in agriculture.

Everybody was kung-fu-farming

A study discovered a fifth global domestication area of early agriculture in the southwestern Amazonia. Squash, manioc and other edibles were used as garden plants during the early Holocene, over 10.000 years ago, modifying the landscape in the region.

“Our results confirm the Llanos de Moxos as a hotspot for early plant cultivation, and demonstrate that ever since their arrival, humans have caused a profound alteration of Amazonian landscapes, with lasting repercussions for habitat heterogeneity and species conservation,” the researchers wrote.

Located in Bolivia, the Llanos de Moxos is a savannah of approximately 48,700 square miles in southwestern Amazonia. It has a landscape dotted by earthworks, including raised fields, mounds, canals, and forest islands. The researchers looked at the forest islands located within the vast savannah for signs of early gardening.

“We basically mapped large sections of forest islands using remote sensing,” said José Capriles, assistant professor of anthropology, Penn State. “We hypothesized that the regularly shaped forest islands had anthropic origin.”

There are more than 4,700 artificial forest islands in the Llanos de Moxos savannah according to the researchers, who looked at approximately 30 of these islands and showed that many might have served as human planting areas.

“Archaeological evidence for plant domestication is very poorly available, especially in Amazonia where the climate destroys most organic materials,” said Capriles. “There is no stone in this area because it is an alluvial plain (water deposited) and it is hard to find evidence of early hunter-gatherers.”

Amazonian crops

The researchers analyzed phytoliths, tiny mineral particles that form inside plants, from radio-carbon-dated samples taken from forest island archaeological excavations and sedimentary cores. The shape of the silica-based phytoliths depends on the plants in which they form, allowing archaeologists to identify the plants that were grown in the forest islands.

The team found evidence of manioc 10,350 years ago, and squash 10,250 years ago. Early maize appears 6,850 years ago. Manioc, squash, maize and other carbohydrate-rich foods such as sweet potato and peanuts probably made up the bulk of the diet in Llanos de Moxos, supplemented by fish and large herbivores.

Researchers have argued for many years that this area was a probable center of early plant domestication because many important cultivars like manioc, squash, peanuts and some varieties of chili pepper and beans are genetically very close to wild plants living here.

The data indicate that the earliest inhabitants of Southwestern Amazonia were not just hunter-gatherers, but engaged in plant cultivation in the early Holocene. The earliest people in the area may have arrived in the region already possessing a mixed economy.

“It’s interesting in that it confirms again that domestication begins at the start of the Holocene period, when we have this climate change that we see as we exit from the ice age,” said Lombardo. “We entered this warm period, when all over the world at the same time, people start cultivating.”

The study was published in the journal Nature.