The Kingdom of Thailand wants to seal its commitment to green energy with its new hybrid solar-hydropower generation facility that covers a water reservoir in the northeast of the country.
The installation covers an immense 720,000 square meters of the reservoir’s surface and produces clean electricity around the clock: solar power during the day, hydropower at night. Christened the Sirindhorn dam farm, this is the “world’s largest floating hydro-solar farm”, and the first of 15 such farms planned to be built by Thailand by 2037. They are a linchpin in the kingdom’s pledge for carbon neutrality by 2050.
Floating towards the future
“We can claim that through 45 megawatts combined with hydropower and energy management system for solar and hydro powers, this is the first and biggest project in the world,” Electricity Generating Authority of Thailand (EGAT) deputy governor Prasertsak Cherngchawano told AFP.
At the 2021 United Nations Climate Change Conference (COP26) last year, Thailand’s Prime Minister Prayut Chan-O-Cha officially announced his country’s goal of reaching carbon neutrality by 2050, and a net-zero greenhouse emissions target by 2065. Thailand also aims to produce 30% of its energy from renewables by 2037 as an interim goal.
The Sirindhorn dam farm project, which went into operation last October, is the cornerstone of that pledge. The farm contains over 144,000 solar cells and can output 45 MW of electricity. This is enough to reduce Thailand’s carbon dioxide emissions by an estimated 47,000 tons per year.
Thailand’s energy grids continue to rely heavily on fossil fuel; some 55% of the country’s power generation as of October last year was derived from such fuels, while only 11% came from renewable sources such as solar or hydropower, according to Thailand’s Energy Policy and Planning Office, a department of the ministry of energy. Still, projects such as Sirindhorn show that progress is being made.
The $35 million project took two years to build, with repeated delays caused by the pandemic, which saw technicians falling sick and deliveries of solar panels being repeatedly delayed. EGAT plans to install floating hydro-solar farms in 15 more dams across Thailand by 2037, which would total an estimated 2,725 MW of power.
Currently, power generated at Sirindhorn is being distributed mainly to domestic and commercial users in the lower northeastern region of the country.
Thailand is also betting that its floating solar farms will be of interest to tourists, as well. Sirindhorn comes with a 415-meter (1,360-foot) long “Nature Walkway” which will give a breathtaking view of the reservoir and the solar cells floating across its surface. Locals are already flocking to see the solar farm, and time will tell if international travelers will be drawn here as well.
Local communities report that with the solar floats installed, catches of fish in the reservoir have decreased — but they seem to be positive about it. State authorities say that the project will not affect agriculture, fishing, or other community activities in the long term, and are committed to taking any steps necessary towards this goal.
“The number of fish caught has reduced, so we have less income,” village headman Thongphon Mobmai, 64, told AFP. “But locals have to accept this mandate for community development envisioned by the state.”
“We’ve used only 0.2 to 0.3 percent of the dam’s surface area. People can make use of lands for agriculture, residency, and other purposes,” said EGAT’s Prasertsak.
We often think of climate science as something that started only recently. The truth is that, like almost all fields of science, it started a long time ago. Advancing science is often a slow and tedious process, and climate science is not an exception. From the discovery of carbon dioxide until the most sophisticated climate models, it took a long time to get where we are.
Unfortunately, many scientists who played an important role in this climate journey are not given the credit they deserve. Take, for instance, Eunice Newton Foote.
Foote was born in 1819 in Connecticut, USA. She spent her childhood in New York and later attended classes in the Troy Female Seminary, a higher education institution just for women. She married Elish Foote in 1841, and the couple was active in the suffragist and abolitionist movements. They participated in the “Women’s Rights Convention” and signed the “Declaration of Sentiments” in 1848.
Eunice was also an inventor and an “amateur” scientist, a brave endeavor in a time when women were scarcely allowed to participate in science. However, one of her discoveries turned out to be instrumental in the field of climate science.
Why do we need jackets in the mountains?
In 1856, Eunice conducted an experiment to explain why low altitude air is warmer than in mountains. Back then, scientists were not sure about it, so she decided to test it. She published her results in the American Journal of Science and Arts.
Foote placed two cylinders under the Sun and later in the shade, each with a thermometer. She made sure the experiment would start with both cylinders with the same temperature. After three minutes, she measured the temperature in both situations.
She noticed that rarefied air didn’t heat up as much as dense air, which explains the difference between mountaintops and valleys. Later, she compared the influence of moisture with the same apparatus. To make sure the other cylinder was dry enough, she added calcium chloride. The result was a much warmer cylinder with moist air in contrast to the dry one. This was the first step to explain the processes in the atmosphere, water vapor is one of the greenhouse gasses which sustain life on Earth.
But that wasn’t all. Foote went further and studied the effect of carbon dioxide. The gas had a high effect on heating the air. At the time, Eunice didn’t notice it, but with her measurements, the warming effect of water vapor made the temperatures 6% higher, while the carbon dioxide cylinder was 9% higher.
Surprisingly, Eunice’s concluding paragraphs came with a simple deduction on how the atmosphere would respond to an increase in CO2. She predicted that adding more gas would lead to an increase in the temperature — which is pretty much what we know to be true now. In addition, she talked about the effect of carbon dioxide in the geological past, as scientists were already uncovering evidence that Earth’s climate was different back then.
We now know that during different geologic periods of the Earth, the climate was significantly warmer or colder. In fact, between the Permian and Triassic periods, the CO2 concentration was nearly 5 times higher than today’s, causing a 6ºC (10.8ºF) temperature increase.
Eunice Foote’s discovery made it to Scientific American in 1856, where it was presented by Joseph Henry in the Eighth Annual Meeting of the American Association for the Advancement of Science (AAAS). Henry also reported her findings in the New-York daily tribune but stated there were not significant. Her study was mentioned in two European reports, and her name was largely ignored for over 100 years — until it finally received credit for her observations in 2011.
The credit for the discovery used to be given to John Tyndall, an Irish physicist. He published his findings in 1861 explaining how absorbed radiation (heat) was and which radiation it was – infrared. Tyndall was an “official” scientist, he had a doctorate, had recognition from previous work, everything necessary to be respected.
But a few things draw the eye regarding Tyndall and Foote.
Dr Tyndall was part of the editorial team of a magazine that reprinted Foote’s work. It is possible he didn’t actually read the paper, or just ignored it because it was an American scientist (a common practice among European scientists back then), and or because of her gender. But it’s possible that he drew some inspiration from it as well — without quoting it.
It should be said that Tyndall’s work was more advanced and precise. He had better resources and he was close to the newest discoveries in physics that could support his hypothesis. But the question of why Foote’s work took so long to be credited is hard to answer without going into misogyny.
Today, whenever a finding is published, even if made with a low-budget apparatus, the scientist responsible for the next advance on the topic needs to cite their colleague. A good example happened to another important discovery involving another female scientist. Edwin Hubble used Henrietta Swan Leavitt’s discovery of the relationship between the brightness and period of cepheid variables. Her idea was part of the method to measure the galaxies’ velocities and distances that later proved the universe is expanding. Hubble said she deserved to share the Nobel Prize with him, unfortunately, she was already dead after the prize announcement.
It’s unfortunate that researchers like Foote don’t receive the recognition they deserve, but it’s encouraging that the scientific community is starting to finally recognize some of these pioneers. There’s plenty of work still left to be done.
Few places are as exposed as the European Union (EU) to Russia’s oil and gas in the wake of its invasion of Ukraine. The EU gets about 40% of its gas from Russia at a cost of over $110 million a day. Moving with a surprising speed, the EU has now introduced a strategy to cut its reliance on this fuel source by two-thirds within a year — and this could mean a lot both economically and environmentally.
The REPowerEU plan hopes to make Europe independent of Russian fossil fuels by 2030, placing initial efforts just on gas. The roadmap proposes to find alternative supplies of gas in the next few months, as well as increasing energy efficiency and doubling down on renewable energy sources in the medium to longer term.
“We simply cannot rely on a supplier who explicitly threatens us. We need to act now to mitigate the impact of rising energy prices, diversify our gas supply for next winter and accelerate the clean energy transition,” Commission President Ursula von der Leyen said in a statement. “We’ll work swiftly to implement these ideas.”
The road ahead
The new proposal will make it a legal requirement for EU countries to make sure they have a minimum level of gas storage. The objective is to have gas stocks at 90% capacity by Autumn, up from about 30% now. Discussions are already taking place with existing gas suppliers such as Norway and Algeria to increase flows and compensate for the crackdown on Russian gas. Environmentally, this won’t make a substantial difference as just the source of the gas will end.
The Commission pictures ending reliance on all fossil fuels from Russia “well before” 2030. In the short term, gas would be imported from the US and Africa and some countries might have to increase the use of coal in the months ahead. While this will mean higher carbon emissions, the longer-term goal is a shift to renewable energy — which will make a difference environmentally.
Another area of focus for the EU in the coming months will be higher imports of Liquefied Natural Gas (LNG) from suppliers including the US, Qatar, and Australia. Germany has already announced plans for two new LNG terminals to increase supplies, which has raised concerns among experts over a longer dependency on fossil fuels.
Executive Vice-President for the European Green Deal, Frans Timmermans asked to “dash into renewable energy at a lightning speed,” as they are cheaper, cleaner, and a potentially endless source of energy. The Russian invasion shows the urgency of accelerating Europe’s energy transition to cleaner energy sources, Timmerman said.
As well as finding new gas supplies, the Commission argued the reliance on Russia will be eased because of new renewable energy projects that will soon come online. Countries should consider using the revenues they raised from the Emissions Trading Scheme, the world’s largest carbon market, to pay for further green energy sources, the Commission said. Solar energy will be a particular point of focus, with a 4-stage plan aimed at delivering 1TW by 2030:
Multiply rooftop PV development through mandatory solar on new buildings, bans on fossil-fuel boilers, and significant investment.
Facilitate utility-scale development by freezing grid connection fees, and mandating member states to identify suitable solar PV sites, aiming to fast-track developments.
Pave the way for smart solar and hybrid projects using dedicated funding.
Accelerate the deployment of EU solar PV manufacturing capacity with€ 1bn.
The proposal says renewable energy projects have to be fast-tracked, with a large potential in domestic rooftop solar power. Up to a quarter of the EU’s electricity consumption could be obtained from panels on buildings and farms, the Commission said – also calling for a large increase in the use of biogas, made from agricultural and food waste.
EU leaders will meet in Versailles, France, later this week to discuss the plan, which won’t be cheap and might lead to some dissenting voices. Meanwhile, campaigners are asking governments to ensure the poorest are protected. Europe is already facing an energy poverty crisis and no one should have to choose between heating and heating, the NGO Global Witness said in a statement.
Bees and other pollinators play a key role in ensuring a healthy ecosystem and are also critical to our food security. However, they are in decline in many parts of the world, hit hard by the loss of habitats and loss and widespread use of toxic pesticides.
In recent years, many of these pesticides have been banned due to pressure from researchers and environmental groups. But they can also come back.
A nasty comeback
Thiamethoxam is a type of pesticide part of the group known as neonicotinoids, widely used around the world. However, in 2018, the most toxic ones, including thiamethoxam, were banned from outdoor use in the EU and the UK amid a growing list of evidence of the harm they cause to bees and other pollinators.
When poisoned by these chemicals, bees experience paralysis of their flight muscles and a failure in the homing behavior of foragers — which means less food for the colony. A single exposure is already enough to cause significant damage and Thiamethoxam is increasingly regarded as a problematic pesticide that is best banned. Neonicotinoids in general can also cause environmental contamination, leaching into soil and water and affecting the entire ecosystem.
However, these pesticides continue to be used even in banned places as countries can grant an “emergency derogation” when there’s the danger of a virus that can’t be contained by any other “reasonable” means. The UK is the most recent example, allowing the use of thiamethoxam for sugar beet against the advice of its own government experts.
It’s not the first time something like this has happened. In January 2021, the UK also planned a special derogation for the pesticide to save sugar beet plants from the beet yellow virus. However, there were lower levels of disease than expected and it was announced that the conditions for emergency use had not been met. This time, things look to be different.
Environmental and health organizations grouped under The Pesticide Collaboration have launched a legal challenge. The UK government decision, even temporary, isn’t consistent with halting wildlife decline, they argue. Farmers should be supported to reduce the reliance on harmful chemicals, finding alternative solutions, they added.
The sugar beet crisis
Over half the sugar consumed in the UK comes from sugar beet grown in England. A large amount of land is put aside every year to satisfy the country’s sugar demand, but climate change is now causing problems for the crop. This has resulted in pressure from farming lobby groups for the government to allow the use of harmful pesticides.
Unfortunately, this winter is much warmer than normal, and scientific modeling predicts a 68% level of virus incidence, which means the threshold for the use of the pesticide has been met, a government statement reads.
“The decision to approve an emergency authorization was not taken lightly and based on robust scientific assessment. We evaluate the risks very carefully and only grant temporary emergency authorizations for restricted pesticides in special circumstances when strict requirements are met and there are no alternatives,” a UK government spokesperson said in a statement.
There are about 3,000 farmers who grow sugar beet in the UK, according to the National Farmers Union (NFU). Farmers will be banned from growing flowering plans for 32 months after the sugar beet crop to minimize the risk to bees. NFU said in a statement that growers are relieved by the decision amid severe pest pressure across the country.
Campaigners argue only 5% of the pesticide actually reaches the crop, with the rest accumulating in the soil and causing a higher level of contamination than in pollen and nectar. This can then be a route of exposure for many organisms, including bee species that nest underground. It’s also absorbed by the roots of many plants visited by bees, such as wildflowers.
“Allowing a bee-harming pesticide back into our fields is totally at odds with ministers’ so-called green ambitions, not to mention directly against the recommendation of their own scientists. This decision comes just two months after the government enshrined in law a target to halt species loss by 2030,” Sandra Bell, campaigner at Friends of the Earth said in a statement.
Situations like this are more likely to emerge as environmental regulations become tighter and climate change also puts additional pressure on agriculture. It remains to be seen what other countries will do in the UK’s position.
Although humans make up only a tiny fraction of all life on the planet, our impact upon diversity and wildlife has been enormous. By some accounts, human activity is responsible for the loss of 80% of all wild animals and about 50% of all plants. Much of this loss was necessary to make way for farmed livestock for human consumption.
Just consider this fact: 70% of all birds on Earth are chickens and other poultry, whereas wild birds comprise a meager 30%. Were an alien archaeologist to visit our planet after humans went extinct, they would surely be staggered by the abundance of chicken fossils.
But before we became hooked on chicken eggs and hot wings, we most likely first started with geese.
Japanese archaeologists performing excavations at Tianluoshan, a Stone Age site dated between 7,000 and 5,500 years ago in China, found extensive evidence of goose domestication. They claim this is the earliest evidence of bird domestication reported thus far.
The team identified 232 goose bones, which paint a convincing picture that Tianluoshan may be the cradle of modern poultry.
First and foremost, the researchers performed radiocarbon dating on the bones themselves, rather than the sediments which covered the remains. This lends confidence that the goose bones are really as old as 7,000 years.
At least four bones belonged to juveniles no older than 16 weeks. This shows that they must have hatched at the site because it would have been impossible for them to fly in from somewhere else at their age. This is likely the case for the adult geese found there as well, given that wild geese don’t breed in the area today and probably didn’t 7,000 years ago either.
But, to be sure, the team led by Masaki Eda at Hokkaido University Museum in Sapporo, Japan, thoroughly broke down the chemical makeup of the ancient bones, showing the water they drank was local. The strikingly uniform size of the bred geese is also very indicative of captive breeding.
Although not by any means definitive, all of these lines of evidence converge to the same conclusion: geese were probably the first birds humans have domesticated, and this happened more than 7,000 years ago in China.
New Scientist reports that other studies have claimed that chickens were the first domesticated birds, as early as 10,000 years ago, also in avian-loving northern China. But the evidence, in this case, has proven contentious. Genetic analysis suggests chickens were domesticated from wild birds called red junglefowl, but these birds do not live that far north. Furthermore, the chicken bones weren’t directly dated. The firmest evidence of chicken domestication only appeared 5,000 years ago.
While most domestication research has focused on dogs and cattle, it’s refreshing to see new perspectives on the evolutionary history of poultry, upon which our food security depends so much.
For some time now, EU governments have been pushing for natural gas and nuclear energy as an essential part of the energy transition from carbon-intensive fossil fuels like coal and oil. But since Ukraine was invaded, Europe’s reliance on Russian gas has triggered a sudden push towards energy independence, mainly via renewables. It’s increasingly looking like Putin’s invasion may succeed in pushing Europe towards renewable energy.
In Germany, Chancellor Olaf Scholz said renewable energy is “crucial” for the EU’s energy security and Finance Minister Christian Lindner called for renewables “freedom energies.” Meanwhile, in France, Barbara Pompili, Minister for Ecological Transition, said that ending the dependency on fossil fuels, especially Russian ones, is essential.
In response, the Stand with Ukraine coalition, which groups hundreds of organizations including environmental groups like Greenpeace, said a ban on Russian energy imports would step one in a path to end fossil fuel production. They called for “bold steps” towards global decarbonization and for a transition to “clean and safe” renewables.
The EU imported 155 billion cubic meters of natural gas from Russia in 2021, almost half (45%) of its gas imports and nearly 40% of the total amount used, according to the International Energy Agency (IEA). But the war has largely disrupted this. Now, the European Commission is expected to present an updated energy strategy, which will likely give renewables a larger role.
The race to end this Russian dependence will likely require boosting imports from countries like the US and Qatar in the short term, and will likely lead to more domestic fossil fuel production. However, this doesn’t have to be the path ahead, climate experts argue, suggesting energy independence via clean energy such as solar and wind. The most likely option is a mixture between the two.
No more illusions
Europe has pledged to cut its greenhouse gas emissions by at least 55% by 2030, reaching net zero emissions by 2050. According to preliminary data, EU emissions dropped 10% from 2019 to 2020 – strongly related to the Covid-19 pandemic. By comparison, EU emissions declined 4% from 2018 to 2019. Despite being one of the more ambitious climate pledges around, it’s still nowhere near what is necessary if we want to avoid the worst of climate change effects.
If Europe wants to rid itself of Russian fossil fuels, it will need some sources oil and gas — but focusing on renewabls is the smart long-term bet, researchers emphasize.
The argument that Europe could limit its dependence on Russian gas by focusing on local fossil fuel sources and importing liquid natural gas from the US is neither realistic nor cost-effective, according to the think tank Carbon Tracker. It would require decades to build new gas decades and source local deposits, meaning price pressures won’t be solved right away.
By contrast, solar and wind energy sources can be significantly scaled up as part of existing decarbonization policies. This would be more cost-effective because of the large drop in renewable energy prices. The think tank Wuppertal Institute released a study this week showing how heating in the EU could run completely on renewables by 2013 thanks to electric heat pumps.
Meanwhile, the IEA came up with a road map to help deal Europe in its energy transition. The plan would reduce the bloc’s dependence on Russian natural gas by one-third in just one year while delivering on the bloc’s climate pledges. It’s a collection of actions designed to diversify the energy supply, focused on renewables.
“Nobody is under any illusions anymore. Russia’s use of its natural gas resources as an economic and political weapon show Europe needs to act quickly to be ready to face considerable uncertainty over Russian gas supplies next winter,” IEA Executive Director Fatih Birol said in a written statement announcing the plan.
The recommendations include no renewing gas supply contracts with Russia, which are due to expire at the end of the year, increasing biogas and biomethane supply, storing more gas to have a buffer of security, accelerating the deployment of renewables, protecting vulnerable customers, and improving the energy grid reliability and flexibility.
In her new book, Kimberly Ridley pairs beautiful vintage illustrations with essays that detail the role of different phenomena in nature, from small to big organisms. Ridley, a science essayist and science writer, decided to celebrate nature’s most brilliant designers and builders in “Wild Design: Nature Architects”.
The book has eight chapters with unusual information on everything from beavers to fungi to birds. It’s packed with illustrations — paintings and drawings created by natural historians from the 17th to the 20th centuries. These allow the eye to focus on important features of the natural world, creating a sense of connection to the inner workings of the natural world.
In an interview with ZME Science, Ridley said she wrote the book as a love letter to the natural world and an invitation to readers to see nature with a new set of eyes, rekindling their sense of wonder. There are countless marvels surrounding us, Ridley said, but when we fail to notice them, we become disconnected from the living world
“We often conflate wonder with naivete, but I think cultivating a sense of wonder is an important survival skill,” Ridley told ZME Science. “I wrote Wild Design to speak to that sense of wonder, which I find on my daily walks. I want to gently take readers by the hand and show them nature’s gorgeous and brilliant designs all around us.”
From the intricate weave of an oriole’s nest and the winged elegance of maple seeds to the ingenious “cases” of caddisfly larvae, which they meticulously construct from pebbles and sand, there are gorgeous and brilliant designs all around us, Ridley explains. “The more I thought about design in nature, the more curious I became,” she added.
The idea of the book originated from Ridley’s own curiosity, as she started to come up with questions regarding nature’s architects. She discovered many design wonders that are right under our noses. She wrote most of the book in her own backyard in Maine. “I set up a table, chair, and my laptop and got to work,” she says.
The role of illustrations
Ridley said she discussed several illustration possibilities with her editor, but that from early on she wanted to use natural history illustrations. This is for several reasons. First, she wanted Wild Design to feel like a miniature cabinet of illustrations. Second, because the illustrations are wonders themselves, created by hand and sometimes in the field
“I wanted the visual narrative of this book to present a glimpse of the visual expression in the heyday of natural history exploration and discovery. These amazing works were central to scientific discovery, and introduced the public to the wonders of the living world,” Ridley told ZME Science. “I want this book to invite readers to slow down and observe.”
Ridley said creating this book opened her eyes wider and deepened her sense of wonder for the wild world around us. The book has helped her appreciate more deeply the interconnectedness of all life. Now, on her hikes along the rocky coast where she lives, she has a new appreciation of geology and is always looking for bird nests and admiring fungi.
Her own new experiences with nature are what she hopes happens with everyone who reads the book, which invites them to explore nature’s beauty, strangeness, and mystery in their own back yards or parks. Nature is a living library, Ridley concludes, a repository of knowledge that has accumulated through billions of years through evolution.
“Nature’s wild designers offer how-to manuals and encyclopedias for helping to solve human design challenges without creating pollution or trashing the neighborhood. So, I hope this small book inspires awareness on every level,” she added.
Governments and companies have a key role to play in preventing the worst effects of climate change — but we can also pitch in. Individuals can make a big difference, claims a new study, by implementing a simple six-step plan. If everyone would follow this plan, it would account for a quarter of the emissions reduction needed to keep global warming down to 1.5ºC
Last week, the Intergovernmental Panel on Climate Change (IPCC), comprised of the world’s leading climate scientists, said in a new report that the climate crisis is causing “dangerous and widespread” adverse impacts in nature and affecting the lives of billions of people. The situation is much worse than predicted in previous reports, and while we still have a chance to avoid the worst results, the window is closing quickly.
“This pioneering analysis ends once and for all the debate about whether citizens can have a role in protecting our earth. We don’t have time to wait for one group to act, we need all action from all actors now,” Tom Bailey, co-founder of the campaign, said in a statement. “The JUMP is a grassroot movement that comes together to make practical changes.”
The good news is there is still plenty we can do.
Climate change and individual action
The research was carried out by academics at Leeds University and analyzed by the C40 network of world cities and the global engineering company Arup. It was published alongside the launch of a new climate movement to persuade and support well-off people to make “The Jump” and sign to the six pledges to reduce their emissions.
The study looked at the impact of consumption on greenhouse gas emissions. It showed that in order to avoid ecological breakdown, a 2/3 reduction in the greenhouse gas impact of consumption in rich countries is required within 10 years. This shift can be achieved through changes across key sectors such as buildings, energy, food, transport, appliances, trade, and textiles.
Citizens have primary influence over 25-27% of the changes needed by 2030 by making key lifestyle changes. In other words, we can’t control most of the changes that need to be done — but we can control some of them.Not everyone is equally responsible. Higher-income groups must take faster and bigger action.
“This analysis shows the collective impact that individuals, and individual choices and action, can contribute to combating climate change,” Rachel Huxley, director of knowledge at C40 cities, said in a statement. “This is really important in showing that citizen action really does add up, and alongside government and private sector action, individuals can make a major contribution.”
The six actions
So, here are the six lifestyle changes everyone should take to address climate change:
Eat green: Combing reducing household food waste to zero and a shift to a mostly plant based diet, would deliver 12% of the total savings needed by North American and European countries.
Dress retro: By reducing the number new items of clothing to a target of three, maximum eight, delivering 6% of the total savings needed.
Holiday local: As close as is possible, reduce personal flights to one short-haul flight every three years, and one long-haul every eight years.
Travel fresh: For those who can, reducing vehicle ownership and if possible moving away from personal vehicle ownership, would deliver 2% of the total savings needed by 2030.
End clutter: By optimising the lifetime of both electronics and appliances, keeping them for at least seven years, would deliver the 3% of the total savings needed
Change the system: To influence the remaining 73% of emissions citizens could take action that encourages and supports industry and government to make the urgently needed, high impact changes to change the system. For instance, swapping to a green energy supplier, changing to a green pension, retrofitting our homes, or taking political action.
In the 1920s, researchers realized that you can add lead to gasoline to help keep car engines healthy for longer. But while leaded gasoline was good for cars, it was bad for humans.
Leaded gasoline is highly toxic and in addition to causing a number of health problems, it can also cross the blood-brain barrier and accumulate in some parts of the brain, where it can cause a number of problems, including reducing intelligence. According to a new study, exposure to car exhaust from leaded gasoline affected the IQs of over 170 million Americans alive today, costing the country a collective 824 million IQ points.
The findings come from a new study published by Aaron Reuben, a PhD candidate in clinical psychology at Duke University, and Michael McFarland and Mathew Hauer, both professors of sociology at Florida State University. The researchers started from publicly available data on US childhood blood-lead levels and leaded-gasoline use. They then determined the likely lifelong burden of lead exposure of every American alive in 2015. From this, they calculated how much of an intelligence burden this exposure to lead proved to be. While IQ isn’t a perfect proxy to intelligence, it’s still a good population-level indicator.
Previous studies have suggested an association between lead exposure in childhood and a drop in IQ. But when the results came in, even the researchers were surprised.
“I frankly was shocked,” McFarland said. “And when I look at the numbers, I’m still shocked even though I’m prepared for it.”
The results show that over half of all Americans (170 million out of an entire population of 330 million) had clinically significant levels of lead in their blood, resulting in lower IQ levels as adults, as well as a number of potential health problems (such as reduced brain size, greater likelihood of mental illness, and increased cardiovascular disease). The people affected by lead exposure would have each lost, on average, 3 IQ points.
“Lead is able to reach the bloodstream once it’s inhaled as dust, or ingested, or consumed in water,” Reuben said. “In the bloodstream, it’s able to pass into the brain through the blood-brain barrier, which is quite good at keeping a lot of toxicants and pathogens out of the brain, but not all of them.”
Three IQ points may not seem like much, but keep in mind that this is an average for a whopping 170 million people. At its worst, people born in the mid-late 1960s may have lost 6 IQ points on average. At a population level, this is a considerable margin — and even though leaded gasoline was banned in the US in 1996, the effects of the problem are still visible today.
“Millions of us are walking around with a history of lead exposure,” Reuben said. “It’s not like you got into a car accident and had a rotator cuff tear that heals and then you’re fine. It appears to be an insult carried in the body in different ways that we’re still trying to understand but that can have implications for life.”
Thankfully, the era of leaded gasoline is finally over. Most countries banned it two decades ago, but only last year, in 2021, the era of leaded gasoline was finally over as the last stocks were used in Algeria (which had continued to produce leaded gasoline until July 2021).
Leaded gasoline is a good example of an exciting technology that turns out to be very bad for the environment and for human health. But while leaded gasoline has been phased out, there are plenty of other sources of pollution still affecting our brains, lungs, and hearts.
Journal Reference: “Half of US Population Exposed to Adverse Lead Levels in Early Childhood,” Michael J. McFarland, Matt E. Hauer, Aaron Reuben. Proceedings of the National Academy of Sciences, March 7, 2022. DOI: 10.1073/pnas.2118631119
A combination of climate change, deforestation, and fires has put immense strain on the Amazon basin — home to the single largest remaining tropical rainforest in the world, housing at least 10% of the world’s known biodiversity — since the early 2000s. A new study suggests that over three-quarters of the Amazon region is showing signs that rainforests may be nearing a tipping point, where they could turn into a savannah.
“There is a lot of discussion about the future of the Amazon rainforest and its tipping point. This comes from model studies that originally showed a fast loss of the Amazon rainforest. Since then there has been a lot of uncertainty about its future based on models not agreeing with each other, different future scenarios of climate change, etc. This leads us to look at the real world Amazon to actually see what is going on, and why wouldn’t you if the data is there? We use well-established indicators to measure the changing resilience of the forest, finding that 75% of the forest is losing resilience,” Chris Boulton, Associate Research Fellow at the University of Exeter in the UK, told ZME Science
R(1) values at each location are measured over time and approximate how much memory the forest has (how similar the forest is compared to how it was previously). Higher values suggest more memory, meaning the forest is responding more slowly to weather events, having lower resilience to them. Over the years, the increasing AR(1) values at individual locations, as well as the average behaviour over the region (shown by the time series) shows that there has been a loss of resilience in the Amazon rainforest, particularly over the last 20 years. Credit: Boulton, et al.; Nature Climate Change
Resilience, Boulton added, refers to an ecosystem’s ability to recover from strenuous events such as droughts. Monitoring ecosystem resilience is paramount because it can help determine the magnitude and timing of ecological interventions, such as environmental watering, as well as provide trajectories we can expect in highly disturbed ecosystems subject to ongoing change. And few regions across the world are under as much stress as the Amazon basin is currently experiencing.
Aggressive modern human economic invasion in the area over the past decades has supplanted once tropical foliage with roads, dams, cattle farms, and huge soy plantations. Adding insult to injury are the hundreds of wildfires that lit large chunks of the iconic rainforest up in flames. In 2020 alone, fires razed more than 19 million acres of the world’s largest tropical forest.
With the forest habitat shredded, many endemic species are under threat of extinction, their previous role being filled by often invasive animals. For instance, we’re seeing giant anteaters being replaced by rats and Brazil nut trees making way for weeds.
Using remote satellite sensing data, Boulton and colleagues modeled changes in the resilience of the Amazon rainforest between 1991 and 2016, coming to some stark conclusions. The analysis revealed that 75% of the Amazon has been steadily losing resilience since the early 2000s, which in simple terms means that the rainforests are finding it increasingly difficult to recover after a big drought or fire.
“I think the biggest challenge with this work was the amount of robustness checking that needed to be done. To have such a striking result, all of our coauthors had to be confident that what we were seeing stood up to various tests,” Boulton said.
These concerning developments suggest to the study’s authors that the Amazon may be approaching a critical threshold. Once crossed, key regions of the Amazon may irremediably transition into a new state, from luxurious rainforests to savannas.
The loss of resilience is most prominent in areas that are closer to human activity, as well as in regions that receive less rainfall. That was to be expected. But what was particularly surprising was finding loss of resilience did not necessarily overlap with loss in forest cover. That’s worrisome because it suggests ecosystems that look to be doing well from up above may be actually more vulnerable to changing their mean state than previously thought.
“On the surface, the Amazon may appear comfortable (by looking at the state of the forest), but you need indicators like the ones we use to really see its health. There is a section in the new IPCC report regarding the ‘committed response’ of the Amazon; that in the future, the Amazon may appear stable but the climate it is experiencing may not be good enough for it to survive. Because the forest overall responds slowly to change, it may have passed a tipping point without being realized from the outside,” Boulton said.
The study did not attempt to offer a timeline for this possible transformation of the rainforests. When such a threshold could be reached if things continue business as usual is a big enigma at this stage. But these alarming findings suggest that, if ecosystem resilience is any indication, the Amazon basin is heading towards this critical point of no return. Furthermore, the level of uncertainty is compounded by the many dependencies that characterize such a complex ecosystem like the Amazon.
“Losing part of the forest will also affect rainfall in other areas, which could create losses of resilience in areas where we do not see it at the moment. As for when, I think this is tough to answer, I am surprised to see these signals now over such a large area, and if others are too then it could give people a wake-up call to do something about it,” Boulton said.
Where biology and technology meet, evolutionary robotics is spawning automatons evolving in real-time and space. The basis of this field, evolutionary computing, sees robots possessing a virtual genome ‘mate’ to ‘reproduce’ improved offspring in response to complex, harsh environments.
Hard-bodied robots are now able to ‘give birth’
Robots have changed a lot over the past 30 years, already capable of replacing their human counterparts in some cases — in many ways, robots are already the backbone of commerce and industry. Performing a flurry of jobs and roles, they have been miniaturized, mounted, and molded into mammoth proportions to achieve feats way beyond human abilities. But what happens when unstable situations or environments call for robots never seen on earth before?
For instance, we may need robots to clean up a nuclear meltdown deemed unsafe for humans, explore an asteroid in orbit or terraform a distant planet. So how would we go about that?
Scientists could guess what the robot may need to do, running untold computer simulations based on realistic scenarios that the robot could be faced with. Then, armed with the results from the simulations, they can send the bots hurtling into uncharted darkness aboard a hundred-billion dollar machine, keeping their fingers crossed that their rigid designs will hold up for as long as needed.
But what if there was a is a better alternative? What if there was a type of artificial intelligence that could take lessons from evolution to generate robots that can adapt to their environment? It sounds like something from a sci-fi novel — but it’s exactly what a multi-institutional team in the UK is currently doing in a project called Autonomous Robot Evolution (ARE).
Remarkably, they’ve already created robots that can ‘mate’ and ‘reproduce’ progeny with no human input. What’s more, using the evolutionary theory of variation and selection, these robots can optimize their descendants depending on a set of activities over generations. If viable, this would be a way to produce robots that can autonomously adapt to unpredictable environments – their extended mechanical family changing along with their volatile surroundings.
“Robot evolution provides endless possibilities to tweak the system,” says evolutionary ecologist and ARE team member Jacintha Ellers. “We can come up with novel types of creatures and see how they perform under different selection pressures.” Offering a way to explore evolutionary principles to set up an almost infinite number of “what if” questions.
What is evolutionary computation?
In computer science, evolutionary computation is a set of laborious algorithms inspired by biological evolution where candidate solutions are generated and constantly “evolved”. Each new generation removes less desired solutions, introducing small adaptive changes or mutations to produce a cyber version of survival of the fittest. It’s a way to mimic biological evolution, resulting in the best version of the robot for its current role and environment.
Evolutionary robotics begins at ARE in a facility dubbed the EvoSphere, where newly assembled baby robots download an artificial genetic code that defines their bodies and brains. This is where two-parent robots come together to mingle virtual genomes to create improved young, incorporating both their genetic codes.
The newly evolved offspring is built autonomously via a 3D printer, after which a mechanical assembly arm translating the inherited virtual genomic code selects and attaches the specified sensors and means of locomotion from a bank of pre-built components. Finally, the artificial system wires up a Raspberry Pi computer acting as a brain to the sensors and motors – software is then downloaded from both parents to represent the evolved brain.
1. Artificial intelligence teaches newborn robots how to control their bodies
Newborns undergo brain development and learning to fine-tune their motor control in most animal species. This process is even more intense for these robotic infants due to breeding between different species. For example, a parent with wheels might procreate with another possessing a jointed leg, resulting in offspring with both types of locomotion.
But, the inherited brain may struggle to control the new body, so an algorithm is run as part of the learning stage to refine the brain over a few trials in a simplified environment. If the synthetic babies can master their new bodies, they can proceed to the next phase: testing.
2. Selection of the fittest- who can reproduce?
A specially built inert nuclear reactor housing is used by ARE for testing where young robots must identify and clear radioactive waste while avoiding various obstacles. After completing the task, the system scores each robot according to its performance which it then uses to determine who will be permitted to reproduce.
Software simulating reproduction then takes the virtual DNA of two parents and performs genetic recombination and mutation to generate a new robot, completing the ‘circuit of life.’ Parent robots can either remain in the population, have more children, or be recycled.
Evolutionary roboticist and ARE researcher Guszti Eiben says this sped up evolution works as: “Robotic experiments can be conducted under controllable conditions and validated over many repetitions, something that is hard to achieve when working with biological organisms.”
3. Real-world robots can also mate in alternative cyberworlds
In her article for the New Scientist, Emma Hart, ARE member and professor of computational intelligence at Edinburgh Napier University, writes that by “working with real robots rather than simulations, we eliminate any reality gap. However, printing and assembling each new machine takes about 4 hours, depending on the complexity of its skeleton, so limits the speed at which a population can evolve. To address this drawback, we also study evolution in a parallel, virtual world.”
This parallel universe entails the creation of a digital version of every mechanical infant in a simulator once mating has occurred, which enables the ARE researchers to build and test new designs within seconds, identifying those that look workable.
Their cyber genomes can then be prioritized for fabrication into real-world robots, allowing virtual and physical robots to breed with each other, adding to the real-life gene pool created by the mating of two material automatons.
The dangers of self-evolving robots – how can we stay safe?
Even though this program is brimming with potential, Professor Hart cautions that progress is slow, and furthermore, there are long-term risks to the approach.
“In principle, the potential opportunities are great, but we also run the risk that things might get out of control, creating robots with unintended behaviors that could cause damage or even harm humans,” Hart says.
“We need to think about this now, while the technology is still being developed. Limiting the availability of materials from which to fabricate new robots provides one safeguard.” Therefore: “We could also anticipate unwanted behaviors by continually monitoring the evolved robots, then using that information to build analytical models to predict future problems. The most obvious and effective solution is to use a centralized reproduction system with a human overseer equipped with a kill switch.”
A world made better by robots evolving alongside us
Despite these concerns, she counters that even though some applications, such as interstellar travel, may seem years off, the ARE system may have a more immediate need. And as climate change reaches dangerous proportions, it is clear that robot manufacturers need to become greener. She proposes that they could reduce their ecological footprint by using the system to build novel robots from sustainable materials that operate at low energy levels and are easily repaired and recycled.
Hart concludes that these divergent progeny probably won’t look anything like the robots we see around us today, but that is where artificial evolution can help. Unrestrained by human cognition, computerized evolution can generate creative solutions we cannot even conceive of yet.
And it would appear these machines will now evolve us even further as we step back and hand them the reins of their own virtual lives. How this will affect the human race remains to be seen.
Countries from the European Union (EU) play a major role as suppliers and traders in the global shark trade, which is driving many species towards extinction, according to a new report. EU member states were the source of 45% of shark-fin-related products imported to Hong Kong, Singapore, and Taiwan in 2020, with Spain being the top exporter for fin trade.
Sharks are currently declining very fast on a global scale. One way humans hunt them is by using a practice called shark finning – the process of slicing off a fin and discarding the rest of the body, usually by throwing it back into the ocean, which leads to a slow and painful demise.
Fins are specifically targeted as they are used to make a fin soup in Asia, which is considered to be a symbol of status. Fishermen sometimes even prefer to practice shark fining instead of selling whole sharks in the market as fins are much more valuable and they get their money’s worth with relatively little work.
Finning is having big implications on shark populations worldwide. About 100 million sharks are killed globally every year, with many species such as the scalloped hammerhead susceptible to extinction.
Population plunges don’t only affect sharks but also entire ecosystems, causing a ripple effect. For example, the decline of the smooth hammerhead causes their prey, rays, to increase. If there are more rays, they eat more scallops and clams, which provide valuable services for the entire ecosystem. Simply put, if you remove the top predators from the ecosystem, the entire ecosystem’s biodiversity is affected.
The role of EU countries
In a new report, the International Fund for Animal Welfare (IFAW) analyzed almost two decades of customs data in three Asian trading hubs from 2003 to 2020. While the main market for fin-related products is in Asia, EU countries – especially Spain, the Netherlands, France, Italy, and Portugal – are big suppliers to this legal market.
“Small or large, coastal or high seas, shark species are disappearing, with the piecemeal management efforts to date failing to stop their decline,” report co-author and IFAW’s EU manager Barbara Slee said in a statement. “The EU, demonstrated by our report to be a key player in global shark markets, has an important responsibility.”
Over 188,000 tons of shark fin products were imported by Singapore, Taiwan, and Hong Kong from 2003 to 2020, with the EU responsible for almost a third. Spain was the top source of imports with over 51,000 tons shipped from 2003 to 2020, an annual average of 2,877 tons, according to the report. Portugal ranked second with 642 tons.
EU countries can’t carry out shark finning but the landing and sale of whole sharks are permitted, except for species protected under the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). That’s why IFAW is now calling for all sharks to be listed under CITES, which would give them further protection.
Shark populations have been shown to recover when effective management is put in place, hence the importance of the CITES listing. If the EU would take a leadership role to ensure the accuracy of trade records and the enactment of sustainability requirements of sharks in trade, then other players would follow through, Barbara Slee added.
“Global shark declines are driven by international demand for shark fins and meat,” report co-author Stan Shea said in a statement. Although many place the burden of change on the consumptive countries, primarily in Asia, equally responsible for declines in shark populations are all countries with internationally operating fishing fleets.”
When the pandemic hit and economies around the world went into lockdown, governments frequently promised to “build back better” or to carry out a “green new deal” once economies reopened. Turns out, it was mostly hot air.
Jonas Nahm, a researcher at Johns Hopkins School of Advanced International Studies, and his colleagues looked at national fiscal stimulus efforts for G20 economies between 1 January 2020 and 31 December 2021. The researchers chose these countries as they account for more than 80% of global emissions and 85% of global economic activity — these are the climate elephants in the room.
The 20 largest economies injected stimuli of at least US$14 trillion during that period — close to China’s annual gross domestic product, for comparison. While most of the money went for shoring up healthcare systems, wages, and welfare, only 6% (or about $860 billion) went to areas that will cut emissions, such as installing renewable plants.
This green investment is less than those that followed previous recessions, the researchers argued. After the global financial crisis in 2007–09, for example, 16% of global stimulus spending was directed at emissions cuts (or about $520 billion). If a similar share had been committed today, the total would be about $2.2 trillion.
So all in all, investments in renewables and other green infrastructure were severely lagging behind what was promised.
The study showed some governments did more than others. The EU and South Korea led the pack, as each dedicated more than 30% of their COVID-19 fiscal stimulus to emissions-reducing measures. Brazil, Germany, and Italy also spent over 20%. India, China, and South Africa were at the other extreme, focusing on fossil fuel spending.
Looking at the reasons behind this trend, Jonas Nahm told ZME Science governments were preoccupied with the pandemic and not as focused on making structural changes to the sources of growth in the economy. Lobbying by interest groups in the fossil fuel industry could be another reason. However, he argues further research is needed to fully answer why this happened.
The road ahead
There’s still time to improve, the researchers argued, highlighting a set of lessons governments can learn from their recovery efforts. First, they should apply environmental conditions to stimulus bills. It is cheap and effective. Attaching climate targets to corporate bailouts can shift sectors onto more sustainable trajectories.
Governments should also focus on recovery measures that have direct emissions impacts. This means accelerating public spending on renewables to reduce the use of fossil fuels and increase the energy efficiency of housing, as South Korea did. Or even investing in vehicle electrification, as Germany did by buying EVs for the government.
At the same time, the researchers believe governments should position their economies strategically to compete in a post-carbon world. This means focusing investments in low-carbon industries, building institutions to make economies more resilient to future shocks, and also helping fossil-based industries to do a transition.
“We hope that showing these aggregate numbers will highlight where we fall short and provide motivation to do things differently going forward. There are also many concrete policy lessons that can be learned from the things governments did do to reduce emissions, even if they didn’t amount to a sufficient response overall,” Nahm told ZME Science.
The study was published as a commentary piece in Nature.
Surface water, including lakes, canals, rivers, and streams, is a key resource for agriculture, industries, and domestic households. It’s quite literally essential to human activity. However, it’s also very susceptible to pollution, and cleaning it up is rarely easy. But we may have a new ally in this fight: nanobots.
According to the UN, 90% of sewage in developing countries is dumped untreated into water bodies. Industries are also to blame, as they dispose of between 300 and 400 megatons of polluted water in water bodies every year. Nitrate, used extensively by agriculture, is the most common pollutant currently found in groundwater aquifers.
Once these pollutants enter into surface water, it’s very difficult and costly to remove them through conventional methods, and hence, they tend to remain in the water for a long time. Heavy metals have been detected in fish from rivers, which hold risks to human health. Water pollution can also progress to massive disease outbreaks.
The use of nanotechnology in water treatment has recently gained wide attention and is being actively investigated. In water treatment, nanotechnology has three main applications: remediating and purifying polluted water, detecting pollution, and preventing it. This has led to a big demand lately for nanorobots with high sensitivity
However, there’s a technical challenge. Most nanorobots use catalytic motors, which cause problems during their use. These catalytic motors are easily oxidized, which can restrict the lifespan and efficiency of nanorobots. This is where the new study comes in.
A new type of nanorobot
Martin Pumera, a researcher at the University of Chemistry and Technology in the Czech Republic, and his group of colleagues developed a new type of nanorobots, using a temperature-sensitive polymer material and iron oxide. The polymer acts like small hands that pick up and dispose of the pollutants, while the oxide makes the nanorobots magnetic.
The robots created by Pumera and his team are 200 nanometers wide (300 times thinner than human hair) and are powered by magnetic fields, allowing the researchers to control their movement. Unlike other nanorobots out there, they don’t need any fuel to function and can be used more than one time. This makes them sustainable and cost-effective.
In the study, the researchers showed that the uptake and release of pollutants in the surface water are regulated by temperature. At a low temperature of 5ºC, the robots scattered in the water. But when the temperature was raised to 25ºC they aggregated and trapped any pollutants between them. They can then be removed with the use of a magnet.
The nanorobots could eliminate about 65% of the arsenic in 100 minutes, based on the 10 tests done by the researchers for the study. Pundera told ZME Science that the technology is scalable, which is why with his team he is currently in conversations with wastewater treatment companies, hoping to move the system from bench to proof-of-concept solutions.
In general, scientists are very aware of the environmental footprint of their research. It’s a noble cause, but many labs use consume vast amounts of plastics, generate waste, and emit greenhouse gases. Many such labs are trying to find ways to go green. In a new study, researchers in Ireland showed how this could be done — while also saving money.
Jane Kilcoyne and her colleagues at the Marine Institute in Ireland run a monitoring program for the detection of biotoxins in shellfish. Aware of labs being “resource-hungry workplaces” contributing to climate change, Kilcoyne told ZME Science they wanted to limit the impacts of their work on the environment while raising awareness overall.
The world’s scientific laboratory sector is massive. There are about 20,500 labs around the world that carry out medical, biological, or agricultural research. Most of them are big consumers of plastics. While the average person in the US consumes 106 kilograms of plastics per year, the average scientist uses 1,000 kilograms per year.
Labs also use large amounts of solvents for sample extraction and analysis, which could be treated and recycled to reduce costs and emissions. Paper consumption for printing is also high. This can translate into deforestation and pollution. Labs consume a lot of energy as well – between five to ten times more energy per square meter than office buildings.
That’s why is critical for labs to adopt good environmental practices. Many are acknowledging the need to operate in more sustainable ways and have already implemented changes to working practices to reduce their waste and energy consumption, such as University College London, set to be carbon neutral in 2030.
Tackling carbon footprint
With a team of seven staff members, the Marine Institute’s national monitoring program for the detection of biotoxins in shellfish implemented a set of sustainable practices in their laboratory, hoping to reduce the overall environmental footprint. As it turns out, it was a success, making the lab a much greener place than before.
They were able to reduce their consumption of single-use plastics by 69% thanks to a transition to more sustainable consumables. Recycling polystyrene (used in the construction industry as insulation) and composting of shellfish waste also led to over 95% of non-chemical waste generated by our laboratory being diverted from landfills.
The researchers could reduce their hazardous chemical waste by about 23% by extending expiry dates and only preparing what’s strictly needed for experiments. They also addressed their fume hood (which uses 3.5 times the energy of an average home) and reduced cold storage equipment energy consumption by 30% through improved management.
The actions implemented led to annual cost savings of about $17.000. But this isn’t the end of the road as further sustainability efforts are still required, they argued. The team will continue working to meet the ultimate goal of achieving a green lab certification known as My Green Lab – an NGO that seeks sustainability in science.
“The strategies adopted could be implemented in any laboratory. In fact, going green in any workplace setting is a win-win. Introducing more sustainable work practices into our monitoring program led to reduced environmental and financial costs, enhanced efficiencies, and boosted staff engagement,” Kilcoyne told ZME.
Despite some efforts to reduce its risks, the climate crisis is already causing “dangerous and widespread” adverse impacts in nature and affecting the lives of billions of people, according to a new landmark report on the climate crisis.
The situation is much worse than predicted in previous reports and if we want to avoid catastrophic damage, we need much more convincing action.
The Intergovernmental Panel on Climate Change (IPCC), comprised of the world’s leading climate scientists, published a new report that updates the global knowledge of man-made global warming. Specifically, it goes deep into the growing impacts of the climate crisis and future risks if global emissions don’t drop further.
The report comes after an earlier publication by the IPCC last year when scientists concluded that major “unprecedented” changes were being seen – many of which were likely “irreversible.” Now, this second part focuses on how the changes to the climate are affecting people’s lives – including floods, heatwaves, and melting glaciers.
“This report is a dire warning about the consequences of inaction,” Hoesung Lee, Chair of the IPCC, said in a statement. “It shows that climate change is a grave and mounting threat to our wellbeing and a healthy planet. Our actions today will shape how people adapt and nature responds to increasing climate risks.”
All in all, the report reads like a gloomy prophecy.
We’re already in trouble
With just 1.1ºC of global warming that we’re seeing now, climate change is already causing widespread disruption in every region of the planet, the IPCC said. Extreme heat, record floods, and crushing droughts threaten food security and livelihoods for millions of people. Since 2008, over 20 million people were forced to leave their homes due to floods and storms.
Half of the global population currently faces water insecurity at least one month per year, a phenomenon driven by the climate crisis. Wildfires are affecting much larger areas than ever before in many parts of the world, while higher temperatures are enabling the spread of vector-borne diseases, such as malaria and Lyme disease.
“The science is now conclusive – and governments have endorsed this – we are in the era of unavoidable climate disasters causing loss and damage. Every fraction of a degree of warming will cause compounding and cascading climate impacts,” Harjeet Singh, Senior Adviser at Climate Action Network International, said in a statement.
People living in cities face higher risks of heat stress, lack of water, food shortages, and other impacts caused by climate change, according to the report. The fastest increase in vulnerability happened in informal settlements. This is especially problematic in sub-Saharan Africa, where about 60% of the urban population lives in these vulnerable areas.
Rural communities also face growing climate risks, especially indigenous people and those whose livelihoods depend on sectors exposed to the climate crisis. As climate change impacts worsen, many won’t have much choice but to move to urban centers. The IPCC projects that droughts across the Amazon basin will lead to rural migrations to cities.
Even if greenhouse gas emissions are drastically reduced today, greenhouse gases already in the atmosphere and current emission trends will have many big impacts unavoidable through 2040. In the next decade alone, climate change will drive between 32 million and 132 million more people into extreme poverty, according to the report.
“These reports are important as they can drive public policies of countries. But science is not being heard or respected. Governments only care about whether they are gaining power or money,” Gregorio Mirabal, head of COICA, an indigenous community umbrella organization, told ZME Science. “We are seeing the impacts of the climate crisis every day.”
Challenges on nature
The extent and magnitude of climate change impacts on nature are larger than previously expected, the IPCC said. Changes are happening faster and are more disruptive and widespread than what scientists expected. This adds to the other stressors faced by ecosystems, such as deforestation, pollution, and overfishing.
Climate change is currently destroying species and entire ecosystems. Animals such as the golden toad (Incilius periglenes) are going extinct due to the warming world, while others such as corals and seabirds are experiencing mass die-offs. Many species are also moving to higher latitudes and elevations to adapt to the higher temperatures.
Global warming of 2ºC by 2100 would mean an extinction risk for up to 18% of all species on land. If the world warms up to 4ºC, every second plant or animal species will be threatened. This is especially concerning for species living in high mountains or in polar regions, where the impacts of the climate crisis are unfolding much faster. But make no mistake: no place on Earth is spared.
Farmers, fishers, and other people who directly rely on nature’s services are experiencing severe effects. Even in a world with low greenhouse gas emissions (where global warming would reach 1.6ºC), 8% of today’s farmland will be climatically unsuitable by 2100. Under these conditions, fishermen in Africa could lose up to 41% of their yield.
“Drought and searing heat, ecosystem destruction, stronger storms and massive floods, species extinction – this is not a list of scenes in an apocalyptic film. Instead, it is the content of an authoritative scientific report detailing the climate impacts that are already wreaking havoc on our planet and its people,” Stephen Cornelius, WWF Global Lead for IPCC, said in a statement.
Today’s young people and future generations will witness stronger negative effects of climate change, the report goes on. Children aged ten or younger in 2020 will experience a nearly four-fold increase in extreme events under 1.5°C of global warming by 2100 and a five-fold increase under 3°C warming.
The percentage of the population exposed to deadly heat stress is projected to increase from today’s 30% to 48-76% by the end of the century, depending on future warming levels and location. Outdoor workers in some parts of Africa, South America, and sub-Saharan Africa will be subject to a growing number of workdays with climatically stressful conditions.
Climate change will also further impact water quality and availability for hygiene, food production, and ecosystems due to floods and droughts. The IPCC estimates that between 800 million to three billion people will experience chronic water scarcity due to droughts at 2°C warming – which would grow to four billion over a 4ºC global warming.
Children growing up in South America will face an increasing number of days with water scarcity and restricted water access, especially those living in cities and in rural areas depending on water from glaciers. As the Andean glaciers and snowcaps continue to melt, the amount of available water decreases as the glaciers shrink or disappear entirely.
The warmer it gets, the more difficult it will become to grow or produce, transport, distribute, buy, and store food – a trend that is projected to hit poor populations the hardest. Depending on future policies and climate and adaptation actions taken, the number of people suffering from hunger in 2050 will range from 8 million to up to 80 million people.
Multiple climate hazards will occur simultaneously more often in the future. They may reinforce each other and result in increased impacts and risks to nature and people that are more difficult to manage. For example, reductions in crop yields due to heat and drought, made worse by reduced productivity because of heat stress, will increase food prices and reduce incomes.
“This report presents a harrowing catalog of the immense suffering that climate change means for billions of people, now and for the decades to come. It’s the most hard-hitting compilation of climate science the world has ever seen. You can’t read it without feeling sick to your stomach,” Teresa Anderson, Climate Justice Lead at ActionAid International, said in a statement.
The importance of adaptation
National and local governments, as well as corporations and civil society, acknowledge the growing need for adaptation, the IPPC said, with already 170 countries and cities that have included adaptation as part of their policies and planning. Nevertheless, efforts are still largely incremental, reactive, and small scale, with most focusing on current impacts or near-term risks
There’s a big gap between the necessary adaptation levels and what’s actually being done. The IPCC estimates that $127 billion and $295 billion will be needed per year by developing countries by 2030 and by 2050 respectively. At the moment, adaptation accounts for just 4% to 8% of climate finance, which means there’s still a long way to go to improve.
The good news is that existing adaptation policies can reduce climate risks – if funded properly and implemented faster. The report analyzes several the feasibility, effectiveness, and potential of several adaptation measures. These include social programs that improve equity, ecosystem-based adaptation, and new technologies and infrastructure.
Climatic risks to people can also be lowered by strengthening nature, meaning that we invest in protecting nature and rebuilding ecosystems to benefit both people and biodiversity. Flood risk along rivers, for instance, can be reduced by restoring wetlands and other natural habitats in flood plains, by restoring natural courses of rivers, and by using trees to create shade.
“Different interests, values, and world views can be reconciled. By bringing together scientific and technological know-how as well as Indigenous and local knowledge, solutions will be more effective. Failure to achieve climate-resilient and sustainable development will result in a suboptimal future for people and nature, IPCC co-chair Debra Roberts said in a statement.
The bottom line
The next few years will be crucial in terms of reaching a sustainable future for all. Changing course will need an immediate, ambitious, and organized response to cut emissions, build resilience, and conserve ecosystems. Governments, civil society, and the private sector have to step up. As the IPCC report makes clear, we have a window of opportunity, but that window is quickly closing down.
Unlike cats, which lack the ability to taste sweetness, dogs find chocolate just as appealing as humans. But while the dark treat can be a euphoric delight for us, it can be poisonous to canines.
That’s not to say that all dogs get poisoned by chocolate or that a candy bar is enough to necessarily kill your pet canine. The dose makes the poison. The weight of the dog also matters, so large canines should be able to handle a small amount of chocolate whereas smaller breeds might run into serious trouble.
Although you shouldn’t panic if your dog accidentally ingests chocolate, candy and other chocolate sweets should never be offered to dogs. Generally, you should treat chocolate as toxic to dogs and should make an effort to keep it away from them.
Why chocolate can be dangerous to dogs
Among the many chemical compounds found in dark chocolate and cocoa is theobromine. Formerly known as xantheose, theobromine is a bitter alkaloid compound that acts as a mild stimulant for the human body.
Because dogs can’t break down, or metabolize, theobromine as well as humans can, the compound is toxic to dogs, over a certain threshold, depending on their body weight.
Mild symptoms of chocolate toxicity occur when a canine consumes 20 mg of theobromine per kilogram per body weight. Cardiac symptoms occur at around 40 to 50 mg/kg and dangerous seizures occur at doses greater than 60 mg/kg.
This explains why a candy bar may cause a chihuahua (average weight 2 kg) to run in circles while Great Dane (average weight 70 kg) might feel just fine.
Darker, purer varieties of chocolate tend to be the most dangerous because they contain the highest concentration of theobromine. According to the USDA nutrient database, various chocolate/cocoa products contain the following amounts of theobromine per 100 grams;
Unsweetened cocoa powder: 2634 mg;
Baking chocolate (unsweetened): 1297 mg;
Dark chocolate (70% cocoa): 802 mg;
Mars Twix (twin bar): 39.9 mg;
White chocolate: 0 mg;
As a rule of thumb, chocolate poisoning in dogs generally occurs after the ingestion of 3.5g of dark chocolate for every 1kg they weigh, or 14g of milk chocolate for every kilogram.
Signs that your dog may be suffering from chocolate poisoning
Chocolate poisoning mainly affects the heart, central nervous system, and kidneys. The symptoms of theobromine toxicity usually appear within 6 to 12 hours after your dog eats too much chocolate and may last up to 72 hours. These include:
elevated or abnormal heart rate,
and in extreme cases collapse and death.
Can chocolate kill dogs?
In short, yes. However, fatalities in dogs due to chocolate poisoning are very rare. According to the Veterinary Poisons Information Service from the U.S., out of 1,000 dog chocolate toxicity cases recorded in its database, only five dogs died.
What do if your dog eats chocolate
If you caught your dog eating chocolate or you suspect this may have happened, it is best to call your veterinarian and ask for advice on how to proceed going forward. Based on your dog’s size and the amount and kind of chocolate ingested, the veterinarian may recommend monitoring your dog for any symptoms of poisoning or ask that you immediately come to the clinic.
If there are good reasons to believe potentially dangerous chocolate poisoning may be imminent, and as long as your pet consumed the chocolate less than two hours ago, the veterinarian may induce vomiting.
In very extreme cases of poisoning, the veterinarian might administer medications and/or intravenous fluids to provide additional treatment.
Keep chocolate away from dogs
There’s no reason to believe chocolate isn’t as tasty to dogs as it is to humans. Unfortunately, many dog owners are ignorant to the fact that chocolate can poison their pets and intentionally offer chocolate snacks as a treat.
Usually, this isn’t a problem for very large breeds when they ingest small amounts of chocolate, but smaller dogs can suffer greatly and even die in extreme cases due to theobromine poisoning.
If you are aware that chocolate can poison your pet, you have no excuse to keep sweets accessible. It is advisable to keep any chocolate items on a high shelf, preferably in a closed-door pantry. Guests and children should be kindly reminded that chocolate is bad for dogs and that they shouldn’t offer chocolate treats regardless of how much the pet begs for them.
Most chocolate poisoning in dogs occurs around major holidays such as Christmas, Easter, or Valentine’s Day, so these are times when you should be extra careful.
From tourism to research activities, humans are leaving a mark in Antarctica – and not a very good one. A new study found that black carbon pollution from human activities in Antarctica is likely increasing snowmelt by about 83 tons per visitor. The remote continent is already one of the places in the world most affected by man-made global warming, experiencing almost 3ºC (5.4 Fahrenheit) of warming in the past 50 years, much higher than the global average of 0.9ºC (1.6 Fahrenheit).
Every summer, tourists and scientists flock to Antarctica by boat and plane. What used to be a very remote continent is now becoming much more accessible. There are more than 70 research stations housing thousands of researchers. During the 2019-2020 season, the number of tourists reached 74,000, with most of them traveling by ship.
As you can imagine, this is leaving a physical mark with lasting consequences. While trash and human waste are flown or shipped off the continent for disposal, some forms of waste are not too easily removed. Every activity in Antarctica uses fuel. As we burn it, human activities release microscopic particles of what’s known as black carbon.
Black carbon is mostly produced during combustion in engines, wildfires, coal burning, and residential wood burning. While it stays in the atmosphere for a limited period of time, it can be transported regionally or intercontinentally. As a result, it has been found in snow samples in the Arctic, North America, the Andes, and Antarctica.
In a new study, researchers sampled the snow yearly between 2016 and 2020 at 28 sites in Antarctica – going from the Ellsworth Mountains to the continent’s northern tip. They focused on the Antarctic peninsula, as that’s where half of the research facilities are currently located and also where over 95% of the tourist trips are made.
“The black carbon footprint of local activities in Antarctica has likely increased as human presence in the continent has surged. Vessels, airplanes, diesel power plants, generators, helicopters, and trucks are all local black carbon-rich sources that affect snow several kilometers downwind,” the researchers wrote in the journal Nature.
Black carbon and snowmelt
In their study, the researcher analyzed the quantity and type of light-absorbing particles in snow samples. These were passed through filters and analyzed for their optical properties so to identify the type of particulates. There are many types of impurities that absorb light in Antarctic snow but in very minuscule quantities.
All samples obtained near human housing had black carbon levels above the usual Antarctic levels – a sign of human emissions. High levels of black carbon influence how the snow absorbs light, known as albedo. Snow with a lower albedo melts faster. The black carbon content in the snow samples could then be used to estimate if snowmelt increased due to human activity.
Human-produced black carbon could be causing surface snow to melt by up to 23 millimeters every summer. When looking at tourism specifically, the study found that every visitor between 2016 and 2020 was melting 83 tons of snow due to emissions from cruise ships. Scientific activities are also contributing their fair share due to the use of equipment and vehicles.
Mechanisms to mitigate black carbon impacts are needed, the researchers argued. They called for global agencies to limit tourism while pushing for a faster transition to clean fuel and hybrid or electric ships. Simultaneously, the size and footprint of research facilities should be addressed by adopting renewable energy power plants and energy efficiency standards.
While genetics play a big role, many other factors have been speculated to cause attention deficit hyperactivity disorder (ADHD) – from eating too much sugar to watching TV. Now, researchers have found that high levels of air pollution and limited access to green areas can also increase the risk of developing the condition.
ADHD is a common neurodevelopmental condition in children, which sometimes continues in adulthood. It’s a complex condition, difficult to diagnose, and with no cure. If left unchecked, ADHD can impact children’s performance at school and their relationships with parents and peers . It’s more common in boys than girls and it affects 1 in 20 children.
The disorder is generally diagnosed during the first years of school but it can manifest differently from child to child. Its cause, however, has been a subject of debate among researchers. In 2018, a study identified regions of the DNA associated with ADHD, for instance. But scientists have also been studying other factors, with no clear answers on many of them so far.
It seems like a lot of things could be responsible for ADHD, and the latest to blame is air pollution. According to previous research, it may cause ADHD through induced systemic oxidative stress, with disturbs brain development, leading to cognitive deficits. Noise exposure can also increase stress, with is associated with psychological disorders such as hyperactivity. However, results from previous research have so far been inconsistent or limited.
In a new study, researchers at the Barcelona Institute for Global Health (ISGlobal) looked at the links between environmental exposures (greenness, air pollution and noise) in early life and later ADHD incidence – using environmental exposure metrics in combination with a population-based birth cohort linked with administrative data.
“We observed that children living in greener neighborhoods with low air pollution had a substantially decreased risk of ADHD. This is an environmental inequality where, in turn, those children living in areas with higher pollution and less greenness face a disproportionally greater risk”, lead author Matilda van den Bosch said in a statement.
ADHD and air pollution
For the study, the researchers used birth data from the metropolitan area of Vancouver, Canada from 2000 to 2001 and also retrieved data on ADHD cases from hospital records. They estimated the percentage of green spaces in the participants’ neighborhoods as well as the levels of air and noise pollution, using exposure models.
The study identified a total of 1,217 ADHD cases, which represents 4.2% of the sampled population. The participants living in areas with a larger percentage of vegetation had a lower risk of ADHD. More specifically, the study showed that a 12% increase in vegetation was linked with a 10% drop in the risk of having ADHD.
The opposite associated was observed with air pollution. The participants who had higher exposure to PM2.5 (fine particulate matter) had a higher risk of ADHD. Specifically, every 2.1 microgram increase in the levels of PM2.5 meant an 11% increase in the risk of ADHD. No link was found between noise pollution, NO2, and ADHD.
“Our findings also show that the associations between PM2.5 and ADHD were attenuated by residential green space and vice versa as if the beneficial effects of vegetation and the harmful effects of PM2.5 neutralized each other,” Weiran Yuchi, a researcher at the University of British Columbia and first author of the study, said in a statement.
Until Antoni van Leeuwenhoek first discovered bacteria in 1676, we didn’t even have a name for these tiny microbes. Viruses, which are an order of magnitude smaller than bacteria and require even more powerful microscopes to observe, weren’t discovered until 1892 when Dmitry Ivanovsky isolated the tobacco mosaic virus. Imagine everyone’s surprise when scientists recently discovered a bacterium so large it’s visible to the naked eye.
This giant string-like bacterium, native to the Caribbean mangroves, can grow up to 2 centimeters in length, about the size of a fly. It’s about 5,000 times larger than most other microbes, stretching the limits of what we thought biological possible for a single-celled organism.
Christened Thiomargarita magnifica by an international team of researchers, which included scientists from the Lawrence Berkeley National Laboratory in the US, CNRS in France, the newly discovered organism dwarfs other so-called “giant bacteria” by about 50-fold.
“All too often, bacteria are thought of as small, simple, ‘unevolved’ life forms—so-called ‘bags of proteins,’” Chris Greening, a microbiologist at Monash University, Clayton, who was not involved in the study, told Science. “But this bacterium shows this couldn’t be much further from the truth.”
The secret to T. magnifica‘s chunky size may lie in the arrangement of its genetic material, which is totally atypical. Bacteria and other single-cell microbes called archaea are classed as prokaryotes, while multicellular organisms like trees and humans are classed as eukaryotes. One of the defining differences between the two is that prokaryotes have free-floating DNA, while eukaryotes package their genetic code in a nucleus.
However, T. magnifica is blurring the lines between eukaryotes and prokaryotes because its huge genome is not free-floating inside its cell as in other bacteria. Instead, it’s encased in a membrane. When researchers sequenced the genome, they were amazed by its size: 11 million bases that line up to form 11,000 genes. For comparison, your run-of-the-mill bacterium only has about 4 million bases and about 3,900 genes.
“Importantly, we show that centimeter-long Thiomargarita filaments represent individual cells with genetic material and ribosomes compartmentalized into a novel type of membrane-bound organelle. Sequencing and analysis of genomes from five single cells revealed insights into distinct cell division and cell elongation mechanisms,” the researchers wrote in a paper that appeared in the preprint server bioRxiv. The findings haven’t been peer-reviewed yet.
These extraordinary findings suggest that the two major branches of life aren’t all that different after all — and T. magnifica could be a missing link that explains how complex life evolved from the most primitive single-celled organisms more than a billion years ago.
The giant bacterium’s DNA is encased in a sac embedded in its membrane. It has another much larger membrane sac that is filled with water and takes up 73% of the microbe’s volume. This external sac allows the organism to grow so large, while keeping its essential organelles packed in a small space to facilitate the diffusion of molecules in and out of the microbe.
“These unique cellular features likely allow the organism to grow to an unusually large size and circumvent some of the biophysical and bioenergetic limitations on growth,” said the researchers, which have compared T. magnifica to other microbes called Large Sulfur Bacteria, which form very long filaments that may reach several centimeters in length. However, unlike T. magnifica, these sulfur bacteria are composed of thousands of individual cells, each no larger than 200 micrometers.
The origin of complex life is one of the most important, yet unanswered, questions in biology. Most bacteria are tiny and dull, but some are complex and feature innovative biological machinery. T. magnifica is a prime example of the latter, with its large genome, gigantic cell size, and its unheard-of compartmentalization of genetic material in its membrane.
“T. magnifica adds to the list of bacteria that have evolved a higher level of complexity. It is the first and only bacteria known to date to unambiguously segregate their genetic material in membrane-bound organelles in the manner of eukaryotes and therefore challenges our concept of a bacterial cell,” the researchers wrote.