As climate heating starts to take its toll more and more, it’s becoming increasingly clear that planting patterns are also affected by these changes — and many plants are struggling to adapt. They do this in several ways, but one of the more direct ways is by changing their range. Simply put, as the climate becomes hotter and hotter, plants “move” to the north in the US (conversely, south of the equator, they migrate southward).
The most important factor for plants is the coldest winter temperature — this is crucial for the plants’ survival. According to a USDA study, the average coldest temperatures of 1989-2018 are more than 3°F warmer for the average city compared to the 1951-1980 baseline. Temperatures have significantly increased at almost all of the 244 stations analyzed. A warming climate shifts the natural ranges of plants all around the country and farmers and gardeners need to consider the ‘new normal’, the USDA urges.
“Landscapes and seascapes are changing rapidly, and species, including many iconic species, may disappear from regions where they have been prevalent or become extinct, altering some regions so much that their mix of plant and animal life will become almost unrecognizable,” the assessment reads.
“Timing of critical biological events, such as spring bud burst, emergence from overwintering, and the start of migrations, has shifted, leading to important impacts on species and habitats.”
This is important to consider not only for gardeners but also for urban and rural planners. North Carolina Arboretum Director George Briggs says that people need to be climate-literate and make better decisions in the face of a shifting climate.
“There is telling evidence that climate change is affecting plant life around the world and here at Longwood,” says Paul Redman, Director of Longwood Gardens in Pennsylvania. “Sharing the important work of NOAA with our staff, guests, and community is integral to our mission and continues Longwood Gardens’ commitment to environmental stewardship.”
In the grand scheme of things, it is yet another reminder that climate heating affects us in many (and often indirect) ways. It is a problem unfolding now, and that we need to address as soon as possible.
You can find out your area’s plant hardiness zone or check out the distribution of planting zones in your states, check out the USDA service here.
One day after President Trump announced that the US will stop funding the World Health Organisation (WHO), its Director-General Dr. Tedros Adhanom Ghebreyesus says he “regrets the decision” and calls again for global solidarity.
While Dr. Ghebreyesus held firm to his track record of tactfully reminding everyone of the importance of working together through the outbreak, others were much more vocal in their criticism. President Trump’s decision has been called “dangerous, short-sighted”, “politically motivated”, and “a typically petulant act”. I daresay I agree.
WHO’s the bad guy here?
“We regret the decision of the President of the United States to order a halt in funding to the World Health Organization,” Dr. Tedros Adhanom Ghebreyesus said in a press briefing Wednesday. “With support from the people and government of the United States, WHO works to improve the health of many of the world’s poorest and most vulnerable people.”
Dr. Ghebreyesus called again for global unity and continued focus on saving lives and fighting the common enemy, COVID-19, during the briefing.
The WHO was formed in the 1940s by the United Nations and supported by its member states, and received around 15% of all its funding from the US (a permanent member on the UN’s Security Council). After accusing it of “severely mismanaging” the outbreak, however, President Trump announced he will halt the funding until his administration has had a chance to review its response.
It is still currently unclear how this massive cut (one-sixth of all its resources) will impact the WHO, which is guiding the global response against COVID-19 while also fighting polio, measles, malaria, Ebola, HIV, tuberculosis, malnutrition, cancer, diabetes, mental health, and many other diseases and conditions, while further helping shore up national health systems and improve on their capabilities.
For context, here are the disease-combating efforts that the Trump administration is having a net positive effect on:
In his briefing, Dr. Tedros explained that the WHO is currently reviewing its budget and plans to work with its remaining partners to keep the body going as efficiently as possible. Furthermore, he noted that member states and independent bodies will review the WHO’s response to the pandemic “in due course,” as they have done after every major health event.
“No doubt, areas for improvement will be identified and there will be lessons for all of us to learn,” he said. “But for now, our focus—my focus—is on stopping this virus and saving lives. This is a time for all of us to be united in our common struggle against a common threat—a dangerous enemy. When we are divided, the virus exploits the cracks between us.”
People are criticizing this
I can’t shake the feeling that I’m watching a kid play at President — only it’s a spoiled kid, prone to episodes of Cartmanesque “I’m going home” mentality whenever something doesn’t go exactly his way.
Dr. Tedros himself has long asked politicians, the public, and the media to not “politicize COVID,” which would hurt our efforts to combat it. And he’s not shy about stating exactly why that fight is important.
“Please quarantine politicizing COVID,” he said in an April 8 press conference when asked about Trump’s previous criticism of the organization. “We will have many body bags in front of us if we don’t behave. When there are cracks at national level and global level that’s when the virus succeeds.”
“For God’s sake, we have lost more than 60,000 citizens of the world.“
Since April 8th, that figure has more than doubled, and there are now over 2 million confirmed cases of COVID-19 worldwide. The US is now the worst affected country with more than 632,000 cases and nearly 28,000 deaths.
Amid that backdrop, Dr. Robert Redfield, the head of the US Centers for Disease Control and Prevention, distanced himself from the decision, telling Good Morning America in an interview that the “CDC and WHO has had a long history of working together in multiple outbreaks throughout the world, as we continue to do in this one. And so, we’ve had a very productive public health relationship. We continue to have that.” There will be time to look at what happened with this outbreak, but only “once we get through it together.”
Both Bill and Melinda Gates tweeted warnings that this move is especially dangerous during a global health crisis.
Other experts are further weighing in on the decision, and they’re not happy at all about what they’re seeing. A simple google search will yield ample responses from individuals and institutions from around the world, but I’ll leave you with some of the more powerful ones I’ve found on The Science Media Center.
Prof Robert Dingwall, Professor of Sociology, Nottingham Trent University:
“The freeze on funding for WHO by the US government is a typically petulant act against an international organization that has sought to maintain its integrity and impartiality rather than to bow to President Trump’s transient and volatile prejudices.”
Dr. Peter Piot, the director of the London School of Hygiene & Tropical Medicine:
“Halting funding to the WHO is a dangerous, short-sighted, and politically motivated decision, with potential public health consequences for all countries in the world, whether they are rich or poor.”
Dr. Gail Carson, director of Network Development at ISARIC (International Severe Acute Respiratory and Emerging Infection Consortium) at the University of Oxford:
“Let’s hope President Trump and the review team realize quickly that now is not the time for division and potentially weakening the UN authority on health who are busy coordinating the global response to the pandemic. Look at facts, and there is plenty of evidence of all the good WHO has done during this pandemic.”
Joshua Moon, a senior research fellow at the Science Policy Research Unit, University of Sussex Business School:
“To see Trump threatening to pull funding from WHO in the middle of a pandemic is truly heartbreaking. The WHO has received so much criticism in the past decade surrounding its role in various public health emergencies. I have been one of those critics myself. However, this attack on WHO is a purely political move designed to distract and pander to Trump’s base.”
“At its core: the loss of US funding for WHO is a huge problem that will impact the response to COVID-19 globally, invite new and potentially unaccountable actors into the position of power that the US previous held, and is a contemptible falsehood being peddled by a politician who in my opinion is trying to hide his own mistakes from his supporters.”
Stephen Griffin, a medical professor, and expert at viral diseases at the University of Leeds:
“This most recent intervention in public health policy by President Trump is perhaps one of the least productive, most short-sighted, self-motivated, and hypocritical acts I have ever witnessed. As far as I can ascertain, it has no foundation in reality. The situation in the US and the world over amounts to a crisis, and one in which we must stand together. WHO is perhaps one of the best means of achieving this and deserves the support and respect of all countries.”
I agree with each and every one of them. The WHO definitely isn’t perfect, but it has always been committed to improving itself and learning from its shortcomings, as evidenced by their openness to reviews from member states. The WHO is perhaps the single greatest tool we have against the current pandemic. There aren’t enough votes in the whole USA to wash away the deaths it can prevent, Mr. President.
“Nearly every day, dangerous flooding occurs somewhere in the United States and widespread flooding is in the forecast for many states in the months ahead,” Ed Clark, director of NOAA’s National Water Center, said on a statement.
NOAA expects an ongoing rainfall, a highly-saturated soil and above-normal precipitations in the coming months, especially in the Mississippi River basin, the Missouri River basin and the Red River of the North.
Any substantial local rainfall could cause flooding in these areas, already experiencing a high level of soil moisture, according to NOAA’s projections, based on the assessment of a number of factors such as drought, soil moisture, and snowpack.
Aware of the upcoming challenge, a group of regional mayors has been working together to get ready for the flooding, also coordinating the federal agencies and Congress. But the coronavirus outbreak is making those plans more difficult, having to deal with several crisis situations at the same time.
“The current situation with COVID-19 presents us a fight on two fronts: one front, we have the ongoing coronavirus pandemic, and on the other, what promises to be a very active spring 2020 flood season,” Sharon Broome, mayor of Baton Rouge, Louisiana, told Buzzfeed News.
Having to work on the two fronts simultaneously will be a gargantuan challenge for many states. But it will also be an opportunity to understand better the type of overlapping impacts that will get more common as the world gets warmer.
Human-induced global warming has increased the number, the frequency and the intensity of extreme weather events across the globe. In the US, it’s now more commons to see more intense rains and longer periods of warmer weather, according to the National Climate Assessment.
To anticipate the floods, states have been accumulating reserves of personal protective equipment for the first responders to floods, including masks, gloves and head coverings. Some districts have also started to distribute premade sandbags to vulnerable communities to avoid a last-minute crisis.
At the same time, the Red Cross will be working soon with local officials on plans to shelter people that will be eventually displaced by the flooding, hoping to relocate them to hotels and dorms. But doing so in the middle of a coronavirus outbreak will be tricky, as officials will have to ensure a proper distance between people.
Setting up a shelter will require to screen people for symptoms of COVID-19 before allowing them access, getting their temperatures checked, for example. Those that exhibit symptoms would have to be isolated from the rest, Trevor Riggen, senior vice president of disaster cycle services at the American Red Cross, told BuzzFeed.
The COVID-19 pandemic is shaping up to be one of the most pressing challenges the country has faced in peacetime. Any additional pressure could prove absolutely devastating unless serious prevention and preparation measures are taken.
Careful preparation is more important than ever, as is staying calm and focused, authorities warn.
“I feel like more than anything, we’re just trying to keep our people in some state of calm as we daily put out these bleak circumstances, these bleak numbers,” said Belinda Constant, mayor of Gretna, Louisiana. “We’re just praying every day that we don’t end up in a state of anarchy.”
Almost half of all adult Americans will be obese, and around a quarter will be severely obese, by 2030, according to a new study.
The paper, led by researchers at the Harvard T.H. Chan School of Public Health, predicts that more than half of the adult population of 29 US states will be obese by 2030 and that, among adults, all states will have an obesity prevalence of over 35% by the same year. They further estimate that the current rates of adult obesity and severe adult obesity in the US are around 40% and 18%, respectively.
Too big to fall
“The high projected prevalence of severe obesity among low-income adults has substantial implications for future Medicaid costs,” said lead author Zachary Ward, programmer/analyst at Harvard Chan School’s Center for Health Decision Science and lead author of the study.
“In addition, the effect of weight stigma could have far-reaching implications for socioeconomic disparities as severe obesity becomes the most common BMI category among low-income adults in nearly every state.”
The prediction is quite troubling as obesity comes associated with several health and economic impacts, on both an individual and social scale. Severe obesity is especially linked to increased rates of chronic disease and medical spending, the team explains, and with drastic reductions in life expectancy.
The team drew on self-reported body mass index (BMI) data from adults who participated in the Behavioral Risk Factor Surveillance System Survey (BRFSS) between 1993 and 2016 (for a total of 6.2 million data points). The BMI is calculated by dividing a person’s weight in kilograms by the square of their height in meters. A BMI of over 30 is considered indicative of obesity, while one of 35 or higher corresponds to severe obesity.
Since self-reported data in general and self-reported BMIs, in particular, tend to be unreliable (as people conform to their own biases), the team developed and used novel statistical methods to correct the data. Furthermore, using the wealth of information collected by the BRFSS, they looked at obesity rates for specific states, income levels, and subpopulations.
Several US states will have adult obesity rates close to 60% by 2030, they report, while the least-affected states will still record rates close to 40%. On a national average, they report, severe obesity will become the most common BMI category for women, non-Hispanic black adults, and those with annual incomes below $50,000 per year.
The team hopes that their study will help guide policy meant to prevent such a situation. For example, they cite previous research showing that a tax on sugar-sweetened beverages is an efficient and cost-effective prevention method for obesity.
“Prevention is going to be key to better managing this epidemic,” said Ward.
The paper “Projected U.S. State-Level Prevalence of Adult Obesity and Severe Obesity” has been published in the New England Journal of Medicine.
In certain cases, rivers have lost as much as 50% of their flow.
Image via Pixabay.
New research led by a hydrologist at the University of Arizona warns that massive groundwater pumping since the 1950s is bleeding rivers dry. The findings can help shape policy for the proper management of U.S. water resources, the authors say, and should be of interest especially for states such as Arizona that manage groundwater and surface water separately.
“We’re trying to figure out how that groundwater depletion has actually reshaped our hydrologic landscape,” said first author Laura Condon, a University of Arizona assistant professor of hydrology and atmospheric sciences.
“What does that mean for us, and what are the lasting impacts?”
According to Condon, this is the first study to look at the impact of past groundwater pumping across the entire U.S. Other research has dealt with this issue, but only on smaller scales.
The team started by using computer models to see what the state of U.S. surface waters would have been today in the absence of human consumption. They then compared that with surface water changes recorded since large-scale groundwater pumping first began in the 1950s.
The model maps ground and surface waters onto a grid of squares (0.6 miles per side) that covers most of the U.S., excluding coastal regions. It included all the groundwater down to 328 feet (100 meters) below the land surface. The analysis focused primarily on the Colorado and Mississippi River basins and looked exclusively at the effects of past groundwater pumping because those losses have already occurred.
Estimates from the U.S. Geological Survey (USGS) place the loss of groundwater volume between 1900 and 2008 at 1,000 cubic kilometers. “The rate of groundwater depletion has increased markedly since about 1950,” it adds, peaking between 2000 and 2008 “when the depletion rate averaged almost 25 km3 per year (compared to 9.2 km3 per year averaged over the 1900–2008 timeframe).” One thousand cubic kilometers of water corresponds to one billion liters or 264.170.000 gallons.
“We showed that because we’ve taken all of this water out of the subsurface, that has had really big impacts on how our land surface hydrology behaves,” she said. “We can show in our simulation that by taking out this groundwater, we have dried up lots of small streams across the U.S. because those streams would have been fed by groundwater discharge.”
Too much of a good thing
Groundwater is a very valuable resource across the world. When surface water sources are scarce, absent, or overtaxed, groundwater is pumped to supply our domestic and economic needs. When misused, it can lead to enormous crises, likethe one facing India today.
Among other things, it is also used for agriculture and provides hydration for wild vegetation. Some native vegetation like cottonwood trees will eventually die if the water table drops below their roots. In the United States, it is the source of drinking water for about half the total population and nearly all of the rural population, and it provides over 50 billion gallons per day for agricultural needs, according to the same article from USGS.
The team found that streams, lakes, and rivers in western Nebraska, western Kansas, eastern Colorado and other parts of the High Plains have been particularly hard hit by groundwater pumping. Those findings agree with other smaller-scale studies in the region.
“With this study, we not only have been able to reconstruct the impact of historical pumping on stream depletion, but we can also use it in a predictive sense, to help sustainably manage groundwater pumping moving forward,” says Reed Maxwell, the paper’s co-author.
“We can do things with these model simulations that we can’t do in real life. We can ask, ‘What if we never pumped at all? What’s the difference?'”
The regions that were most sensitive to a lowering water table are east of the Rocky Mountains, where the water table was initially shallow (at the depth of 6-33 feet or 2-10 meters). Ground and surface waters are more closely linked in these areas, so depleting the groundwater is more disruptive for streams, rivers, and by extension, vegetation. The western U.S. has deeper groundwater, so reducing their volume didn’t have as powerful an effect on surface waters.
Condon says that other research has shown that the areas of the Midwest where precipitation used to equal evaporative demand — i.e. where irrigation wasn’t required for crops — are becoming more arid. Those are some of the regions where groundwater pumping has reduced surface waters.
“In the West, we worry about water availability a lot and have many systems in place for handling and managing water shortage,” Condon said. “As you move to the East, where things are more humid, we don’t have as many systems in place.”
The paper “Simulating the sensitivity of evapotranspiration and streamflow to large-scale groundwater depletion” has been published in the journal Science Advances.
A new open-access tool developed at Stanford University reveals that, in certain U.S. states, solar panels now account over 10% of total energy generation.
The interactive map of the United States on the DeepSolar website. Image credits DeepSolar / Stanford University
Policy-makers, utility companies, researchers, and engineers currently have a hard time estimating just how many solar panels installed throughout the country. Stanford University researchers have come to their aid, however, with a new algorithm that makes it easier than ever before to quantify them and analyze development. The tool (accompanied by an open-access website) draws on high-resolution satellite data and automated image analysis.
“With these methods, we can not only maintain and update a high-fidelity database of solar installations, but also correlate them at the census-tract level with the amount of incoming solar radiation as well as non-physical factors such as household income and education level,” says co-senior author Arun Majumdar, a mechanical engineering professor at Stanford and co-Director of the Precourt Institute for Energy.
The tool, dubbed DeepSolar, offers unprecedented insight into the trends that drive solar power adoption by society at large, the team says. The algorithm works by analyzing high-resolution images across the U.S., looking for solar panels. When it finds a match, the program records the location and calculates its size.
In stark contrast to its predecessors, DeepSolar isn’t painfully slow. “Previous algorithms were so slow that they would have needed at least a year of computational time” to identify most of the solar panels in the U.S., says co-senior author Ram Rajagopal, a civil engineering professor at Stanford. Meanwhile, DeepSolar only needs a “fraction” of that time.
The team reports using DeepSolar to locate roughly 1.47 million individual solar installations across the country. These included rooftop panels, solar farms, and utility-scale installations. The software should help optimize solar development at the aggregate level, the team explains, especially since decentralization of solar power made it hard to keep track of all the panels being installed.
DeepSolar interactive map showing solar panel distribution by county in the region surrounding Chicago. Image credits DeepSolar / Stanford University.
One area the team hopes to make an immediate impact with DeepSolar is in the U.S. power grid. The tool, they say, could be used to better integrate solar into the grid by accounting for daily and seasonal fluctuations in incoming sunlight.
“Now that we know where the solar panels are, or are likely to be in the future, we can feed that information into questions of modeling the electricity system and predicting where storage units and substations should go,” says Majumdar.
DeepSolar could also help in pinpointing new areas for solar deployment. The team used the program to establish correlations between solar installation density and variables such as population density or household income — which, when pooled together, allowed them to create a model predicting which areas are most likely to adopt solar in the future.
“Utilities, companies that install solar panels, even community planners that are thinking about sustainability, they all can benefit from this high-resolution spatial data and a website where they can explore and analyze the different trends involved,” Rajagopal says.
The team plans to expand the DeepSolar database to include solar installations in other countries with suitably high-resolution satellite images and to improve its ability to estimate energy output based on characteristics such as the angle of incoming light.
The paper “DeepSolar: A Machine Learning Framework to Efficiently Construct Solar Deployment Database in the United States” has been published in the journal Joule.
Here’s an interesting thought: taking classes in the world’s most widely used and dominant language might actually be a detriment. Just 20% of Americans from kindergarten to the 2nd grade take a foreign language course, compared to a whopping 92% in Europe, which might put young Americans at a disadvantage, both directly and indirectly.
Image credits: Pew Research Center.
Legislation is a major aspect
Aside from offering the direct benefit of being able to communicate with more people, learning a foreign language is associated with a wide variety of intellectual and health benefits. Bilingual kids tend to be smarter, scoring higher in cognitive tests, and bilingual brains also tend to be healthier and stave off age-related decline.
At a first glance, the reason for the difference between the US and Europe seems pretty obvious — Americans speak English and so most of them feel like they just don’t need to learn another language. But in Europe, things couldn’t be more different. Eight countries in Europe have 100% of students study at least a foreign language, and for almost all of them, the rate is above 80%.
But when you start to go a bit deeper, a systemic difference becomes apparent — and it’s a lot about how the educational process is designed and legislated.
European students typically start learning their first foreign language between 6 and 9, and furthermore, learning a second foreign language is compulsory in more than 20 European countries. Overall, 92% of students study at least one foreign language in Europe — vastly different from the US.
For starters, learning foreign languages are typically mandated by law in Europe, whereas in the US, each state makes its own regulation. Overall, the vast majority of US states have less than 25% participation, with only 9% of students studying a foreign language in New Mexico, Arizona and Arkansas.
Image credits: Pew Research Center.
[panel style=”panel-default” title=”UK and Ireland” footer=””]The Pew report doesn’t include the UK and Ireland, for which it did not have satisfying data. However, this also draws an interesting comparison — after all, the native language in both countries is English, as is also within the US, so perhaps this would be a more apt comparison.
Well, the UK definitely lags behind the rest of Europe in terms of foreign language, and the Brits are notorious on the continent for their lack of foreign language skills. However, 38% of Britons speak at least one foreign language. In England and Scotland, primary school pupils are generally expected to learn a foreign language, though that is not always the case in Wales and Northern Ireland.
Regarding Ireland, in addition to English, all pupils at primary level study Irish. However, Irish is not considered a foreign language. [/panel]
This has a lot to do with what skills are required to get a job. In Europe, a foreign language is considered essential for many qualified jobs, while in the US that’s hardly the case. But that’s not the only reason. Europe tends to be much more culturally diverse, even between the borders of one country. In Switzerland, for instance, there are 4 official languages (German, French, Italian and Romansh), while in Belgium, which reports one of the lowest overall percentages of students learning another language, there are three official languages: French, German, and Dutch. Since Belgians are very divided on the matter of what languages should be spoken within the country, English has become the country’s best apolitical linguistic option, as Quartz’s Nikhil Sonnad recently noted.
Europeans also share more borders with other countries, and it’s very easy to travel from country to country, at least within the EU — you can do so only with your ID and nothing more, and you don’t need a visa.
Lastly, on top of it all, there’s still the practical necessity. With English, you can probably get around in most parts of the world, so there’s not much incentive to brush up on your French. But if you’re from the Netherlands, the odds are you may need to know a language other than Dutch at some point.
American kids don’t feel as pressured to learn another tongue, but while they may easily find work and get around using only English, they might be missing out on developing cultural intelligence, which is increasingly important in modern times.
The Massachusetts installation — christened “Vineyard Wind” — will be constructed in state waters some 14 miles (22.5 km) off of Martha’s Vineyard and is planned to be ready surprisingly fast: the farm is earmarked to start feeding the grid as soon as 2021, reportsGreen Tech Media. The two companies who won the contract — Avangrid Renewables and Copenhagen Infrastructure Partners, both based in Europe — will share ownership of the project equally. The two will begin negotiations for transmission services and power purchase agreements shortly, according to a joint press release.
Vineyard Wind comes as part of Massachusetts’ recently-approved goal of building 1.6GW of wind energy by 2027 — and should cover half of that pledged capacity. Overall, it’s expected to reduce the state’s carbon emissions by over 1.6 million tons per year, roughly equivalent to the exhaust of 325,000 cars.
The project is likely to propel further offshore wind development in the area, similarly to what we’ve seen happen in Europe. The port of New Bedford has already been retrofitted to handle the immense load of traffic and infrastructure that development of Vineyard Wind will require, notes the New York Times — which is likely to make further development even more attractive and convenient.
The second contract, awarded by Rhode Island to Deepwater Wind, aims to provide 400MW capacity — although not on such short notice. Construction on the farm, called Revolution Wind, could begin “as soon as 2020” writes Megan Geuss of ArsTechnica, citing a company spokesperson. Deepwater Wind is an US-based firm that has previously collaborated with the state of Rhode Island to built the first offshore wind in the US: the 30MW unit off the coast of Block Island.
The added capacity from this farm will help Rhode Island to reach 1GW of renewable energy by 2020, a goal that state Gov. Gina Raimondo recently called for. Deepwater Wind will also need to start power purchase negotiations and get federal regulatory approval before construction can begin. Revolution Wind, like Vineyard Wind, will be built in state waters.
Aerial view of the Block Island offshore wind farm. Image credits Ionna22 / Wikimedia.
Judging by what happened in Europe, however, both Massachusetts and Rhode Island stand a lot to gain in the long term from these offshore wind developments. Europe currently hosts roughly 15.7GW of offshore wind, and the experienced energy companies have gleaned here has consistently knocked down installation costs — which made the tech is so attractive even in the US.
Similarly, the early experience and logistical base these two states will gain could provide them with a decisive edge in further offshore developments in the US — which are bound to pop up as installation costs drop. For example, the Department of the Interior recently opened 390,000 acres of federally-controlled waters off the coast of Massachusetts for offshore wind. New Bedford is ideally suited to provide shipping and support for developments here without any further investments — so Massachusetts will surely stand to benefit as more actors join the US offshore wind market.
And more are joining already — the state of New Jersey is also eager to plug its grind into offshore wind farms, with Governor Phil Murphy signing into law a commitment to 3,500 MW, the largest state offshore wind policy to date, on Wednesday, as well. The Union of Concerned Scientists applauds the developments-to-be, writing that these will likely spur states such as Connecticut, New York, Maryland, or Virginia into dipping their toes in offshore wind.
But it’s not just about what states gain. We’ve written before about the benefits renewables bring to local communities. These range from jobs (here and here), air quality improvements, reductions in carbon emissions, and a lower energy bill once the projects are up and running. All good things, I’m sure you’ll agree.
As a new royal baby was born to Kate Middleton and Prince William, the UK was abuzz, with word spreading of a lavish, luxurious birth. But the price for the birth and the mother’s recovery, which was $8,900, is significantly lower than what the average US woman pays under normal conditions.
The US is the most expensive place in the world for giving birth, with the average price being $10,800 in 2015. This doesn’t include pre and post-birth care, which raise the price to roughly $30,000.
Kate Middleton. Image via Wikipedia.
The UK takes its royalty very seriously — and the birth of a new royal baby is no small matter. So it’s only natural that the media was abuzz with the event, presenting even the tiniest details about Kate and William’s preparations. Among these details, it was revealed that the baby was delivered in a private room in St. Mary’s Hospital’s Lindo Wing. Perks include an “en suite” bathroom, a refrigerator, and a menu of “nutritious” meals — which, call me crazy, sounds decent rather than luxurious for a woman going through the struggles of childbirth. Still, the $8,900 price tag is nothing to scoff at and seems very luxurious — until you look at figures for the USA.
According to figures compiled by The Economist and circulated by Statista, this deluxe package for 24 hours, including the non-Caesarian delivery, still costs less than an average birth in the United States, which amounts to $10,800 (2015 figures). The Guardian reports that, including all expenses, US hospitals charged $32,093 for an uncomplicated vaginal birth and newborn care, and $51,125 for a standard cesarean section.
Of course, you can make a very valid case that the UK royal house is making too many expenses, that they’re ultimately funded through public money, and that they’re often quite lavishly wasteful. But really, a more important takeaway is that, even in these extremely troubling times, the British healthcare system (be it public or private) somehow manages to be more price-efficient than the US healthcare system. Even though American insurers often negotiate lower prices, the associated costs are still much higher. This is a recurring problem for the US, which spends more on healthcare than any other country, but in many aspects falls way behind other developed nations.
It’s not like the British system is a landmark either — other developed countries also have much lower birth-associated prices. For instance, in Spain, it costs about $1,950 to deliver a child. In Australia, the price is around $5,000, and in even Switzerland, a notoriously expensive country, it’s under $8,000.
In an effort to show the actual effects climate change will have on each of our lives in the future, the National Oceanic and Atmospheric Administration (NOAA) put together some chilling maps showing a scorching future.
“Melting Ice Cream Truck” by Glue Society. Image via thisiscolossal.
Climate change is going to cause a whole lot of changes to the world as we know it. Sea level rise, species extinction, and food and water uncertainty for many. While we’re aware of these future threats, the problem is that we’re just not very good, from a psychological point of view, at dealing with future problems that require collective action. It doesn’t feel like it matters to us directly, it doesn’t feel like we can do anything about it, so we don’t really care. Our brains sport more of a cross-that-bridge-when-we-get-there type of wiring.
That, in the context of climate change, is a really bad strategy. It takes time for this damage to build, but it takes just as long for us to work on preventing it — and much longer to fix it after the deed is done.
Would you like some ice with that?
NOAA, however, knows what’s up. NOAA is also sneaky and knows what everybody hates: scorching summer temperatures. So, to help us better understand what path we’re walking down, they’ve compiled some maps to show just how hot things are going to get by the end of the century. For example, here’s what mean July temperatures looked like in the US in 2010 — and how they will look in 2090.
Scorching, right? Well, don’t take out the ice-cubes just yet because this is one of the better-case scenarios — one where we pursue and achieve fairly ambitious reductions in greenhouse gas emissions and reforestation. Yep, fairly ambitious greenhouse gas reduction and reforestation. With the new, 2.0, ‘clean’ coal administration currently ruling the US, neither of those targets seem very likely, do they?
NOAA agrees; that’s why they’ve also made predictions for a business-as-usual scenario, in which we keep polluting as we do now, and make no policy changes in regards to the environment. If we go down that road, July 2090 looks like a time where no amount of ice cream will cut it:
For those of you with an unnatural fear of metrics, deep red (the thing covering most of the US in the picture) corresponds to average daily highs of 100°F (37.8°C). Daily. For a month at least, year after year, after year. I don’t know about you guys, but when I see the thermometer hitting 30°C I know it’s going to be a bad day. At 35°C, I can’t function any longer. I just fill my bathtub with cold water and camp in it.
I’m not so special in that regard; people and high temperatures don’t seem to mix that well. Currently, some 658 people die from extreme heat in the US every year, mostly in states such as Arizona or Texas. That number is bound to skyrocket as these states themselves, along with the rest of the US, start getting hotter.
Winters will also warm up. Even under a best-case-scenario, average highs during winter months will look something like this (the video starts with current mean temperatures).
I like what NOAA did with these maps because I feel it helps put that infamous 2°C Paris goal into context. It’s often used as a reference point in many discussions around climate change, and a benchmark that many official bodies, scientists, and publications use — but it doesn’t convey much to the average Joe.
Knowing that global temperatures will increase by 2°C doesn’t sound like much, and there’s probably a lot of ‘globe’ around so that doesn’t tell me much about what I’m going to have to face. Even worse, that figure is the mean annual temperature — putting one more layer of abstraction between it and what effects I’ll feel.
Hopefully, NOAA’s work will help us better understand what’s waiting for us down the road. They note that most people in the US will have to contend with scorching heat as soon as 2050 — and such conditions will take their toll on the economy and our quality of life (high temperatures, among other things, make it harder for us to sleep).
Our lawmakers do have the power to change where we’re heading. Quite possibly, most of them don’t particularly care — but they do care about votes. Call them up, ask them why they want you to suffer through 100°F during lunch. Then ask why you should vote for them.
The US spends almost twice as much as other high-income countries on health care, and yet has consistently poorer results in many areas, with the lowest life expectancy and highest infant mortality rate of all developed countries. A new study analyzed why this happens, and what can be done to improve it.
The US spends about three times more on healthcare, per capita, than the UK. Image credits: Papanicolas et al, 2018 / JAMA.
Despite this, the US still lags behind all other developed countries when it comes to the quality of said healthcare. Image credits: Papanicolas et al, 2018 / JAMA.
With only 2.9 beds per 1,000 people, the US falls way below other developed countries, especially compared to Japan’s 13.2 and Germany’s 8.2. Similar figures pop up for many metrics relating to healthcare availability and efficiency. However, on a per capita basis, the US spends much more than any other country: $9451 in 2015, compared to Germany’s $5267.
The US is also the only developed country which doesn’t offer universal healthcare.
Of course, much ink has been spilled over health care in the past decades, and the causes are complex and difficult to thoroughly assess. But in a new study, Harvard researchers took on that gargantuan task. This is what they found:
In 2016, the US spent nearly two times more than other high-income countries on healthcare.
Despite this, the country had significantly poorer health outcomes in many areas. Out of all the developed countries, the US had the lowest life expectancy and highest infant mortality rate,
Contrary to popular belief, high utilization of healthcare services and low spending on social services are not the main reasons for the costs and lack of efficiency.
Instead, the main drivers of higher healthcare spending in the U.S. are generally high prices, particularly for medical devices and pharmaceuticals. The US spends much more than other countries on planning, regulating, and managing health systems and services.
Other causes of unneeded spending are the overuse of expensive health services, low social spending, and the lack of an adequate number of primary health physicians.
The US also pays higher salaries for nurses and physicians (on average).
The good news is that despite poor overall outcomes, when people are sick, the quality of delivered healthcare is quite high.
The main problem, researchers say, is that most policies regarding health care have focused on utilization. However, the authors write that “efforts targeting utilization alone are unlikely to reduce the growth in health care spending”. Instead, an effort to reduce prices and administrative costs is needed.
“We know that the U.S. is an outlier in healthcare costs, spending twice as much as peer nations to deliver care. This gap and the challenges it poses for American consumers, policymakers, and business leaders was a major impetus for healthcare reform in the U.S., including delivery reforms implemented as part of the Affordable Care Act,” said senior author Ashish Jha, a professor at the Harvard Global Health Institute (HGHI).
“In addition, the reasons for these substantially higher costs have been misunderstood: These data suggest that many of the policy efforts in the U.S. have not been truly evidence-based.”
Several studies have already found that counterintuitive measures, such as increasing social spending, can actually reduce expenses in the long term. However, while the US spends a bit less on social care than other countries, it’s not necessarily an outlier. The study also contradicts several common beliefs, such as the idea that America uses more healthcare services than peer countries (it actually has lower rates of physician visits and days spent in the hospital than other nations) and that the quality of healthcare is always lower than in other countries. The US actually has excellent healthcare for those who have heart attacks or strokes but is below average in avoidable hospitalizations for things like diabetes and asthma.
The problem is that despite investing heavily in health care, Americans don’t have access to the quality they’re paying for. This is an old, systemic problem for the country, but the good news is that it can be fixed, researchers conclude. What’s needed is a reduction in unnecessary costs and an investment in the areas where the country is still lagging behind.
“As the U.S. continues to struggle with high healthcare spending, it is critical that we make progress on curtailing these costs. International comparisons are very valuable–they allow for reflection on national performance and serve to promote accountability,” said first author Irene Papanicolas, visiting assistant professor in the Department of Health Policy and Management at Harvard Chan School.
Journal Reference: “Health Care Spending in the United States and Other High-Income Countries,” Irene Papanicolas, Liana R. Woskie, Ashish K. Jha, JAMA, online March 13, 2018, doi: 10.1001/jama.2018.1150
Non-medical vaccination exemptions and wide misinformation on their efficiency are pulling America back into endemic measles outbreaks, a paper reports.
Back of female with measles. Image credits Wellcome Trust.
The US took great pains (in the form of strict, nationwide vaccination campaigns) to eliminate measles back in 2000. Luckily, these efforts proved fruitful. Outbreaks did spring up here and there, mostly from people who travel to and from other countries, but they numbered a few dozens, upwards to a few hundred cases yearly. Which is a really small number. Overall, however, the measles virus was considered to no longer be endemic (present in the country) since the turn of the millennia.
But rejoice not! The US is slowly inching back to pre-2000 days, when the measles virus roamed free and deadly, researchers from the Stanford and Baylor College of Medicine warn. At the heart of the issue are non-medical vaccine exemptions and non-medical delays, coupled with wide public misinformation about vaccines.
A high toll
The two researchers, Nathan Lo, Bs. and Dr. Peter Hotez, MD., PhD., report that a 5% decrease in measles-mumps-and-rubella (MMR) vaccination rates among kids aged 2-11 would triple measles cases in the age group and end up draining the public health system some $2.1 million in additional costs. But wait, it gets even better/worse — ages 2-11 make up only about a third of measles cases in current outbreaks, but it was the only age interval the researchers had sufficient data to work with. They fully expect those numbers to become much higher once enough data to model “social mixing and immunization status of adults, teens, and infants under two” becomes available.
“The results of our study find substantial public health and economic consequences with even minor reductions in MMR coverage due to vaccine hesitancy and directly confront the notion that measles is no longer a threat in the United States,” they write.
The duo says they conducted this study out of concern for growing vaccine hesitancy and use of non-medical exemptions — both largely driven by shoddy data or outright lies pertaining to the safety of vaccines, and the downplaying of just how dangerous these diseases can be.
And measles is up there on the dangerous scale. The virus is ridiculously infectious, and can keep on floating in the air hours after a carrier coughed or sneezed. Those infected develop high fevers, skin rashes, inflamed eyes, and flu-like coughs and runny nose. About 30% of cases also come with highly desirable complications such as pneumonia, brain swelling, even blindness. While this does make it really simple to spot someone sick so you can stay away, carriers can spread the virus days before symptoms pop up.
Get your kid vaccinated!
So if the Eyeball Mk.1 we all come pre-equipped with can’t spot the danger, what do we do to stay safe? Well, we immunize the herd. So to speak. Basically, the idea behind herd immunity is to make such a large proportion of the population (around 90 to 95% of everybody) immune to the virus that it simply won’t be able to spread around effectively. There aren’t enough viable carriers to take spread it around.
It’s an all for one and one-starts-an-epidemic scenario. If immunity levels drop below that percentage, a single infected individual has a much higher chance of starting an outbreak — which, in turn, will have a much easier time infecting huge numbers of people. The bad news is that in many areas of the US, immunity levels are just shy of falling below that range, and vaccination rates still keep going down. Some 18 states allow parents to forego vaccination on the ground of personal beliefs, and almost all (except Mississippi and West Virginia) allow for religious and/or philosophical exceptions, according to the NCLS.
So, to get a feel for what these exceptions will do in the long run, the duo mathematically modeled the way measles spreads based on the virus’ known behavior, data on current vaccination rates from the CDC, and the “social mixing patterns” of kids aged 2-11. To get a rough estimate of the costs these outbreaks will take on the health system, they factored in stuff like medical staff wages, the cost of laboratory analyses, and money spent on outbreak surveillance. Each measles case, they estimate, costs about $20,000.
They then checked and calibrated their model based on data from past measles outbreaks from the US and UK. After they made sure their model works, they pushed up the vaccine exemption rate from 1% to 8% to see what would happen. Unsurprisingly, larger the exemption rates led to more cases and bigger outbreaks. Eliminating the exemptions however would take MMR coverage in the US to 95%, a very comfy percentage when talking about herd immunity.
In other words, when you chose not to vaccinate your kid, you’re putting both his health and that of others at risk.Stop believing what stupid stuff people say, believe, or write on shady websites over what your physician spent years learning in med school.
This U.S. flag is only a couple nanometers wide or thousands of times thinner than a human hair. It’s practically invisible to the human eye and the tiniest American flag ever. This pattern appeared unexpectedly when researchers heated the “stripe” material molybdenum ditelluride. Credit: University of Texas at Dalla.
Captioned above is an intriguing, two-dimensional sheet that’s just about to transform into an array of nanowires, each just a few atoms wide. This image shows how the material momentarily morphs into the familiar Old Glory. Completely by accident, the researchers made the tiniest U.S. flag ever. Moreover, this patriotic material could lead to transistors that are ten times smaller than the current state of the art.
An unexpected discovery
A team from the University of Texas at Dallas was investigating new materials whose properties look promising as small, energy-efficient transistors for tomorrow’s next generation electronics. One such material was an anatomically-thin sheet made of one layer of molybdenum atoms and two layers of tellurium atoms. It belongs to a class of materials called metal dichalcogenides (TMDs), which have the potential to replace silicon in transistors. Silicon has had a fantastic run but it’s increasingly becoming limited. If a good replacement isn’t found soon, Moore’s Law — the observation that the number of transistors in a dense integrated circuit doubles approximately every two years — will cease to become relevant.
“We wanted to understand the thermal stability of this particular material,” said lead researcher Dr. Moon Kim in a statement. “We thought it was a good candidate for next-generation nanoelectronics. Out of curiosity, we set out to see whether it would be stable above room temperature.”
Kim and colleagues cranked the heat to 450 degrees Celsius and almost immediately two things started happening.
First, the researchers were surprised to see a new pattern emerge that was “aesthetically pleasing to the eye,” Kim said. The repeating rows or stripes of molybdenum ditelluride transformed into a shape that resembled six-pointed stars. Later, the researchers learned that, in fact, the material was transitioning into hexa-molybdenum hexa-telluride, a huge mouthful that’s a one-dimensional wire-like structure. Essentially, it’s a structure consisting of six central atoms of molybdenum surrounded by six atoms of tellurium.
The “stripes” and “stars” looked very much like the United States flag so the researchers made a false-color version with a blue field behind the stars and half of the stripes coloured in red to much effect. It’s a nanoflag!
“Then, when we examined the material more closely, we found that the transition we were seeing from ‘stripes’ to ‘stars’ was not in any of the phase diagrams,” Kim said in a statement. “Normally, when you heat up particular materials, you expect to see a different kind of material emerge as predicted by a phase diagram. But in this case, something unusual happened — it formed a whole new phase.”
Each of these nanowires from the array is a semiconductor, meaning it can switch current on and off — a basic property for any transistor. But when these individual nanowires are group together though they start behaving like a metal.
“We would want to use the nanowires one at a time because we are pushing the size of a transistor as small as possible,” Kim said. “Currently, the smallest transistor size is about 10 times larger than our nanowire. Each of ours is smaller than 1 nanometer in diameter, which is essentially an atomic-scale wire.”
“If used in future technology, would result in powerful energy-efficient devices,” Kim said.
It’s interesting to note that the material’s behavior could not be predicted in theory. When subjected to varying external conditions such as temperature or pressure, certain materials will undergo a phase change. One classic example is water’s phase change from a liquid into vapor when it’s boiled at temperatures in excess of 100 degrees Celsius (at sea level). Some materials, however, undergo phase changes differently, namely the atoms rearrange and redistribute changing the structure and composition of the material. This can, of course, affect the material’s properties. Scientists have made so many experiments to phase changing materials that there’s enough data to make a so-called phase diagram which can predict property changes in a material when it changes phase.
However, no phase diagram could predict this exotic material’s behaviour. Next, Kim and colleagues plan on running more tests meant to determine how to turn the material into a functioning material. Particularly challenging will be separating out the individual nanowires. They would then “But this is a start,” Kim said.
Concentrated poverty is on the rise in the US again, with the number of neighborhoods where 40% or more of the population lives below the federal poverty levels of all races increasing for the first time since the 1990s, Penn State demographers report.
Venice Beach, California. Image credits Thomas Galvez / Flickr.
While general poverty levels only look at how many people live on less than a standard income in a particular place, the concept of poverty concentration takes into account how poverty is spread out throughout an area. Poverty on its own is really bad news, but concentrated poverty makes things a lot worse for everyone — it’s a cascading effect of ever-less money available in the community, meaning health services, educational services, and other civic institutions work with reduced efficiency or grind down altogether. Concentrated poverty amplifies the poor’s struggle by making society around them poorer, less able to help, in a self-reinforcing cycle.
And it’s on the rise in the US for the first time in two decades, warns John Iceland, a professor of sociology and demography at Penn state and research associate at the Population Research Institute. Using data gathered by the U.S. Census Bureau from 1980-2000 and data gathered through the American Community Survey from 2000-2014, the team says concentrated poverty, which saw a rise in the 1980s and gradually eased during the 1990s, is making a comeback throughout all demographics in the US.
Iceland points to growing residential separation and isolation of the poor from the rest of American society in metropolitan areas, as well as an overall increase in poverty since the early 2000s as the biggest factors driving this rise.
“I personally was curious about this volatility — what explains it? Why did we see this increase in the 1980s and the decline in the 1990s and why has it been rebounding?” said Iceland.
“As a social demographer, I’m particularly interested in the changing composition of people living in certain neighborhoods and what types of broad population processes help explain the general trend.”
Not only is the US experiencing a rise in concentrated poverty levels, but it’s also undergoing a shift in who and where is getting the worst of it. The authors note that the demographics, as well as the location of high-poverty neighborhoods, has changed since the 1990s.
“It used to be thought of as black, inner-city poverty, but now more Hispanics and a higher proportion of whites are living in high-poverty neighborhoods,” Iceland said. “They are less likely to be just in the inner core of cities, but oftentimes in inner suburbs.”
“We find that changes in the segregation of the poor explained the largest share of the change in concentrated poverty over most of the time period, with the exception of the 1990s, where the plunge in both black and white poverty rates had the largest role in explaining the considerable decline in concentrated poverty in that decade for both groups.”
Poverty and poverty concentration are different concepts but it’s possible the two are related, Iceland added. Working together with sociology and demography graduate student Erik Hernandez, Iceland also looked at how fluctuations in overall poverty affected its concentration throughout the US.
“There could be a certain percentage of the population in a country that is poor, but what the concentration of poverty looks at is to what extent are they concentrated in relatively few neighborhoods,” he said.
They found that poverty concentration followed the trends set by overall poverty. The country’s recent economic hardships, such as the 2006-2008 recession, has pushed up individual poverty, neighborhood-wide (social) poverty, the overall percentage of people and that of poor people living in high-poverty neighborhoods, the researchers said.
In the 2000s, some 20.5% of poor blacks lived in high-poverty neighborhoods, a figure which increased to 23.1% between 2010 and 2014. For poor non-Hispanic whites, that number went from 5.8% to 8.2% during this time. Overall, the total percentage of poor Americans living in high-poverty neighborhoods went from 11.4% to 14.1%. This concentration can affect governmental services — health, police, education — as well as limit job opportunities, further impoverishing those living in the affected areas.
“A lot of resources are tied to neighborhoods — the quality of schooling and the amount of a school’s economic resources vary across neighborhoods, for example,” said Iceland.
“People have talked about how there’s more crime and social disorganization in places with high poverty levels. And this all has consequences for quality of life.”
The full paper “Understanding trends in concentrated poverty” has been published in the journal Social Science Research.
China hopes to usher about breakthroughs in the field of high-performance processors and other key systems by building the first exascale supercomputer said Meng Xiangfei, director of applications at the National Super Computer Tianjin Centre, on Monday.
When they say supercomputer, they mean it. Dubbed Tianhe-3 (meaning ‘Heavenriver-3’), the supercomputer would make any device you’re reading this on cry digital tears of shame. Operating in the ‘exascale’ means that the system will be able to handle a quintillion (1018) calculations each second.
The prototype device is expected to be ready in 2018, with full operational capability scheduled for 2020.
One computer to rule them all
There has been somewhat of a supercomputer ‘arms-race’ going on, with China and the US both vying for supremacy in the computational arena. In 2013, the US could boast 252 of the most powerful 500 supercomputers in the world, dwarfing China’s 66. But the same year China’s Tianhe-2 wrestled for the title of the most powerful supercomputer from Oak Ridge’s Titan, outperforming it by an almost 2-to-1 factor.
Titan. Image credits Oak Ridge National Laboratory.
Still, the system was not without its limitations. First of all was its huge power bill. Critics also pointed out to the lack of usable software as its Achilles’ Heel. Since the main drive was on developing the hardware, users would often have to write their own programs.
“It is at the world’s frontier in terms of calculation capacity, but the function of the supercomputer is still way behind the ones in the US and Japan,” Chi Xuebin, deputy director of the Computer Network and Information Centre under the Chinese Academy of Sciences, said about Tianhe-2 in 2014.
“Some users would need years or even a decade to write the necessary code”, he added.
Since then, China has made huge progress. It ranked first for the total number of top supercomputers in June 2016, holding 168 of the top 500 systems and briefly overcoming the US. By November 2016, the two contenders were evenly matched, with 171 systems each. China has the most powerful one in the world — the Sunway TaihuLight, the country’s first supercomputer built with domestically designed processors that clocked in at 125 quadrillion calculations per second — but it’s second by total computational power. You can check out the ebb and flow between the two countries on Top500. It’s pretty cool.
Tianhe-3 is expected to crown the country’s achievements. The delivery date puts China in first place on the exascale race, overtaking the US’s ECP (Exascale Computing Project), which aimed to produce the first device of this kind by 2021, by a full three years prototype-wise and one year by full operational capability.
“Its computing power is on the next level, cementing China as the world leader in supercomputer hardware,” said Meng Xiangfei.
Sunway Taihulight. Image credits Top500.
But it’s not only about flexing their industrial muscles. Because supercomputers can crunch calculations that would make mere computers give up and blue-screen, access to this class of devices opens up a lot of possibilities for researchers. Tianhe-1 for example, the first Chinese system to pass the 1-quadrillion (1015) calculations per second mark, is now busy solving more than 1,400 tasks each day, furthering research in fields from biology to astronomy.
Tianhe-3 is expected to be 100 times faster than its grandfather, and ten times as powerful as the Sunway. It will be available for public use and will “help us tackle some of the world’s toughest scientific challenges with greater speed, precision, and scope”, Meng added. It’s already been earmarked to analyze smog distribution throughout China, as current systems can only handle the models on a district-level, China Daily reported.
It will also be powerful enough to simulate earthquake and epidemic patterns in greater detail than ever before, improving the government’s ability to respond to such events, Meng added. Alternatively, it can be used to unravel genetic sequences and protein structures with unprecedented scale and speed — data which can be used to create more efficient medicine in the future, Meng said.
Tianhe-3 will be produced using only domestical sources, with Chinese industry and know-how supplying everything from the processors to the operating system.
And finally, this video the American Meteor Society put together shows the meteorite’s estimated trajectory and visibility range. It also places the rock’s final resting place in the middle of lake Michigan.
If there’s any bit of the meteorite that didn’t burn, that’s most likely where it ended up. But considering it was probably really small to begin with (not much larger than a baseball or a football), there’s slim chances anything survived the burn and the crash.
A trucking accident on the Dodge County highway revealed the US livestock industry’s sweetest secret — farmers have been feeding cows defective Skittles on the down low to avoid paying for corn.
Image credits Dodge County Sheriff’s Office / Facebook.
Wisconsin cattle farmers are in a sticky situation with customers after a truck spilled thousands of Skittles on County Highway S intended as animal feed. The candy, all colored in pink and carrying the brand’s distinctive white ‘S’, didn’t meet quality standards and was actually cheaper than corn.
The Sheriff’s department reported that the Skittles were boxed up in the back of a flatbed truck. Due to rain, the crates got wet and slipped onto the road and broke apart, spewing candy everywhere. Highway maintenance teams were deployed to dispose of the sweets.
In wake of the finding, public voices raised concern that this practice would negatively impact the quality of meat. Experts, however say that there’s no cause for alarm — as the practice has been going on for a few years now. The candy is not only cheaper than traditional feed (especially the defective ones) but they may actually provide a host of other benefits.
“Cows need carbohydrates, as well. They need sugar. It provides energy and calories for them,” said Liz Binversie, Brown County UW-Extension Agriculture Educator. “Your body doesn’t really distinguish candy vs syrup vs corn vs whatever,” she added.
“It actually has a higher ratio of fat (than) actually feeding them straight corn,” said Joseph Watson, owner of United Livestock Commodities, who swapped for candy during the 2012 drought when corn prices skyrocketed.
And some argue that the practice may also be more environmentally friendly than using traditional feed and throwing these candies out. John Waller, a professor of animal nutrition at the University of Tennessee, said for NBC:
“I think it’s a viable (diet).”
“It keeps fat material from going out in the landfill, and it’s a good way to get nutrients in these cattle. The alternative would be to put (the candy) in a landfill somewhere.”
I do see his argument — waste is nobody’s friend. But how can Skittles be ‘defective’? It’s candy. It’s supposed to be sweet, and that’s all it has to be. If you look at the big enough picture, producing a pound of the stuff has a higher impact than producing a pound of corn. It makes sense to feed it to cows rather than dumping them, sure, but I’d rather not have to make the choice in the first place.
Still, with corn prices at an all time high, it’s unlikely that the farmers will wean off candy any time soon. At the end of the day the cows get to chow on some sweets and I guess that’s nice.
Another upside to the whole story is that the Dodge County Sheriff’s posts about the incident are pure gold:
They later said that the crash actually helped, since the roads had been icy for days and the candy provided “extra traction”.
A Texas hospital has performed four uterus transplants from live donors, one of which was successful. This marks the first occasion such a procedure has been performed in the US.
Image credits Hey Paul Studios ? Flickr.
The four women underwent transplants in September. They all had a condition called Mayer-Rokitansky-Küster-Hauser syndrome, which caused them to be born without a uterus. So far, three of the organs had to be removed as they weren’t getting enough blood flow, and the doctors feared the possible complications which might have developed. The fourth patient is stable and the procedure seems to be doing well so far. A statement by BaylorScott&White, the hospital who performed the transplants, says that the surgical team was “cautiously optimistic” that the fourth uterus would be functional.
“This is the way we advance, from learning from our mistakes,” lead surgeon at Baylor University Medical Centre in Dallas, Giuliano Testa, told Time.
“I am not ashamed of being the one who will be remembered as the guy who did four [transplants] in the beginning and three failed. Even if through failure, I am going to make this work.”
The procedure is still experimental and there’s a high failure rate. More research and operations are needed before it’s deemed safe. Even if the surgery becomes widely available, it’s very likely it will take a huge financial toll on potential patients. But, for women born without a uterus or for those who’ve had it damaged or removed, undergoing such a transplant might be their only change at getting pregnant and having children. Here’s a basic run-down:
[panel style=”panel-success” title=”Here’s the basic rundown” footer=””]Surgeons take the uterus and part of the vagina from a donor, living or deceased.
This is then implanted in the patient. Surgeons connect the uterus to the body’s circulatory system, and attach it along the vagina and pelvis. No nerves need to be attached.
In case of a successful transplant, the patient should be able to safely get pregnant in about 6 to 12 months’ time. In virto fertilisation will be used (as the uterus is not connected to the ovaries).
The woman will have to deliver via a C-section.[/panel]
At this point, the doctors are sadly not sure if the fourth case will be a success or not.
UK doctors plan to perform the procedure using non-living donors in the near future, but for now, Sweden is the only country apart from the US where such transplants have succeeded. The nine procedures in Sweden used live donors — like the Texas ones — and some of the women went on to have children. The experts helped the TBaylor team during the operations.
With the Zika virus running rampant through South America, outbreaks could pop up in several US cities. A study from the National Center for Atmospheric Research (NCAR) estimated this hazard in the largest cities in the US, finding that the south and especially the southeast is quite vulnerable to the threat posed by Zika.
Many US cities face potential risk in summer of low, moderate, or high populations of the mosquito species that transmits Zika virus (colored circles). The mosquito has been observed in parts of the United States (shaded portion of map) and can establish populations in additional cities because of favorable summertime meteorological conditions. In addiiton, Zika risk may be elevated in cities with more air travelers arriving from Latin America and the Caribbean (larger circles). This image is freely available for media & nonprofit use. Credit: Image based on data mapped by Olga Wilhelmi, NCAR GIS program.
Key factors can combine to produce a devastating Zika outbreak, and those unfortunate conditions may very well allign in some American cities. The Aedes aegypti mosquito, which is spreading the virus in much of Latin America and the Caribbean, will start moving more and more to the north as the weather warms up. The east coast is in a similar situation, with higher temperatures than most of the ocuntry. Summertime weather conditions are highly favorable for mosquito populations as far north as New York City and across the southern tier of the country as far west as Phoenix and Los Angeles, NCAR models showed. Wintertime conditions are too cold across all the country, bar southern Florida and Texas. However, it’s especially these (often impoverished) areas that especially vulnerable in the case of an outbreak.
“This research can help us anticipate the timing and location of possible Zika virus outbreaks in certain U.S. cities,” said NCAR scientist Andrew Monaghan, the lead author of the study.
“While there is much we still don’t know about the dynamics of Zika virus transmission, understanding where the Aedes aegypti mosquito can survive in the U.S. and how its abundance fluctuates seasonally may help guide mosquito control efforts and public health preparedness.”
“Even if the virus is transmitted here in the continental U.S., a quick response can reduce its impact,” added NCAR scientist Mary Hayden, a medical anthropologist and co-author of the study.
The study doesn’t propose a fixed chance for this year, but even in the case of an outbreak, it wouldn’t be as dramatic as it was in South America. A higher percentage
of Americans live in air-conditioned conditions or in sealed offices, and green areas and parks are often sprayed with insecticide. But this doesn’t mean that there is
Aside for meteorological conditions, poverty and lack of access to proper sanitation also favorize the spread of the virus. Add to this the higher mobility of people
from the US, and you could end up with a recipe for disaster. All in all, this is a complex issue, and a disease we don’t properly understand yet. If we want to avoid
a global outbreak, basic precautions have to be set.
“The results of this study are a step toward providing information to the broader scientific and public health communities on the highest risk areas for Zika emergence in the United States,” said Kacey Ernst, an epidemiologist at the University of Arizona and co-author of the study. “We hope that others will build on this work as more information becomes available. All areas with an environment suitable to the establishment of Aedes aegypti should be working to enhance surveillance strategies to monitor the Aedes aegypti populations and human populations for disease emergence.”
“This research highlights the complex set of human and environmental factors that determine whether a mosquito-borne disease is carried from one area to another, and how severely it affects different human populations,” said Sarah Ruth, program director in NSF’s Division of Atmospheric and Geospace Sciences. “By integrating information on weather, travel patterns, mosquito biology, and human behavior, the project team has improved our ability to forecast, deal with, and possibly even prevent future outbreaks of Zika and other serious diseases.”
Andrew Monaghan, Cory Morin, Daniel Steinhoff, Olga Wilhelmi, Mary Hayden, Dale Quattrochi, Michael Reiskind, Alun Lloyd, Kirk Smith, Christopher Schmidt, Paige Scalf and Kacey Ernst. On the seasonal occurrence and abundance of the Zika virus vector mosquito Aedes aegypti in the contiguous United States. PLOS Currents Outbreaks,
While most of the world is trying to reach a climate agreement that would help preserve our planet’s climate for future generations, some of the US presidential candidates just don’t get it – or don’t want to get it.
Despite an overwhelming scientific consensus (97% of climate scientists), despite obvious effects and forecasts, and despite economic indicators, many of the Republican candidates for presidency in the US just don’t seem to get it.
While George Bush completely ignored climate change, his brother goes even further — “I’m a skeptic. I’m not a scientist” seems to be his catchphrase. But he went even further, attempting to discredit climate science.
“I think the science has been politicized. I would be very wary of hollowing out our industrial base even further … It may be only partially man-made. It may not be warming by the way. The last six years we’ve actually had mean temperatures that are cooler [this is not correct]. I think we need to be very cautious before we dramatically alter who we are as a nation because of it.”
He also doesn’t seem to understand that everybody is allowed to have their own personal view – but when you’re in a leadership position, you need to take into account the cold hard facts.
“I think global warming may be real. … It is not unanimous among scientists that it is disproportionately manmade. What I get a little tired of on the left is this idea that somehow science has decided all this so you can’t have a view.”
Unfortunately, it’s not the first time we’ve covered Ted Cruz and his anti-science actions. After trying to prevent NASA from studying climate change, he is absolutely denying mountains of climate change related evidence.
“I just came back from New Hampshire where there’s snow and ice everywhere. And my view actually is simple. Debates on this should follow science and should follow data. And many of the alarmists on global warming, they’ve got a problem because the science doesn’t back them up. And in particular, satellite data demonstrate for the last 17 years there’s been zero warming, none whatsoever.”
His view is definitely simple – simple and wrong. But wait, you may be thinking that that’s somehow taken out of context… except unfortunately, it’s not. He keeps on repeating the same type of things over and over again, simply ignoring the factual reality.
“The last 15 years, there has been no recorded warming. Contrary to all the theories that they are expounding, there should have been warming over the last 15 years. It hasn’t happened.”
Rand Paul has generally been careful not to alienate any of his voters, and he never really went full-on climate change denier, at least not like Cruz or Donald Trump. But his views, although indirectly expressed, are evident.
“There is some question as to the validity of the science” that shows carbon emissions cause climate change.
He went on to admit that climate change is happening, but doesn’t believe it’s because of humans.
“While I do think man may have a role in our climate, I think nature also has a role,” Paul said. “The planet’s 4.5 billion years old. We’ve been through geologic age after geologic age. We’ve had times when the temperature’s been warmer, we’ve had times when the temperature’s been colder. We’ve had times when the carbon in the atmosphere’s been higher.”
Image via Wikipedia.
Ah yes, we’re finally here – the king of climate change denial, the tsar of preposterous statements, the master of ignoring facts, Donald Trump.
“You can’t watch the news anymore. It’s always weather,” said Mr. Trump, who added that China hasn’t done anything to address climate change – spoiler alert, they have.
Climate change is a hoax in his view, created by the Chinese… somehow.
The concept of global warming was created by and for the Chinese in order to make U.S. manufacturing non-competitive.
Wait, it gets even better:
It’s freezing and snowing in New York—we need global warming! This very expensive GLOBAL WARMING bullshit has got to stop. Our planet is freezing, record low temps,and our GW scientists are stuck in ice.
Not one to be easily outdone, Ben Carson simply stated:
“Gravity, where did it come from?”
I’ll just leave it at that…
Marco Rubio acknowledges that climate change is happening, but doesn’t believe that humans are causing it.
“I do not believe that human activity is causing these dramatic changes to our climate the way these scientists are portraying it.” Rubio’s energy plan would “effectively nullify an international climate change accord.”
Some would rather let the facts talk for themselves and not speak about it – such is the case with Chris Christie, whose measures in New Jersey include closing the Office of Climate Change. However, out of all people on this list, Christie is probably the most moderate, suggesting that we should invest in all types of energy – including solar and wind.
“We worked with the private sector to make solar affordable and available to businesses and individuals in our state,” he said during Wednesday’s Republican debate.
Kasich is a strange case – deeply religious, he at one point said that climate change is a real problem requiring us to “protect” the “creation that the Lord has given us. However, he went on to say that while he does believe in climate change (he has actually oscillated on this issue), he doesn’t want to do much about it – “we shouldn’t worship the environment,” he concluded.