Tag Archives: temperature

Last year was the warmest humanity ever recorded without El Niño, NASA warns

The Earth continues to heat up according to a NASA analysis that revealed 2017 was the second warmest year since global estimates became possible in 1880. More worryingly, it was only second to 2016, which saw increases in temperature caused by El Niño, the warm phase of the cyclical El Niño Southern Oscillation (ENSO). When correcting for the phenomenon, 2017 takes the lead.

Candles.

Image credits Cuddy Wifter.

The year continued the ignoble, decades-long warming trend of the globe — 17 out of the 18 warmest years on record have occurred between 2001 and today. NASA’s Goddard Institute for Space Studies (GISS) reports that globally-averaged temperatures in 2017 were 1.6° Fahrenheit higher than the 1951-1980 mean.

It came second as the warmest-ever recorded year behind 2016. However, temperatures then were bumped up by El Niño, which pushes warmer fronts of water from the western tropical Pacific Ocean towards the coast of South America. This movement of warm water causes warming effects across the globe.

When El Niño is factored out, 2017 becomes the warmest year ever recorded by humanity.

The Earth warmed up overall, but weather dynamics mean that this effect wasn’t homogenous across the face of the planet — as such, different locations experienced different amounts of warming.

“Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we’ve seen over the last 40 years,” said GISS Director Gavin Schmidt.

The strongest warming trends were seen in the Arctics, which continued to bleed ice cover and volume in 2017, the report adds.

National Oceanic and Atmospheric Administration (NOAA) researchers produced an independent analysis on the subject, which strongly confirms NASA’s findings, with the exception that NOAA lists 2017 as the third warmest year on record.

The two agencies used different methods to analyze temperatures across the globe, which created this small difference in ranking. NASA tracks temperatures using a combination of data from 6,300 weather stations, ship- and buoy-based recordings of sea-surface temperatures, as well as research stations in the Antarctic. The algorithm that data is fed through was designed to consider sources of interference to produce the global average temperature deviations from the baseline period of 1951 to 1980, according to NASA.

This processing (and the fact that NOAA uses its own algorithms) is why the two agencies’ results diverge slightly. However, the findings reported in both documents largely overlap, and both agree that the five warmest years on record have taken place since 2010.

Temp recording over time.

They say an image is worth a thousand words.
Image credits NASA.

“NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth’s polar regions and global temperatures,” the NASA report reads.

NASA estimates that the global mean change in temperature they calculated is accurate within 0.1°F with a 95% certainty level — which is very solid. Any uncertainty arises due to changes in measurement practices over time, and to some weather stations being relocated over the studied period.

 

Mistastin lake.

Gemstones prove asteroid impact was the hottest event ever recorded on Earth

Snow-clad Canada boasted, for the briefest of moments, the highest temperature ever recorded on Earth. The blistering event, clocking in at some 2370°C (4300°F), was the product of a chunk of space rock crashing down on the planet. Researchers looking into the event say we had no idea “real rocks can get that hot,” underscoring how devastating meteorite impacts can be.

Mistastin lake.

Image credits Jcmurphy / Wikimedia.

Some 40 million years ago in what we now call Canada, an otherworldly visitor was dropping in for a visit. This meteorite crashed with such speed and force that it heated rocks at the impact site to over half the temperature of the Sun’s surface — some 2370°C (4300°F). This event, which has the distinction of being the highest temperature witnessed on our planet, was recorded in gemstones formed under that immense release of heat.

Falling skies

Impacts between space-faring matter and the Earth release a monumental amount of energy — which leads to some mind-bogglingly high temperatures developing in the collision zone. In fact, the more energetic of such impacts (those which involved larger pieces of space rock) shaped the planet into what we know today. They affect the chemical makeup of the atmosphere and crust, directly playing a part in Earth’s habitability, and could even have created the Moon.

Pinning an actual number on these temperatures, however, is a tricky proposition. For one, notable impacts took place a very long time ago, on the order of millions of years. Secondly, because they’re near apocalyptic events, shock waves released during impacts tend to literally vaporize both the meteorites and the rocks they hit. Which is really bad news if you’re trying to analyze those rocks and estimate how much heat they were subjected to.

So researchers have an idea of the upper extremes these temperatures can reach — estimated to be well over 2000°C (3632°F) — but no way to refine that estimate since there’s no geological evidence to test it against. After all, you’d need something that can shrug off an event of such magnitude it turns rocks into thin air.

Thankfully, a team led by Nicholas Timms, a senior researcher and lecturer of Geology at the Curtin University in Perth, Australia, has found one such substance. Their work focused on the 28-kilometer (17.4 miles) wide Mistastin Lake crater in Labrador, Canada, estimated to have been the site of a violent impact some 40 million years ago. The team reports that there was enough energy released during the impact to fuse rock-borne zircon into gem-like cubic zirconia, whose minimum formation temperature is 2370 °C.

“These new results underscore just how extreme conditions can be after asteroids strike,” Timms says.

“Nobody has even considered using zirconia as a recorder of temperatures of impact melts before. This is the first time that we have an indication that real rocks can get that hot.”

The finding showcases how extreme conditions can become in the few minutes after an asteroid impact, and offer us a sort of benchmark for shelters if we’re ever faced with such an event — for which we may be long overdue.

It also helps offer us a glimpse into the environment of early Earth, which was constantly and repeatedly bombarded from space. The team says these impacts could have churned the crust enough to keep hydrogen, carbon, and sulfur in circulation in the atmosphere. These elements are fundamental to life as we know it (you need to mix both hydrogen and oxygen to get water). However, they point out that too severe a bombardment would have negatively impacted the planet’s climate and chemical balance, making it less habitable in the long run.

The paper “Cubic zirconia in >2370 °C impact melt records Earth’s hottest crust” has been published in the journal Earth and Planetary Science Letters.

Satellite.

Corrected satellite data shows 2.4 times faster warming than previously indicated

New findings could throw (another) huge wrench in the workings of climate denier rhetoric. After correcting for an error in satellite data acquisition, scientists report not only that global warming taking place in the lower atmosphere — but that it’s way worse than we thought.

Satellite.

Beep beep.
Image via Pixabay.

A big Achilles Heel for people who actually base their opinions on facts and real information (go team!) is that we’re swayed by facts and information. So, naturally, we were quite bummed and more than a little perplexed when the folks over in the climate denier corner (booo) first threw satellite data on the table: it seemed to indicate that the Earth’s lower atmosphere wasn’t warming as quickly as we thought. This was especially strange since it was the one piece that didn’t really fit in with any other data — all of which pointed to rising temperatures caused by human activity.

But now, we know why it didn’t fit in: it was wrong. A new paper explains why this happened and describes a method to correct for the errors, showing that, in fact, warming is way faster than we’ve believed.

Bad timing

Satellites don’t measure ground temperature directly — however, atmospheric temperature readings can be used to infer surface and sea temperatures, and a warmer troposphere (the lowest layer of the atmosphere, where weather takes place) is consistent with what climate models show would happen as greenhouse gas levels increase in the atmosphere.

The faulty data was caused by satellites’ tendency to gradually inch nearer to Earth as they’re slowed down by friction with the gasses in the upper atmosphere (orbit decay). These sats are programmed to re-record and compare temperature data at various points at the same time of day as they circle back again, and thus spot any difference in temperature — in time, this data can be used to find a general trend, be it cooling, warming, or a constant trend-line.

However, as they’re calibrated to record data based on a constant orbit, even slight changes to orbit have a significant effect on the figures they measure. Slight alterations may not impact the data very much, but for some satellites, it can make a huge difference — they might fly over spot they’re programmed to record at 2 pm as late as 6 pm, even 8 pm, with a massive effect on temperature readings.

Satellite passing times.

Equator crossing times for each of the satellites used in this study. Thinner lines indicate portions of mission that were not used for the study.
Image credits Mears et al., 2017.

Using information from the satellites, Dr Carl Mears and Frank Wentz of Remote Sensing Systems, a California-based research company, developed a way to correct for changes in data recording time. Correcting for satellite data is, in fact, a pretty common occurrence. We’ve only had the things a short while now, we’ve only been using them to measure temperature since the ’70s, and the first data sets were put together in the ’90s — so researchers have had to fix and correct for various teething problems while they learn how to use the things properly. This time the team had to make corrections:

“[…] primarily due to the changes in the adjustment for drifting local measurement time. The new dataset shows more warming than most similar datasets constructed from satellites or radiosonde [meteorological balloon] data,” the paper explains.

They used various approaches such as combining readings from different satellites which recorded the same spot both on mornings and evenings, using climate models to account for atmospheric temperature changes during the day, and including data from surface or weather balloon readings and other instruments to verify their data.

The corrected data set doesn’t paint a cool picture at all.

The results varied based on the methods the team used to correct initial data and the time of day when it was recorded, with temperature trends between 1979-2016 ranging from +0.13 C°/decade to +0.22 C°/decade. The duo says that the most ‘reasonable’ set of parameters was chosen, with a final temperature trend of +0.17 C°/decade. That’s over one-third (36%) faster than what previous satellite readings indicated in the 1979-1998 interval, and nearly 2.4 times (140%) faster for the 1998-2016 interval.

Overly under-estimated

A worrying implication of the new data set is that we may have underestimated how quickly the lower troposphere is warming from ground-level measurements alone — which means that climate, on the whole, may be warming up much faster than we’ve believed. Talking with Carbon Brief about the findings, data scientist Dr Zeke Hausfather (who wasn’t part of the study) said that “the new data actually shows more warming than has been observed on the surface, though still slightly less than predicted by most climate models.”

“Since the temperature changes since 1979 are on the order of 0.6C or so, it is relatively easy for bias, due to changing observation times, to swamp the underlying climate signal.”

All in all, ground measurements are way more reliable than satellite data since there are a lot more measurements take at ground level to compare between and exclude errors. In comparison, there are only a few satellites which can measure temperatures in a spot at any one time — a total of 15 temperature-reading satellites have been launched since 1979, and only about 2 or so are on active measuring duty at any one time. So errors such as the one this paper addresses can slip through quite easily.

But that also shows us we’re moving in the right direction. Correlating satellite data more closely with surface trends (which we know are reliable) only helps to increase confidence in the former.

The paper “A satellite-derived lower tropospheric atmospheric temperature dataset using an optimized adjustment for diurnal effects” has been published in the journal AMS.

An artist's conception of the KELT-9 system, which has a host star (left) that's almost twice as hot as our sun. Credit: NASA/JPL-Caltech/R. Hurt (IPAC).

Scorching exoplanet is hotter than most stars. It’s so hot it might even leave a trail of atomic gas like a comet

Some 650 light-years away from Earth lies one of the hottest planet known to astronomers. Called KELT-9b, the planet’s temperature runs in excess of 4,300°C (7,700°F) or only 1,300°C short of our sun’s surface temperature. For comparison, Venus, the hottest planet in our solar system and common illustration of what hell ought to look like registers only about 460°C at its surface.

An artist's conception of the KELT-9 system, which has a host star (left) that's almost twice as hot as our sun. Credit: NASA/JPL-Caltech/R. Hurt (IPAC).

An artist’s conception of the KELT-9 system, which has a host star (left) that’s almost twice as hot as our sun. Credit: NASA/JPL-Caltech/R. Hurt (IPAC).

A planet hotter than most stars we know of

This extreme gas giant was first discovered in 2014 by scientists working with the  Kilodegree Extremely Little Telescopes, or KELT for short. To find KELT-9b, researchers simply had to measure its parent star’s brightness as it dimmed when the exoplanet passed between the star and the telescope’s lens. These dips occurred extremely fast, about every 36 hours which means a year on KELT-9b only lasts about one and half days on Earth. No scientist must have expected to see absorption coefficients of molecules and atoms for exoplanets go beyond 3000 Kelvin, but here we are. That’s super extreme by many accounts. And that’s not all.

Unlike the planets of our solar system that all orbit around the sun’s equator, KELT-9b orbits around its parent star’s poles. It’s also tidally locked always facing its star with the same side like the moon does in relation to our planet. This means that the hottest part of the planet is the one facing the sun but the night-side is also very hot — hotter than Proxima Centauri, our nearest neighbor which has an exoplanet of its own.

The data from the observations was so flabbergasting that the researchers couldn’t tell at first whether this was a star or an exoplanet. Scott Gaudi, a professor of astronomy at Ohio State University, sided on the ‘this is a planet’ side and even made a bet with a colleague over a bottle of single-malt scotch which he apparently won.

So what would KELT-9b look like? Probably like Jupiter in a frying pan. At such extreme temperatures, molecular bonds are broken so everything is elemental atoms. There can’t be any methane, water vapor or CO2. The intense heat expands all this cloud of atomic gas like a soufflé —  a world nearly three times more massive than Jupiter but only half as dense, as reported in the journal Nature.

“Its day side would be very bright orange. Its night side would be very dark red. And it would have a cloud of evaporating hydrogen and helium, which would actually look violet,” Gaudi told NPR. 

Credit: NASA/JPL-Caltech/R. Hurt (IPAC).

Credit: NASA/JPL-Caltech/R. Hurt (IPAC).

That’s a very strange sight indeed, which is why some scientists are calling this a planet-star hybrid, with the important distinction that there is no nuclear fusion going on inside KELT-9b’s core. One wild hypothesis surrounding KELT-9b is it might have a tail akin to a comet’s, shedding hydrogen gas as it travels around its solar system. But that’s just a wild hypothesis that has no substantial proof to it whatsoever at this point. We might learn more if Gaudi and colleagues successful convince the right people to point the Hubble telescope at KELT-9b.

The hottest star in the universe we know of peculiarly orbits its parent star at the poles, not the equator. Credit: NASA/JPL-Caltech/R. Hurt (IPAC).

The hottest star in the universe we know of peculiarly orbits its parent star at the poles, not the equator. Credit: NASA/JPL-Caltech/R. Hurt (IPAC).

“KELT-9 radiates so much ultraviolet radiation that it may completely evaporate the planet. Or, if gas giant planets like KELT-9b possess solid rocky cores as some theories suggest, the planet may be boiled down to a barren rock, like Mercury,” co-author Keivan Stassun, professor of physics and astronomy at Vanderbilt, told Wired.

Whatever’s the case, KELT-9b shines brightly, almost like a star. It’s hotter than many K-type yellow-orange stars, after all. And as expected, it lives fast and dies young. It’s expected as KELT-9’s star runs out of hydrogen, it should cool and swell to three times its current size eating our hot tomato in the process. That’s in a billion years or so.

“As we seek to develop a complete picture of the variety of other worlds out there, it’s important to know not only how planets form and evolve, but also when and under what conditions they are destroyed,” said Stassun.

KELT-9 is certainly the hottest gas giant we’ve come across so far but it’s no the hottest exoplanet. That distinction belongs to Kepler 70b, a small, rocky planet with a surface temperature of 6,870 degrees Celsius (12,398 degrees F).

 

 

City sunset.

Urban heat island effect could almost triple the cost of climate change in cities, burn economies to a crisp

Cities might feel the heat of climate change almost twice as worse as the rest of the world due to the urban heat island effect, a new paper reports. Unless cities adapt to ensure more incoming energy is reflected or absorbed, this effect will put a huge dent in the economy, it further reads.

Tokyo.

Tokyo, one of the densest urban centers in the world. Not a place you want to be in on a warm day.

It’s a hot day, your apartment is stiffing, and the AC just doesn’t cut it — so you decide to take a stroll in the park by the water to cool down. Or maybe go on that long overdue hike and get a break from the city altogether. Congratulations! You’ve unknowingly felt the effects of and then counteracted the urban heat island effect.

There are millions more who, just like you, are feeling the heat. And that’s actually part of the problem. The ‘urban heat island effect’ is just what it sounds like. These areas of higher ambient temperatures form when natural surfaces like vegetation or water that tend to reflect or use incoming energy are replaced by artificial surfaces such as concrete or asphalt, that trap incoming natural heat (sunlight). The high concentration of cars, air conditioning heat sinks, people, and so on in cities also means there’s a lot of anthropic heat which further drives up ambient temperatures.

Throw climate change in the mix, and it’s only going to get worse. Worse enough, in fact, that it’s going to tank the economy.

Paying the cooling bills

Published by an international team of economists, the study is the first to look at how major cities will fare under global as well as local changes in climate — and it’s not at all encouraging. This analysis included 1,692 cities around the world to quantify the effects of rising temperatures throughout climate zones (and across countries and cultures) on urban GDP, the backbone of modern economies, and found that the costs of climate change for cities this century could be 2.6 higher than we’ve believed once you factor in the heat island effect.

Overall, the team reports that we’re likely looking at a decrease of 5.6% of Gross World Product product by the end of the century, but the effects won’t be distributed uniformly. In the worst-affected cities, for example, climate change-induced losses could shave off as much as 10.9% of GDP by the end of the century.

City sunset.

You could say the profits will melt away.
Image credits Rogerio Rogeriomda.

Particularly bad news since cities, although they cover only around 1% of Earth’s surface, churn out about 80% of Gross World Product, consume about 78% of the world’s energy, and house more than half of the world’s population.

So how do the two tie together?

Well, on the one hand, mean temperatures exceeding 13 degrees Celsius (or 55 Fahrenheit) seem to reduce human productivity, and unstable climate will also eat away at entrepreneurship, meaning that there’s less going into the lump sum we call GDP. On the other hand, higher temperatures mean higher expenses. We’ll use up more energy (which translates to costs) for cooling, there will be higher medical care costs due to falling air quality, lower productivity, even rioting, and healthcare costs associated with social unrest over lack of food and higher levels of aggression — all very nice stuff.

As a side-note, a lot of very important cities might not be viable anymore — no city, no city’s GDP.

The research puts the issue of climate change into perspective. The discussion today revolves around tackling this change — as it well should be. But at the same time, it’s easy to lose sight of the fact that local interventions to mitigate the effects of warming climate are equally important for our economies and quality of life.

“Any hard-won victories over climate change on a global scale could be wiped out by the effects of uncontrolled urban heat islands,” said Professor Richard S.J. Tol MAE, Professor of Economics at the University of Sussex, in a statement.

“We show that city-level adaptation strategies to limit local warming have important economic net benefits for almost all cities around the world.”

The paper further looks at the measures which could limit the costs of this effect, and whose implementation should, therefore, be a top priority for ruling bodies.

De-islanding

To find the most desirable solution, the team performed a cost-benefit analysis for a number of local policies including ‘cool’ pavements and roofs, which are designed to reflect sunlight and thus absorb less heat, increasing vegetation cover including green roofs, and so on.

One Central Park facades.

One Central Park, Sydney — because you can do good for the climate, bring down temperature, and look awesome while doing it.

Medium-scale implementation of cool pavements and roofs came out on top, echoing finds regarding climate change in general that mitigation is the best policy. Turning 20% of a city’s roofs and pavement surface to the cool variety would save up to 12 times their installation and maintenance costs and reduce ambient temperatures by 0.8 degrees Celsius over the following century — not a bad result. The 20% point is just the peak — implementing this policy on a wider scale will provide even more benefits but at a lower cost-efficiency. Another thing to keep in mind is that successful global climate change mitigation efforts, in general, will compound the effects of these local policies, so we should really be working on both fronts here. As Professor Tol concludes:

“It is clear that we have until now underestimated the dramatic impact that local policies could make in reducing urban warming. However, this doesn’t have to be an either/or scenario. In fact, the largest benefits for reducing the impacts of climate change are attained when both global and local measures are implemented together.”

“And even when global efforts fail, we show that local policies can still have a positive impact, making them at least a useful insurance for bad climate outcomes on the international stage.”

The full paper “A global economic assessment of city policies to reduce climate change impacts” has been published in the journal Nature Climate Change.

Timelapse_01.

Watch the Grand Canyon overflowing with clouds in the wake of atmospheric inversion

Filmmaker Harun Mehmedinovic recorded this breathtaking video of the Grand Canyon turned sea-of-clouds in the wake of a total temperature inversion and a particularly chilly night.

Timelapse_01.

For the most part, air is warm near the ground and gets progressively colder the further up you go, because it’s the ground that radiates heat. Per thermal dilation works, however, the low layer of air tries to push its way up once it gets warm enough. This upward motion ultimately culminates in the formation of clouds, as the upward drafts of air carry moisture from the ground up to colder layers where it condenses. This effect leads to a variety of different types of clouds. 

But in some very rare conditions, the bodies of air can undergo a spectacular phenomenon known as a total temperature inversion. In case the name wasn’t a dead giveaway, it basically consists of cold air (which is denser) getting trapped at ground-level under a cap of warm air. Although the two bodies of air are flipped over, the moisture is still at ground level — and now the body of cold air there is too, so clouds form around your feet.

That’s exactly what you see happening in the video above. It’s part of the Skyglow Project, a crowdfunded project that aims to explore the effects of urban light pollution by examining some of the darkest skies across North America. It was shot after a cool, rainy night on the Grand Canyon. Moisture got trapped and condensed in the canyon, filling it to the brim with a sea of clouds.

Because inversions are rare in an of themselves and the Grand Canyon is so dry usually, you can catch the fog-filled vistas here only once every several years.

Video credit to Skyglow Project. Still taken from video.

Bio-compatible wireless sensors developed to monitor brain injury

An international research team has developed miniaturized devices to monitor living brain tissue. When no longer needed the devices can be deactivated to dissolve and be reabsorbed into the soft tissue. The wireless sensors were implanted into mice brains and successfully took intracranial pressure and temperature readings.

Dissolvable brain implant consisting of pressure and temperature sensors (bottom right) connected to a wireless transmitter. Image via theguardian

Dissolvable brain implant consisting of pressure and temperature sensors (bottom right) connected to a wireless transmitter. Kang et al, 2017

Electronic implants have long been used in the treatment of medical conditions. Ranging from the humble pacemakers and defibrillators given to cardiac patients all the way to futuristic brain-computer interfaces or injectable meshes that fuse with your brain, it’s hard to imagine today’s medical field without these devices.

Whether implanted permanently or only for short periods of time, the procedure always carries some risk — the devices can hurt surrounding tissues during implantation and their metallic components are prime real estate for bacteria, possibly leading to infections in the area. Not to mention the added risk and distress involved in removing these devices.

The novel device, developed by a research team with members from America and South Korea, is described in the journal Nature and could potentially overcome these limitations. Each device houses a pressure and a temperature sensor, each one smaller than a grain of rice. They’re housed in a biodegradable silicone chip that rests on the brain and sends data via wireless transmitters attached to the outside of the skull.

The devices were successfully tested on live rats and recorded pressure and temperature changes that occurred as the animals drifted in and out of consciousness under anesthesia. They proved to be at least as or more accurate than other devices currently available.

The team showed that by tweaking the sensors they could take measurements either from the surface of the brain up to about 5mm below it. The researchers say the device can easily be modified to monitor a wide range of other important physiological parameters of brain function, such as acidity and the motion of fluids. It could also be used to deliver drugs to the brain, and, with the incorporation of microelectrodes, to stimulate or record neuronal activity.

However, what truly makes this sensor unique is the materials that go into building them, the so-called “green electronics.” These materials are designed to be stable for a few weeks then dissolve. If immersed in watery fluids (such as cerebrospinal fluid) these fully biodegradable and bio-compatible materials take about a day to fully dissipate. When the team examined the animal’s brains after the tests, they found no indication of inflammation or scarring around the implantation site.

As well as being safer for the patient the fabrication process is also cheaper and more environmentally-friendly than that employed in existing technologies.

The next step in development is to test the devices in human clinical trials, the researchers said.

The full paper describing these devices is available online in the journal Nature.

The oxygen in the oceans has decreased by 2% in the past 50 years

A new study published in the science journal Nature reports that the ocean’s oxygen levels have dropped by more than 2% between 1960 and 2010 — and climate change is only a part of the picture.

Oceans are getting hotter, and they’re also losing oxygen. Credits: MODIS Aqua sea surface temperature 2003-2011 average.

Losing oxygen

Oxygen is a vital requirement both on land and in the sea. Ocean dwellers need oxygen just as we do, and lack of available oxygen can have dire consequences. In this study, oceanographers Dr. Sunke Schmidtko, Dr. Lothar Stramma and Prof. Dr. Martin Visbeck from GEOMAR Helmholtz Centre for Ocean Research Kiel, analyzed data on ocean salinity, temperature, depth, and oxygen since 1960 from several databases, and mapped it around the world. The startling discovery emerged when they mapped out oxygen — 80 billion metric tons, or 2% of the total oxygen in the oceans, was gone.

“We were able to document the oxygen distribution and its changes for the entire ocean for the first time. These numbers are an essential prerequisite for improving forecasts for the ocean of the future,” wrote Schmidtko. “Since large fishes in particular avoid or do not survive in areas with low oxygen content, these changes can have far-reaching biological consequences,” Schmidtko, the lead-author of the study adds.

While 2% might not seem like much, it really is. Even slight changes can have domino effects and can severely threaten the well-being of marine life. No one is safe from an oxygen depletion except for some bacteria.

“Just a little loss of oxygen in coastal waters can lead to a complete change in ecosystems — a small decrease in oxygen like this can transform from something desirable to very undesirable,” said David Baker, Assistant Professor at the University of Hong Kong’s Swire Institute of Marine Sciences.

This is the first time oxygen depletion was quantified for the oceans. Oceanic data is typically more sparse and the gaps are hard to fill in.

Image in public domain.

“To quantify trends for the entire ocean, however, was more difficult since oxygen data from remote regions and the deep ocean is sparse,” explains Dr. Schmidtko, “we were able to document the oxygen distribution and its changes for the entire ocean for the first time. These numbers are an essential prerequisite for improving forecasts for the ocean of the future.”

Like with climate change itself, this depletion is not uniform, and some areas are affected more than others. All areas experienced some oxygen loss, but the North Pacific experienced the biggest losses.

Causes

As it so often happens these days, climate change is one of the culprits — but surprisingly, it’s not the biggest one. Just 15% of this depletion can be attributed to rising temperatures. Oxygen is dissolved into water, and as temperatures rise, the ocean loses its ability to hold said oxygen. This has another cascading effect. Not only does oxygen escape from the surface, but rising temperatures also decrease the density of surface water, making it unlikely for new oxygen to be transported to the deeper parts of the ocean.

“It’s almost like the oceans are getting ready for a heart attack,” said Baker. “You’re essentially slowing the heartbeat of the ocean, and you’re getting less oxygen to the ocean.”

Another significant cause here is the melting of sea ice, which leads to more plankton growth. More plankton means more plankton decomposition, which further decreases the oxygen levels. The study also mentions “dead zones” — low-oxygen areas in the ocean’s shallow waters — multiplying around the world’s oceans. These areas are pumping a greenhouse gas which further amplifies climate change. Fertilizers running into the ocean encourage algal blooms which greatly contribute to the development of these zones. Yet all in all, the causes of this oxygen depletion remain very difficult to gauge and further research is still required to get the whole picture.

The writers conclude the study on a pessimistic note, stating that “far-reaching implications for marine ecosystems and fisheries can be expected.” They also foreshadow what might come

“The oceans are really a mirror of human health — if they’re sick and dying, then that’s the future of humanity as well,” said Baker.
Journal Reference: Sunke Schmidtko et al — Decline in global oceanic oxygen content during the past five decades, Nature (2017). DOI: 10.1038/nature21399
This microscopic vibrating drum was, at one point, colder than anything found in nature. Credit: Teufel/NIST

Tiny aluminium drum cooled beyond quantum limit proves we can make things even colder. Possibly down to absolute zero

This microscopic vibrating drum was, at one point, colder than anything found in nature. Credit: Teufel/NIST

This microscopic vibrating drum was, at one point, colder than anything found in nature. Credit: Teufel/NIST

Nothing can be chilled below absolute zero ( −273.15°C) because at this temperature all molecular motion stops completely. Per Heisenberg’s uncertainty principle the forces of real particle velocities will always be above zero. It’s a fundamental limit that can’t seem to be broken. That’s fine just fine — what bothers scientists, however, are other limits that keep them from cooling things near absolute zero.

For decades, researchers have used lasers to cool down atoms very close to absolute zero. However, when you try to cool close to zero something macroscopically, like a power cable or even a coin, you get hit by a brick wall — a ‘quantum limit’ that keeps mechanical objects from getting too cold.

Physicists from the National Institute of Standards and Technology (NIST) weren’t sure that this is a fundamental limit and good thing they experimented because their findings suggest macroscopic objects can be cooled more than previously thought possible.

[ALSO SEE] The minimum and maximum temperatures 

Using lasers, the NIST team cooled an aluminum drum to 360 microKelvin or 10,000 times colder than the vacuum of space. The tiny vibrating membrane is 20 micrometers in diameter and 100 nanometers thick. It’s the coldest thing we’ve ever seen that’s larger than a few atoms across.

“The colder you can get the drum, the better it is for any application,” said NIST physicist John Teufel, who led the experiment. “Sensors would become more sensitive. You can store information longer. If you were using it in a quantum computer, then you would compute without distortion, and you would actually get the answer you want.”

“The results were a complete surprise to experts in the field,” Teufel’s group leader and co-author José Aumentado said. “It’s a very elegant experiment that will certainly have a lot of impact.”

Everyone’s familiar with lasers but firing lasers to cool stuff? It sounds counter-intuitive because we all know lasers warm targets — but that’s if you fire all of the light. The kind of lasers used for cooling fire at a specific angle and frequency. Typically multiple lasers are used. As a result of this clever tweaking photons actually end up snatching energy from its target instead of releasing it, and it’s all done by literally pushing the atoms.

Confused? It gets elementary once you understand or remember what temperature actually is — the motion of atoms. That’s it. When we feel warm, atoms are whizzing past us faster. When it’s cold outside, the molecules in the air are moving slower. So, what scientists do when they fire lasers is they push these atoms in the opposite direction of their motion. As the photon gets absorbed by the target atom(s), the photon’s momentum is transferred.

Laser pulses, however, like any light,  fires in discrete packets of energy called quanta. This means there’s a gap between packets which gives atoms the time to resume motion. That’s how light works and quantum mechanics seems to suggest there’s an upper limit. Previously, NIST researchers used sideband-cooling to limit the thermal motion of a microscopic aluminum membrane that vibrates like a drumhead to one-third the amount of its quantum motion.

The NIST researchers took laser cooling a step further by using ‘squeezed light’ — light that’s more organized in one direction than any other. By squeezing light, the noise, or unwanted fluctuations, is moved from a useful property of the light to another aspect that doesn’t affect the experiment. The NIST team used a special circuit to generate microwave photons that were purified or stripped of intensity fluctuations, which reduced inadvertent heating of the drum.

“Noise gives random kicks or heating to the thing you’re trying to cool,” Teufel said. “We are squeezing the light at a ‘magic’ level—in a very specific direction and amount—to make perfectly correlated photons with more stable intensity. These photons are both fragile and powerful.”

The NIST paper published in Nature seems to suggest squeezed light removes the generally accepted cooling limit. Teufel says their proven technique can be refined to make things even cooler — possible even to exactly absolute zero. And that, ladies and gentlemen, is the coolest thing you’ll hear today.

“In principle if you had perfect squeezed light you could do perfect cooling,” he told the Washington Post. “No matter what we’re doing next with this research, this is now something we can keep in our bag of tricks to let us always start with a colder and quieter and better device that will help with whatever science we’re trying to do.”

‘Goldilocks area’ not nearly enough for habitable planets – internal temperature is also important

A Yale University researcher claims that the so-called Goldilocks planetary area only tells half of the story – in order for a planet to have the necessary temperature to support life, the starting internal temperature also needs to be right.

A new study suggests a planet must start with an internal temperature that is “just right” in order to support life. Credit: Michael S. Helfenbein/Yale University

In astronomy and astrobiology, the circumstellar habitable zone (CHZ), or simply the habitable zone, is the range of orbits around a star in which a planet can support liquid water, the fundamental condition for life as we know it. This habitable zone is colloquially called the ‘Goldilocks area’ as a metaphor from the beloved children’s fairy tale.

Since the concept was first presented in 1953, many planets have been shown to have a Goldilocks area, and some of them have one or several planets in this area. Naturally, distance from the star is the main factor. In our solar system, Venus is too close to the Sun, Mars is too far away, but Earth is at just the right distance.

Astronomers generally thought that a planet’s mantle might self-regulate the general temperature through convection (hot plumes are lighter, they rise towards the surface, they cool down, become heavier and sink again, creating a regulating mechanism). However, this might not be the case, and if this is not the case then the distance to the star isn’t the be-all end-all for temperature.

“If you assemble all kinds of scientific data on how Earth has evolved in the past few billion years and try to make sense out of them, you eventually realize that mantle convection is rather indifferent to the internal temperature,” said Jun Korenaga, author of the study and professor of geology and geophysics at Yale. Korenaga presents a general theoretical framework that explains the degree of self-regulation expected for mantle convection and suggests that self-regulation is unlikely for Earth-like planets.

The implications of this are huge, and might force us to tighten the leash on what we thought to be Earth-like planets.

“The lack of the self-regulating mechanism has enormous implications for planetary habitability,” Korenaga said. “Studies on planetary formation suggest that planets like Earth form by multiple giant impacts, and the outcome of this highly random process is known to be very diverse.”

The curious thing is that mantle convection does exist, at least on Earth. But Korenaga says that if our planet’s starting temperature wasn’t in the right range, this would have never happened.

“What we take for granted on this planet, such as oceans and continents, would not exist if the internal temperature of Earth had not been in a certain range, and this means that the beginning of Earth’s history cannot be too hot or too cold.”

The research was published in the journal Science Advances.

Spider personalities are influenced by temperature

Although they might not be as unique as human personalities, animal personalities possess a fairly large variation in specific traits such as shyness and aggressiveness and scientists have long wondered why these differences exist and how they came to be. Now, a new study from researchers at the University of North Carolina (UNC) at Chapel Hill has discovered a connection between spider personalities and temperature changes, potentially bringing us closer to answering these questions.

Image credit Alex Wild

Image credit Alex Wild

The team examined the Anelosimus studiosus, also known as the tangle web spider, which inhabits North Carolina as well as numerous regions across North and South America. Among the spiders in this species, there are two distinct personality types: highly aggressive and docile. Typically, these two types share the same living space and co-exist to care for brood and capture prey.

The study looked at the effect of temperatures from -75 to 93 degrees Fahrenheit on the spiders’ ability to survive and reproduce individually within the colony. The results revealed that while aggressive spiders had a harder time surviving and reproducing at higher temperatures, docile spiders showed an opposing pattern: difficulty surviving and reproducing at lower temperatures.

Interestingly, when colony’s possessed a mix of the two spider personalities, these effects disappeared – aggressive spiders didn’t die off at higher temperatures and docile spiders didn’t die off at cooler temperatures

“Some aspect about living in a diverse society shields these aggressive spiders from selective pressures that would otherwise kill them,” said Spencer Ingley, a postdoctoral fellow at UNC College of Arts and Sciences and co-author of the study. “Without these diverse personalities, these spider societies would be more susceptible to extreme fluctuations in temperature – and it is interesting to think if our own society could benefit from diversity in a similar way.”

The results are particularly relevant in today’s times – with the planet’s climate projected to increase by three to 12 degrees Fahrenheit by 2100 and numerous studies linking global warming to the death of coral and megafauna, scientists are continuing to keep their eyes peeled for the many unique effects of our planet’s temperature increase.

“We live in a time of global change,” Ingley said. “Scientists are seeing that these changes can have a huge impact on individual organisms and groups of organisms. But people have rarely looked at personalities and how the personalities of groups can alter their response to these changes, particularly in different temperature environments.”

Could our planet’s continual warming be affecting our personalities in a way that we have yet to realize? It’s definitely possible, but we’ll just have to wait for further research to give us the final answer.

Journal Reference: Thermal effects on survival and reproductive performance vary according to personality type. 21 June 2016. 10.1093/beheco/arw084

Image: Pixabay

‘Cool’ light improves learning and academic performance. ‘Yellow light’ better for relaxing

Our mood and concentration are heavily influenced by light, both natural and artificial. There are numerous benefits to sunlight, yet at this age, most people are forced by circumstances to spend their daytime in offices or schools. Managers and parents alike might be interested in learning what are the best artificial light settings for optimized performance. New research investigated various light intensity scenarios and reported their findings. For optimal learning performance, “cool” light is better while “yellow” or “warm” light is the most relaxing. Ready to switch to light bulbs?

Image: Pixabay

Image: Pixabay

Kyungah Choi and Hyeon-Jeong Suk, two researchers at the Korea Advanced Institute of Science and Technology (KAIST) in South Korea, studied learning in the context of different correlated color temperatures (CCTs). The CCTs provide a way to correlate the color appearance of a light source with temperature. Yellowish white light or high CCT is below 3,500 degrees Kelvin while bluish cool light is found above 5,000 degrees Kelvin, corresponding to a low CCT.

Sunlight has a CCT of about 6,500 K, while incandescent bulbs emit light between 2500 K and 3000 K. Luckily LED lights can easily adjust their CCT, unlike incandescent or fluorescent lamps whose CCT stays fixed.

The original fluorescent lighting (left) replaced with tunable LEDs (right). Credit: OSA Publishing

The original fluorescent lighting (left) replaced with tunable LEDs (right). Credit: OSA Publishing

Choi and Suk first exposed a group of adult volunteers to various CCT conditions (3500 K, 5000 K and 6500K) while electrodes strapped to the wrist and ankles measured heart rate. The most intense CCT at 6500K caused the highest level of psychological alertness among the participants while 3500K was found to be the most relaxing.

 

Next, the same three different lighting conditions were tested in a classroom setting with fourth-graders. Results suggest that the 6500 K intensity which corresponds to heightened alertness also enhanced academic performance. The fourth-graders performed better on recess activities in 3500 K light.

[MUST READ] The LED sun: artificial light that completely mimics properties of sunlight

Using the three lighting presets derived from the empirical study, a dynamic lighting system was designed as a mobile application to generate highly optimized lighting according to students’ activities. The application allowed teachers to select the most appropriate lighting via three lightings presets: “easy,” “standard,” and “intensive” modes

Operational flow of the dynamic lighting system: 1) students are focusing on solving mathematical problems; 2) select the “intensive” lighting preset to support students’ academic performance; and 3) lower the illuminance level by pinching in the intensity icon for high intensity of daylight. Credit: OSA Publishing

Operational flow of the dynamic lighting system: 1) students are focusing on solving mathematical problems; 2) select the “intensive” lighting preset to support students’ academic performance; and 3) lower the illuminance level by pinching in the intensity icon for high intensity of daylight. Credit: OSA Publishing

The findings confirm the Yerkes-Dodson Law, an observation stated more than a century ago by psychologists Robert Yerkes and John Dodson. The “law” says there’s a curvilinear relationship between mental arousal and performance. Tests on the Yerkes-Dodson law suggests people perform at their best when their state of mental arousal is intermediate. In the discussion about light, this intermediate state is reached at 6500 K. This means that lights with a higher CCT will make you too alert and decrease performance from optimal baseline.

“We believe that small changes in classroom environment, such as lighting conditions, could make a dramatic difference in supporting students’ learning,” Suk said.

A multiple-exposure image of a new shape-memory polymer reverting to its original shape after being exposed to body temperature. (University of Rochester photo / J. Adam Fenster)

Novel polymer changes shape just by touching with a finger — lifts 1,000 times its own weight doing so

This polymer can change shape and release tremendous amounts of stored elastic energy relative to its weight simply by being exposed to a temperature change. This in itself isn’t exactly new, but the team led by Chemical Engineering Professor Mitch Anthamatten at the University of Rochester innovated by making the polymer react to room temperature — a first.

A multiple-exposure image of a new shape-memory polymer reverting to its original shape after being exposed to body temperature. (University of Rochester photo / J. Adam Fenster)

A multiple-exposure image of a new shape-memory polymer reverting to its original shape after being exposed to body temperature. (University of Rochester photo / J. Adam Fenster)

The material belongs to a class called shape-memory polymers. These materials can be programmed to retain a specific shape until triggered to return to the original shape — in our case, by heat.

“Our shape-memory polymer is like a rubber band that can lock itself into a new shape when stretched. But a simple touch causes it to recoil back to its original shape,” said Anthamatten in a statement.

 

To reach this level of control, the polymer’s crystallization that occurs when it gets cooled or stretched was tweaked just right. When deformed, chains of the memory polymer become stretched while smaller segments become aligned in the same direction across small areas. This fixes the material. The more of these concentrated areas you have in the material, the more stable the shape and the more difficult it is for the material to revert back to its original shape.

Eventually, the researchers learned to trigger the material rewind to any temperature by including molecular linkers to connect the individual polymer strands. In one demonstration, the polymer was set to revert back to its original shape at sub room temperature. Touching the material with one finger was enough.

What’s amazing is that once triggered, the polymer releases its stored elastic energy. Anthamatten’s shape-memory polymer is capable of lifting an object one-thousand times its weight. For example, a polymer the size of a shoelace—which weighs about a gram—could lift a liter of soda. As a demonstration, his team used a small band of the material to pull a toy truck up an incline or lifting weights.

This is cool, but any practical use for it? Anthamatten says his shape-memory polymer could prove very handy as sutures, artificial skin, body-heat assisted medical dispensers, and self-fitting apparel.

We’re in December, but Washington’s flowers and trees are blooming

Image via CBS.

There’s almost no need to say it again – it’s been an exceptionally warm December, and an exceptionally warm year. In fact, it’s been the hottest year on record, with 7 of 11 months so far breaking the record. Things aren’t very different in the capital of the US, where temperatures have exceeded 50 degrees almost each day (10 Celsius)… it’s no wonder flowers are confused and have started blooming.

After cherry trees started blooming in November, roses, irises, azaleas and even vegetables have reportedly been blooming in Washington DC… and it’s not just DC. Twitter’s going crazy posting pictures of blooming plants, which should just not happen in the middle of winter.

If you have any pictures of plants going crazy in December in your area, feel free to share them with us and we’ll post them.

20 bromine atoms positioned on a sodium chloride surface using the tip of an atomic force microscope at room temperature, creating a Swiss cross with the size of 5.6nm. The structure is stable at room temperature and was achieved by exchanging chlorine with bromine atoms. Photo: Department of Physics, University of Basel

Smallest Swiss cross made of only 20 atoms demonstrates atom manipulation at room temp

Some applications require such a degree of precision that everything needs to be in exact order at the atom-scale. In an awesome feat of atomic manipulation,  physicists from the University of Basel,  in cooperation with team from Japan and Finland, have placed 20 atoms atop an insulated surface in the shape of a Swiss cross. Such experiments have been achieved with success before, but the real highlight is that this is the first time anything like this was made at room temperature.

20 bromine atoms positioned on a sodium chloride surface using the tip of an atomic force microscope at room temperature, creating a Swiss cross with the size of 5.6nm. The structure is stable at room temperature and was achieved by exchanging chlorine with bromine atoms. Photo: Department of Physics, University of Basel

20 bromine atoms positioned on a sodium chloride surface using the tip of an atomic force microscope at room temperature, creating a Swiss cross with the size of 5.6nm. The structure is stable at room temperature and was achieved by exchanging chlorine with bromine atoms. Photo: Department of Physics, University of Basel

Since the 1990s, scientists have been able to manipulate surface structures by individually moving and positioning atoms. This sort of demonstrations, however, were made mainly atop conducting or semi-conducting surfaces and only under very low temperatures. Fabricating artificial structures on fully insulated surfaces and at room temperature has always proven to be a challenge, but the international effort proved it is possible.

[RELATED] IBM develops smallest storage device: 12 atoms for a single bit!

The team led by Shigeki Kawai and Ernst Meyer from the Department of Physics at the University of Basel used an atomic force microscope to place single bromine atoms on a sodium chloride surface. Upon reacting with the surface, the bromine atoms would exchange position with chloride and the researchers carefully repeated each step until they formed a lovely Swiss cross made up of 20 such atoms. It’s so small that the surface area measures only a whooping 5.6 nanometers square. Effectively, the demonstration represents  the largest number of atomic manipulations ever achieved at room temperature.

[ALSO READ] Incredible molecular imaging shows how chemical bonds really look like for the first time

By proving atomic manipulation at this scale is achievable under room temperature, the scientists help pave the way for the next generation of electromechanical systems, advanced atomic-scale data storage devices and logic circuits that will most likely use a scaled version of their process.

The paper appeared in the journal Nature Communications.

Confocal scan of a single cell. The white cross corresponds to the position of the gold nanoparticle used for heating, while the red and blue circles represent the location of diamond sensors used for thermometry. The dotted white line outlines the cell membrane. (Credit: DARPA)

Temperature control and monitoring achieved at the cellular level

Temperature is an important physical parameter which greatly influences a system. Monitoring and/or manipulating this state parameter with great accuracy is thus of great importance to scientists. Recently, researchers part of  DARPA’s Quantum-Assisted Sensing and Readout (QuASAR) program proved a new technique that allowed them to measure and control temperatures at the nanometer scale inside living cells. Measuring temperatures at such fine spatial resolutions will allow for better assessment of the thermal performance of high grade materials, cell-specific treatment and other health related applications.

The QuASAR team, led by researchers from Harvard University, were able to measure sub-degree variations over a large range of temperatures in both organic and inorganic systems at length scales as low as 200 nanometers by using an ingenious set-up, along with gold nanoparticles and tiny diamond sensors. The latter is where the innovative part of the research lies.

Confocal scan of a single cell. The white cross corresponds to the position of the gold nanoparticle used for heating, while the red and blue circles represent the location of diamond sensors used for thermometry. The dotted white line outlines the cell membrane. (Credit: DARPA)

Confocal scan of a single cell. The white cross corresponds to the position of the gold nanoparticle used for heating, while the red and blue circles represent the location of diamond sensors used for thermometry. The dotted white line outlines the cell membrane. (Credit: DARPA)

Diamonds have naturally occurring imperfections known as nitrogen-vacancy (NV) color centers, with each center capable of trapping one electron. Temperatures fluctuations causes the diamond lattice to expand or contract, just like at the macroscale – think of the dilation and contraction of railway tracks when its hot or cold, respectively.

This lattice structure variation also causes changes in the spin properties of the electron, measured using a laser technology, which can then be correlated with temperature.

To demonstrate their research, 100-nanometer-diameter gold particles were implanted  into a human cell alongside the diamond sensors, which the scientists then heated using a heating laser. By monitoring the diamond sensors, researchers could characterize the local thermal environment around the cell. Shifting the power of the heating laser, as well as the gold nanoparticles concentration, allows for modification of the environment.

Using such a technique, researchers could have access to high resolution thermal insight and study, for instance,  nanoscale cracking and degradation caused by temperature gradients in materials and components operating at high temperatures. The electronics industry, where finely-tuned tiny components are at play, might have a lot to benefit. Also, since diamond is inert and thus doesn’t interfere with chemical reactions, the technique could be used to monitor and control chemical reactions at tiny shifts of temperature.

The findings were reported in the journal Nature.

 

At a few million atmospheric pressures, Hydrogen nears metal conductivity

Hydrogen is the most common element in the Universe. It’s the first element in the periodic table, and it has but one proton and one electron. Understanding how it behaves at very large pressures is crucial to our understanding of matter and the nature of hydrogen-rich planets.

hydrogen

Under typical conditions, Hydrogen is a diatomic molecule (H2); but as pressure increases, these molecules start to change – these different forms are called phases, and hydrogen as three well known solid phases. But it has also been speculated that at very large pressures, it starts acting like a metal, conducting electricity. As a matter of fact, a few more bold physicists believe that it can even become a superconductor or a superfluid that never freezes–a completely new and exotic state of matter.

In this new paper, a team from Carnegie’s Geophysical Laboratory examined the structure, bonding and electronic properties of highly compressed hydrogen using a technique called infrared radiation.

The team found the new form to occur between 2.2 million atmospheres at about 25 degrees Celsius (80 Fahrenheit) to at least 3.4 million times atmospheric pressure and about -70 degrees Celsius (-100 Fahrenheit).

Their results showed that in these conditions, hydrogen acts like no other structure that we know of. It has two very different types molecules in its structure – one which interacts very weakly with its neighboring molecules (highly unusual for matter at such high pressures), and another which bonds with its neighbors, forming surprising planar sheets.

“This simple element–with only one electron and one proton–continues to surprise us with its richness and complexity when it is subjected to high pressures,” Russell Hemley, Director of the Geophysical Laboratory, said. “The results provide an important testing ground for fundamental theory.”

Via Carnegie

The minimum and maximum possible temperatures

Since the start of the year, I’ve received quite a few questions regarding absolute temperatures – highest and lowest, so I decided to start a brief discussion around the two values, in which I will give the basic facts about them, so feel free to step in and add more info or questions.

Absolute zero

In thermodynamics, absolute zero is impossible to reach; it is the temperature at which entropy reaches minimum value, entropy being a property used to determine the energy not available for work, or to put it in layman terms, a state of  ‘molecular disorder’ of any substance. Absolute zero or absolute 0 K (0 degrees on the Kelvin scale, which is typically used for absolute values) equals −273.15° on the Celsius scale and −459.67° on the Fahrenheit scale.

Scientists have managed to get extremely close to absolute zero, at 100 picoKelvins, or 10-10 Kelvins, but as I said, reaching absolute zero is impossible, at least with our current knowledge. Researchers have noted some remarkable properties of matter, when they get close to this temperature, such as superconductivity.

Now, quite a lot of people know about this; but what many people don’t know is that similar to how there is a minimum possible accepted temperature, there is also a maximum temperature, called the Planck temperature.

The Planck and maximum temperature

In the Planck temperature scale, 0 is absolute zero, 1 is the Planck temperature, and every other temperature is a decimal of it. This maximum temperature is believed to be 1.416833(85) x 1032 Kelvin degrees, and at temperatures above it, the laws of physics just cease to exist.

However, many don’t agree with this rather cosmological model and believe that as we continue to find out more and more things about the Universe we live in, the maximum temperature will continue to grow.

From what I have been able to find, the highest temperature obtained on Earth was 3.6 billion degrees, which even though is over 2.000 times hotter than the interior of the Sun, is only an insignificant fraction of 1032 degrees.

A novel way to generate electricity

In the evergrowing search for new energy sources, scientists have started searching for more simple solutions, and what they found was that heat can be an incredible ally.

“In the search for new sources of energy, thermopower – the ability to convert temperature differences directly into electricity without wasteful intervening steps – is tremendously promising,” says Junqiao Wu of Berkeley Lab’s Materials Sciences Division (MSD), who led the research team. Wu is also a professor of materials science and engineering at the University of California at Berkeley. “But the new effect we’ve discovered has been overlooked by the thermopower community, and can greatly affect the efficiency of thermopower and other devices.”

What they found was that temperature gradients (differences in temperature) in semiconductors, when one end is hotter than the other end, can produce whirlpools of electric currents, and also, at the same time, they can create magnetic fields at right angles to both the plane of the swirling electric currents and the direction of the heat gradient.

Wu says, “There are four well-known effects that relate thermal, electric, and magnetic fields” – for example, the familiar Hall effect, which describes the voltage difference across an electric conductor in a perpendicular magnetic field – “but in all these effects the magnetic field is an input, not an outcome. We asked, ‘Why not use the electric field and the heat gradient as inputs and try to generate a magnetic field?'”

These remarkable results, Wu explains, can also be duplicated by other kinds of inhomogeneous excitation – for example, by the way light falls on a solar cell.

“Different intensities or different wavelengths falling in different areas of a photovoltaic device will produce the same kinds of electronic vortices and could affect solar cell efficiency. Understanding this effect may be a good path to better efficiency in electronics, thermal power, and solar energy as well.”

Here is the published study.

Large Hadron Collider hints at infant Universe

Despite several setbacks and technical difficulties, the Large Hadrdon Collider is already starting to live up to it’s nickname, the Big Bang machine. Researchers have pinpointed what may very well be the dense, hot state state of matter that is believed to have filled the Universe during its first nanoseconds.

Generally speaking, quarks are bound together in groups of two or three, stuck together by gluons. However, right after the Big Bang, it was so hot that the quarks broke free, and the matter became a free flow of quarks and gluons.

In the snapshots taken from LHC’s detector, a flow similar to this has been observed.

Full report here.