Author Archives: Dragos Mitrica

About Dragos Mitrica

Dragos has been working in geology for six years, and loving every minute of it. Now, his more recent focus is on paleoclimate and climatic evolution, though in his spare time, he also dedicates a lot of time to chaos theory and complex systems.

What Is the Golden Dental Proportion in Dentistry?

The concept of the golden dental proportion can be traced back to the times of the ancient Greeks. According to their beliefs, true beauty is defined by the constant proportion or ratio of large and small elements.

While they used this in reference to beauty in nature, this was eventually adopted in other aspects of beauty. In modern days, the concept remains appealing and it has branched out into various areas, including the dentistry field. 

Image credits: Lesly Juarez.

Simply put, the golden dental proportion is based on a mathematical ratio of 1.618:1. This ratio represents the larger and smaller length (or teeth, in dental application). When you put it in a different format, it means that the smaller tooth must be 62% of the larger tooth’s size.

While it is inevitable that beauty trends come and go, the idea of a “golden ratio”  has remained surprisingly enduring. At the same time, the concept of the golden proportion has been the center of controversy — and dentistry is no exception in this regard. It has had significant adjustments in terms of its application in the dental sphere, simply because an ideal proportion is difficult to achieve. 

But as dental technologies and advancements continue to move forward, this would not be an impossible thing to achieve. According to Dentaly, the concept of golden proportions should not be viewed solely in terms of tooth width but also in connection with oral health. It should be looked at as a whole in terms of facial aesthetics (dental alignment included) and improving dental wellness. 

What Is the Golden Ratio?

Two quantities are in the golden ratio if their ratio is the same as the ratio of their sum to the larger of the two quantities. In other words, their ratio is approximately 1.618.

The golden ratio of 1.618  has been used extensively in music, nature, architecture, and art. Ancient Greek mathematicians and Leonardo DaVinci are among those who have employed the golden ratio in their works. 

The “golden spiral”, based off of the Fibonacci progression.

When it comes to your smile, the golden ratio is applied to the two front teeth. These two teeth are measured in terms of height and width. The ideal shape is for the two front incisors to form a golden rectangle using the phi ratio. 

This formula offers just one possible mathematical approach in order to determine a desirable size and shape of the teeth, particularly the maxillary teeth. Dentists would have to determine the length of the following before they can apply the golden proportion rule: the position of the incisal edge, incisal plane, gingival plane, and the length of the central incisor. 

Dentists also use a different approach in terms of measuring the width based on two things: viewing width and clinical width. These two are taken into consideration when it comes to coming up with an accurate measurement of the dental proportion.

Applying the Golden Proportion in Dentistry

The golden proportion provides a mathematical formula with which you can base your ratio of the smaller and larger teeth in relation to each other. 

In the constant search for perfection in the dental field, the golden proportion presents itself as the secret to achieving the perfect smile. People nowadays spend thousands of dollars per year to correct the alignment of their teeth or to have their teeth replaced. Dentists are now examining the importance of (or lack of) the golden proportion when it comes to correcting people’s smiles.

Of course, cosmetic interventions should not be done frivolously, and you should consult with your specialist before any potential intervention.

However, in the quest of achieving that ideal proportion, various cosmetic dentistry procedures are done. One example of that is dental implants. A dental implant can not only help retain the integrity of your facial and bone structure but also improve overall face aesthetics. When you have a missing tooth, it can cause your other teeth and jawbones to move. This movement creates a lack of proportion in your facial features. 

How Can Cosmetic Dentistry Help?

Of course, there is no replacement for the good old fashioned brush — but dentists can help in multiple ways.

The quest for a perfect smile never ends. The new advancements in cosmetic dentistry technology have only created more discussion with the aim of achieving perfectly proportioned teeth. Procedures such as dental implants and braces are a few of the methods available to achieve aligned teeth. A good cosmetic dentist will be able to assess a patient’s natural teeth formation and determine the ideal procedure that would achieve the golden ratio. The best cosmetic dentists are not only well-versed about the latest cosmetic dentistry techniques, but they are also knowledgeable when it comes to smile aesthetics and how achieving the best natural proportion can produce a better smile. 

Cosmetic dentistry includes a wide range of procedures and treatments aimed at full mouth reconstruction. Even a simple procedure such as teeth whitening can do a lot to enhance your smile. Advanced techniques and treatments can, therefore, restore the proper alignment and harmony of your teeth. Whether you have protruding teeth, missing teeth, or crooked teeth alignment, these can now be fixed with modern dentistry techniques. 

With the advancement in cosmetic dentistry, it is only a matter of time until dentists can achieve the golden dental proportion. This makes your quest of achieving that perfect smile within your reach!

Facebook and Twitter put your privacy at risk — even when you don’t use them

A new study casts new light on how social networks can gather information about you — even if you don’t have an account.

In a way, social media is like smoking — but instead of being bad for your health, it’s bad for your privacy. There’s another striking similarity between the two: just like second-hand smoke is a thing, affecting those who might not even smoke, social media might also affect the privacy of those around you, even if they’re not users themselves.

The new study from researchers at the University of Vermont and the University of Adelaide gathered more than thirty million public posts on Twitter from 13,905 users.

The first concerning find is that it only takes 8 or 9 messages from a person’s contacts to be able to predict that person’s later tweets “as accurately as if they were looking directly at that person’s own Twitter feed”. In other words, social media information about yourself can also be derived indirectly.

“You alone don’t control your privacy on social media platforms,” says UVM professor Jim Bagrow, one of the authors of the study. “Your friends have a say too.”

“You think you’re giving up your information, but you’re giving up your friends’ information too!” adds University of Vermont mathematician James Bagrow who led the new research.

UVM professor Jim Bagrow led a new study, published in Nature Human Behavior, that suggests privacy on social media networks is largely controlled by your friends. Image credits: Joshua Brown.

The study also found that if a person leaves social media (or never joined it in the first place), 95% of this predictive accuracy also stands. Scientists found that they were generally successful at predicting a person’s identity and future activities even without any data from them.

This raises fundamental questions about how privacy can be protected. Intuitively, you would think that if you’re not on a social network, nothing can be known about you.

However, scientists have also shown that there is a fundamental limit to how much predictability can come with this type of data.

“Due to the social flow of information, we estimate that approximately 95% of the potential predictive accuracy attainable for an individual is available within the social ties of that individual only, without requiring the individual’s data,” researchers conclude.

The study has been published in Nature

Seal hitches a ride on the back of a whale

Australian photographer Robyn Malcolm has captured a mind blowing picture: a seal hitching a ride on the back of a humpback whale off the coast of New South Wales, Australia.

“I was surprised to find photos of the cheeky seal in amongst the other shots as I didn’t notice him at the time,” Malcolm told Lucy Cormack over at The Sydney Morning Herald. “I don’t think he stayed there for long.”

Humpbacks are baleen whales, they have no teeth, and use their baleen filter system to sift small fish and plankton and krill from the water. They can consume enormous quantities of food, up to 1,400 kg every day, which is likely why the seal was around.

Whale expert Geoff Ross from New South Wales National Parks and Wildlife Service said:

“Humpbacks force fish into very tight bait balls, that means everyone can dart through the inside or the middle – anything that makes it easier to catch fish, seals will be involved.”

To my honest surprise, he explained that this behavior is not unheard of.

“The only other time was a seal trying to get away from a killer whale,” said Ross. “The seal hopped on the back of the pectoral fins of a humpback whale.”

Right now, humpback whales are making their way down the Australian coast en masse as they head to Antarctica for the summer, resting and feeding as they go.

Americans believe climate change is happening, but they aren’t aware of the scientific consensus

Despite almost 3 in 4 Americans believing in climate change, 87% of them are not aware of the scientific consensus on man-made climate change.

A recent report carried out by researchers from Yale University assessed how Americans feel about climate change — if it’s happening or not, who is causing it, and how certain we are of things. They didn’t tell the 1,266 adults in the survey that 97 percent of climate scientists concur that human-caused global warming is happening, and they avoided any piece of information that could sway the subjects on way or another.

Several findings were interesting, some actually encouraging (such as 70% of Americans believing in climate change or 58% reporting that the climate change is man-made), but something really stood out: Americans really aren’t up to date with the science.

This is quite troubling since it leaves quite a door open for debate, when in reality, the time for debating whether or not man-made climate change is happening has long passed. Study after study has pitched in, and nowadays we have over 20,000 peer-reviewed studies on the subject. It’s a situation which can only be classified as a scientific consensus.

“Public misunderstanding of the scientific consensus – which has been found in each of our surveys since 2008 – has significant consequences,” the researchers note in the Yale-GMU report. “Other research has identified public understanding of the scientific consensus as an important ‘gateway belief’ that influences other important beliefs,” the researchers said, including the beliefs that global warming is happening, is human caused, is a serious problem and is solvable.

As we previously reported, it’s not just a lack of science literacy that’s causing this issue, though that’s definitely playing a part too. The US media has launched an all-out attack on climate science, nowadays spearheaded by a White House administration with an unprecedented anti-scientific approach. Ideas launched by President Trump and his administration (‘climate change is a Chinese hoax,’ ‘make coal great again,’ ‘CO2 doesn’t cause warming‘) have fueled a part of the media already at the mercy of the fossil fuel lobby. That facts and simple truths are so easily discarded by both politicians and media outlets is worrying and can cause a growing rift of misinformation.

Here are some other key findings from the Yale report:

  • Americans are also more certain global warming is happening – 46% are “extremely” or Americans are also more certain global warming is happening – 46% are “extremely” or “very” sure it is happening, the highest level since they started doing this survey in 2008.
  • 30% of Americans that believe in climate change believe this is happening due to natural causes. It’s a startlingly high number, but it’s the smallest figure since 2008.
  • Some 57% of Americans are “worried” about climate change; 17% are “very worried.”
  • 59% of Americans believe climate change is affecting the weather in the US.
  • 35% of Americans believe Americans are suffering due to climate change right now.
  • Most Americans (71%) believe the threat of climate change is distant, affecting future generations. Perhaps this is the main point why the US is so slow to take action against the phenomenon — they just don’t see it as an urgent threat. Again, it’s an instance when the population has not caught up to the science. Climate change is affecting people all over the world right now, including in the US. However, 40% of Americans say they have experienced the effects of climate change first hand. Also…
  • Four in ten Americans (39%) think the odds that global warming will cause humans to Four in ten Americans (39%) think the odds that global warming will cause humans to become extinct are 50% or higher.
  • Americans (78%) believe children should be taught the science of climate change, as well as potential solutions –which is highly encouraging.

So to sum it up, Americans largely understand that climate change is happening, though some still believe it’s not a man-made phenomenon. They think it’s a distant problem, which will affect future generations, and even has the potential to wipe out humanity. Yet it’s a problem many have experienced themselves. It’s a complex, sometimes contradictory image, with a lot of both positives and negatives, and definitely a lot of voids that need to be filled — by science.

Scientists find water clouds and exotic, primitive atmosphere on a “warm Neptune”

It’s a type of planet which we didn’t even know existed — a Neptune-sized planet closer to a star than its namesake.

Artistic representation of the newly discovered “warm Neptune.” Image credits: NASA/GSFC.

A pioneering new study has revealed what astronomers believed to be evidence of water vapor and exotic clouds around a planet located some 437 light-years away from Earth. The planet, called HAT-P-26b, is a so-called warm Neptune — a Neptune-sized gas giant orbiting very close to its sun, which makes it much hotter. The new study also showed that the planet has an atmosphere composed almost entirely of hydrogen and helium, something which was even more unexpected.

“This exciting new discovery shows that there is a lot more diversity in the atmospheres of these exoplanets than we have previously thought,” David Sing, an astrophysics professor at the University of Exeter in England, said in a statement.

The chemical composition indicates a primitive atmosphere. Researchers believe that, compared to the gas giants in our own solar neighborhood, this planet either developed later in the history of its solar system, closer to its star — or both. It’s a quirk, but it’s a quirk which can be very useful. It kind of breaks the pattern we’ve been seeing in other, similar planets, and this allows astronomers to look at such planets in a new light and better understand the formation and evolution of different solar systems.

“Astronomers have just begun to investigate the atmospheres of these distant Neptune-mass planets, and almost right away, we found an example that goes against the trend in our Solar System,” says one of the researchers, Hannah Wakeford from NASA’s Goddard Space Flight Centre. “This kind of unexpected result is why I really love exploring the atmospheres of alien planets.”

The study itself didn’t employ any new technique. Basically, when the planet passes between its star and the Earth, a fraction of the light emitted by its star is captured and filtered by its atmosphere — but only for some wavelengths. By analyzing the wavelengths of the light which manages to reach us, we can infer the composition of the atmosphere. However, the study was innovative in the sense that it applied the technique to a much smaller planet than previous efforts. This was facilitated by the unusual orbit of the planet — the fact that it’s so close to its star. HAT-P-26b completes a full rotation around in star in just 4.23 days, which makes such observations much easier.

“This ‘warm Neptune’ is a much smaller planet than those we have been able to characterize in depth, so this new discovery about its atmosphere feels like a big breakthrough in our pursuit to learn more about how solar systems are formed, and how it compares to our own,” added Sing, the co-leader of a new study about HAT-P-26b that was published online today (May 11) in the journal Science.

So how would the sky of this alien planet look like? If you were looking through the water clouds, you’d likely see a washed-out, gray sky. While there is some water vapor, the clouds are much more exotic, likely made of disodium sulfide. These clouds would cause scattering in all colors, which is why you’d likely end up with a grayscale sky.

In recent years, telescope and telescope arrays such as NASA’s Kepler have revealed several intriguing planets, greatly expanding our understanding of alien worlds and their solar systems. The variety we are seeing is staggering and sometimes unexpected, but studies like this go a long way towards helping astronomers understand this variety.

Journal Reference: H.R. Wakeford el al., “HAT-P-26b: A Neptune-mass exoplanet with a well-constrained heavy element abundance,” Science (2017).

This is why space armor is becoming more important

The European Space Agency recently shared this image of a tiny, 10-cm object that can wreak havoc in even the strongest space armor we have.

ESA space debris studies, an impact sample. This is the kind of damage even a small projectile can cause. Image credits: ESA.

There is a growing concern regarding the sheer number of random objects in outer space, be they natural or man-made. Needless to say, all these objects pose a great risk to spacecraft, because they typically travel at extremely high velocities. For instance, an object just 10 cm across would inflict catastrophic damage and potentially cause the disintegration of the target. This happens due to the extremely high velocities at which they travel, which can reach 15 km/s for space debris and 72 km/s for meteoroids. Just so you can make an idea, bullets almost never go above 400 meters per second, so debris travels about 37 times faster than a bullet.

Even extremely small objects can have a major impact. Recently, the ISS’ Cupola — the dreamy vantage point which astronauts use to take amazing pictures — was chipped by a paint flake or small metal fragment no bigger than a few thousandths of a millimetre across. The problem is not only the impact itself but also that the speed of these rogue objects causes additional shockwaves which further the damage. The ESA explains:

“Beyond 4 km/s (depending on the materials), an impact will lead to a complete break­up and melting of the projectile, and an ejection of crater material to a depth of typically 2–5 times the diameter of the projectile. In hypervelocity impacts, the projectile velocity exceeds the speed of sound within the target material. The resulting shockwave that propagates across the material is reflected by the surfaces of the target, and reverses its direction of travel. The superimposition of progressing and reflected waves can lead to local stress levels that exceed the material’s strength, thus causing cracks and/or the separation of spalls at significant velocities.”

This was caused by “possibly a paint flake or small metal fragment no bigger than a few thousandths of a millimetre across,” writes the ESA.

It’s counterintuitive, but big objects aren’t really as problematic as small objects. Larger objects can be tracked and studied and perhaps avoid — or at the very least, we can prepare for it. But smaller objects are virtually untraceable and can be quite surprising, striking out of nowhere. According to NASA, there are millions of pieces of debris or ‘space junk’ orbiting Earth. Recently the ESA shared its latest figures according to which there are around 5,000 objects larger than 1 meter in orbit, 20,000 larger than 10cm, and 750,000 larger than 1cm. All these pose a risk for all spacecraft, which is why researchers are trying to develop better and safer armor. Notably, the ESA is working on Whipple shields with aluminium and Nextel–Kevlar bumper layers.

Whipple shields are quite clever in their approach. They consist of a relatively thin outer bumper spaced some distance from the main spacecraft wall. This will cause a bumper which is not expected to stop the particle or even remove most of its energy, but rather to break it and disperse its energy, dividing the original particle into many fragments, spread across a greater surface. Intermediate fabric layers further slow the cloud particles. The original particle energy is spread more thinly over a larger wall area, which is more likely to withstand it. Nowadays, Whipple shields have reached a stage of maturity, so they’ll likely be incorporated into the next generation of spacecraft — potentially even SpaceX shuttles.

A 7.5 mm-diameter aluminium bullet was shot at 7 km/s towards the same ‘stuffed Whipple shield’ design used to protect the ATV and the other International Space Station manned modules. Image credits: ESA.

Future research will try to further our understanding of such impacts, because the risks get higher every day. If we want to start exploring Mars or other areas of the solar system, or even if we just want to secure Earth’s orbit for future spacecraft, armor is key. With every piece of spacecraft and satellite we launch. the risks get higher.

This graphene nanoribbon is only seven carbon atoms in width. Credit: Chuanxu Ma and An-Ping Li

Nanoribbons pave the way for switching graphene ‘on-off’

Among its many stellar properties, graphene is an amazing electrical conductor. However, if graphene is to reach its full potential in the field of electronics, it needs to coaxed to turn current on or off like silicon transistors. Physicists at the Department of Energy’s Oak Ridge National Laboratory (ONRL) present a recent breakthrough that may enable graphene to act like a semiconductor. The catch is to grow graphene in curled nanoribbons rather than in flat 2-D sheets.

This graphene nanoribbon is only seven carbon atoms in width. Credit: Chuanxu Ma and An-Ping Li

This graphene nanoribbon is only seven carbon atoms in width. Credit: Chuanxu Ma and An-Ping Li

When arranged in wide sheets, the hexagon-linked graphene doesn’t have a band gap, which means you can’t use it in modern electronics like computer chips or solar panels. It’s a great electrical wire but useless as a transistor. That’s speaking about its traditional configuration because graphene can work as a semiconductor in other arrangements. Doping graphene with various impurities can enable the material to switch on or off, for instance, DNA and copper ions as demonstrated previously by another team. 

The team from ONRL, however, made semiconductive graphene with no other additional material by fashioning it in ribbons because when graphene becomes very narrow, it creates an energy gap. The narrower the ribbon is, the wider the energy gap and the ribbons made at ONRL are definitely narrow. One nanoribbon has a width of only one nanometer or less.

Besides narrowness, another important factor is the shape of the edge. When graphene’s hexagon is cut along the side, its shapes resembles an armchair — this shape enables the material to act like a semiconductor.

Previously, scientists made graphene nanoribbons by growing them on a metal substrate. This was necessary but undesirable because the metal hinders some of the ribbons’ useful electrical properties.

The scanning tunneling microscope injects charge carriers called “holes” into a polymer precursor. . Credit: Oak Ridge National Laboratory, U.S. Dept. of Energy

The scanning tunneling microscope injects charge carriers called “holes” into a polymer precursor. . Credit: Oak Ridge National Laboratory, U.S. Dept. of Energy

ONRL took a different route to get rid of the metal substrate altogether. To trigger chemical reactions that control the width and edge structure from polymer precursors, the team used the tip of a scanning tunneling microscope to inject positive charge carriers called ‘holes’. The reaction could be triggered at any point of the polymer chain by moving the tip in the right direction. This method rendered ribbons that were only seven carbon atoms wide whose edges were neatly wrapped in the armchair configuration.

“We figured out the fundamental mechanism, that is, how charge injection can lower the reaction barrier to promote this chemical reaction,” said An-Ping Li, a physicist at the Department of Energy’s Oak Ridge National Laboratory.

Moving forward, the researchers plan on making the heterojunctions with different precursor molecules. One exciting possibility is conducting photons in a new electronic device with graphene semiconductors where current could be carried with virtual no resistance even at room temperature — a life-long dream in solid state physics.

“It’s a way to tailor physical properties for energy applications,” Li said. “This is an excellent example of direct writing. You can direct the transformation process at the molecular or atomic level.”

Scientific reference: Chuanxu Ma et al, Controllable conversion of quasi-freestanding polymer chains to graphene nanoribbons, Nature Communications (2017). DOI: 10.1038/ncomms14815.

NASA releases breathtaking gallery of Jupiter photos

Aside from delivering a trove of valuable information about Jupiter and its nearby environment, the Juno probe has also sent back a number of spectacular photos. After entering Jupiter’s atmosphere, getting closer than ever to the gas giant, and having an unprecedented view on Jupiter’s clouds, Juno has quite the stories to share — it’s Jupiter, like you’ve never seen it before.

Contrast and Color changes both major and subtle to bring out details and also removed longest wavelength color channel to improve sharpness. Oh, and by the way — NASA used MS Paint for this image. All image credits: NASA.

The $1 million Juno spacecraft was launched from Cape Canaveral Air Force Station on August 5, 2011 and entered Jupiter’s orbit in 2016. Its major objectives are to understand origin and evolution of Jupiter, look for solid planetary core, map magnetic field, measure water and ammonia in deep atmosphere, observe auroras.

Jupiter’s north Pole. Just look at this majestic planet! Image Credits: NASA / JPL-Caltech / SwRI / MSSS / Roman Tkachenko

Juno is currently engaged in repeated swings around Jupiter, in a wide arc — to minimize exposure to the planet’s intense radiation belts, which can damage sensitive electronics. NASA planned to fire Juno’s thrusters in October to increase the frequency of these flybys but had to cancel plans due to a malfunction of the engine valves. But that doesn’t prevent Juno from carrying on its mission.

This mosaic was building merging the last 3 flyby over the south hemisphere. Also HDR Tones processing was apply using Photoshop, in order to enhance the color contrast. Image credits: Gervasio Robles / NASA.

Here, we picked just some of our favorites (sometimes enhanced by photo editing software, check the description). Head on to Juno’s page to check out the full gallery.

An image created by processing the PJ-4 image 106 (“Oval BA”) raw framelets. This is a perspective view that shows Jupiter from Juno’s vantage point when the original image data was obtained. The effects of global illumination have been removed and the contrast, color and sharpness exaggerated. Image credits: Bjorn_Jonsson / NASA/JPL-Caltech/SwRI/MSSS/Björn Jónsson

An image created by processing the PJ-4 image 106 (“Oval BA”) raw framelets. This is a perspective view that shows Jupiter from Juno’s vantage point when the original image data was obtained. The effects of global illumination have been removed and the contrast, color and sharpness exaggerated. In this view one of the “string of pearl” ovals is visible – the oval called A1. Image credits: Bjorn_Jonsson / NASA/JPL-Caltech/SwRI/MSSS/Björn Jónsson.

This shows the 3 images covering the southern hemisphere, plus context images. The positions of the known circulations and jets are indicated. Image credits: Philosophia-47 / NASA / SwRI / MSSS / John Rogers

Astronomers create the most accurate map of dark matter in the Universe

Yale researchers have created one of the most accurate maps of dark matter, helping us uncover the secrets of the universe.

This is a 3-D visualization of reconstructed dark matter clump distributions in a distant galaxy cluster, obtained from the Hubble Space Telescope Frontier Fields data. The unseen matter in this map is comprised of a smooth heap of dark matter on which clumps form. We don’t know why the clumps for or what they mean. Credit: Yale University

Whenever we’re looking at the Universe, we’re mostly blind. According to our current understanding, the universe is roughly 27 percent dark matter, 68 percent dark energy and about 5 percent ordinary matter. We’ve seen quite a bit of the 5% matter… but that’s still just 5%. The rest of the 95% remains hidden to our sight.

Take dark matter for instance. It’s an unidentified type of matter which does not emit or interact with electromagnetic radiation (such as light). Therefore, we can’t really see it. We know it exists somehow because we see its gravitational effects — including something called gravitational lensing.

A gravity lens is a phenomenon observed when a distribution of matter (such as a cluster of galaxies) lies between an observer (the Earth) and a distance light source. As the light travels from the source to the observer, it bends when it gets close to the cluster of galaxies. By studying the shapes and orientations of large numbers of distant galaxies, their orientations can be averaged to measure the shear of the lensing field in any region. This, in turn, can be used to reconstruct the mass distribution in the area: in particular, the background distribution of dark matter can be reconstructed.

“With the data of these three lensing clusters we have successfully mapped the granularity of dark matter within the clusters in exquisite detail,” Natarajan said. “We have mapped all of the clumps of dark matter that the data permit us to detect, and have produced the most detailed topological map of the dark matter landscape to date.”

So basically, they see the lensing and see the regular mass around it — the rest of the lensing, the one that can’t be explained by regular matter, indicates the presence of the dark matter. Do this several times, in several directions, and you’d end up with a map of dark matter.

“While we now have a precise cosmic inventory for the amount of dark matter and how it is distributed in the universe, the particle itself remains elusive,” Natarajan said.

What’s remarkable is that this map closely resembles previous computer simulations, indicating that astronomers have a pretty good idea of what’s going on. The fact that with only indirect evidence, astronomers have predicted dark matter with such accuracy is impressive — and it’s also a testament to how much the field has advanced in recent years.

Journal Reference: Priyamvada Natarajan, Urmila Chadayammuri, Mathilde Jauzac, Johan Richard, Jean-Paul Kneib, Harald Ebeling, Fangzhou Jiang, Frank van den Bosch, Marceau Limousin, Eric Jullo, Hakim Atek, Annalisa Pillepich, Cristina Popa, Federico Marinacci, Lars Hernquist, Massimo Meneghetti, Mark Vogelsberger. Mapping substructure in the HST Frontier Fields cluster lenses and in cosmological simulations. Monthly Notices of the Royal Astronomical Society, 2017; DOI: 10.1093/mnras/stw3385


In 2015, record temperatures in Antarctica were 17.5°C (63.5°F). Yes, you read that right

The temperature was recorded in 2015, but it was just now released.

For a long time, it seemed that Antarctica was immune to global warming. How the times have changed! Image credits: NASA.

Is this the Antarctic or the Mediterranean?

Antarctica has been called “the last place on Earth.” People have made countless expeditions to explore the rugged terrain and ungodly temperatures. Some of them have not returned, falling to the might of the seemingly endless ice. But some days… you could enjoy the Antarctic Sun in a T-shirt, with temperatures more fit for Spain or southern Italy.

The reason why this is extremely surprising (if this needs any explaining) is that polar regions are cold — really cold. The South Pole’s annual mean temperature is -76F (-60C) in winter and -18 (-28.2C) in summer according to data at Woods Hole Oceanographic Institute. Antarctica is usually not as cold as the South Pole, but on most days, temperatures range between -10C and -40C. But sometimes, things just go a bit crazy — especially when climate change kicks in.

Randy Cerveny, an Arizona State University professor of geographical science and urban planning who works with the World Meteorological Organization (WMO) said in a press release:

“The temperatures we announced today are the absolute limit to what we have measured in Antarctica […] The polar regions of our planet have been termed the ‘canary’ in our global environment[…]because of their sensitivity to climate changes, sometimes the first influences of changes in our global environment can be seen in the north and south polar regions.

Antarctic warming

Image credits: NASA / John Sonntag.

It’s not like this took us completely by surprise; climatologists have known that global warming greatly affected the Antarctic, but the sheer magnitude is stunning. Putting it all into perspective, much of the Antarctic is expected to melt by the end of the century — even under the most optimistic scenarios. It’s generally accepted that it’s no longer a question of whether the West Antarctic Ice Sheet will melt, it’s a question of when. We already see a massive crack (above), over 100 kilometers long and 100 meters wide (60 miles x 300 feet) with much more expected to come in the not-so-distant future. But seriously, this is pretty much the opposite of hell freezing over.

You might be wondering why this only emerged now, two years after it happened. WMO’s Archive of Weather and Climate Extremes is tasked with documenting the changes at the edges of the continent, but that’s no easy feat. Thing is, we don’t really know how common or uncommon this is in the Antarctic. Because the land is so inaccessible and funding is still scarce in many instances, we have few data points (read: scientific bases) scattered around the Antarctica.

“The Antarctic and the Arctic are poorly covered in terms of weather observations and forecasts, even though both play an important role in driving climate and ocean patterns and in sea level rise,” said Michael Sparrow, a polar expert with the World Climate Research Programme.

“Verification of maximum and minimum temperatures help us to build up a picture of the weather and climate in one of Earth’s final frontiers.”

Due to this, satellite-based observations such as NASA’s Earth Sciences program are vital and should be maintained. Hopefully, NASA will be allowed to continue doings its job.


European Space Agency makes all its pictures and videos free to share and use

The doors to the European Space Agency (ESA) have been wide opened: everything they’ve ever released is now open access.

An iconic image recently released by the ESA. Mars as seen by Rosetta’s OSIRIS camera (2007). Credit: ESA & MPS for OSIRIS Team MPS/UPD/LAM/IAA/RSSD/INTA/UPM/DASP/IDA, 2007, CC BY-SA 3.0 IGO

We knew something was up because, for the past few weeks, ESA has been uploading more and more of its archive to the open access site. Now, the trove of images and videos has adopted the Creative Commons Attribution – ShareAlike 3.0 Intergovernmental Organisation. This means that you can use whatever you want, share it and adapt it as you wish for all purposes — even commercially — while crediting ESA as the author.

“This evolution in opening access to ESA’s images, information and knowledge is an important element of our goal to inform, innovate, interact and inspire in the Space 4.0 landscape,” said Jan Woerner, ESA Director General. “It logically follows the free and open data policies we have already established and accounts for the increasing interest of the general public, giving more insight to the taxpayers in the member states who fund the Agency,” he added.

But this isn’t just about images and videos, their recent releases under open access policies also include data which can be used by scientists, professionals, and even students. Among many other things, you can now freely access:

  • Images and data from Earth observation missions (Envisat, Earth Explorer, European Remote Sensing missions, Copernicus; example here).
  • ESA/Hubble images and videos
  • The entire ESA Planetary Science Archive Data (PSA). The PSA is the central European repository for all scientific and engineering data returned by ESA’s planetary missions. You can see pretty much everything ESA has ever done: ExoMars 2016, Giotto, Huygens, Mars Express, Rosetta, SMART-1, Venus Express, and many more.
  • Sounds from Space: ESA’s official SoundCloud channel hosts a multitude of sounds and so-called sonifications from Space, including the famous ‘singing comet’, a track that has been reused and remixed thousands of times by composers and music makers worldwide.
  • 3D Models of a comet.

It’s a great step the ESA has taken, one which follows NASA’s similar decision. Yes, everything that NASA has is also open access — and this truly is tremendous. Now more than ever, we need access to information and data, and now more than ever NASA and the ESA have more data on their hands than they can analyze. By making it available for everyone they are not only helping researchers, students, and the media, they are also helping advance science. The tendency to favor open-access data is something we applaud and which can lead to important discoveries. The trove of data is now open for everyone to access — congrats, ESA!

NASA satellite spots mile-long iceberg breaking off from Antarctic glacier

NASA’s satellites witnessed the dramatic breaking of the iceberg. The icy surface first cracked and then, a mile-long chunk of ice ripped apart.

Pine Island Glacier shedding a block of ice the size of Manhattan in January CREDIT: MODIS/NASA

The immense Pine Island Glacier is known for its instability, but we’ve rarely witnessed something happening at this scale. Calving is not uncommon, and the glacier amounts for 20 percent of the ice sheet’s total ice flow to the ocean, according to NASA scientists. Especially in recent years, the ice has been constantly retreating, giving way to the liquid water.

The Earth-watching Landsat satellite (you know, the type some US politicians want to retire because ‘NASA shouldn’t be watching the Earth‘) captured the event unfolding in all its glory.

The glacier’s last major iceberg break took place back in July 2015, when an iceberg measuring almost 225 square miles separated. Though this event is much smaller than the one from 2015, it’s yet another proof of the glacier’s instability in the face of climate change.

“I think this event is the calving equivalent of an ‘aftershock’ following the much bigger event,” Ian Howat, a glaciologist at The Ohio State University, said in a statement. “Apparently, there are weaknesses in the ice shelf — just inland of the rift that caused the 2015 calving — that are resulting in these smaller breaks.”

More calving is expected in the near future, and even more calving might be taking place without us knowing it. The West Antarctic Ice Sheet might completely collapse in the next 100 years. In the meantime, we’ll see more and more chunks breaking apart and melting.

Yet again, this shows just how vital Earth monitoring is. If we want to truly understand and tackle processes taking place at such scales, the importance of Landsat-like missions cannot be overstated.

The science of uptime: How big websites manage huge traffic loads 24×7

We live in a very connected and mobile world. Technology and media have become integral parts of our lives that we spend 10 hours a day looking at screens. A fifth of that time is spent on social media. YouTube viewers spend an average of 40 minutes per session watching videos. All of these happen around the clock and around the globe. So have you ever wondered how these websites are available anytime, anywhere, and on-demand?

Image via Pixabay.

Websites like YouTube, Facebook, and other popular names need to be running 24 x 7 and have to keep up with demand. They have to remain accessible to billions of users worldwide. The time websites spend up and running is called “uptime.” Maintaining uptime can be a challenge especially for many of these top high-traffic websites. Doing that isn’t an easy feat. It requires significant computing power to process all the requests and data being transferred from across the globe.

Websites are hosted on powerful computers called servers. Many small websites can be hosted on a single machine. Even your home PC or laptop can function as a server. However, top websites need more computing power than that. Because of the high volume of traffic, they need to pool together many computers and network devices to share the computing work. These collections or clusters of servers are usually found in data centers – entire rooms or even buildings that are designed to accommodate server clusters, such as Akeno Hosting.

Here are some of the things that help huge websites keep their uptime.

Redundancy and failover

Among the common reasons for website downtime, server problems deserve a special mention. They can be caused by faulty hardware. Servers are typically built to handle more stress than regular PCs, but like any electronic device, they can break. Imagine if the server’s hard drive fails. If there are no backups available, the data may be corrupted and lost.

To make sure that all data is kept safe, data centers apply for redundancy. In the case of Google Docs, all the documents you create are automatically sent to multiple data centers that are located far away from each other through live synchronization. This way, the data is automatically backed up to at least two locations at once, so if one fails, the data is still safe.

To make sure that the service doesn’t break in cases of failure, failover measures automatically switches serving duties to a redundant or backup server.

Load balancing

Servers can also be overloaded by huge amounts of computational load. Say, a post or a video goes viral. It isn’t uncommon for those to be viewed and shared hundreds of thousands of times a day. All those can cause a server to get overloaded and crash. To prevent this from happening, data centers employ load balancing where the load is distributed across multiple servers so that no one server takes the brunt of the load.

Data centers typically use specialized hardware to do this, but thanks to cloud computing technologies, these can even be done in the cloud and not in the data center. Cloud-based load balancing services like Incapsula employ a variety of load distribution methods and algorithms to optimize the use of servers across data centers worldwide. One advantage of employing cloud-based load balancing is that it can be deployed across servers in different datacenters or even different parts of the world, unlike appliance-based load balancing, which only works within a certain datacenter.

Power and cooling

Servers also need to have a consistent power supply to be up all the time. Data centers can’t risk power interruptions so they usually have uninterrupted power supplies (UPS) with high-capacity and high-drain batteries and backup generators. In any case power to the data center goes out, the backup power kicks in.

Computers can also consume a lot of energy and produce a lot of heat. Just observe your computer or laptop’s fans hum and expel hot air when you play graphics-intensive games or perform processing-heavy work like video editing or photo editing.

To prevent servers from overheating (which could cause them to crash), data centers feature clever climate control systems that regulate both coolness and humidity. Some of the more progressive tech companies are making sure they use green technologies like using renewable energy like solar and wind to power their data centers and even channeling seawater to cool their facilities.

Response to natural disasters

Data centers also need to take into consideration their geographic location. There are areas with a higher risk of natural disasters that can disrupt operations.

For example, many tech companies are based in earthquake-prone California. Data centers cope with this through clever engineering. Structures are reinforced, server racks are restrained to prevent them from toppling over, and even the servers are tested to withstand vibrations and tremors.

Some of the bigger tech firms even have data centers located all over the world to serve as redundancy and backups to their stateside data centers. Many advanced data centers also have fire control and flood control to keep equipment safe in case of these disasters.

Monitoring and protection

Data centers employ engineers to monitor the status of servers and the data center and respond if anything comes up. They are responsible for making sure the infrastructure is running in optimal conditions. They also function as support personnel and coordinate with other service providers and data centers so that the whole system works reliably.

However, despite this, you might have experienced downtime occur even with these top websites. For example, Twitter and Spotify reported downtime late in 2016 due to a cyberattack on their service provider Dyn.

Today, among the biggest security threats to data centers are distributed denial-of-service (DDoS) attacks where attackers try to overwhelm a network with traffic so that the data centers overload. Most security services are now bundled together with cloud-based load balancing, firewalls, and DDoS protection.

Guaranteeing uptime

It does take a lot to guarantee uptime. If you try or have availed of web hosting services, you’d probably notice that everyone’s claiming robust infrastructure but can’t claim 100 percent uptime. Oftentimes, they claim the number 99.999 percent as the best they can offer. Computing that to the number of hours in a year, you would come up with slightly less than an hour of downtime. This is a provision for maintenance and other unexpected occurrences. But overall, given all the threats to uptime, it is quite amazing how the top sites can actually provide near 100 percent uptime.

You can search for your own exoplanets — Researchers make huge dataset available

The search for extraterrestrial life is about to get some new recruits — us.

“There seems to be no shortage of exoplanets,” says Jennifer Burt, a Torres postdoctoral fellow in MIT’s Kavli Institute for Astrophysics and Space Research. “There are a ton of them out there, and a ton of science to be done.” Credit: Ricardo Ramirez

Keck and us

The W. M. Keck Observatory is a two-telescope astronomical observatory located in Hawaii, US. These are two of the largest and most advanced telescopes currently available and they’re also some most well-known instruments helping astronomers in their hunt for alien planets. The only “problem” astronomers have is that there’s just too much data for them to analyze it all, though one could argue that that’s hardly a problem. With this in mind, researchers from the Carnegie Institution for Science and MIT have released a massive dataset, taken over two decades by the W.M. Keck Observatory. The data comes with an open-source software package to process the data and an online tutorial so you can start hunting for planets right away.

The data features observations taken by the High Resolution Echelle Spectrometer (HIRES). HIRES splits a wave of light into all its constituent wavelengths, which are then analyzed to determine the characteristics of the starlight. The technique is called radial velocity. The radial velocity of an object with respect to a given point is the rate of change of the distance between the object and the point. The radial velocity of a star (and other luminous distant objects os well) can be measured accurately by taking a high-resolution spectrum and comparing the measured wavelengths of known spectral lines to wavelengths from laboratory measurements. This is what this data is all about, and scientists hope to draw new eyes to look at these observations, which include 61,000 measurements of more than 1,600 nearby .

“This is an amazing catalog, and we realized there just aren’t enough of us on the team to be doing as much science as could come out of this dataset,” says Jennifer Burt, a Torres Postdoctoral Fellow in MIT’s Kavli Institute for Astrophysics and Space Research. “We’re trying to shift toward a more community-oriented idea of how we should do science, so that others can access the data and see something interesting.”

Find your own planet

Although it wasn’t initially designed to look for exoplanets, MIT Torres postdoctoral research and team member Jennifer Burt said HIRES was pivotal in the discovery of new stars and has also proven effective at planet hunting.

“HIRES was designed back in the late ‘80s and early ‘90s to go and look at these faint, fuzzy galaxies,” Burt told Gizmodo. “The professor who designed it—Steven Vogt, who’s on the planet hunting team—was one of my advisors in grad school. Steve went through when designing HIRES and included the machinery that you would need to turn it into an exoplanet machine.”

Over the past two decades, astronomers have turned HIRES at more than 1,600 “neighborhood” stars, all within a relatively close 100 parsecs, or 325 light years, from Earth. So there’s no shortage of stars, and almost certainly — no shortage of planets to be found. The sheer number of exoplanets shocked astronomers in the beginning, but now we’re starting to understand just how many of them there are.

“We’ve gone from the early days of thinking maybe there are five or 10 other planets out there, to realizing almost every star next to us might have a planet,” Burt says.

So can we realistically find new planets? The answer is definitely ‘yes.’ For the aspiring astronomer, or for the casual scientist who wants to try something new, the odds are looking good. For people who will start from scratch, the beginning will be a bit challenging but if you persevere, you’re certainly in for some space goodies. The best part? The data set will continue to grow, so if you get accustomed to it, you’ll have more and more stuff to play with.

“This dataset will slowly grow, and you’ll be able to go on and search for whatever star you’re interested in and download all the data we’ve ever taken on it. The dataset includes the date, the velocity we measured, the error on that velocity, and measurements of the star’s activity during that observation,” Burt says. “Nowadays, with access to public analysis software like Systemic, it’s easy to load the data in and start playing with it.”


Astronomers discover the first “middleweight” black hole

Generally speaking, black holes fall into two categories: small, with a mass comparable to that of the Sun, or supermassive, weighing millions or even billions of Suns. Researchers have postulated that middleweight black holes should also exist but were unable to actually find one — until now.

An artist rendition of the newly discovered middleweight black hole. New research suggests that a 2,200 solar-mass black hole resides at the center of the globular cluster 47 Tucanae.
Credit: CfA / M. Weiss

When most black holes are discovered, astronomers see X-rays coming from a hot disk of material swirling around it, but this only works if the black hole is actively feeding on nearby gas. Supermassive black holes can also identified by the gravitational effect they have on nearby stars, but that also only works in a limited number of cases. There’s a good chance that many black holes don’t respect either of these criteria and there may be a swarm of them lying undiscovered in our galaxy. Harvard astronomers have been on the lookout for such intermediate-sized black holes.

“We want to find intermediate-mass black holes because they are the missing link between stellar-mass and supermassive black holes. They may be the primordial seeds that grew into the monsters we see in the centers of galaxies today,” says lead author Bulent Kiziltan of the Harvard-Smithsonian Center for Astrophysics (CfA).

They focused on a globular cluster called 47 Tucanae,  located in the constellation Tucana, about 16,700 light years away from Earth, and 120 light years across. It can even be seen with the naked eye as it contains thousands of stars, as well as about two dozen pulsars. It’s not the first time 47 Tucanae has been investigated in the hope of finding a black hole at its center, but previous attempts have not been successful. Now, two pieces of evidence indicate the existence of such a black hole.

The first clue is the overall motion of the stars throughout the cluster. The globular cluster is so dense with stars that the big ones fall towards the center while the other ones spin around. The extra gravity from the black hole acts like a spoon “stirring the pot” of stars — causing them to slingshot faster and over greater distances. This change, even though subtle, is measurable. The second line of evidence comes from the pulsars mentioned above.

Pulsars are highly magnetized, rotating neutron stars or white dwarfs. The radio signals these pulsars emit are very recognizable and easy to detect by astronomers. These objects are also flung by the black hole’s gravity and are much farther from the center of the cluster than you’d expect.

So although we can’t see the black hole directly, we get a good glimpse of its gravity effect. Kiziltan believes the black hole has a mass of about 2,200 solar masses, which would make it a perfectly fit for the middleweight category they were looking for. The team now wants to look in similar clusters, to see if a similar analysis could reveal other hidden black holes.

Journal Reference: Bülent Kızıltan, Holger Baumgardt, Abraham Loeb. An intermediate-mass black hole in the centre of the globular cluster 47 Tucanae. Nature, 2017; 542 (7640): 203 DOI: 10.1038/nature21361

European Space Agency gets 9.5% budget increase in 2017

It’s good news for space exploration: the European Union decided to increase the budget of its space agency. However, most of this extra money won’t go into “boldly going where no man has gone before” but will rather be invested into navigation, technology support and space situational awareness.

Main Control Room / Mission Control Room of ESA at the European Space Operations Centre (ESOC) in Darmstadt, Germany. Image credits: ESA.

NASA’s budget woes are a secret to no one, although the space agency did secure an impressive budget in 2016 — over $19 billion. But NASA isn’t the only space agency out there. Five other government space agencies have full launch capabilities and are all conducting valuable research: the Indian Space Research Organisation (ISRO), the European Space Agency (ESA), the China National Space Administration (CNSA), the Japan Aerospace Exploration Agency (JAXA) the Russian Federal Space Agency (Roscosmos). Although none of these can rival NASA in resources, Roscosmos and the ESA still get significant funding and have consistent accomplishments.

The ESA money basically comes from the European Union. There were some concerns regarding Britain’s proposed exit of the European Union, but ESA Director-General Jan Woerner stressed that Brexit will have little or no effect on Britain’s participation in the ESA.

Much of the funding will be dedicated to managing Galileo and the Copernicus Earth observation network, for which new satellites are scheduled to launch in the coming months. Copernicus is one of the most ambitious Earth observation programmes to date. Its goal is to provide “accurate, timely and easily accessible information to improve the management of the environment, understand and mitigate the effects of climate change and ensure civil security.” With the Trump administration being very vocal against such programs, Europe’s role becomes even more important and Copernicus might bring some much needed data in a world struggling to meet the Paris Agreement goals.

Another notable element in ESA’s budget is the extra 400 million euros  ($430 million) injected into the Euro-Russian ExoMars exploration program which strives to find clear evidence of life on Mars. The mission will search for biosignatures of Martian life, past or present, employing several spacecraft elements to be sent to Mars on two launches, with a launch scheduled for 2020.

Here are just a few of the ESA’s most impressive accomplishments (or plans) from 2016:

Blood, plasma and water droplets beading on a superomniphobic surface. Colorado State University researchers have created a titanium surface that's specifically designed to repel blood. (Credit: Kota Lab / Colorado State University)

Blood-repelling surface might finally put an end to clotting in medical implants

Blood, plasma and water droplets beading on a superomniphobic surface. Colorado State University researchers have created a titanium surface that's specifically designed to repel blood. (Credit: Kota Lab / Colorado State University)

Blood, plasma and water droplets beading on a superomniphobic surface. Colorado State University researchers have created a titanium surface that’s specifically designed to repel blood. (Credit: Kota Lab / Colorado State University)

Medical implant designers have always found it challenging to make their prostheses both biocompatible and safe from blood clotting. The solution might have been found at the interface between material science and biomedical engineering as Colorado State University engineers recently demonstrated. A team there designed a “superhemophobic” titanium surface that’s extremely repellent to blood. Tests ran in the lab suggest that blood would stay clear of an implant coated with this surface averting clots and infection that usually require doctors to perform surgery again.

Arun Kota and Ketul Popat, both from Colorado State University’s mechanical engineering and biomedical engineering departments, combined their expertise in an effort to design a surface that repels blood. Kota is an expert in superomniphobic materials (the kind that can repel virtually any liquid) while Popat’s work has been focused on tissue engineering and bio-compatible materials.

The two had to venture through unexplored terrain, as the typical approach has so far been the opposite. Medical implant engineers usually design “philic” surfaces that attract, not repel, blood so these are more biocompatible.

“What we are doing is the exact opposite,” Kota said. “We are taking a material that blood hates to come in contact with, in order to make it compatible with blood.”

That may sound confusing but the finished piece performed as intended. The researchers started with plain sheets of titanium whose surfaces they chemically altered to create a ‘phobic’ geometry onto which blood can’t come in contact with. It’s akin to how the lotus leaf repels water thanks to its nanoscale texture, only Kota and Popat’s surface was specially designed to repel blood. Experiments suggest very low levels of platelet adhesion, the biological process that eventually can lead to blood clotting and even biological rejection of the foreign material.

What the titanium's chemically altered surface looks like. The 'spikes' repel the blood. Credit: Colorado State Uni.

What the titanium’s chemically altered surface looks like. These ‘spikes’ repel the blood. Credit: Colorado State Uni.

Because the blood is ‘tricked’ that there is no surface blocking its flow, for all intents and purposes there is no foreign material.

“The reason blood clots is because it finds cells in the blood to go to and attach,” Popat said.

“Normally, blood flows in vessels. If we can design materials where blood barely contacts the surface, there is virtually no chance of clotting, which is a coordinated set of events. Here, we’re targeting the prevention of the first set of events.”

Next on the drawing board is to test new textures and chemistries. So far, fluorinated nanotubes seem to offer the best protection against clotting. Other clotting factors will also be examined and hopefully the Colorado State team may soon have the chance to test their work with real medical devices.

The findings were reported in the Advanced Healthcare Materials journal.

New NASA phone app is basically ‘Plant Sims in Space’

Science fans and gamers rejoice — NASA just released a phone app (iOS / Android) which allows you to take part in virtual experiments carried on the International Space Station (ISS).

“Welcome to the International Space Station! As the newest member of the ISS crew, it’s your task to familiarize yourself with the station, and help out with the plant growth experiment” — App description.

Released last month, NASA Science Investigations: Plant Growth is both fun and educational. As a former avid Sims player, I feel the NASA’s app has many similarities to the classic app. You go around the ISS, interact with another astronaut, and solve various tasks.

Image credits: NASA / Play Store.

For starters, you learn to navigate the ISS just like a real astronaut would, and the game is an exact replica of the real thing. Everything is similar, every shelf is there. After that, you start talking to another astronaut, Naomi, who convinces you to start growing some plants. This is the highlight of the game and it mimics a real experiment. NASA is actively researching growing plants in outer space and the ISS has a real blooming garden.

“In anticipation of long duration missions in the future, plant growth in space will become more important for several reasons,” Sharon Goza IGOAL Project Manager at NASA-Johnson Space Center, told Gizmodo. “Growing plants for food in space not only provides a variety of nutrients, but also may provide psychological benefits.”

So if you want to get your kid more interested in science or if you want to get a feel of the ISS life yourself, and maybe grow some space plants, be sure to check out NASA Science Investigations: Plant Growth.


Rare prediction: Star collision will be visible with the naked eye in 2022

Some scientists have made an unprecedented prediction, claiming that a pair of stars in the constellation Cygnus will collide approximately in 2022, creating an explosion so bright it will be visible to the naked eye.

The stars expected to merge are located in the Cygnus constellation. Image via NASA.

Calvin College professor Larry Molnar worked with his students and researchers from the Apache Point Observatory (Karen Kinemuchi) and the University of Wyoming (Henry Kobulnicky) to make an unprecedented claim. He and his team believe that two stars he is monitoring in the constellation Cygnus will merge and explode in 2022 (give or take a year), lighting up in the sky.

“It’s a one-in-a-million chance that you can predict an explosion,” Molnar said of his bold prognostication. “It’s never been done before.”

It will be a very dramatic change in the sky, as anyone can see it,” Calvin College astronomer Larry Molnar told National Geographic. “You won’t need a telescope to tell me in 2023 whether I was wrong or I was right.”

If it does happen, then you should see it easily without a telescope or any other specialized tools.

“It will be a very dramatic change in the sky, as anyone can see it. You won’t need a telescope to tell me in 2023 whether I was wrong or I was right,” Molnar said at the presentation, according to National Geographic.

The two stars, jointly called KIC 9832227, are located 1,800 light years away from Earth. Astronomers expect them to merge at some point but the exact time is hard to predict. Stellar collisions occur about once every 10,000 years and scientists have only recently been able to observe a stellar merger. If we will see the explosion in 2022, then this means that they have already exploded for 1795 years, but light from them hasn’t reached us already. As we can see them, the stars are quite close to one another. Daniel Van Noord, who also worked on the study, said they share an atmosphere together “like two peanuts sharing a single shell.”

It was actually Van Noord who started the study in 2013, when he realized that the star (as astronomers considered it then) was actually a binary system.

“He looked at how the color of the star correlated with brightness and determined it was definitely a binary,” said Molnar. “In fact, he discovered it was actually a contact binary, in which the two stars share a common atmosphere, like two peanuts sharing a single shell.”

“From there Dan determined a precise orbital period from Kinemuchi’s Kepler satellite data (just under 11 hours) and was surprised to discover that the period was slightly less than that shown by earlier data” Molnar continued.”

This reminded them of the work of Romuald Tylenda, who made similar observations of a star in 2008. The star system exhibited a similar behavior and then exploded – prompting Molnar to believe that the same thing will happen here. Extrapolating the data from Tylenda’s study they made the bold prediction and even set a date for it: 2022.

A red nova explosion like the one we might expect in 2022. Image via Space Telescope Science Institute

Whether or not this will happen, we should pay close attention, Molnar says, because this event will broaden our understanding of stars.

“Bottom line is we really think our merging star hypothesis should be taken seriously right now and we should be using the next few years to study this intensely so that if it does blow up we will know what led to that explosion,” he says.

The study is interesting because unlike most astronomical observations, which involve numerous team and expensive equipment, this study was low-scale and low-cost, which is why Molnar can afford to make such a risky prediction, as he himself says.

“Most big scientific projects are done in enormous groups with thousands of people and billions of dollars,” he said. “This project is just the opposite. It’s been done using a small telescope, with one professor and a few students looking for something that is not likely.”

“Nobody has ever predicted a nova explosion before. Why pay someone to do something that almost certainly won’t succeed? It’s a high-risk proposal. But at Calvin it’s only my risk, and I can use my work on interesting, open-ended questions to bring extra excitement into my classroom. Some projects still have an advantage when you don’t have as much time or money.”

Hubble Telescope maps Voyager’s road trip

As Voyager continues to go where no other mission has gone before, it receives a bit of help from the Hubble Telescope.

An artist’s conception of Voyager 1, which is now in interstellar space, and the Solar System the probe left behind. Image credits: NASA.

Voyager 1 has entered interstellar space. The NASA spacecraft, which left Earth on a September morning 36 years ago, has traveled farther than anyone, or anything, in history. As it prepares to sail even deeper into the unknown, the shuttle is getting a bit of help from the Hubble Telescope, detailing what lies in front of it – particularly rich clouds of hydrogen.

“If the Voyager spacecraft are the Google Street View car going around your neighbourhood taking pictures on the street, then Hubble is providing the overview, the road map for the Voyagers on their trip through interstellar space,” says Julia Zachary, an undergraduate student at Wesleyan University in Middletown, Connecticut.

It’s a rare marriage between two emblematic missions, two landmark projects which have marked space exploration. There are two Voyager missions, both of which launched in 1977 on a mission to visit and study the gas giants in our solar system: Saturn, Jupiter, Uranus, and Neptune. After that, however, they are set to continue their mission outside the solar system, drifting away into the cold, vast space. Voyager 1 is already in interstellar space, revealing some rather surprising information about the edge of our solar system. Meanwhile, Voyager 2 is still just barely within the Solar System but still more than 17 billion kilometers away from Earth.

The thing is, the two missions will be leaving the solar system at slightly different angles, and Hubble is peering ahead of both their trajectories. Although the Google Street View analogy is quite neat, this is no ordinary map Hubble is developing. The telescope analyzes light coming from distant stars and analyzes chemical signatures trapped in that light. Most notably, it picked up signatures from some hydrogen clouds contain small amounts of carbon, which Voyager 1 is heading towards. As Voyager reaches those parts of space, it sends back information which will be analyzed and compared with what Hubble found. It’s almost unique in astronomy, to take some indirect measurements (Hubble) and then have direct, on-point information (from Voyager).

“As an astronomer, I’m not used to having measurements from the place I’m observing,” says Seth Redfield, an astronomer at Wesleyan and a member of the team.

The Voyager missions also feature phonograph recordings on a 12-inch gold-plated copper disk, containing sounds and images selected to portray the diversity of life and culture on Earth.