Tag Archives: fusion

Hand Plasma Lamp.

New theoretical framework will keep our fusion reactors from going ‘boom’

New theoretical work finally paves the way to viable fusion reactors and abundant energy for all.

Hand Plasma Lamp.

Hand touching a plasma lamp.
Image credits Jim Foley.

A team of physicists from the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) at Princeton University’s Forrestal Campus, New Jersey, may have finally solved a long-standing problem in physics — how to tame fusion for energy production. Their work lays down the groundwork needed to stabilize the temperature and density levels of plasma in fusion reactors, an issue that plagued past efforts in this field.

Wild and energetic

Plasma is one of the four natural states of matter. That may sound confusing since we’re all thought that stuff is either a gas, a liquid, or a solid, but there’s a good explanation for this: plasma is such a violent and energetic state of matter that it simply doesn’t exist freely on Earth. It is, however, the stuff that most stars are made of.

Think of plasma as a soup, only instead of veggies, it’s full of protons and electrons (essentially, highly-energized hydrogen atoms) that smack together to create helium. For a less-culinary explanation, see here. This process requires a lot of energy to get going — you need to heat the hydrogen to about 100 million degrees Celsius — but will generate monumental amounts of energy if you manage to keep it running.

It’s easy to understand, then, why fusion is often hailed as the harbinger of infinite, free energy for everybody — ever. So far, we’ve successfully recreated plasma in fusion reactors — the donut-shaped tokamaks or funky stellarators, for example — but we’ve yet to find a way of keeping this super-heated soup of charged particles stable for more than a few seconds.

One of the biggest hurdles we’ve encountered is that plasma in fusion reactors tends to fluctuate wildly in terms of temperature and density. Such turbulence is very dangerous, as any inkling of runaway plasma will eat through a reactor’s wall like a lightsaber through butter. Faced with such odds, researchers have little choice but to shut down experimental reactions before they run amok.

Plasma MAST Tokamak.

Plasma confined in the MAST tokamak at the Culham Centre for Fusion Energy in the UK. Magnetic field lines that combine to act like an invisible bottle for the plasma.
Image credits ITER / CCFE.

The most frustrating thing is that we know what we have to do, but not how to do it. We need to contain the plasma in an orderly fashion and keep the reaction going long enough for it to start being net-energy-positive — i.e. generate more energy than we put in.

Stars can cash in on their sheer mass to press plasma into playing nice, but we don’t have that luxury. Instead, we use massively-powerful magnetic fields (some 20,000 times stronger than that of the Earth) to keep it away from the reactor’s walls.

Go with the flow

This is where the present paper comes in. Certain types of plasma flows (like those inside stars) have been found to be very stable over time, without dangerous turbulence. We didn’t know how to make plasma flow like this, but the PPPL researchers report it comes down to a mechanism called magnetic flux pumping forcing the flow at the core of the plasma body to stay stable.

According to the flow simulations the team ran, magnetic flux pumping can take place in hybrid scenarios — a mix of the standard flow regimes currently known from theoretical and experimental models. These standard regimes include high-confinement mode (H-mode) and low-confinement mode (L-mode).

In L-mode, an electrically-balanced scenario — meaning it has a perfect ratio of positive to negative charged-particles — formed at lower temperatures, turbulence allows the plasma to leak away some of its energy. L-mode is unstable as high-temperature plasma at the core is thrown out to the surface, destabilizing the reaction. If this mode can be surpassed and the reaction enters H-mode, the overall temperature of the plasma body is increased and the reaction stabilizes. H-mode is an energy-imbalanced mode, but the plasma is kept stable and confined by electrical fields it itself generates (T. Kobayashi et al., Nature, 2016).

In a hybrid scenario, however, the flow is kept orderly only at the plasma body’s core. This generates an effect similar to that encountered inside the Earth, the team reports, where the solid iron core acts as a ‘mixer’, generating a magnetic field. The interactions between this field, the one applied by the generator, and the two types of plasma flow stabilize the reaction.

Even better, this magnetic flux pumping mechanism is self-regulating, the simulations show. If the mixer becomes too strong, the plasma’s current drops just below the point where it would go haywire.

And, even better #2, the authors suggest that ITER — widely held to be the most ambitious nuclear fusion project, currently under construction in Provence, France — may be suited to experiment with developing magnetic flux pumping by using the same hardware it employs to heat up the plasma.

The paper “Magnetic flux pumping in 3D nonlinear magnetohydrodynamic simulations” has been published in the journal Physics of Plasmas.

Ivy Mike.

Functional hydrogen-boron fusion could be here “within the next decade”, powered by huge lasers

Viable fusion may be just around the corner, powered on by immensely powerful lasers. Even better, a newly technique requires no radioactive fuel and produces no toxic or radioactive waste.

Ivy Mike.

The hydrogen bomb is the only man-made device to date that successfully maintained fusion.
Image credits National Nuclear Security Administration / Nevada Site Office.

One of the brightest burning dreams of sci-fi enthusiasts the world over is closer to reality than we’ve ever dared hope: sustainable fusion on Earth. Drawing on advances in high-power, high-intensity lasers, an international research team led by Heinrich Hora, Emeritus Professor of Theoretical Physics at UNSW Sydney, is close to bringing hydrogen-boron reactions to a reactor near you.

Energy from scratch

In a recent paper, Hora argues that the path to hydrogen-boron fusion is now viable and closer to implementation that other types of fusion we’re toying with — such as the deuterium-tritium fusion system being developed by the US National Ignition Facility (NIF) and the International Thermonuclear Experimental Reactor under construction in France.

Hydrogen-boron fusion has several very appealing properties which Hora believes puts it at a distinct advantage compared to other systems. For one, it relies on precise, rapid bursts from immensely powerful lasers to squish atoms together. This dramatically simplifies reactor construction and reaction maintenance. For comparison, its ‘competitors’ have to heat fuel to the temperatures of the Sun and then power massive magnets to contain this superhot plasma inside torus-shaped (doughnut-like) chambers.

Furthermore, hydrogen-boron fusion doesn’t release any neutrinos in its primary reaction — in other words, it’s not radioactive. It requires no radioactive fuel and produces no radioactive waste. And, unlike most other energy-generation methods which heat water as an intermediary media to spin turbines — such as fossil-fuel or nuclear — hydrogen-boron fusion releases energy directly into electricity.

All of this goody goodness comes at a price, however, which always kept them beyond our grasp. Hydrogen-boron fusion reactions require immense pressures and temperatures — they’re only comfortable upwards of 3 billion degrees Celsius or so, some 200 times hotter than the Sun’s core.

Back in the 1970s, Hora predicted that this fusion reaction should be feasible without the need for thermal equilibrium, i.e. in temperature conditions we can actually reach and maintain. We had nowhere near the technological basis needed to prove his theory back then, however.

Why not blast it with a laser?

Laser fusion reactor.

Image credits Hora et al., 2017, Lasers and Particles.

The dramatic advances we’ve made in laser technology over the last few decades are making the two-laser approach to the reaction Hora developed back then tangibly possible today.

Experiments recently performed around the world suggest that an ‘avalanche’ fusion reaction could be generated starting with bursts of a petawatt-scale laser pulse packing a quadrillion watts of power. If scientists could exploit this avalanche, Hora said, a breakthrough in proton-boron fusion was imminent.

“It is a most exciting thing to see these reactions confirmed in recent experiments and simulations,” he said.

“Not just because it proves some of my earlier theoretical work, but they have also measured the laser-initiated chain reaction to create one billion-fold higher energy output than predicted under thermal equilibrium conditions.”

Working together with 10 colleagues spread over six countries of the globe, Hora created a roadmap for the development of hydrogen-boron fusion based on his design. The document takes into account recent breakthroughs and points to the areas we still have to work on developing a functional reactor. The patent to the process belongs to HB11 Energy, an Australian-based spin-off company, which means it’s not open for everyone to experiment.

“If the next few years of research don’t uncover any major engineering hurdles, we could have a prototype reactor within a decade,” said Warren McKenzie, managing director of HB11.

“From an engineering perspective, our approach will be a much simpler project because the fuels and waste are safe, the reactor won’t need a heat exchanger and steam turbine generator, and the lasers we need can be bought off the shelf,” he added.

The paper “Road map to clean energy using laser beam ignition of boron-hydrogen fusion” has been published in the journal Laser and Particle Beams.

fusion vs fission

What’s the difference between nuclear fission and fusion

fusion vs fission

Fission vs fusion reaction. Credit: Duke Energy.

In both fusion and fission, nuclear processes alter atoms to generate energy. Despite having some things in common, the two can be considered polar opposites.

For the sake of simplicity, nuclear fusion is the combination of two lighter atoms into a heavier one. Nuclear fission is the exact opposite process whereby a heavier atom is split into two lighter ones.

Fission vs fusion at a glance

Nuclear Fission Nuclear Fusion
  • A heavy nucleus breaks up to form two lighter ones.
  • It involves a chain reaction, which can lead to dangerous meltdowns.
  • The heavy nucleus is bombarded with neutrons.
  • There is established, decades-old technology to control fission.
  • Nuclear waste, a byproduct of fission, is an environmental challenge.
  • Raw material like plutonium or uranium is scarce and costly.
  • Two nuclei combine to form a heavier nucleus.
  • There is no chain reaction involved.
  • Light nuclei have to be heated to extremely high temperature.
  • Scientists are still working on a controlled fusion reactor that offers more energy than it consumes.
  • There is no nuclear waste.
  • Raw materials are very easily sourced.
  • Fusion reactions have energy densities many times greater than nuclear fission.

From Einstein to nuclear weapons

On November 21, 1905, physicist Albert Einstein published a paper in Annalen der Physik calledDoes the Inertia of a Body Depend Upon Its Energy Content? This was one of Einstein’s four Annus Mirabilis papers (from Latin, annus mīrābilis, “Extraordinary Year”) in which he described what has become the most famous physical equation:  E = mc2 (energy equals mass times the velocity of light squared).

This deceivingly simple equation can be found everywhere, even in pop culture. It’s printed on coffee mugs and T-shirts. It’s been featured in countless novels and movies. Millions of people recognize it and can write it down by heart even though they might not understand anything about the physics involved.

Before Einstein, mass was considered a mere material property that described how much resistance the object opposes to movement. For Einstein, however, relativistic mass — which now takes into account the fact that mass increases with speed — and energy are simply two different names for one and the same physical quantity. We now had a new way to measure a system’s total energy simply by looking at mass, which is a super-concentrated form of energy.

It didn’t take scientists too long to realize there was a massive amount of energy waiting to be exploited. By the process through which fission splits uranium atoms, for instance, a huge amount of energy, along with neutrons, is released. Interestingly, when you count all the particles before and after the process, you’ll find the total mass of the latter is slightly smaller than the former. This difference is called the ‘mass defect’ and it’s precisely this missing matter that’s been converted into energy, the exact amount computable using Einstein’s famous equation. This mass discrepancy might be tiny but once you multiply it by c2 (the speed of light squared), the equivalent energy can be huge.

Of course, this conservation of energy holds true across all domains, both in relativistic and classical physics. A common example is spontaneous oxidation or, more familiarly, combustion. The same formula applies, so if you measure the difference between the rest mass of unburned material and the rest mass the burned object and gaseous byproducts, you’ll also get a tiny mass difference. Multiply it by  c2 and you’ll wind up with the energy set free during the chemical reaction.

We’ve all burned a match and there was no mushroom cloud, though. It can only follow that the square of the speed of light only partly explains the huge difference in energy released between nuclear and chemical reactions. Markus Pössel, the managing scientist of the Center for Astronomy Education and Outreach at the Max Planck Institute for Astronomy in Heidelberg, Germany, provides us with a great explanation for why nuclear reactions can be violent.

“To see where the difference lies, one must take a closer look. Atomic nuclei aren’t elementary and indivisible. They have component parts, namely protons and neutrons. In order to understand nuclear fission (or fusion), it is necessary to examine the bonds between these components. First of all, there are the nuclear forces binding protons and neutrons together. Then, there are further forces, for instance the electric force with which all the protons repel each other due to the fact they all carry the same electric charge. Associated with all of these forces are what is called binding energies – the energies you need to supply to pry apart an assemblage of protons and neutrons, or to overcome the electric repulsion between two protons.”

Nuclear binding energy curve. Credit: hyperphysics.phy-astr.gsu.edu

Nuclear binding energy curve. Credit: hyperphysics.phy-astr.gsu.edu

“The main contribution is due to binding energy being converted to other forms of energy – a consequence not of Einstein’s formula, but of the fact that nuclear forces are comparatively strong, and that certain lighter nuclei are much more strongly bound than certain more massive nuclei.”

Pössel goes on mentioning that the strength of the nuclear bond depends on the number of neutrons and protons involved in the reaction. What’s more, the binding energy is released both when splitting up a heavy nucleus into smaller parts (fission) and when merging lighter nuclei into heavier ones (fusion). This explains, along with chain reactions, why nuclear bombs can be so devastating.

How nuclear fission works

Nuclear fission is a process in nuclear physics in which the nucleus of an atom splits into two or more smaller nuclei as fission products, and usually some by-product particles.

Based on Albert Einstein’s eye-opening prediction that mass could be changed into energy and vice-versa, Enrico Fermi built the first nuclear fission reactor in 1940.

When a nucleus fissions either spontaneously (very rare) or following controlled neutron bombardment, it splits into several smaller fragments or fission products, which are about equal to half the original mass. In the process, two or three neutrons are also emitted. The resting mass difference, about 0.1 percent of the original mass, is converted into energy.

Nuclear fission of Uranium-235. Credit: Wikimedia Commons.

Nuclear fission of Uranium-235. Credit: Wikimedia Commons.

The energy released by a nuclear fission reaction can be tremendous. For instance, one kilogram of uranium can release as much energy as combusting 4 billion kilograms of coal.

To trigger nuclear fission, you have to fire a neutron at the heavy nucleus to make it unstable. Notice in the example above, fragmenting U-235, the most important fissile isotope of uranium, produces three neutrons.  These three neutrons, if they encounter other U-235 atoms, can and will initiate other fissions, producing even more neutrons. Like falling dominos, the neutrons unleash a continuing cascade of nuclear fissions called a chain reaction.

In order to trigger the chain reaction, it’s critical to release more neutrons than were used during the nuclear reaction. It follows that only isotopes that can release an excess of neutrons in their fission support a chain reaction. The isotope U-238, for instance, can’t sustain the reaction. Most nuclear power plants in operation today use uranium-235 and plutonium-239.

Another prerequisite for the fission chain reaction is a minimum amount of fissionable matter. If there is too little material, neutrons can shoot out of the sample before having the chance to interact with a U-235 isotope, causing the reaction to fizzle. This minimum amount of fissionable matter is referred to as critical mass by nuclear scientists. Anything below this minimum threshold is called subcritical mass. 

nuclear chain reaction

U-235 fission chain reaction. Credit: Wikimedia Commons.

How nuclear fusion works

Fusion occurs when two smaller atoms collide at very high energies to merge, creating a larger, heavier atom. This is the nuclear process that powers the sun’s core, which in turn drives life on Earth.

Like in the case of fission, there’s a mass defect — the fused mass will be less than the sum of the masses of the individual nuclei — which is the source of energy released by the reaction. That’s the secret of the fusion reaction. Fusion reactions have an energy density many times greater than nuclear fission and fusion reactions are themselves millions of times more energetic than chemical reactions.

Nuclear fusion is what powers the sun's core. Credit: NASA.

Nuclear fusion is what powers the sun’s core. Credit: NASA.

Nuclear fusion could one day provide humanity with inexhaustible amounts of energy. When that day may come is not clear at this point since progress is slow, but that’s understandable. Harnessing the same nuclear forces that drive the sun presents significant scientific and engineering challenges.

Normally, light atoms such as hydrogen or helium don’t fuse spontaneously because the charge of their nuclei causes them to repel each other. Inside hot stars such as the sun, however, extremely high temperature and pressure rip the atoms to their constituting protons, electrons, and neutrons.  Inside the core, the pressure is millions of times higher than the surface of the Earth, and the temperature reaches more than 15 million Kelvin. These conditions remain stable because the core witnesses a never-ending tug of war of expansion-contraction between the self-gravity of the sun and the thermal pressure generated by fusion in the core.

Due to quantum-tunneling effects, protons crash into one another at high energy to fuse into helium nuclei after a number of intermediate steps. Fusion inside the star, a process called the proton-proton chain, follows this sequence:

The proton-proton fusion process that is the source of energy from the Sun. Credit: Wikimedia Commons.

The proton-proton fusion process that is the source of energy from the Sun. Credit: Wikimedia Commons.

  1. Two pairs of protons fuse, forming two deuterons. Deuterium is a stable isotope of hydrogen, consisting of 1 proton, 1 neutron, and 1 electron.
  2. Each deuteron fuses with an additional proton to form helium-3;
  3. Two helium-3 nuclei fuse to create beryllium-6, but this is unstable and disintegrates into two protons and a helium-4;
  4. The reaction also releases two neutrinos, two positrons, and gamma rays.

Since the helium-4 atom has less energy or resting mass than the 4 protons which initially came together, energy is radiated outside the core and across the solar system.

To shine brightly, the sun gobbles about 600 million tons of hydrogen nuclei (protons) every second which it turns into helium releasing 384.6 trillion trillion Joules of energy per second. This is equivalent to the energy released in the explosion of 91.92 billion megatons of TNT per second. Of all of the mass that undergoes this fusion process, only about 0.7% of it is turned into energy, though.

Though scientists have been trying to harness fusion for decades, we’ve yet to fulfill the fusion dream that promises unlimited clean energy.

While it’s relatively easy to split an atom to produce energy, fusing hydrogen nuclei is a couple of orders of magnitude more challenging. To replicate the fusion process at the core of the sun, we have to reach a temperature of at least 100 million degrees Celsius. That’s a lot more than observed in nature — about six times hotter than the sun’s core — since we don’t have the intense pressure created by the gravity of the sun’s interior.

That’s not to say that we haven’t achieved fusion yet. It’s just that all experiments to date put more energy in enabling the required temperature and pressure to trigger significant fusion reactions than the energy generated by these reactions.

Promising new technologies like magnetic confinement and laser-based inertial confinement could one day surprise all of us with a breakthrough moment. One of the most important projects in the field is the  International Thermonuclear Experimental Reactor (ITER) joint fusion experiment in France which is still being built. Its doughnut-shaped fusion machine called tokamak is expected to start fusing atoms in 2025.

Elsewhere, in Germany, the Wendelstein 7-X reactor, which uses a complex design called a stellarator, was turned on for the first time late 2016. It worked as expected, though still inefficient like all other fusion reactors. The Wendelstein reactor, however, was built as a proof of concept for the stellarator design which adds several twists to the tokamak ring to increase stability. The UK and China have their own experimental fusion reactors as well. 

The United States, on the other hand, wants to significantly revamp the classical fusion reactor. Physicists at the Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) are proposing a more efficient shape that employs spherical tokamaks, more akin to a cored apple.  The team writes that this spherical design halves the size of the hole in the doughnut, meaning we can use much lower energy magnetic fields to keep the plasma in place.

It seems like we’re still decades away from seeing an efficient fusion reactor. When we do get our own sun in a jar, though, be ready embrace the unexpected for nothing will be the same again.

Another breakthrough for fusion reported in South Korea

Scientists working on nuclear fusion have announced the breaking of another record – they’ve maintained ‘high performance’ plasma in a stable state for 70 seconds this week, the longest time ever recorded for this type of reaction.

Image credits: Michel Maccagnan/Wikimedia Commons.

Fusion power is regarded by many researchers as the holy Grail of clean energy. If we could create functional fusion machines, then we could create clean energy for thousands of years with little more than salt water. Researchers have been dreaming of clean fusion for decades, but so far, it still remains a work in progress – and many doubt its feasibility. But in recent years, several breakthroughs have shown promise and are bringing the technology closer to reality. In Germany, a device called a Stellarator is reportedly working as planned, and in Korea, a different type of reactor (a Tokamak) has sustained fusion for the longest time ever.

The problem with fusion power is that you have to maintain ungodly high temperatures – up to 300 million degrees Celsius (about 540 million degrees Fahrenheit). These temperatures are required for the hydrogen atoms to fuse together and create helium, the process which releases energy – a process similar to what’s happening inside the Sun and other stars. This is why fusion energy is sometimes called “a star in a jar.”

But maintaining such high temperatures is no easy feat, and involves incredibly strong magnetic fields. Ultimately, what you get is a tradeoff between temperature, pressure, and time. So you end up with high temperatures at high pressures, which is what you want, but you’ll have problems sustaining it for a long time. That’s why 70 seconds might not seem like a lot, but it is.

The KSTAR reactor is housed at the National Fusion Research Institute (NFRI), and has shown great progress in recent years.

“The world record for high-performance plasma for more than a minute demonstrated that the KSTAR is the forefront in steady-state plasma operation technology in a superconducting device,” NFRI said in a statement today. “This is a huge step forward for realization of the fusion reactor.”

Surely enough, the device still consumes more energy than it creates, but at this point it’s just a proof of concept. As they push the limits of the device more and more, they hope to ultimately harness the energy of fusion – and usher in a new age of cheap and clean energy.

New record gets us closer to fusion energy

Scientists have just broken the record for plasma pressure – the key ‘ingredient’ for fusion, bringing us one step closer to clean fusion energy.

“This is a remarkable achievement that highlights the highly successful Alcator C-Mod program at MIT,” said physicist Dale Meade of Princeton Plasma Physics Laboratory, who wasn’t involved in the experiments.

Image credits: Bob Mumgaard/Plasma Science and Fusion Centre

Fusion energy has been touted as a clean energy source for decades, up to the point where many believe it to be a pipe dream. There is no question about fusion’s theoretical feasibility because this happens naturally in stars, but whether we could actually harvest this energy for ourselves is a different story.

Fusion power is the generation of energy by nuclear fusion. Fusion reactions occur when two or several atomic nuclei come close enough for the strong nuclear force pulling them together to exceed the electrostatic force pushing them apart, fusing them into heavier nuclei. For nuclei lighter than iron-56, the process releases heat which can be harvested. But achieving a stable system for fusion energy generation remains far away.

Still, this is a notable landmark. The research team achieved a pressure of 2.05 atmospheres – a 15 percent jump over the previous record of 1.77 atmospheres. That might not seem like much, but when you consider that the plasma temperature was 35 million degrees Celsius (63 million degrees Fahrenheit) – over twice as hot as the Sun’s core – it becomes easier to understand why this matters. The device they created sustained fusion for 2 seconds, producing a total of 600 trillion fusion reactions.

These three variables, temperature, pressure, and time are considered to be a trade-off. You can have high temperatures for a long time, but not at high pressures, and so on. But pressure has proven to be especially difficult to achieve under these conditions.

The record was achieved at the Alcator C-Mod reactor at MIT, which unfortunately will reach the end of its life after 23 years. Funding is being moved to the ITER machine being constructed in France. Right not, existing fusion machines still consume more energy than they produce, but it is hoped that ITER could become the first sustainable fusion machine in the world, paving the way for clean and virtually limitless energy.

If you have any more questions or anything you’d like to learn about this technology, the researchers behind it will be hosting a Reddit AMA on October 20.

The U.S. plans to build the most advanced fusion reactor ever

The US government has put its weight behind efforts to create an economically viable fusion reactor, endorsing a new category of designs that could become the most efficient and viable yet.

Test cell of the NSTX-U.
Image credits Elle Starkman / PPPL Office of Communications.

Re-creating the atom fusing processes that sustain the sun on Earth has long been one of the holy grails of modern physics. Hydrogen fusion has been powering out Sun for the past 4.5 billion years now, and it’s still going strong — a machine that could safely and stably harvest these processes would offer humanity safe, clean, and virtually endless energy.

But, at the risk of stating the obvious, making a star isn’t easy. Physicists have seen some progress in this field, but a viable fusion reactor still remains out of their grasp. We’re inching forward, however, and in an effort to promote progress the US government has just backed plans for physicists to build a new kind of nuclear fusion device that could be the most efficient design yet.

Harnessing the atom…again

Our nuclear plants today rely on nuclear fission — the splitting of an atom into tinier atoms and neutrons — to produce energy, and they’re really good at it. Per unit of mass, nuclear fission releases millions of times more energy than coal-burning. The downside is that you have to deal with the resulting radioactive waste, which is really costly and really hard to get right.

But merging atoms, in nuclear fusion, produces no radioactive waste. If you heat up the nuclei of two lighter atoms to a high enough temperature, they merge into a heavier one releasing massive amounts of energy, with the only reaction product being the fused atom. It’s an incredibly efficient process, one that sustains all the stars in the Universe, our sun included.

So there’s understandably a lot of interest into taking that process, scaling it down, and harvesting it to power our lives. Physicists have been trying to do just that for the past 60 years and still haven’t succeeded, a testament to how hard it can be to put “a star in a jar.” The biggest issue, as you might have guessed, is that stars are incredibly hot.

While fission can be performed at temperatures just a few hundred degrees Celsius, fusion takes place at star-core temperatures of several millions of degrees. And because our would-be reactors have to jump-start the reaction from scratch, they need to generate temperatures in excess of that. A successful reactor should be able to resist at least 100 million degrees Celsius. Which is a lot.

“During the process of nuclear fusion, atoms’ electrons are separated from their nuclei, thereby creating a super-hot cloud of electrons and ions (the nuclei minus their electrons) known as plasma,” Daniel Oberhaus said for the Motherboard.

“The problem with this energy-rich plasma is figuring out how to contain it, since it exists at extremely high temperatures (up to 150 million degrees Celsius, or 10 times the temperature at the Sun’s core). Any material you can find on Earth isn’t going to make a very good jar.”

So what scientists usually do to keep the plasma from vaporizing the device is to contain it through the use of magnetic fields. So far, the closest anyone’s gotten to sustainable fusion is a team of physicists at the Wendelstein 7-X stellarator in Greifswald, Germany, and researchers at China’s Experimental Advanced Superconducting Tokamak (EAST) – both of which have been trying to hold onto the super-heated plasma that results from the fusion reaction.

The German device managed to heat hydrogen gas to 80 million degrees Celsius and sustain a cloud of hydrogen plasma for a quarter of a second last year. That doesn’t sound like a lot but it was a huge milestone in the world of physics. Back in February, the Chinese team reported that it successfully generated hydrogen plasma at 49.999 million degrees Celsius, and held onto it for 102 seconds. Neither of these devices has proved that fusion can produce energy — just that it is possible in a controlled environment.

Physicists at the US Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) think that progress has been so slow because we’ve been working with the wrong jar. They plan to redesign the fusion reactor to incorporate better materials and a more efficient shape — instead of using the traditional tokamak to contain the plasma in a doughnut-like shape, they suggest employing spherical tokamaks, more akin to a cored apple. The team writes that this spherical design halves the size of the hole in the doughnut, meaning we can use much lower energy magnetic fields to keep the plasma in place.

Traditional tokamak.
Image credits Matthias W. Hirsch / Wikimedia.

The smaller hole could also allow for the production of tritium – a rare isotope of hydrogen – which can fuse with another isotope of hydrogen, called deuterium, to produce fusion reactions.

They’ve also set their sights on replacing the huge copper magnets employed in today tokamak designs with high-temperature superconducting magnets that are far more efficient because electricity can flow through them with zero resistance.

To save development time, the team will be applying these improvements to two existing spherical tokamaks – UK’s Mega Ampere Spherical Tokamak (MAST), which is in the final stages of construction, and the PPPL’s National Spherical Torus Experiment Upgrade (NSTX-U), which came online last year.

“We are opening up new options for future plants,” one of the researchers behind the study, NSTX-U program director Jonathan Menard, said in a statement.

“[These facilities] will push the physics frontier, expand our knowledge of high temperature plasmas, and, if successful, lay the scientific foundation for fusion development paths based on more compact designs,” added PPPL director Stewart Prager.

Right now, all we can do is wait and see the results. But if this works, we’ll be one step closer to creating stars right here on Earth — then plugging them right into the grid to power our smartphones.

The full paper titled “Fusion nuclear science facilities and pilot plants based on the spherical tokamak” has been published in Nuclear Fusion.


The ‘Next Big Things’ in Science Ten Years from Now

ZME Science reports the latest trends and advances in science on a daily basis. We believe this kind of reporting helps people keep up with an ever-changing world, while also fueling inspiration to do better.

But it can also get frustrating when you read about 44% efficiency solar panels and you, as a consumer, can’t have them. Of course, there is a momentary time lapse as the wave of innovation travels from early adopters to mainstream consumers. The first fully functional digital computer, the ENIAC, was invented in 1946, but it wasn’t until 1975 that Ed Roberts introduced the first personal computer, the Altair 8800. Think touch screen tech is a new thing? The first touch screen was invented by E.A. Johnson at the Royal Radar Establishment, Malvern, UK, between 1965 – 1967. In the 80s and 90s, some companies like Hewlett-Packard or Microsoft introduced several touch screen products with modest commercial success. It wasn’t until 2007 when Apple released the first iPhone that touch screen really became popular and accessible. And the list goes on.


The point I’m trying to make is that all the exciting stuff we’re seeing coming out of cutting-edge labs around the world will take time to mature and become truly integrated into society. It’s in the bubble stage, and for some the bubble will pop and the tech won’t survive. Other inventions and research might resurface many decades from now.

So, what’s the future going to look like in ten years from now? What’s the next big thing? It’s my personal opinion that, given the current pace of technological advancement, these sorts of estimates are very difficult, if not impossible, to make. As such, here are just a few of my guesses as to what technology — some new, other improved versions of what’s already mainstream today — will become an integral part of society in the future.

The next five years

Wearable devices

A hot trend right now is integrating technology into wearable devices. Glasses with cameras (such as Google Glasses) or watches that answer your phone calls (like the Apple Watch) are just a few products that are very popular right now. Industry experts believe we’re just scratching the surface, though.

Thanks to flexible electronics, clothing will soon house computers, sensors, or wireless receivers. But most of these need to connect to a smartphone to work. The real explosion of wearable tech might happen once these are able to break free and work independently.

“Smart devices, until they become untethered or do something interesting on their own, will be too complicated and not really fulfill the promise of what smart devices can do,” Mike Bell, head of Intel’s mobile business, said. “These devices have to be standalone and do something great on their own to get mass adoption. Then if they can do something else once you pair it, that’s fine.”

Internet of Things

In line with wearable devices is the Internet of Things — machines talking to one another, with computer-connected humans observing, analyzing, and acting upon the resulting ‘big data’ explosion. Refrigerators, toasters, and even trash cans could be computerized and, most importantly, networked. One of the better-known examples is Google’s Nest thermostat.

This Wi-Fi-connected thermostat allows you to remotely adjust the temperature of your home via your mobile device and also learns your behavioral patterns to create a temperature-setting schedule. Nest was acquired by Google for $3.2 billion in 2014. Another company, SmartThings, which Samsung acquired in August, offers various sensors and smart-home kits that can monitor things like who is coming in and out of your house and can alert you to potential water leaks to give homeowners peace of mind. Fed by sensors soon to number in the trillions, working with intelligent systems in the billions, and involving millions of applications, the Internet of Things will drive new consumer and business behavior the likes of which we’ve yet to see.

Big Data and Machine Learning

Big data is a hyped buzzword nowadays that’s used to describe massive sets of (both structured and unstructured) data which are hard to process using conventional techniques. Big data analytics can reveal insights previously hidden by data too costly to process. One example is peer influence among customers revealed by analyzing shoppers’ transaction, social, and geographical data.

With more and more information being stored online, especially s the internet of things and wearable tech gain in popularity, the world will soon reach an overload threshold. Sifting through this massive volume is thus imperative, and this is where machine learning comes in. Machine learning doesn’t refer to household robots, though. Instead, it’s a concept much closer to home. For instance, your email has a spam folder where email that fit a certain pattern are filtered through by an algorithm that has learned to distinguish between “spam” and “not spam”. Similarly, your Facebook feed is filled with posts from your closest friends because an algorithm has learned what your are preferences based on your interactions — likes, comments, shares, and clickthroughs.

Where big data and machine learning meet, an informational revolution awaits and there’s no field where the transforming potential is greater than medicine. Doctors will be aided by smart algorithms that mine their patient’s dataset, complete with previous diagnoses or genetic information. The algorithm would go through the vast records and correlate with medical information. For instance, a cancer patient might come in for treatment. The doctor would then be informed that since the patient has a certain gene or set of genes, a customized treatment would apply. Amazing!


You might have heard of Bitcoin, but it’s not the only form of cryptocurrency. Today, there are thousands of cryptocurrencies. Unlike government-backed currencies, which are usually regulated and created by a central bank, cryptocurrencies are generated by computers that solve a complex series of algorithms and rely on decentralized, peer-to-peer networks. While these were just a fad a few years ago, things are a lot more serious now. Shortly after Bitcoin’s creation, one user spent 10,000 Bitcoin for two pizzas. That same amount of bitcoin would be worth about $8 million a few short years later. Today, they’re worth around $63 million.

There’s much debate surrounding cryptocurrency. For instance, because it’s decentralized and anonymous, Bitcoin has been used and is used to fund illegal activities. Also, there’s always the risk of a computer crash erasing your wallet or a hacker ransacking your virtual vault. Most of these concerns aren’t all that different to those concerned about traditional money, though, and with time, cryptocurrencies could become very secure.

Driverless cars

In 2012, California was the first state to formally legalize driverless cars. The UK is set to follow this year.

Some 1.2 million people worldwide die in car accidents every year. Tests so far have shown that driverless cars are very safe and should greatly reduce motor accidents. In fact, if all the cars on a motorway were driverless and networked, then theoretically no accident should ever occur. Moreover, algorithms would make sure that you’d get the best traffic flow possible as mathematical functions would calculate what velocity a car should go relative to one another such that the whole column would move forward at maximum speed. Of course, this would mean that most people would have to give up driving, which isn’t an option among those who enjoy it. Even so, you could get to work alone in the car without a driver’s license. “Almost every car company is working on automated vehicles,” says Sven Beiker, the executive director of the Center for Automotive Research at Stanford.

3D printing

A 3D printer reads every slice (or 2D image) of your virtual object and proceeds to create the object, blending each layer together with no sign of the layering visible, resulting in a single 3D object. It’s not exactly new. Companies, especially in the R&D or automotive business, have been using 3D printers to make molds and prototypes for more than two decades. What’s new is how this technology has arrived to the common folk. Nowadays, you can buy a decent 3D printer for less than $600. With it, you can print spare parts for your broken machines, make art, or whatever else suits your fancy.

You don’t even have to know how to design. Digital libraries for 3D parts are growing rapidly and soon enough you should be able to print whatever you need. The technology itself is also advancing. We’ve seen 3D printed homes, cars, or ears, and this is just the beginning. Scientists believe they can eventually 3D print functioning organs that are custom made for each patient, saving millions of lives each year.

Virtual reality

The roots of virtual reality can be traced to the late 1950s, at a time when computers where confined Goliaths the size of a house. A young electrical engineer and former naval radar technician named Douglas Engelbart saw computers’ potential as a digital display and laid the foundation for virtual reality. Fast forward to today and not that much has become of VR — at least not the way we’ve seen in movies.

But if we were to try on the proverbial VR goggles what insight into the future might they grant? Well, you’d see a place for VR that goes far beyond video games, like the kind Oculus Rift strives towards. Multi-player VR provides the foundation by which a class of students can go on a virtual tour of the Egyptian pyramids, let a group of friends watch the latest episode of “Game of Thrones” together, or let the elderly experience what it is like to share a visit with their grandkids who may be halfway around the world. Where VR might be most useful is not in fabricating fantasies, but enriching reality by connecting people like never before. It’s terribly exciting.


It’s been 10 years since the human genome was first sequenced. In that time, the cost of sequencing per person has fallen from $2.7bn to just $5,000! Raymond McAuley, a leading genomics researcher, predicted in a lecture at Singularity University’s Exponential Finance 2014 conference that we will be sequencing DNA for pennies by 2020.  When sequencing is applied to a mass population, we will have mass data, and who knows what that data will reveal?

The next ten years


There is increasing optimism that nanotechnology applied to medicine and dentistry will bring significant advances in the diagnosis, treatment, and prevention of disease. Many researchers believe scientific devices that are dwarfed by dust mites may one day be capable of grand biomedical miracles.

Donald Eigler is renowned for his breakthrough work in the precise manipulation of matter at the atomic level. In 1989, he spelled the letters IBM using 35 carefully manipulated individual xenon atoms. He imagines one day “hijacking the brilliant mechanisms of biology” to create functional non-biological nanosystems. “In my dreams I can imagine some environmentally safe virus, which, by design, manufactures and spits out a 64-bit adder. We then just flow the virus’s effluent over our chips and have the adders attach in just the right places. That’s pretty far-fetched stuff, but I think it less far-fetched than Feynman in ’59.”

Angela Belcher is widely known for her work on evolving new materials for energy, electronics, and the environment. The W. M. Keck Professor of Energy, Materials Science & Engineering and Biological Engineering at the Massachusetts Institute of Technology, Belcher believes the big impact of nanotechnology and nanoscience will be in manufacturing -– specifically clean manufacturing of materials with new routes to the synthesis of materials, less waste, and self-assembling materials.

“It’s happening right now, if you look at the manufacturing of certain materials for, say, batteries for vehicles, which is based on nanostructuring of materials and getting the right combination of materials together at the nanoscale. Imagine what a big impact that could have in the environment in terms of reducing fossil fuels. So clean manufacturing is one area where I think we will definitely see advances in the next 10 years or so.”

David Awschalom is a professor of physics and electrical and computer engineering at the University of California, Santa Barbara. As pioneer in the field of semiconductor spintronics, in the next decade or two, Awschalom would like to see the emergence of genuine quantum technology. “I’m thinking about possible multifunctional systems that combine logic, storage, communication as powerful quantum objects based on single particles in nature. And whether this is rooted in a biological system, or a chemical system, or a solid state system may not matter and may lead to revolutionary applications in technology, medicine, energy, or other areas.”


ZME Science has never backed down from praising graphene, the one atom thick carbon allotrope arranged in a hexagon lattice — and for good reason, too. Here are just a few highlights we’ve reported: it can repair itself; it’s the thinnest compound known to us; the lightest material (with 1 square meter coming in at around 0.77 milligrams); the strongest compound discovered (between 100-300 times stronger than steel and with a tensile stiffness of 150,000,000 psi); the best conductor of heat at room temperature; and the best conductor of electricity (studies have shown electron mobility at values of more than 15,000 cm2·V−1·s−1). It can be used to make anything, ranging from aircraft, to bulletproof vests ten times more protective than steel, to fuel cells. It can also be turned into an anti-cancer agent. Most of all, however, its transformative potential is greatest in the field of electronics, where it could replace poor old silicon, which is greatly pressed by Moore’s law.

Reading all this, it’s easy to hail graphene as the wonder material of the new age of technology that is to come. So, what’s next? Manufacturing, of course. The biggest hurdle scientists are currently facing is producing bulk graphene that is pure enough for industrial applications at a reasonable price. Once this is settled, who knows what will happen.

Mars Colony

After Neil Armstrong’s historic moonwalk, the world went drunk with dreams of conquering space. You’ve probably seen or heard about ‘prophecies’ made during those times of how the world might look like in the year 2000. But no, we don’t have moon bases, flying cars or a cure for cancer — yet.

In time, the interest for manned space exploration dwindled, something that can has been unfortunately reflected in NASA’s present budget. Progress has still been made, albeit not at the pace some might have liked. The International Space Station is a fantastic collaborative effort which is now nearing two decades of continued manned operation. Only two years ago, NASA landed the Curiosity rover, which is currently roaming the Red Planet and relaying startling facts about our neighboring planet. By all signs, men will walk on Mars and when this happens, as with Armstrong before, a new rejuvenated wave of enthusiasm for space exploration will ripple through society. And, ultimately, this will be consolidated with a manned outpost on Mars. I know what you must be thinking, but if we’re to lend our ears to NASA officials, this target isn’t that far off in time. By all accounts, it will most likely happen during your lifetime.

Beginning in 2018, NASA’s powerful Space Launch System rocket became operational, testing new abilities for space exploration, like a planned manned landing on an asteroid in 2025. Human missions to Mars will rely on Orion and an evolved version of SLS that will be the most powerful launch vehicle ever flown. Hopefully, NASA will fly astronauts to Mars (marstronauts?) sometime during the 2030s. Don’t get your hopes up too much for Mars One, however.

Wireless electricity

We’ve know about the possibilities for more than a century, most famously by the great Tesla during his famous lectures. The scientist would hang up a light bulb in the air and it would light up — all without any wires! The audience was dazzled every time by this performance. But this wasn’t any parlor trick — just a matter of current by induction.

Basically, Tesla relied on sets of huge coils which generated a magnetic field, which induces a current into the light bulb. Voila! In the future, wireless electricity will be accessible to anyone — as easy as WiFi is today. Smartphones will charge in your pocket as you wander around, televisions will flicker with no wires attached, and electric cars will refuel while sitting on the driveway. In fact, the technology is already in place. What is required is a huge infrastructure leap. Essentially, wirelessly charged devices need to be compatible with the charging stations and this requires a lot of effort from of both the charging suppliers and the device manufacturers. We’re getting there, though.

Nuclear Fusion

Nuclear fusion is essentially the opposite of nuclear fission. In fission, a heavy nucleus is split into smaller nuclei. With fusion, lighter nuclei are fused into a heavier nucleus.

The fusion process is the reaction that powers the sun. On the sun, in a series of nuclear reactions, four isotopes of hydrogen-1 are fused into a helium-4, which releases a tremendous amount of energy. The goal of scientists for the last 50 years has been the controlled release of energy from a fusion reaction. If the energy from a fusion reaction can be released slowly, it can be used to produce electricity in virtually unlimited quantities. Furthermore, there’s no waste materials to deal with or contaminants to harm the atmosphere. To achieve the nuclear fusion dream, scientists need to overcome three main constraints:

  • temperature (you need to put in a lot of energy to kick off fusion; helium atoms need to be heated to 40,000,000 degrees Kelvin — that’s hotter than the sun!)
  • time (charged nuclei must be held together close enough and long enough for the fusion reaction to start)
  • containment (at that temperature everything is a gas, so containment is a major challenge).

Though other projects exist elsewhere, nuclear fusion today is championed by the International Thermonuclear Experimental Reactor (ITER) project, founded in 1985, when the Soviet Union proposed to the U.S. that the countries work together to explore the peaceful applications of nuclear fusion. Since then, ITER has ballooned into a 35-country project with an estimated $50 billion price tag.

Key structures are still being built at ITER, and when ready the reactor will stand 100 feet tall, weigh 23,000 tons, and its core will be hotter than the sun. Once turned on (hopefully successfully), the ITER could solve the world’s energy problems for the foreseeable future, and help save the planet from environmental catastrophe.

Newly discovered star’s chemistry puzzles researchers

A team of Argentinian astronomers, peering up in the night’s sky from the Astronomical Observatory of Córdoba has found a new, young lithium-rich giant star that they designated KIC 9821622. Drawing on data obtained from the GRACES high-resolution spectograph, they were able to determine the star’s mass, radius, age, as well as determine the chemical abundances of 23 elements of the celestial body, which they will be publishing in the December issue of the Astronomy & Astrophysics journal.

This image from the New Technology Telescope at ESO’s La Silla Observatory shows Nova Centauri 2013 in July 2015 as the brightest star in the center of the picture. This was more than eighteen months after the initial explosive outburst. This nova was the first in which evidence of lithium has been found.
Image via phys

The analysis of this star was performed during an on-sky test of the GRACES system (Gemini Remote Access to CFHT ESPaDOnS Spectrograph) conducted in July of this year. The team’s report shows that KIC is an intermediate-mass giant star weighing in at about 1.64 times our Sun’s mass, located some 5,300 light years from Earth, in the Kepler Field. But what’s really caught scientist’s interest is the chemical composition of KIC 9821622; it’s very rich in lithium, and high concentration of this element is very rarely seen in stars  — it’s estimated that only 1-2 percent of all known stars boast comparable levels of this element.

Under normal conditions, lithium stands out as being the lightest metal and the least dense solid element in the periodic table. With an atomic number of just 3 protons, you’d expect this element to be very prolific in stars, as a logical progression from hydrogen (Z=1) and helium (Z=2) fusion, but it’s actually very rarely seen in stars large and hot enough to sustain fusion — lithium is actually consumed inside these stars. At the immense temperatures required for hydrogen fusion, lithium atoms collide with protons to form helium in a process known as lithium burning, and isn’t re-created afterwards. In fact, high concentrations of this element are usually a good indicator that a celestial body is substellar, such as brown dwarfs, through the Rafael Rebolo or Lithium test..

This is why scientists are quite puzzled by the find. One theory the authors propose is that the high concentration of lithium can be attributed to a fresh batch of the element that is synthesized near the luminosity bump. Another possibility they’re looking into is that KIC gained its lithium by accretion of planets or brown dwarfs, and it didn’t have time to “digest” the element yet.

However, neither theory is anything more than speculation on their part right now. While the second theory is a bit shaky — up to now, we’ve found no trace of any orbiting planet or binary star system near KIC, let alone one that could have provided the required amount of lithium — the team feels that this is where we should focus our efforts.

“Lithium enhancement in giant stars can be the result of the engulfment of a brown dwarf or planet,” the paper reads.

It’s very important that we understand how or where KIC got its lithium, the authors say — they underline the need for further research of the star, as understanding lithium abundance in stellar photospheres is an important tool in our understanding stellar evolution as a whole.

“To advance in our understanding of these rare objects, it is essential not only to continue the search of lithium rich giants, but also to derive their chemical abundances and to unambiguously establish their evolutionary status,” they write in the paper.

The authors recommend that the star be observed at longer wavelengths than those employed by the GRACE system, such as mid-infrared or submillimeter, as they believe that accretion of a planet would lead to the formation of an ejected-material shell that they could detect as infrared excess.

Aside from being lithium-rich, the scientists found that KIC 9821622 shows also a high abundance of carbon, nitrogen and oxygen. They also managed to derive the star’s precise spectroscopic fundamental parameters, including the effective temperature, surface gravity, metallicity and microturbulent velocity.

“KIC 9821622 is certainly a unique and interesting object that deserves further scrutiny to reveal the real mechanism behind the observed anomalous abundances. In this sense, high-resolution chemical analysis of more of these young giants might help to understand their origin,” the paper concludes.

Advances in magnet technology could bring cheaper, modular fusion reactors from sci-fi to sci-reality in less than a decade

Advances in magnet technology have allowed MIT scientists to design a cheaper, more compact, modular and highly efficient fusion reactor that is efficient enough to use commercially. The era of clean, practically inexhaustible energy may be upon us in as little as a decade, scientists report.

MIT PhD candidate Brandon Sorbom holds REBCO superconducting tapes (left), enabling technology behind the ARC reactor.
When cooled to liquid nitrogen temperature, the superconducting tape can carry as much current as the large copper conductor on the right, enabling the construction of extremely high‑field magnets, which consume minimal amounts of power.
Photo: Jose‑Luis Olivares/MIT

The team used newly available rare-earth barium copper oxide (REBCO) superconducting tapes to produce high-magnetic field coils.

“[The implementation of these magnets] just ripples through the whole design,” says Dennis Whyte, professor of Nuclear Science and Engineering and director of MIT’s Plasma Science and Fusion Center. “It changes the whole thing.”

Bigger bang for your magnet

But how do magnets help us build a mini-star? Well, fusion reactors generate electricity by using the same physical process that powers stars. In such a reactor, two lighter atoms are mushed together to create heavier elements. And just like natural stars, they generate immensely hot plasma – a state of matter similar to an electrically charged gas.

The stronger magnets and the stronger magnetic fields they generate allow the plasma to be contained in a much smaller space than previously possible. This translates to less materials and space necessary to build the reactor, and less hours of work, meaning a cheaper, more affordable reactor.

The proposed reactor, using a tokamak (donut-ish) geometry is described in a paper in the journal Fusion Engineering and Design, co-authored by Whyte, PhD candidate Brandon Sorbom, and 11 others at MIT.

A cutaway view of the proposed ARC reactor. Thanks to powerful new magnet technology, the much smaller, less-expensive ARC reactor would deliver the same power output as a much larger reactor.
Illustration credits to the MIT ARC team

Power plant prototype

The basic concept of the reactor and its associated elements rely on well-tested and proven principles that have been developed over decades of study.

The new reactor is intended to allow basic research on fusion and to potentially function as a prototype power plant – that could produce significant quantities of power.

“The much higher magnetic field,” Sorbom says, “allows you to achieve much higher performance.”

The reactor uses hydrogen fusion to form helium, with enormous releases of energy. To sustain the reaction and make it energy efficient (to release more energy than the reaction consumes) the plasma has to be heated to temperatures hotter than the cores of stars. And here is where the new magnets come in handy – they trap the heated particles in the center of the tokamak.

Cutaway of the inner workings of the ITER reactor. Not much difference structurally in the tokamak, the increase in power comes from the magnets. Notice the solid cover over the reactor.
Image via nature

“Any increase in the magnetic field gives you a huge win,” Sorbom says.

This is because in a fusion reactor, changing the strength of the magnetic field has a dramatic effect on the reaction: available fusion power increases to the fourth power of the increase in the magnetic field. Doubling the field would thus produce a 16-fold increase in the power generated by the device.

Ten times more power

The new magnets do not quite produce a doubling of the field strength, but they are strong enough to increase the power generation of the reactor ten times over previously used superconducting technology, the study says. This opens up the path for a series of improvements to be done to the standard design of the reactor.

The world’s most powerful planned fusion reactor, a huge device under construction in France called ITER, is expected to cost around US$ 40 billion. This device was designed and put into production before the new superconductors became available. Sorbom and the MIT team believe that their new design would produce about the same power as the french reactor, while being only half the diameter, cost but a fraction of its price and being faster to construct.

But despite the difference in size and magnetic field strength, the proposed reactor, called ARC, is based on “exactly the same physics” as ITER, Whyte says.

“We’re not extrapolating to some brand-new regime,” he adds.

The team also plans to include a method for removing the fusion core from the reactor without having to dismantle the entire device. Being able to do this would lend well to research aimed at further improving the system by using different materials or designs of its core to improve performance.

In addition, as with ITER, the new superconducting magnets would enable the reactor to operate in a sustained way, producing a steady power output, unlike today’s experimental reactors that can only operate for a few seconds at a time without overheating of copper coils.

Molten core and liquid cover

Another key breakthrough the design of the reactor brings is that it replaces the blanket of solid materials that surrounds the fusion chamber with a liquid material, that can be easily circulated and replaced. This curbs operating costs associated with replacement of the materials that degrade over time.

“It’s an extremely harsh environment for [solid] materials,” Whyte says, so replacing those materials with a liquid could be a major advantage.

In its current state, the reactor should be capable of producing about three times as much electricity as is needed to keep the reaction going. Sorbom says that the design could probably be improved and fine-tuned to crank up to about five or six times that much power. So far, no completed fusion reactor has produced energy (well they did, but they use more juice than they make) so the kind of net energy production ARC is expected to deliver would be a major breakthrough in fusion technology, the team says. They estimate that the design should be able to produce electricity for about 100,000 people.

“Fusion energy is certain to be the most important source of electricity on earth in the 22nd century, but we need it much sooner than that to avoid catastrophic global warming,” says David Kingham, CEO of Tokamak Energy Ltd. in the UK, who was not connected with this research. “This paper shows a good way to make quicker progress,” he says.

The MIT research, Kingham says, “shows that going to higher magnetic fields, an MIT speciality, can lead to much smaller (and hence cheaper and quicker-to-build) devices.” The work is of “exceptional quality,” he says; “the next step … would be to refine the design and work out more of the engineering details, but already the work should be catching the attention of policy makers, philanthropists and private investors.”

The research was supported by the U.S. Department of Energy and the National Science Foundation.

The preamplifiers of the National Ignition Facility. The unified lasers deliver 1.8 megajoules of energy and 500 terawatts of power — 1,000 times more than the United States uses at any one moment. (Credit: Damien Jemison/LLNL)

Closer then ever to nuclear fusion, according to physicists

Physicists have been dreaming of achieving controlled nuclear fusion for decades, and year by year we’ve been getting closer to turning it into reality. A recent paper published in the journal Physics of Plasmas reports improvements in the design of an experimental set-up capable of igniting a self-sustained fusion reaction with high yields of energy. Researchers at the National Ignition Facility (NIF) claim they’re currently tackling one big obstacle that comes is in the way of fusion ignition.

The preamplifiers of the National Ignition Facility. The unified lasers deliver 1.8 megajoules of energy and 500 terawatts of power — 1,000 times more than the United States uses at any one moment. (Credit: Damien Jemison/LLNL)

The preamplifiers of the National Ignition Facility. The unified lasers deliver 1.8 megajoules of energy and 500 terawatts of power — 1,000 times more than the United States uses at any one moment. (Credit: Damien Jemison/LLNL)

Nuclear fusion is a nuclear reaction in which two or more atomic nuclei collide at very high speeds and energies and fuse together leading to the creation of a new atom. At the dawn of the Universe, there was only Hydrogen, and it is through fusion that all the other elements surfaced. If lighter elements are fused (lighter than Iron), the nuclear reaction releases energies – lots of it. If the fused elements are heavier than iron, then the reaction absorbs energy. This is why in fission, which is the exact opposite of nuclear fusion, very heavy elements are employed to release energy.

Nuclear fusion, however, requires tremendous amounts of energy to kick-start. The challenge lies in designing reactors that capable of producing more energy than it goes into igniting the reaction. Even so there are technical challenges in order to achieve the highly stable precisely directed implosion required for ignition. One such obstacle has been outlined the NIF researchers in their report.

Closer to a fusion dream

Schematic of NIF ignition target and capsule (credit: M. J. Edwards et al., Physics of Plasmas)

Schematic of NIF ignition target and capsule (credit: M. J. Edwards et al., Physics of Plasmas)

To achieve ignition, NIF reserachers used 192 laser beam that fire simultaneously inside a specially designed cryogenic hollow chamber called a hohlraum (German for “hollow room”), just the size of a pencil. Together the combined power of the lasers deliver 1.8 megajoules of energy and 500 terrawatts of power – 1,000 times more than the United States uses any single moment – inside the hohlraum in billionth-of-a-second pulses.  All this power is directed towards  a ball-bearing-size capsule containing two hydrogen isotopes, deuterium and tritium (D-T) inside the hallow chamber, which creates a sort of “X-ray oven” that implodes the isotope capsules  to temperatures and pressures similar to those found at the center of the sun.

“What we want to do is use the X-rays to blast away the outer layer of the capsule in a very controlled manner, so that the D-T pellet is compressed to just the right conditions to initiate the fusion reaction,” explained John Edwards, NIF associate director for inertial confinement fusion and high-energy-density science. “In our new review article, we report that the NIF has met many of the requirements believed necessary to achieve ignition—sufficient X-ray intensity in the hohlraum, accurate energy delivery to the target and desired levels of compression—but that at least one major hurdle remains to be overcome, the premature breaking apart of the capsule.”

The NIF researchers used monitoring tools to diagnose the capsule breaking step by step.

 “In some ignition tests, we measured the scattering of neutrons released and found different strength signals at different spots around the D-T capsule,” Edwards said.

“This indicates that the shell’s surface is not uniformly smooth and that in some places, it’s thinner and weaker than in others. In other tests, the spectrum of X-rays emitted indicated that the D-T fuel and capsule were mixing too much — the results of hydrodynamic instability — and that can quench the ignition process.”

The NIF scientists are now concentrating all their efforts on determining the exact nature of this instability and mitigate it. This is only one big obstacle the scientists face. There are still many major milestones that need to be reached, but advances and reports so far are promising – we’re getting there: tremendously large, clean and safe energy.

A concept image of a spacecraft powered by a fusion-driven rocket. In this image, the crew would be in the forward-most chamber. Solar panels on the sides would collect energy to initiate the process that creates fusion. (c) University of Washington

Fusion rocket a step closer to taking man on Mars in 30 days

A concept image of a spacecraft powered by a fusion-driven rocket. In this image, the crew would be in the forward-most chamber. Solar panels on the sides would collect energy to initiate the process that creates fusion. (c) University of Washington

A concept image of a spacecraft powered by a fusion-driven rocket. In this image, the crew would be in the forward-most chamber. Solar panels on the sides would collect energy to initiate the process that creates fusion. (c) University of Washington

Billions of dollars and decades worth of research have been invested in fusion propelling technology, so that one day we might breach current spaceflight limitations that offer little hope of straying too far from our planet. Researchers at Washington University have recently made great strides forward in this respect and have successfully tested each stage of their fusion rocket in the lab so far. If they manage to combine all the stages to work together then spaceflight will be in for nothing short of a revolution. For instance, uing a fusion powered rocket a trip to Mars would only take 30 days compared to years using current technology.

The team of scientists, led by John Slough, have been working for the past few years on an unique solution of manipulation nuclear fusion and so far their tests of individual stages have proven successful in the lab. The next step is to combine these isolated stages into a working fusion rocket, but to achieve this they need to generate more power than the fusion process requires for kickstart – a feat that is a lot more difficult than it sounds since it hasn’t been achieved not even to this day, despite 60 years worth of research and billions invested.

What makes fusion rockets particularly appealing is their immense energy density – 7 million times more dense than conventional rocket fuel. This means that you can build not only more powerful rockets – powerful enough to make a trip to Mars in a mere 30 days at 200,000 miles per hour – but lighter ones as well, significantly reducing cost.

“Using existing rocket fuels, it’s nearly impossible for humans to explore much beyond Earth. We are hoping to give us a much more powerful source of energy in space that could eventually lead to making interplanetary travel commonplace,” Slough was quoted by the university as saying.

The fusion driven rocket test chamber at the UW Plasma Dynamics Lab in Redmond. The green vacuum chamber is surrounded by two large, high-strength aluminum magnets. These magnets are powered by energy-storage capacitors through the many cables connected to them. (c) University of Washington

The fusion driven rocket test chamber at the UW Plasma Dynamics Lab in Redmond. The green vacuum chamber is surrounded by two large, high-strength aluminum magnets. These magnets are powered by energy-storage capacitors through the many cables connected to them. (c) University of Washington

New age of space travel once fusion rockets kick in

The technology basically relies on a sort of plasma encased in its own magnetic field which when compressed produces nuclear fusion, similar to how diesel engines compress diesel fuel to produce combustion. The plasma is deuterium-tritium (hydrogen isotopes) and this is surrounded by metal rings of lithium. At the right time when the plasma reaches a certain points in the combustion chamber, a magnetic fields acts and causes the rings to compress the plasma. The force of implosion is so immense that the metal rings cause the plasma to compress into nuclear fusion. The metal rings would be ejected out of the rocket at 67,000 mph (108,000 kmh), generating thrust in a process that is repeated every 10 seconds allowing the rocket to accelerate somewhere around 200,000 miles per hour. This is the plan and so far only each individual stage taken separately have been tested.

Their work is really cut out for them, since, again, they need to solve the energy problem – producing more energy than it is pumped to kickstart the reaction and then maintaining this reaction. It’s the later that causes so many hurdles; starting nuclear fusion isn’t that difficult, maintaining and then converting the thermal energy into electrical current is something that hasn’t be achieved yet. Yet, the Washington University researchers work seems promising enough.

Most likely, however, they won’t be able to finish a working fusion rocket, if ever, any time soon. Not in time for Dennis Tito’s 2018 manned expedition to Mars, slated to last 500 days. For more about fusion rockets work check the video below.

The project is funded through NASA’sInnovative Advanced Concepts Program. Last month at a symposium, Slough and his team from MSNW, of which he is president, presented their mission analysis for a trip to Mars, along with detailed computer modeling and initial experimental results