Category Archives: Genetics

Scientists discover how genes from our parents may shape our behavior

Credit: Pixabay.

One major point of contention among psychologists has always been the nature versus nurture debate — the extent to which particular aspects of our behavior are a product of either inherited (i.e. genetic) or acquired (i.e. learned) influences. In a new study on mice, researchers at the University of Utah Health focused on the former, showing that genes inherited from each parent have their own impact on hormones and important neurotransmitters that regulate our mood and behavior.

Intriguingly, some of these genetic influences are sex-specific. For instance, the scientists found that genetics inherited from mom can shape the decisions and actions of sons, while genes from dad have biased control over daughters.

I got it from my Mom and Dad

Like chromosomes, genes also come in pairs. Both mom and dad each have two copies, or alleles, of each of their genes, but each parent only passes along one copy of each to the child. These genes determine many traits, such as hair and skin color.

But it’s not only our outward appearance that is influenced by genes. In a new study, researchers found that tyrosine hydroxylase and dopa decarboxylase — two genes that are heavily involved in the synthesis of hormones and neurotransmitters like dopamine, serotonin, norepinephrine, or epinephrine — are expressed differently from maternally versus paternally inherited gene copies. These chemicals play a crucial role in regulating an array of important functions from mood to movement.

The genes are also involved in the production of the adrenaline hormone by the adrenal gland, which triggers the “fight or flight” response when we encounter danger or stress. Together, these pathways form the brain-adrenal axis.

“The brain-adrenal axis controls decision making, stress responses, and the release of adrenaline, sometimes called the fight or flight response. Our study shows how mom’s and dad’s genes control this axis in their offspring and affect adrenaline release. Mom’s control the brain and dad’s control the adrenal gland,” Christopher Gregg, principal investigator and associate professor in the Department of Neurobiology at the University of Utah Health, told ZME Science.

In order to investigate how inherited gene copies introduce maternal or paternal biases in the brain-adrenal axis, the researchers genetically modified mice to attach a fluorescent tag to the dopa decarboxylase enzyme. Using a microscope, they could tell if a gene was inherited from the mother (colored red) or from the father (colored blue).

An investigation of the entire mouse brain revealed 11 regions that contained groups of neurons that only use mom’s copy of the dopa decarboxylase gene. Conversely, in the adrenal gland, there were groups of cells that were exclusively expressed by the gene copy inherited from the dad.

These findings immediately led to an existential question: could our behavior be influenced by these genetic biases? To answer, the researchers analyzed mice with mutations that switched off one parent’s copy in a select group of cells while the rodents were foraging for food.

The mice were left to explore freely so any external influence was kept to a minimum. Their behavior had to be as natural as possible as they encountered various obstacles, which prompted them to either take risks or retreat to safety, before resuming their quest for finding food.

These movements and behaviors look random and chaotic, but a machine algorithm developed by the researchers was able to pick up subtle, but significant patterns. When these foraging patterns were broken down into modules, the researchers were able to identify behavioral differences associated with each parent’s copy of the dopa decarboxylase genes.

“We have faced a lot of skepticism from the scientific community. The way we study decision-making by using machine learning to detect patterns was hard for scientists to understand. The community was surprised to find that such well-studied genes (Th and Ddc) express the Mum and Dad’s gene copies in different brain and adrenal cells. We had to do a lot of work to show how strong the evidence is for our discovery,” Gregg said.

Christopher Gregg pictured. Credit: Jen Pilgreen.

Gregg had been interested in how biological factors influence our decisions since he first came across Daniel Kahneman’s work in behavioral economics while he was still a postdoc. In the 1970s, Kahneman and Amos Tversky introduced the term ‘cognitive bias’ to describe our systematic but flawed patterns of responses to judgment and decision problems.

For instance, the gambler’s fallacy makes us tend to be certain that if a coin has landed heads up five times in a row, then it’s much more likely to land tails up the sixth time. The odds are, in fact, still 50-50. One of the most pervasive and damaging biases is the confirmation bias, which leads us to look for evidence confirming what we already think or suspect. If you’re disgruntled by the current political divides across the world, where each side seems unable to allow that the other side might be right about some things, you can point the finger at confirmation bias in many cases. There are many other biases, though, with Wikipedia listing at least 185 entries.

Now, Gregg seems convinced that these cognitive biases and some decision processes are deeply rooted in our biology, as well as that of other mammals. And with more research, it may be possible to modify maladaptive behaviors in a clinical setting, with potential new treatments for conditions like anxiety or depression.

The main caveat, however, is that all of this work has been performed on mice. Gregg and colleagues now want to develop and apply a new artificial intelligence platform called Storyline Health to human decision-making and behavior. They expect to discover genetic factors that control our behavior and cognition in a similar way to rodents.

“I am very excited about this new area that emerges from our work and merges decision making, machine learning and genetics. We are going to discover a lot of important new things about the factors that shape our decisions,” he said.

The findings appeared in the journal Cell Reports.

World’s tiniest antenna is made from DNA

Illustration of the fluorescent-based DNA antennae. Credit: Caitlin Monney.

Chemists at the Université de Montréal have devised a nano-scale antenna using synthetic DNA to monitor structural changes in proteins in real-time. It receives light in one color and, depending on the interaction with the protein it senses, transmits light back in a different color, which can be detected. The technology could prove useful in drug discovery and the development of new nanotechnologies.

DNA contains all the instructions needed for an organism to develop, survive, and reproduce. The blueprint of life is also extremely versatile thanks to the self-assembly of DNA building blocks.

Using short, synthetic strands of DNA that work like interlocking Lego bricks, scientists can make all sorts of nano-structures for more sophisticated applications than ever possible before. These include “smart” medical devices that target drugs selectively to disease sites, programmable imaging probes, templates for precisely arranging inorganic materials in the manufacturing of next-generation computer circuits, and more.

Inspired by these properties, the Canadian researchers led by chemistry professor Alexis Vallée-Bélisle have devised a DNA-based fluorescent nanoantenna that can characterize the function of proteins.

“Like a two-way radio that can both receive and transmit radio waves, the fluorescent nanoantenna receives light in one color, or wavelength, and depending on the protein movement it senses, then transmits light back in another color, which we can detect,” said Professor Vallée-Bélisle.

The receiver of the nanoantenna reacts chemically with molecules on the surface of the target proteins. The 5-nanometer-long antenna produces a distinct signal when the protein is performing a certain biological function, which can be detected based on the light released by the DNA structure.

“For example, we were able to detect, in real-time and for the first time, the function of the enzyme alkaline phosphatase with a variety of biological molecules and drugs,” said Harroun. “This enzyme has been implicated in many diseases, including various cancers and intestinal inflammation.”

These nanoantennas can be easily tweaked to optimize their function and size for a range of functions. For instance, it’s possible to attach a fluorescent molecule to the synthesized DNA and then attach the entire setup to an enzyme, allowing you to probe its biological function. Furthermore, these crafty DNA-based machines are ready-to-use for virtually any research lab across the world. Vallée-Bélisle is now working on setting up a startup to bring this product to the market.

“Perhaps what we are most excited by is the realization that many labs around the world, equipped with a conventional spectrofluorometer, could readily employ these nanoantennas to study their favorite protein, such as to identify new drugs or to develop new nanotechnologies,” said Vallée-Bélisle.

The findings appeared in the journal Nature Methods.

New vaccine could remove “zombie” cells that cause aging

Researchers in Japan have developed a new vaccine that they argue could remove senescent cells, also known as zombie cells, which are usually associated with aging and several diseases. Mice administrated with the vaccine showed decreased levels of the zombie cells, creating antibodies that attach to the cells and removing them. 

Image credit: Pixabay (Creative Commons).

Professor Toru Minamino of Juntendo University and a team of researchers identified a protein in senescent cells in humans and mice, then created a peptide vaccine that targets it. When applied, the body creates antibodies that attach themselves to the cells, which are then removed by white blood cells that adhere to these antibodies, Minamino told The Japan Times.

The researchers first administered the new vaccine to mice with arterial stiffness, reporting positive results. Plenty of accumulate zombie cells were removed and areas affected by the disease were reduced. Then they applied the vaccine in older mice, in which the progression of age proved to be slower compared to mice who hadn’t been vaccinated. 

“Senolytic vaccination also improved normal and pathological phenotypes associated with aging, and extended the male lifespan of progeroid mice,” the researchers wrote in their paper in the journal Nature Aging, reporting on the results. “Our results suggest that vaccination targeting seno-antigens could be a potential strategy for new senolytic therapies.”

Understanding zombie cells

A wide array of stress factors can harm our body cells. Ideally, these are removed through our immune system through a process called apoptosis. But as we get older the body isn’t as effective at removing dysfunctional cells. This can contribute to an already weakened immune system and less efficient biological processes, triggering disease. 

Over the years, researchers have been exploring whether better management of senescence cells can revitalize aging tissues and increase active years of life. These cells are quite unique as they eventually stop multiplying but don’t die when they should. Instead, they continue releasing chemicals that can cause inflammation – like a moldy fruit affecting the rest. 

The older we are, the more zombie cells we have in our body. And since our immune system is less efficient, these cells accumulate and affect healthy ones. This can affect our ability to cope with illness or stress, recuperate from injuries and even learn new things, like another language – as zombie cells also degrade our brain’s cognitive functions. 

Senescent cells have been linked with a set of age-related conditions, such as cancer, stroke, Alzheimer’s disease, diabetes, cardiovascular disease and even eyesight problems. Researchers have been looking at these cells since early 1960s, with investigations currently being done on a potential connection with cytokine storm induced by Covid-19. 

Back in May, researchers at the University of California, San Francisco (UCSF) reported having discovered how immune cells naturally clear the body of zombie cells. Their finding, based on laboratory experiments in mice, could open the door to new approaches and strategies to treat age-related diseased with immunotherapy, they argued. 

The study behind the new vaccine was published in the journal Nature Aging. 

New gene-editing technology creates single-sex mice

A group of researchers at the Francis Crick Institute, working with the University of Kent, have used gene-editing technology to create male-only and female-only mice litters. The technology could avoid the destruction of hundreds of thousands of unwanted mice in the academic world, as either male or female mice are typically required. 

Image credit: Flickr / Nick Harris.

Whether we like it or not, there’s still a great deal of research that requires animal subjects. However, this demand isn’t uniform across genders. For any given task, there’s usually a demand for just male or female animals, not just in scientific research but also in farming.

Laboratory studies sometimes require only animals of the sex being studied, while in farming only female animals are needed for egg production and in dairy herds. That’s why it’s a common practice for animals of the undesired sex to be culled after birth. But that could change soon.

By deactivating a gene involved in the embryo development, the mice can be programmed to only form female embryos at an early stage of development, the researchers explained. This seems to work in experiments (with 100% accuracy), but the next step will be pilot studies, which will hopefully prove the feasibility of the method.

This could end up preventing millions of animals from being culled, having long-reaching implications, researchers say. It could be transformative — but it’s a form of animal eugenics, and we shouldn’t rush into it without discussing the implications at the society level.

“The implications of this work are potentially far-reaching when it comes to improving animal welfare, but should be considered at ethical and regulatory levels,” Peter Ellis, study author, said in a statement. “Before any use in agriculture, there would need to be extensive public conversation and debate, as well as changes to legislation.”

What’s behind this technology 

Sex chromosomes are behind whether a mammal turns out of male or female sex. Males have a Y chromosome from their father and an X chromosome from their mother, while females have two X chromosomes. In the study, the researchers found a way to deactivate a gene and prevent XX and XY mouse embryos from developing.

This is how it works. The team embedded one half of the gene-editing molecule, known as Crispr-Cas9, which deactivates the gene, into the father’s X or Y chromosome (depending on the sex needed) and the other into the mother’s DNA. This only works if both parts of Crispr-Cas9 are linked together, the researchers said.

“This method works as we split the genome editing process in half, between a male and female, and it is only when the two halves meet in an embryo through breeding, that it is activated. Embryos with both halves cannot develop beyond very early cell stages,” Charlotte Douglas, first author and scientist at the Crick, said in a statement. 

Surprisingly, the litter of the mice edited thusly didn’t turn out 2 times smaller (as you may expect with one of the sexes gone). Instead, litter sizes were around 30-40% smaller than the control litters. This happened because mice produce more eggs than needed. This would mean that when one sex is needed, fewer breeding animals would be required to produce the same number of offspring. 

The offspring that do survive only have half of the CRISPR-Cas9 elements within their genome. This prevents sex selection from being inherited by further generations – unless they are bred with an individual of the opposite sex that has the other half. It’s a different approach to “gene-drive” methods, which spread a mutation widely in a population.

It’s not the first time something like this has been proposed. Billions and billions of male chicks are slaughtered each year, as only females are useful for egg-laying, and researchers are developing ways to select the sex of chick embryos. 

The study was published in the journal Nature Communications. 

Eugenics: how bad science was used to promote racism and ableism

Eugenics is the idea to selectively ‘improve’ humankind by only allowing specific physical and mental characteristics to exist. It focuses on systematically eradicating ‘undesirable’ physical traits and disabilities, and although it has been long discredited as a science, some of its ideas are still surprisingly prevalent in today’s society.

A Eugenics Society poster (1930s) from the Wellcome Library Eugenics Society Archive. Wikimedia Commons.

In some forms, eugenics actually has a remarkably long history. Some indigenous peoples of Brazil practiced infanticide against children born with physical abnormalities, and in ancient Greece, the philosopher Plato argued in favor of selective mating to produce a superior class. The Roman Empire and some Germanic tribes also practiced some forms of eugenics. However, eugenics didn’t truly become a large-scale idea until the 20th century.

Progress didn’t just happen in Europe

The foundation of eugenics lies on racist beliefs and ideologies — and especially something called scientific racism: a pseudoscientific belief that tries to empirical evidence to support or justify racism.

In 1981, American paleontologist Stephen Gould wrote ‘The Mismeasure of Man’, a book in which he discusses the problems of the continuous belief in biological determinism that later became eugenics. He gave examples of the instances of scientific racism and how some scientists contributed to providing ‘evidence’ to the superiority of white people, shaping faulty beliefs for decades or centuries. In the book, you can find the a remarkable list of horrid theories and studies which the researchers insisted on putting one race above the other. 

The most famous ranking of races was developed by 19th-century physician Samuel George Morton. Morton, believing himself to be objective, used his collection of skulls of different American Ethnicities to compare cranial capacities and try to prove superior intelligence of one group over the other. His study was basically done by ranking average head sizes (which is not directly connected to intelligence) but mixed different heights in his samples, which induced an obvious bias to his analysis. The analysis was strongly skewed towards linking intelligence with white men, and Morton’s conclusion was that white men were the most intelligent race on the face of the Earth. Gould criticized Morton’s data (though he does mention that the bias may have been unconscious), noting that the analysis includes analytical errors, manipulated sample compositions, and selectively reported data. Gould classifies this as one of the main instances of scientific racism.

But it gets even worse. Colonialism was working hand in hand with the idea that Europeans were carrying out a ‘civilizing mission’. White Europeans were doing nothing but a generous act of ‘helping’ ‘inferior’ races to develop and become civilized. This patronizing notion is easily debunked with historical evidence. For instance, we know Mesoamerican and Andean civilizations were empires and they didn’t need foreign influence to achieve progress. Take Stonehenge for instance, a monument in England we believe was built around 3000 or 2000 BC. Though very impressive and with enough complexity, it is not as advanced as the Giza pyramid complex in Egypt, which was created around nearly the same period, proving how civilizations were evolving independently. 

Social Darwinism

Image credits: Gennie Stafford.

Another interesting aspect of eugenics is so-called social Darwinism. Social Darwinists believe that “survival of the fittest” also happens in society — some people become powerful in society because they are somehow innately better.

Social Darwinism was invented by one of the founders of eugenics, Sir Francis Galton, one of Charles Darwin’s cousins. He believed that eugenics should ‘help’ the human race to reach its ultimate ‘potential’ accelerating the ‘evolution’ by eliminating the ‘weak’ and keeping the ‘appropriate races’. 

The problem is it does not fit any scientific evidence. First, genetics has clearly shown that we don’t have a separation in races, race is rather a social construct more than a genetic one. Differences do exist, but they have to do with common ancestry. In our species, we share 99.9% of our DNA, regardless of race. As a result, one ethnicity is not better than the other in anything, not in appearance, behavior, or intelligence.

The other misconception lies in natural selection itself. Evolution, for humans, is a slow process, it takes time for a genetic trait to become dominant in a species. Social change, on the other hand, is much faster; regimes fall, presidents change, policies change. The changes in society can be beneficial or not for some people, maybe everyone will have easy access to vaccines and survive an epidemic, while in a different regime people can get sick for not having these basic rights. Or worse, shorten the number of people simply because they do not have enough to eat. This has nothing to do with a group being stronger than the other, but the choice of leaving some unassisted. Simply put, social Darwinism has little scientific evidence to back it up — and a lot of evidence against it.

How technology fits in

Morton’s ideas are obviously flawed, but scientists took them as an objective analysis for decades — and that’s when the chaos started. One scientist cites the other, and the other, propagating false ideas and sending their echo through history affecting millions of lives for years. More theories like those emerged, with developments that come with the evolution of science, but the insistence of ranking white men as the ‘apex predator’ perpetuated. Even leading scientists can fall prey to racist ideas, and mask them as scientific racism.

Even with modern machine learning and big data, these ideas can still continue to propagate. If the scientists involved don’t make sure that their code is not being susceptible to biases, the computer won’t be objective. That happened to a machine learning routine using data from hospitals in the US. The algorithm wanted to find patients with risks, one of the easy ways is to look for the amount of money spent by a patient in one year. Seems reasonable, but the problem is the model excluded a large number of black people, for obvious reasons, our society is biased. The fact that this particular system involves money has nothing to do with the patient’s condition. 

Machine learning is based on statistics, and some of the fathers of statistics are intertwined with eugenics. If you ever took a statistics course, you may have heard the name ‘Pearson’. Karl Pearson developed hypothesis testing, the use of p-values, the Chi-Squared test, and many other useful tools for science still used today. However, the scientist held strong beliefs in Social Darwinism, a distorted idea that due to natural selection some groups struggle more because in the end ‘the stronger survive’. Pearson even supported wars against ‘inferior races’. In 2020, the University College London renamed lecture halls and a building which originally honored Pearson and Francis Galton.

The search for the ‘special mind’

Besides ethnicity, the next eugenicist target is intelligence. The French psychologist Alfred Binet invented what we know today as the first version of the IQ test. He wanted his test to be used to help kids at school — those who performed poorly would be sent to special classes to get help adapting. He didn’t want that to be a label to segregate people. However, his ideas were distorted by some scientists in the USA. In the American continent, the test was used to reinforce the old fallacies for ranking people, even becoming a mechanism to select immigrants. 

In time, the IQ test became the one you know today. The problem with it is that it’s often used to segregate people, without accounting for cultural or socioeconomic factors that could affect IQ scores. That’s not all: American psychologist Henry Goddard, the one responsible for corrupting Binet’s ideas, defended the idea that ‘feeble-minded’ people should not have children. In addition, he and other gentlemen chose words like ‘idiot’, ‘moron’, and ‘feeble-minded’ to classify people — words we still use today to insult someone.


The ultimate goal of eugenics is perpetuating only the ‘good’ genes — which means not allowing those who have ‘bad’ genes to reproduce.

This led to forced sterilizations in people with mental disorders. The most famous example was the case Buck vs. Bell in the US in 1927. Most of the over 60,000 sterilizations happened in the United States in people whose conditions were labeled as ‘feeble-minded’ and ‘insane’ between the 1920s and 1950s.

These procedures were typically carried out in asylums or prisons, with a medical supervisor having the right to decide whether the inmates’ reproductive systems should be altered or not. The practice is now considered a violation of their rights — and the motivation that “it would improve inmates’ lives” is considered bogus, as is “concern about the financial burden the inmates would provide if they had children”, punishment, and of course “avoid the reproduction of the unfit”. All these with California’s law that the person had no right for objection or appeal.


A lot happened from Goddard’s time to the 1930s and 1940s when autism was discovered. Know the famous guy, Hans Asperger? Well, he was a nazi Austrian pediatrician known for understanding one ‘type’ of autism, later known as Asperger Syndrome. The diagnostic criteria for Asperger Syndrome were removed from the  Diagnostic and Statistical Manual of Mental Disorders in 2013. There are no longer sub diagnoses, it is all called Autism Spectrum Disorder (ASD).

Asperger observed there were autistic children who were more ‘adaptable’ to the social norms, they could act ‘normal’, so he labeled those children as “high functioning”, while others were “low functioning”. The low functioning was considered a burden and not fit for the Third Reich because they couldn’t do the tasks of a “normal” person. In other words, they wouldn’t be profitable. Asperger would then transfer these ‘genetically inferior’ children to the ‘euthanasia’ killing programs, making the choice of who was worth living and who wasn’t. Next time you meet people suffering from autism, ask if they want to be connected to that idea before calling anyone low functioning/high functioning/aspie — spoiler, they almost definitely don’t.

Genetic research can be eugenist, without mentioning the word or directly defending the idea. Nobody seems to ask autistic people what types of research could be done in order to make their lives better, it is usually a concern on ‘how parents should not have a burden’ – pay attention to the advertisements, do they display autistic people in successful positions, or are they pictures of children with their parents? 

More recently, Spectrum10k research was paused. The UK-based researchers wanted to interview and collect DNA from autistic people and their relatives. The autistic community was not consulted and questioned on who the data would be shared with. They realized people involved in the project had a history of questionable research regarding autistic DNA, so advocates protested and the study was paused with the promise they will listen to autistic people.

“People with disabilities are genuinely concerned that these developments could result in new eugenic practices and further undermine social acceptance and solidarity towards disability – and more broadly, towards human diversity.”

Said Catalina Devandas on 28 February 2020, a UN Special Rapporteur on the rights of persons with disabilities.

Gould saw a problem with many ideas back in the 90s, he edited the book to add the biased ‘research’ of his time, with the hope to alert scientists not to make those same mistakes. It is evident that our world of today has no more space for racist/ableist science like thise, so why is it ok for labels which came from those eras to be in machine learning, the therapists’ offices, and schools? It’s about time to cut the eugenics out of our civilization.

We’re getting a better idea of how moles turn into melanoma, and environment is key

New research is upending what we knew about the link between skin moles and melanoma.

Image via Pxhere.

Moles and melanomas are both types of skin tumors, and they originate from the same cells — the pigment-producing melanocytes. However, moles are harmless, and melanomas are a type of cancer that can easily become deadly if left untreated. The close relationship between them has been investigated in the past, in a bid to understand the emergence of melanomas.

New research at the Huntsman Cancer Institute (HCI) , the University of Utah, and the University of California San Francisco (UCSF) comes to throw a wrench into our current understanding of that link. According to the findings, our current “oncogene-induced senescence” model of the emergence of melanomas isn’t accurate. The research aligns with other recent findings on this topic, and propose a different mechanism for the emergence of skin cancer.


“A number of studies have challenged this model in recent years,” says Judson-Torres. “These studies have provided excellent data to suggest that the oncogene-induced senescence model does not explain mole formation but what they have all lacked is an alternative explanation — which has remained elusive.”

Melanocytes are tasked with producing the pigments in our skin which protect us from harmful solar radiation. Changes (mutations) in one specific gene in the genome of melanocytes, known as BRAF gene mutations, are heavily associated with moles; such mutations are found in over 75% of skin moles. At the same time, BRAF gene mutations are encountered in 50% of melanoma cases.

Our working theory up to now — the oncogene-induced senescence– was that when melanocytes develop the BRAFV600E mutation, it blocks their ability to divide, which turns them into a mole. However, when other mutations develop alongside BRAFV600E, melanocytes can start dividing uncontrollably, thus developing into cancer.

The team investigated mole- and melanoma tissues donated by patients at the UCSF Dermatology clinic in San Francisco or the HCI Dermatology clinic in Salt Lake City. Their analysis revolved around two methods known as transcriptomic profiling and digital holographic cytometry. The first one allows them to determine molecular differences between the cells in moles and those in melanomas. The second one was used to track changes inside individual cells.

“We discovered a new molecular mechanism that explains how moles form, how melanomas form, and why moles sometimes become melanomas,” says Judson-Torres.

The team reports that melanocytes don’t need to have mutations besides BRAFV600E to morph into melanoma. What does play a part, however, are environmental factors, transmitted to the melanocytes through the skin cells around them. Depending on exactly what signals they’re getting from their environment, melanocytes express different genes, making them either stop dividing or divide uncontrollably.

“Origins of melanoma being dependent on environmental signals gives a new outlook in prevention and treatment,” says Judson-Torres. “It also plays a role in trying to combat melanoma by preventing and targeting genetic mutations. We might also be able to combat melanoma by changing the environment.”

The authors hope that their findings will help researchers get a better idea of the biomarkers that can predict the emergence of melanoma at earlier stages than possible today. Furthermore, the results here today can also pave the way to more effective topical medicine that can prevent melanoma, or delay its progress.

The paper “BRAFV600E induces reversible mitotic arrest in human melanocytes via microrna-mediated suppression of AURKB” has been published in the journal eLife.

Test tube baby population: from 1 to a few million in less than 50 years

The idea of so-called “test-tube babies” (technically called in vitro fertilization) is not new, but it has developed and matured incredibly rapidly — up to the point where in developed countries, it’s become a fairly routine procedure. The technique can help with fertility problems, enabling millions to conceive a child.

But the technique can be used for even more things than just conceiving. The method has been used to screen for embryos carrying hereditary genetic diseases, and even for features that are unrelated to diseases, such as sex selection — which has raised a number of ethical questions and concerns. 

A new born baby.
Image credits: Jan Canty

IVF is the same process that was once employed in the late 1970s to give birth to the world’s first test-tube baby, and since then it has come a long way. 

When a couple is unable to conceive children naturally (whether it’s due to physiological or reproductive issues), doctors carry out the artificial fertilization of their sperm and egg under laboratory conditions. This external lab-based fertilization is called IVF and the baby born from using this method is colloquially referred to as a test-tube baby.

Until now, about eight million children have been born by IVF globally — but IVF is not just limited to childbirth. Entrepreneurs and many medical experts believe that IVF could also play a key role in human genetic engineering, genetic diagnosis, and numerous other advanced medical technologies in the future. For these reasons, the technology has garnered a fair share of critics.

History of IVF and the first test tube baby

In 1891, Cambridge University professor Walter Heape performed the first-ever mammal embryo transfer. More than 50 years later, American scientists John Rock and Miriam Menkin introduced the concept of biochemical pregnancy by extracting and fertilizing oocytes (immature eggs) and sperm cells in-vitro.

In 1958, a paper concerning in-vitro fertilization was published in Nature by researchers Anne Mclaren and John Bigger, this was the first study that proposed that fertilization outside of a woman’s body as possible. The following year, biologist M.C. Chang performed a successful experiment involving the birth of a live rabbit using in-vitro fertilization, this groundbreaking achievement led to a spree of in-vitro fertilization experiments across the globe. Things were moving quickly and already, researchers started to look forward to the world’s first “test-tube baby”. But the time was not ripe yet.

In-vitro fertilization using human gametocytes (the precursors of male and female reproductive cells) would not be performed until 1973, when a team of Australian embryologists (Alan Trounson, Carl Wood, John Leeton) created a biochemically conceived human embryo that survived for just a couple of days. The same year, American gynecologist Landrum Shettles also tried to perform a human IVF experiment but he had to cancel the same due to unknown reasons. Then, it finally happened.

In November 1977, Lesley Brown along with her husband Peter Brown decided to conceive a child through IVF. The couple had their gametocytes fertilized on a laboratory dish at Dr. Kershaw’s Hospice in Royton, England under the supervision of Dr. Patrick Steptoe, Dr. Robert Edwards, and embryologist Jean Purdy. About nine months later, Lesley gave birth to the world’s first test-tube baby, Louise Joy Brown on July 25, 1978.

The published news of first test tube baby.
The birth of Louise Brown became the headline. Daily Mail issue published on July 27, 1978.

Just two months after Louise’s birth, a second test-tube baby was born in Kolkata, India. The newborn girl was named Durga and Dr. Subhash Mukharjee and embryologist Sunit Kumar Mukharjee were responsible for her conception through IVF.

Both Louise and Durga (official name – Kanupriya Agarwal) are now 43 years old and mothers of naturally born children. Louise’s younger sister Natalie was also born through IVF and she was the first IVF-born person to give birth to children. 

For his exceptional work in the field of in-vitro fertilization, Robert Edwards was awarded the 2010’s Nobel Prize in Medicine. Steptoe and Purdy had passed by that time so they were not eligible for the award.

IVF Facts

In-vitro fertilization taking place.
In-vitro fertilisation (IVF). Image credits: DrKontogianniIVF/Pixapay

IVF has enabled hundreds of thousands of families to have children of their own, the assisted reproductive technology (ART) has emerged as the most successful treatment for infertility. However, there are various shocking myths and facts associated with test-tube babies that make it a controversial subject as well:

  • IVF has been a subject of debate among various religious communities. The Catholic Church and many Sunni Islamic scholars have not been in the favor of IVF because they believe that assisted reproductive techniques are immoral and interfere with the natural process of reproduction. Several religious groups are against the practice.
  • Unmarried couples and people having certain types of contagious medical conditions are not allowed to undergo IVF in China. In India, IVF is allowed to conceive children but prenatal sex discernment (detecting the sex of fetus) through IVF is a punishable crime.
  • In the US, pineapple (the fruit) has emerged as a symbol of hope among many couples facing infertility or undergoing IVF treatment. People tend to believe that by eating pineapple, the probability of them being pregnant increases. However, there is no scientific evidence or research that validates this belief. 
  • Many people also happen to believe that there is no risk of ectopic pregnancy when a couple conceives a child through in-vitro fertilization. This is not true because research reveals that while the possibility of ectopic pregnancy in IVF is between 2 and 8.6%, it is only 1 to 2% in the case of natural conception.
  • There is plenty of room for IVF to grow, and it likely will. In the US alone, infertility affects 10% of women, and approximately 1.9% of all infants born in the United States every year are conceived using assisted reproductive techniques.
  • Plenty of factors affect IVF success rates, but the most important factor determining success rates is a woman’s age. However, while complications are not uncommon after the age of 40, women much older can give birth through IVF. Until recently, Adriana Iliescu from Romania held the record for as the oldest woman to give birth using IVF and a donor egg, when she gave birth in 2004 at the age of 66. In September 2019, a 74-year-old woman became the oldest-ever to give birth after she delivered twins at a hospital in Guntur, Andhra Pradesh.
  • In the US, the average IVF. cycle can cost anywhere from $12,000 to $25,000. Prices may vary in other parts of the world, but this remains a relatively expensive technique.

Impact and future of the test-tube baby technique  

Apart from families coping with infertility, the test tube method has also allowed same-sex couples, single individuals, and overaged life partners to conceive children. IVF techniques have made parenthood accessible to more human beings than ever before.

Many couples who earlier faced infertility are now enjoying parenthood due to IVF treatment.
A mother playing with her child at the beach. Source: Pixabay/pexels

According to a report, the IVF market is expected to value around $25.56 Billion by the year 2026. Increasing delayed pregnancy among the youth, rising birth success rate, and growing acceptance for IVF also indicate that the test tube baby technique (along with other ART methods such as artificial insemination, surrogacy, etc) is going to be more popular in the coming years.

The birth success rate of test-tube babies has also increased considerably over the years and now stands at 52% (for people below the age of 35 years). During IVF treatment, doctors are able to choose an embryo that is least likely to carry genetic disorders. Moreover, scientists are now trying to go one step further, they are looking for ways through which they can manipulate the genes of in-vitro embryos so that genetically superior individuals could be born. Needless to say, many other scientists (and important parts of civil society) are strongly against this idea.

Concerns still loom regarding the potential use of IVF and related techniques for eugenics — the improvement of the embryo by the selection of desired hereditary traits. If you could make your baby more likely to be tall, intelligent, and have blue eyes, would you? Millions likely would, but this opens up a can of worms that many researchers and philosophers fear could steer humanity towards a darker path that could spiral out of control and lead to discrimination and in the long term, increase the risk of our species going extinct due to less richness in the gene pool.

Ultimately, technology has had a significant and positive impact on humanity, and will likely continue to have a bigger and bigger impact as technology progresses. The debate around what’s acceptable for IVF is still not settled, and the discussion will likely continue for decades and centuries. It’s up to researchers and civil society to try to steer the technology into a continuously positive direction and stay clear of dystopian applications.

We’re evolving right now: scientists see how our genome is changing in recent history

A new study from Europe has identified 755 traits that have changed in the past 2-3,000 years of human evolution. These traits are linked with things like pigmentation, nutritional intake, and several common diseases or disorders.

We sometimes tend to think of humans as the pinnacle of evolution, the tip of the biological pyramid. Not only does that show just how self-centered we humans can be, but it’s not really correct either. Even if it were to be the case, it’s not like evolution has stopped — it’s happening right as you’re reading this.

Natural selection (the process through which individuals better adapted to an environment are more likely to reproduce) isn’t just happening in the animal world, it’s happening for humans too. Granted, the pressures that drive this can be quite different, but the process is taking place nonetheless — and it’s been happening since the dawn of human history.

Understanding the patterns behind our past and present evolution isn’t just a scientific curiosity, it could have important applications in the field of medicine and human biology.

“The genetic architecture of present-day humans is shaped by selection pressures in our history. Understanding the patterns of natural selection in humans can provide valuable insights into the mechanisms of biological processes, the origin of human psychological characteristics and key anthropological events,” the researchers write in the new study.

The shells of individuals within a bivalve mollusk species showing diverse coloration and patterning in their phenotypes. Image credits: Debivort.

The team, led by Weichen Song  from Shanghai Jiao Tong University School of Medicine, sequenced modern human genetic data from the UK Biobank, along with ancient human DNA from across Europe. They analyzed 870 polygenic traits — traits whose phenotype (the set of observable characteristics or traits of an organism) is influenced by more than one gene, comparing differences between the old and the new genetic groups.

They found that 88% of these traits (755) underwent significant change in the past 2-3 thousand years. Some of these findings were linked to pigmentation, body size, and nutritional intake.

“One of our most interesting results was the finding that pigmentation, body measurement, and dietary traits were continuously under intense selection pressure across various human development timescales,” the study also reads.

However, researchers caution that their findings are limited exclusively to European data, and it’s not clear if there is a cause-effect between the associations between genetic variants and phenotype.

“In sum, we provide an overview of signals of selection on human polygenic traits and their characteristics across human evolution, based on a European subset of human genetic diversity. These findings could serve as a foundation for further populational and medical genetic studies,” the researchers write.

Nevertheless, this could serve as a foundation for larger, wider studies, aiding future research into human genetics and evolution.

The study “A selection pressure landscape for 870 human polygenic traits” was published in Nature Communications.

Moral judgment condemning drug use and casual sex may be rooted in our genes

Prior research suggests that people who condemn drug use over moral grounds also tend to judge others harshly who engage in promiscuous, non-monogamous sex. A new study that involved more than 8,000 twins not only confirmed this link but also showed the association may be mediated by genes. Those who wrap their negative views regarding sexuality and drug use in a veneer of morality may, deep down, actually be looking out for their own reproductive strategy by shaming others in order to control the environment.

Public condemnation of casual sex and illicit drug use has never really gone away, despite massive cultural shifts during the 1960s counterculture movement. Although upbringing certainly has a part to play in shaping one’s views of the world and moral compass, psychologists have amassed increasing evidence that many of the instances when we righteously point our fingers may be selfish acts of self-interest.

It’s common for people who disapprove of illicit drug use to also frown upon casual sex. Each of these instances shouldn’t bother other people since it doesn’t affect them directly in any way unless they interact with people who engage in them. But past studies have shown that openness to engage in casual sex is partially explained by genes. And those who are inclined to engage in noncommittal sex are also more likely to use recreational drugs.

“People adopt behaviors and attitudes, including certain moral views, that are advantageous to their own interests. People tend to associate recreational drug use with noncommitted sex. As such, people who are heavily oriented toward high commitment in sexual relationships morally condemn recreational drugs, as they benefit from environments in which high sexual commitment is the norm,” said Annika Karinen, a researcher at Vrije Universiteit Amsterdam in the Netherlands and lead author of the new study.

Karinen and colleagues decided to investigate whether there is any genetic basis surrounding moral views on both sex and illicit drug use. They employed a dataset from a survey of 8,118 Finnish fraternal and identical twins. Identical twins share almost all their genes while fraternal twins share roughly half of their genes. As such, twin studies are the perfect natural laboratory that allows scientists to tease out genetic factors from environmental ones when assessing behaviors.

Each participant had to answer a set of questions that measured their moral views surrounding the use of drugs and openness to non-committed sex, as well as political affiliations, religiosity, and other facts.

When comparing the results of the questionnaires between fraternal and identical twin pairs, the Dutch psychologists found that moral views concerning both recreational drugs and casual sex are approximately 50% heritable, while the other 50% can be explained by the environment in which people grew up and the unique experiences not shared by the twins. Moreover, the relationship between openness to casual sex and views on drugs is about 75% attributable to genetic effects.

“These findings run counter to the idea that within-family similarities in views toward drugs and sex reflect social transmission from parents to offspring; instead, such similarities appear to reflect shared genes,” the researchers wrote in the journal Psychological Science.

Those who frown upon casual sex and drug use (which they associate with casual sex) may be looking out for a sexual strategy that revolves around committed sex into which they’ve invested a lot of resources. People who engage in casual sex are seen as a threat to the monogamous reproductive strategy because there’s the risk of losing one’s partner in an environment where casual sex is deemed acceptable. By judging other people’s sexuality and drug use from a moral high ground, people who prefer monogamous relationships have a weapon they can wield to control the sexuality of others to serve their own interests.

“Important parts of hot-button culture-war issues flow from differences in lifestyle preferences between people, and those differences in lifestyle preferences appear to partly have a genetic basis,” Karinen added.

Horse domestication traced to 4,200 years ago in the Western Eurasian steppe

Credit: Pixabay.

The domestication of wild horses has had a huge impact on human history, offering important advantages in terms of mobility, nutrition, and warfare, among other things. But there are still many unknowns with regards to when and where humans’ affinity for horses first began. These questions may have finally been answered by a new genetic study that traced the earliest domestication of horses to the Pontic-Caspian steppes, northern Caucasus, around 4,200 years ago.

Ludovic Orlando, a molecular archaeologist from France’s CNRS research agency in Toulouse, was among the first researchers to study the horse genome. His lab houses the world’s largest collection of wild and tamed horse DNA, some as old as 50,000 years old. This massive trove of genetic data helped him and his colleagues have a much clearer understanding of how humans shaped equine evolution. But it was only recently that Orlando and a team of more than 160 international scientists have been able to pinpoint the origin of domestic horses as we know them today.

Initially, the researchers turned their attention to Kazakhstan, where excavations of ancient Botai settlements had suggested these herders were among the first to domesticate horses. Yet although these 5,500-year-old horses from Botai showed signs of domestication, their DNA proved without a doubt that they were not the ancestors of modern domestic horses. So, the researchers moved to other possible origin spots like Anatolia, Siberia, and even the Iberian Peninsula, which each ultimately proved to be dead ends.

“We knew that the time period between 4,000 to 6,000 years ago was critical but no smoking guns could ever be found,” said Orlando in a statement.

In response, Orlando and colleagues went back to the drawing board and opted for a new strategy. They widened their net to compare the genomes of 273 horses that lived between 52,000 and 2,200 years ago.

This strategy eventually paid off, showing that horses in Eurasia suffered a dramatic change between 4,200 and 4,000 years ago. A single genetic profile, which was previously confined to the Pontic steppes of the North Caucasus, spread very fast beyond its native region, eventually replacing all the wild horse populations from the Atlantic to Mongolia in the span of only a few centuries.

“That was a chance: the horses living in Anatolia, Europe, Central Asia, and Siberia used to be genetically quite distinct,” notes Dr. Pablo Librado, first author of the study published today in the journal Nature.

“The genetic data also point to an explosive demography at the time, with no equivalent in the last 100,000 years,” Orlando added. “This is when we took control over the reproduction of the animal and produced them in astronomic numbers.”

This particular early population of domesticated horses introduced two key genomic regions (GSDMC and ZFPM1.) with desirable adaptations that made them appealing to humans. One is linked to more docile behavior, while the other helps horses develop a stronger backbone. In time, these advantageous characteristics were further selected and helped domestic horses spread from the Western Eurasian steppe.

These genetic characteristics surfaced at the same time as archaeological evidence suggests spoke-wheeled chariots and Indo-Iranian languages started to spread throughout Asia. The combination of technology and culture helped the new horse breed to replace all other previous populations across Eurasia, showing how history and science can converge to reveal

These early domestic horse ancestors suffered further important domestications. For instance, after the Arabs expanded into Europe in the 7th-century, Arabian stallions outcompeted males from other breeds, which transferred their Y chromosomes to all modern horses alive today. Today’s horses are much faster and stronger than their counterparts from 1,000 years ago, let alone those that lived 4,000 years ago. At the same time, their genetic diversity is much smaller than it ever has been, allowing more potentially deleterious mutations to accumulate and lead to a higher risk of genetic disease. 

People in the Philippines are the most Denisovan in the world

Genetic analysis has found clear traces that humans and Denisovans interbred in the past. The Philippine ethnic group known as the Ayta Magbukon has the highest level of Denisovan ancestry in the world.

The Negritos group in the Philippines comprises some 25 different ethnic groups, scattered throughout the Andaman archipelago in South-East Asia. They were once considered to be a single population, but the more researchers looked into it, the more they found that Negritos are actually very diverse.

In the new study, Maximilian Larena of Uppsala University and colleagues set out to establish the demographic history of the Philippines. Their project involved indigenous cultural communities, local universities, as well as official and non-governmental organizations from the area. With everyone working together, they were able to analyze 2.3 million genotypes from 118 ethnic groups in the Philippines — including the diverse Negrito populations.

The results were particularly intriguing for a population called the Ayta Magbukon, which still occupy vast swaths of their ancestral land and continue to coexist with the lowland population surrounding them. The Ayta Magbukon seem to possess the highest level of Denisovan ancestry in the world.

“We made this observation despite the fact that Philippine Negritos were recently admixed with East Asian-related groups—who carry little Denisovan ancestry, and which consequently diluted their levels of Denisovan ancestry,” said Larena “If we account for and masked away the East Asian-related ancestry in Philippine Negritos, their Denisovan ancestry can be up to 46 percent greater than that of Australians and Papuans.”

This finding, along with the recent discovery of a small-bodied hominin called Homo luzonensis, suggests that multiple hominin species inhabited the Philippines prior to the arrival of modern humans — and these groups likely interbred multiple times.

The Denisovans are a mysterious group of hominins identified in 2010 based on mitochondrial DNA (mtDNA) extracted from a juvenile female finger bone from the Siberian Denisova Cave. Although researchers haven’t found numerous traces of DNA, they’ve discovered traces of their DNA in modern populations. Apparently, this group in the Philippines has the highest percentage of Denisovan DNA in the world — at least that we’ve found so far.

“This admixture led to variable levels of Denisovan ancestry in the genomes of Philippine Negritos and Papuans,” co-author Mattias Jakobsson said. “In Island Southeast Asia, Philippine Negritos later admixed with East Asian migrants who possess little Denisovan ancestry, which subsequently diluted their archaic ancestry. Some groups, though, such as the Ayta Magbukon, minimally admixed with the more recent incoming migrants. For this reason, the Ayta Magbukon retained most of their inherited archaic tracts and were left with the highest level of Denisovan ancestry in the world.”

Researchers hope to sequence more genomes and better understand “how the inherited archaic tracts influenced our biology and how it contributed to our adaptation as a species,” Larena concludes.

Journal Reference: “Philippine Ayta possess the highest level of Denisovan ancestry in the world” 

African scientists used CRISPR to edit bananas and make them more resilient to disease

Bananas are under threat from disease and climate change. A genetic tool can help.

Bananas are one of the most important food crops in the world. They’re an essential source of food and income for illions of farmers in resource-poor countries, and the overall banana production worldwide surpasses 155 million tons a year. But bananas are under pressure.

All the cultivated banana varieties are susceptible to diseases — and Banana xanthomonas wilt (BXW) is particularly problematic. BXW is a bacterial disease that has emerged as one of the largest threats to bananas. Overall economic losses from the disease were estimated at US$ 2–8 billion over a decade.

While all crops have some pests, being pretty much clones doesn’t really help the case — bananas are commercially propagated through cuttings, which means that banana growers virtually clone the plants. This lack of genetic variety makes them doubly susceptible to pests and disease, and we’ve seen in the past that infections can wipe out entire cultivars of bananas (until the 1950s, the Gros Michel banana cultivar was dominant, and it was wiped out by an outbreak of the Panama disease; now, Cavendish bananas account for around half of the global production, but they too are vulnerable).

With this in mind, researchers from the International Institute of Tropical Agriculture (IITA) scientists in Kenya set out to use genetic modifications to produce more resilient bananas. They used CRISPR/Cas9, a precise but also relatively affordable gene-editing tool, a discovery that earned a Nobel Prize in 2020.

“Recent advances in CRISPR/Cas-based genome editing can accelerate banana improvement,” the researchers write in the study. “The availability of reference genome sequences and the CRISPR/Cas9-editing system has made it possible to develop disease-resistant banana by precisely editing the endogenous gene.”

They focused on a gene called downy mildew resistance 6 (DMR6), a gene that has previously been shown to be important for many plants in fighting disease. During pathogen infection, the expression of this gene works to reduce or suppress the plant’s immune function — so if the gene were to be switched off, the plant’s immune system could be turbocharged.

Rapid bioassay of the edited bananas. Image credits: Tripathi et al.

The plants edited with CRISPR showed increased resilience to the disease, in some cases by up to 66% more resilient. Other than the increased resilience, there seemed to be no differences.

“Growth trial of three replicates of the potted plants of all the edited events under the greenhouse conditions showed normal growth with no morphological differences,” the study reads.

However, the researchers note that the study needs to be replicated on a larger sample size and in more realistic soil conditions, as this study was carried out on potted plants.

With bananas under threat from multiple pathogens, approaches such as this one can make all the difference. It’s not just pathogens, either — climate change has also been shown to have a damaging effect on bananas.

The study was published in Plant Biotechnology Journal.

Sugar just got a bit CRISPR: precise gene edits can improve sugarcane resilience, reduce its environmental impact

Ayman Eid, CABBI Postdoctoral Research Associate at the University of Florida, displays gene-edited sugarcane with reduced chlorophyll content. Credit: Rajesh Yarra, UF/IFAS Agronomy.

Sugarcane is one of the most important plants on Earth — at least for us humans. Not only does it provide 80% of the sugar and 30% of the bioethanol consumed worldwide, but the oil in its leaves and stems is also used to make bioplastics.

But there are two big problems with sugarcane. The first is its environmental impact. It takes huge amounts of water to grow and refine sugar (around nine gallons for a single teaspoon), and the whole process produces a lot of waste. To make matters even worse, sugar takes up large portions of agricultural land, fueling deforestation in several parts of the world.

For researchers, this environmental impact is also an opportunity — an opportunity to change the plant and make it more sustainable. But there’s another, different problem with sugar: it has a complex and messy genome, which makes it very difficult to change and edit it. It often takes over a decade for a single sugarcane cultivar to be properly developed, and crossbreeding sugarcane is notoriously difficult.

But new genetic tools can finally enable researchers to edit sugarcane in desired ways, says Fredy Altpeter, Professor of Agronomy at the University of Florida’s Institute of Food and Agricultural Sciences

“Now we have very effective tools to modify sugarcane into a crop with higher productivity or improved sustainability,” Altpeter said. “It’s important since sugarcane is the ideal crop to fuel the emerging bioeconomy.”

Altpeter and Postdoctoral Research Associate Ayman Eid used the so-called “genetic scissors” CRISPR. CRISPR is a family of DNA sequences found in the genomes of some bacteria and archaea and can be used to edit parts of the genome of both plants and animals, eliminating some sequences and replacing them with more desirable ones. This approach can be used to treat diseases in humans or animals, but also for improving crops.

In two studies, the two researchers and their colleagues did just that: edited the gene of sugarcane using the CRISPR. In the first study, they changed a few genes to change the appearance of the plant. This was more of a proof of concept, to know if it worked or not. They also turned off several copies of a gene that helps sugarcane produce chlorophyll, making the plants turn light green or even yellow. The light green ones seemed to require less fertilizers to grow while producing the same biomass and no detectable side effects, the researchers note.

In the second study, researchers replaced individual nucleotides (the individual building blocks of both RNA and DNA) with better versions that they hoped would give sugarcane more resistance to herbicides. Essentially, this meant editing the plant’s own DNA repair process and making it more resilient to herbicides.

The fact that both attempts worked offers great hope for breeding useful new varieties of sugarcane that can help reduce its dreadful environmental impact.

With conventional breeding, two different types of sugarcane would have been cross-bred to reshuffle the genetic information, hoping that the desirable trait (such as needing less fertilizer) is enhanced. The problem is that it’s not always possible to fully control this, and genes are transferred from parents to offspring in blocks, which means that the desired gene is linked to other, superfluous genes. Researchers often have to do multiple rounds of breeding, and screen the plant to see exactly what changed in the offspring. Genetic tools offer a much more elegant, cheaper, and quicker way to accomplish the same thing.

Of course, whether or not consumers will accept CRISPR-edited plants on the plates remains to be seen. Consumers are almost always wary of modifying the genes of plants, even when the scientific process has been shown to be safe.

Journal Reference: Ayman Eid et al, Multiallelic, Targeted Mutagenesis of Magnesium Chelatase With CRISPR/Cas9 Provides a Rapidly Scorable Phenotype in Highly Polyploid Sugarcane, Frontiers in Genome Editing (2021). DOI: 10.3389/fgeed.2021.654996

Mehmet Tufan Oz et al, CRISPR/Cas9-Mediated Multi-Allelic Gene Targeting in Sugarcane Confers Herbicide Tolerance, Frontiers in Genome Editing (2021). DOI: 10.3389/fgeed.2021.673566

Just how “human” are we? At most, 7% of your DNA is uniquely human, study finds

A landmark study found that only 1.5% to 7% of the human genome contains uniquely (modern) human DNA. The rest is shared with relatives such as Neanderthals and Denisovans.

However, the DNA that is unique to us is pretty important, as it’s related to brain development and function.

Image in public domain.

Researchers used DNA from fossils of our close relatives (Neanderthals and Denisovans) dating from around 40,000-50,000 years ago and compared them with the genome of 279 modern people from around the world. They used a new computational method that allowed them to disentangle the similarities and differences between different DNA with greater detail.

Many people around the world (all non-African populations) still contain genes from Neanderthals, a testament to past interbreeding between the two species. But the importance of this interbreeding may have been understated. The new study found that just 1.5% of humans’ genome is both unique and shared among all people living now, and up to 7% of the human genome is more closely related to that of humans than to that of Neanderthals or Denisovans.

This doesn’t mean that we’re 93% Neanderthal. In fact, just 20% of Neanderthal DNA survives in modern humans, and non-African humans contain just around 1.5-2% Neanderthal DNA. But if you look at different people, they have bits of Neanderthal DNA in different places. So if you add all the parts where someone has Neanderthal DNA, that ends up covering most of the human genome, although it’s not the same for everyone. This 1.5% to 7% uniquely human DNA refers to human-specific tweaks to DNA that are not present in any other species and are strictly unique to Homo sapiens.

In addition, this doesn’t take into account the places where humans gained or lost DNA through other means such as duplication, which could have also played an important role in helping us evolve the way we are today.

What makes us human

The research team was surprised to see just how little DNA is ours and ours alone. But those small areas that make us unique may be crucial.

“We can tell those regions of the genome are highly enriched for genes that have to do with neural development and brain function,” University of California, Santa Cruz computational biologist Richard Green, a co-author of the paper, told AP.

The exact biological function of those bits of DNA remains a major problem to disentangle. Our cells are filled with “junk DNA“, which we don’t really use (or we just don’t understand how our bodies use it yet) — but we still seem to need it. We’re not even sure what the the non-junk DNA bits do. Understanding the full instructions and role that genes have is another massive challenge that’s not yet solved.

What this study seems to suggest is that interbreeding played a much bigger role in our evolutionary history than we thought. Previous archaeological studies also suggest this: humans interbred with Neanderthals, Denisovans, and at least one other mysterious species we haven’t discovered yet (but we carry its DNA). Researchers are finding more and more evidence that these interbreeding events weren’t necessarily isolated exceptions but could have happened multiple times and over a longer period than initially thought. It’s up for future studies to reconcile the archaeological and anthropological evidence with the genetic one.

The study also found that the human-specific mutations seemed to emerge in two distinct bursts: 600,000 years ago and 200,000 years ago, respectively. It’s not clear what triggered these bursts; it could have been an environmental challenge or some other event, which at this point is unknown.

Researchers say that studying this 1.5-7% of our genome could help us better understand Neanderthals and other ancient populations, but it could also help us understand what truly makes us human. For instance, you could set up a laboratory dish experiment where you’d edit out the human-specific genes and revert them back to their Neanderthal function, and compare the molecular results of this change. It wouldn’t exactly be like bringing back a Neanderthal, but it could help us deduct how Neanderthals would have been different from modern humans — or, in counterpart, what makes humans stand out from our closest relatives.

The study “An ancestral recombination graph of human, Neanderthal, and Denisovan genomes” has been published in Science.

We’ve identified the genetic roots of OCD, pointing the way towards new treatments

New research led by the Vagelos College of Physicians and Surgeons, Columbia University, has linked certain patterns of genetic mutation to obsessive-compulsive disorder (OCD) in humans.

Image credits Benjamin Watson / Flickr.

The findings confirm that targeting certain genes can be a valid avenue for treatment against OCD, which affects between 1% to 2% of the population. We’ve known that there is a genetic component to this disorder, as it often runs in the family, but the causes of OCD had remained elusive so far.

Genes made me do it

“Many neurological diseases are influenced by strongly acting mutations which can cause disease by themselves,” says David Goldstein, PhD, director of the Institute for Genomic Medicine at Columbia and a senior author on the new paper.

“These mutations are individually very rare but important to find because they can provide a starting point for the development of therapeutics that target precise underlying causes of disease.”

Previous work on this topic had used a “candidate gene” model, the authors explain, in which researchers focus on particular genes they believe might be involved in a certain pathogenesis — in this case, OCD. While there was some success, such approaches can also miss important genes or lead to errors in statistical interpretation of our data. In other words, it can miss parts of the story and shift our overall understanding of what causes a condition.

However, there has been a recent shift towards genome-wide analyses in this field, the team explains. In short, this approach looks at all genes in a genome at the same time, checking each of them for evidence that they’re increasing the risk of developing OCD.

In collaboration with researchers from the Johns Hopkins University’s psychiatry department, the team used this genome-wide approach to identify relevant genes in the genomes of a cohort of over 1,300 OCD patients. They compared their sequences to a similar group of control participants. Scientists from the University of North Carolina at Chapel Hill, the David Geffen School of Medicine in Los Angeles, Harvard Medical School, and SUNY Downstate Medical Center in Brooklyn were also involved in the study.

They found a strong correlation between OCD and several rare mutations, but one in particular — of a gene called SLITRK5 — seemed to have the strongest association. This gene had also been identified in previous candidate-gene studies for OCD. However, the results of this study are much more reliable and the authors hope they will spur further research and development in drugs targeting the gene.

“When you look at genes that do not tolerate variation in the human population, those are the genes most likely to cause disease, and with OCD, we see an overall increased burden of damaging mutations in those genes compared to controls,” Goldstein says. “That’s telling us that there are more OCD genes to be found and where to find them.”

OCD is a condition that causes patients to have uncontrollable, recurring, and intrusive thought patterns and behaviors. It isn’t what we typically call OCD, such as the minor need to straighten a stack of books. In some, the compulsions are so severe that the person has an impending sense of dread until an action, whatever it may be, is completed. This can lead to a person spending twenty minutes opening and closing a door until they feel it has locked correctly — which is to say, it can heavily interfere with a patient’s daily life. OCD is also relatively common, affecting between 2-3% of US adults, which is twice as much compared to conditions such as schizophrenia.

Currently-available treatment avenues include serotonin reuptake inhibiting drugs and cognitive-behavioral therapy. Both are highly effective when they work, but around half of patients are resistant to either or both. A treatment that targets the genetic roots of OCD would thus be very welcome and useful for patients and doctors both.

The paper “. Exome sequencing in obsessive–compulsive disorder reveals a burden of rare damaging coding variants” has been published in the journal Nature Neuroscience.

We’ve identified a gene variant that seems to make people immune to the effects of COVID — but not to catching the virus

Researchers in the UK are closing in on a possible genetic defense against COVID-19. The findings could help explain why some people can catch the virus without getting sick.

A team of researchers led by members at the Newcastle University, UK, reports that the gene HLA-DRB1*04:01 likely confers its bearers some sort of protection against the coronavirus, or at least, from its more severe symptoms. This conclusion was drawn from the observation that the gene is found, on average, three times as often in asymptomatic patients compared to symptomatic ones.

The study worked with patients from the same communities in the UK in order to limit the influence of other factors such as environment, location, and socioeconomic status.

Genetically insulated

According to the authors, this is the first clear evidence of genetic resistance against COVID-19. While previous research has worked with whole genomes, that approach is far less effective than focusing on individual genes, as the current paper does. A genome-wide view can miss important tidbits of information, quite like watching the forest means you don’t focus on individual trees. The current research focused on comparing symptomatic to asymptomatic members of the same community to make it easier to spot how individual genes or alleles (variations of the same gene) can help protect us from COVID-19.

HLA-DRB1*04:01, a human leukocyte antigen gene, was identified as a prime candidate in this regard. The finding is based on samples taken from 49 patients with severe COVID-19 symptoms — who had been hospitalized with respiratory failure, — 69 hospital workers who had tested positive for the virus but were asymptomatic, and a control group.

These samples were analyzed so that the team could study the different HLA alleles present in the general population of North East England during the first lockdown. The asymptomatic patients, on average, were three times as likely to have the HLA-DRB1*04:01 allele in their genomes than symptomatic patients (16.7% vs. 5.1% after adjustment for age and sex).

From previous research, we know that the incidence of the HLA-DRB1*04:01 allele in the general population is directly correlated to latitude and longitude. People in the North and West of Europe are more likely to have this allele. Sadly, this also means that these areas will have a harder time keeping the virus under control.

“This is an important finding as it may explain why some people catch Covid but don’t get sick,” explains Dr. Carlos Echevarria from the Translational and Clinical Research Institute, Newcastle University, a Respiratory Consultant in the Newcastle Hospitals NHS Foundation Trust, and co-author of the paper. “It could lead us to a genetic test which may indicate who we need to prioritize for future vaccinations.”

“At a population level, this is important for us to know because when we have lots of people who are resistant, so they catch Covid but don’t show symptoms, then they risk spreading the virus while asymptomatic.”

Populations of European descent, the authors add, are most likely to remain asymptomatic but still carry and transmit the disease to individuals that do not enjoy the same levels of genetic protection. The fact that there is a link between gene expression and geolocation is a well-established scientific concept. Genes are selected for by the unique sets of demands placed on different groups by their environment, so people living in different areas will evolve different types of genetic defences. The HLA gene is no different: they develop over generations as a response to pathogens.

“Some of the most interesting findings were the relationships between longitude and latitude and HLA gene frequency,” adds co-author David Langton, whose company ExplantLab helped fund the study. “It has long been known that the incidence of multiple sclerosis increases with increasing latitude. This has been put down in part to reduced UV exposure and therefore lower vitamin D levels. We weren’t aware, however, that one of the main risk genes for MS, that is DRB1*15:01, directly correlates to latitude.”

“This highlights the complex interaction between environment, genetics and disease. We know some HLA genes are vitamin D responsive, and that low vitamin D levels are a risk factor for severe COVID and we are doing further work in this area.”

Still, the team notes that more studies will be needed (both in the UK and other areas) as there may be different copies of the HLA genes providing resistance in other populations.

The paper “The influence of HLA genotype on the severity of COVID‐19 infection” has been published in the journal HLA.

Scientists may have finally sequenced the entire human genome

In 2003, after nearly $3 billion in funding and 13 years of painstaking research, scientists with the Human Genome Project (HGP) announced they had finally mapped the first human genome sequence. This was a momentous breakthrough in science that would revolutionize genomics. However, the initial draft and updates of the human genome sequence that followed were not 100% complete. But now, scientists with the  Telomere-to-Telomere (T2T) Consortium claim they’ve addressed the remaining 8% of the human genome that was missing.

“The Telomere-to-Telomere (T2T) Consortium has finished the first truly complete 3.055 billion base pair (bp) sequence of a human genome, representing the largest improvement to the human reference genome since its initial release,” wrote the scientists in a paper published in the pre-print server bioRxiv, meaning it has yet to be peer-reviewed.

The first truly complete genome of a vertebrate

The genome is the sum of all the DNA and mitochondrial DNA (mtDNA) sequences in the cell. It contains all the instructions a living being needs to survive and replicate, consisting of chemical building blocks or “bases” (G, A, T, and C), whose order encodes biological information.

In diploid organisms, such as humans, the size of the genome is considered to be the total number of bases in one copy of its nuclear DNA. Humans and other mammals contain duplicate copies of almost all of their DNA. For instance, we have pairs of chromosomes, with one chromosome of each pair inherited from each parent. But scientists are only interested in sequencing the sum of the bases of one copy of each chromosome pair. A person’s actual genome is roughly six billion bases in size, but a single “representative” copy of the human genome is about three billion bases in size.

Because the human genome is so large, its bases cannot be read in order end-to-end in one single step. What HGP scientists did to sequence the genome was to first break down the DNA into smaller pieces, with each piece then subjected to various chemical reactions that allowed the identity and order of its bases to be deduced. These bits and pieces were then put back together to deduce the sequence of the starting genome.

Although genome sequencing technology has advanced a lot since the HGP announced the first draft of the human genome in 2001, a complete sequence of the entire genome was never achieved. Around 8% of the genome was missing, which corresponds to areas where DNA sequences are made up of long repeating patterns. Some of these repeating patterns, such as those found in the centromeres of chromosomes (the ‘knot’ that ties chromosomes together), play important biological roles, but standard technology hasn’t been able to decode them properly.

Using revolutionary new technology, scientists affiliated with T2T now claim that they’ve filled these gaps.

“You’re just trying to dig into this final unknown of the human genome,” Karen Miga, a researcher at the University of California, Santa Cruz, who co-led the international consortium, told STAT News. “It’s just never been done before and the reason it hasn’t been done before is because it’s hard.”

According to Miga and colleagues, the genome breakthrough was made possible thanks to new DNA sequencing technologies developed by Pacific Biosciences in California and Oxford Nanopore in the UK. These technologies do not cut the DNA into tiny pieces for later assembly, which can result in errors. Instead, Oxford Nanopore tech runs the DNA molecule through a nanoscopic hole, resulting in a long sequence. Meanwhile, lasers developed by Pacific Biosciences read the same DNA sequence again and again, which makes the readout far more accurate than previous technology.

Both technologies complemented each other to reveal the missing parts of the genome that have been eluding scientists for almost two decades. According to TNT, the number of DNA bases has been increased from 2.92 billion to 3.05 billion, marking a 4.5% improvement. However, the number of genes only increased by 0.4%, to 19.1969 — that’s because the vast majority of DNA sequences do not code for proteins but rather regulate the expression and activity of these genes.

“The complete, telomere-to-telomere assembly of a human genome marks a new era of genomics where no region of the genome is beyond reach. Prior updates to the human reference genome have been incremental and the high cost of switching to a new assembly has outweighed the marginal gains for many researchers. In contrast, the T2T-CHM13 assembly presented here includes five entirely new chromosome arms and is the single largest addition of new content to the human genome in the past 20 years,” wrote the researchers.

“This 8% of the genome has not been overlooked due to its lack of importance, but rather due to technological limitations. High accuracy long-read sequencing has finally removed this technological barrier, enabling comprehensive studies of genomic variation across the entire human genome. Such studies will necessarily require a complete and accurate human reference genome, ultimately driving adoption of the T2T-CHM13 assembly presented here,” they added.

The genome that the researchers sequenced didn’t come from a person but rather from a hydatidiform mole, a rare mass or growth that forms inside the womb (​uterus) at the beginning of a pregnancy. This tissue forms when sperm fertilizes an egg with no nucleus, so it contains only 23 chromosomes, just like a gamete (sperm or egg), rather than 46 found in the DNA of a human’s cell. These cells make the computational effort simpler but may constitute a limitation.

We will find out more once the paper is peer-reviewed and properly scrutinized by the international scientific community. If the findings hold water, they may mark a new age of genomics — one where no nook or cranny of DNA is left unexplored. 

Genetically modified grass saves soils destroyed by military target practice

A common species of prairie grass can help clean soils of dangerous chemicals released by military-grade compounds, a new paper reports. The only catch (at least, in the eyes of some), is that we need to genetically modify it for the task.

A plot of switchgrass. Image credits Great Lakes Bioenergy Research Center / Flickr.

Genetically modified (GM) switchgrass (Panicum virgatum) can be used to purge soils of RDX residues, according to new research. RDX belongs to the nitramide chemical family, is flavorless, odorless, and extremely explosive. Pound for pound, it’s more powerful than dynamite. Given its high stability and ability to explode hard, RDX was in use in military-grade munitions during (and since) WW2. You’ve probably heard of C-4; RDX is its main component, alongside some plasticizing agents.

One downside of using RDX on a wide scale (that, admittedly, wouldn’t factor in very much during an active conflict) is that it can be quite damaging to the environment. In particular, compounds produced by RDX after it detonates (in combat or in firing ranges) spread around the point of impact and accumulate in groundwater, where they can pose a very real threat to any humans or wildlife they come into contact with. RDX stored in munition dumps, buried in minefields, or in rounds discarded improperly will also leech such compounds into their environment.

Genetically modified help

However, one species that’s traditionally employed against soil erosion can be modified to remove these compounds from the soil. The study, led by members at the University of York, has shown that this approach has promise at least when talking about the land on live-fire training ranges, munitions dumps, and minefields. Theoretically, however, it should be applicable wherever switchgrass can grow.

“The removal of the toxic RDX from training ranges is logistically challenging and there is currently a lack of cost-effective and sustainable solutions,” explains Dr. Liz Rylott from the Department of Biology and Director of the Centre for Novel Agricultural Products (CNAP), co-author of the study.

“Our research demonstrates how the expression, in switchgrass, of two bacterial genes that have evolved specifically to degrade RDX gives the plants the ability to remove and metabolize RDX in the field at concentrations relevant to live-fire military ranges. We demonstrated that by inserting these genes into switchgrass, the plant then had the ability to degrade RDX to non-detectable levels in the plant tissue.”

RDX-bearing ammo is still commonly used at firing ranges for training purposes, and has been for several decades already. This has led to high and widespread levels of groundwater contamination around such sites, which is never good news.

The authors explain that their approach involved grafting two genes from bacteria that are known to break down RDX into switchgrass. These plants — essentially GMOs at this point — were then grown on contaminated soil at one US military site. The plants grew well and had degraded the targeted compounds below detectable in their own tissues levels by the end of the experiment.

All in all, the grass degraded RDX at a rate of 27 kgs per hectare, which isn’t bad at all. According to the team, this is the most successful attempt to use plants to clean organic pollutants in the field to date. Processes that use plants for this purpose are collectively known as phytoremediation, and they’re a subset of the greater field of bioremediation, which involves the use of any type of organism or biological process for this task.

The findings here are of particular interest as organic pollutants, in general, tend to interact heavily with their environment (meaning they cause quite a lot of damage) while also being resistant to natural degradation processes (meaning they last for a long time in the wild). RDX in particular is of growing concern in the US. The Environmental Protection Agency (EPA) has it designated as a priority pollutant, with more than 10 million hectares of military land in the US being contaminated with weapons-associated compounds, RDX making up a sizable chunk of that contamination.

“The recalcitrance of RDX to degradation in the environment, combined with its high mobility through soil and groundwater, mean that plumes of toxic RDX continue to spread below these military sites, threatening drinking water supplies,” explains Professor Neil Bruce, also from CNAP, the study’s corresponding author.

One example the paper cites is that plumes of RDX pollution were found in groundwater and aquifers beneath the Massachusetts Military Reservation training range in Cape Cod back in 1997. This aquifer was, in effect, the only source of drinking water for half a million people, and the discovery prompted the EPA to ban the use of all live ammo during training at this site.

The paper “Field trial demonstrating phytoremediation of the military explosive RDX by XplA/XplB-expressing switchgrass” has been published in the journal Nature Biotechnology.

Masculinity may be literally toxic: DNA in Y chromosome may shorten lifespan in males

The average lifespan is about 5 years longer for women than men in the U.S. Worldwide, the difference is even more staggering, with women living 7 years longer than men, on average. And it’s not just humans either. Mammalian female’s average lifespan is 18.6% longer than that of males, a much greater difference in longevity between the sexes observed in human populations, according to a 2020 study published by researchers at the University of Southern Denmark and University Lyon 1.

Scientists have proposed a number of explanations for why females tend to live longer. One is that males are often larger and invest more energy into sexually dimorphic characteristics such as larger horns. Another explanation is that males produce more androgens such as testosterone than females. When present in high amounts, male hormones can impair some aspects of the immune system, making males more susceptible to infection and disease.

These may all be factors that contribute to some extent to the observed differences in lifespan among the sexes, and a new study may add a new one that is encoded in our genes.

In a new study published today in the journal PLOS Genetics, researchers led by Doris Bachtrog of the University of California, Berkeley, reported that male fruit flies have repetitive sections of the Y chromosome that create toxic effects.

Heterochromatin enrichment across chromosomes. Immunofluorescence staining for H3K9me3 in male mitotic chromosomes. Scale bar is 50μm. Credit: PLOS Genetics.

The Y chromosome is a unique part of the genome since it does not recombine over some or most of its length and represents male-limited transmission. In many organisms, a significant fraction of the genomic DNA is highly repetitive, with over two-thirds of the sequence consisting of repetitive elements in humans.

“Y chromosomes are derived from ordinary autosomes (that is, they used to be homologous with the X), and over time, they degenerate. Degeneration includes the loss of genes initially present on the Y, but also the accumulation of repetitive “selfish DNA” – such as mobile elements, that can replicate themselves, and spread across the genome. These selfish elements are particularly likely to spread on the Y. Mobile elements are harmful to the organism because they can insert themselves into genes, and destroy their function,” Bachtrog told ZME Science.

Earlier, many of these repetitive DNA sequences were viewed as ‘junk DNA’, but we now know they can play a major role in genome evolution. These include transposable elements (TEs), which are mobile DNA sequences that can change position, and satellite DNA, which is found near centromeres.

Both males and females carry repeat DNA sequences in their genomes, but Bachtrog and colleagues suspected that a large number of repeats lie within the Y chromosome.

To put this hypothesis to the test, the researchers used a technique called chromatin-immunoprecipitation to study the genomic distribution of heterochromatin in young and old male and female fruit flies (Drosophila miranda), as well as RNA-sequencing to study the genetic activity of mobile DNA.

The researchers found that males have twice as much repetitive DNA as females. Specifically, young males had compromised heterochromatin, and much higher activity of mobile DNA, both of which contribute to shorter lifespans. Humans also have heterochromatin, which is a tightly packed repeating DNA.

As the male flies age, their DNA assumes a looser form that activates more repeat sections, resulting in ‘toxic’ side effects, such as induced genomic instability and DNA damage. The new study essentially shows there may be a link between repeat DNA and aging, which is currently still poorly understood.

“Our data provide empirical support that toxic Y chromosomes can diminish maile fitness, and contribute to sex-specific aging in species with heteromorphic sex chromosomes,” the researchers wrote in their study.

Other studies published previously found an association between more active repeat sections and impaired memory, shorter lifespan, and DNA damage. It is perhaps such damage to the Y chromosome that may explain some of the lifespan differences in humans as well, although more research is needed. 

“Humans contain large stretches of repetitive DNA, and heterochromatin alterations have been identified as drivers of human aging. Like in flies, human males tend to live shorter than females, suggesting that differences in heterochromatin content may contribute to sex-specific aging in our species as well,” Bachtrog said.

Many plants have been “naturally GMO’d” by bacteria

Much of the controversy around genetically modified (GM) plants is that they aren’t “natural”, and somehow dangerous. But we may want to reconsider exactly what “natural” is.

Genetic modification is a process that sometimes happens naturally at the hands of bacteria, a new study concludes. Dozens of plants, including bananas, peanuts, hops, cranberries, and tea were found to contain the Agrobacterium microbe — the exact bacterium that scientists use to create GM crops.

“Horizontal gene transfer from Agrobacterium to dicots is remarkably widespread,” the study reads, reporting that around 1 in 20 plants are naturally transgenic.

Transgenic means that one or more DNA sequences from another species have been introduced by artificial means — in other words, an organism that has been modified genetically. In unicellular prokaryotes, this is a fairly common process, but it is less understood (and less common) in macroscopic, complex organisms.

The ability of Agrobacterium to transfer genes to plants and fungi, however, is well known. Researchers have known it for a while, as they are using this exact type of bacteria to produce desired genetic changes in plants. But before researchers thought of this, the method emerged naturally.

In 2015, an impactful study found that sweet potatoes are naturally transgenic — they’ve been GM’d by Agrobacterium. This came as a surprise for many consumers, but many biologists suspected that sweet potatoes weren’t that unique, and several other plants went through a similar process. Tatiana Matveeva and Léon Otten studied the genomes of some 356 dicot species and found 15 naturally occurring transgenic species.

It’s still a rare occurrence, but 1 in 20 is too much to just chalk it up to a freak accident. “This particular type of horizontal gene transfer (HGT) could play a role in plant evolution,” the researchers say.

It’s unclear if humans may have had something to do with this. It’s possible that the horticultural process of grafting plants could have accelerated this phenomenon, leading to the exchange of genes — which would mean that humans have been GM-ing plants for millennia. It could also have nothing to do with human activity.

“We are only at the start of this,” says Léon Otten at the Institute of Molecular Biology of Plants in Strasbourg, France, for NewScientist.

At any rate, this goes to show that in the biological world, GMOs may not be as freak an occurrence as many believe. It could also have practical implications: the European Union recently mandated that its GMO regulations exclude organisms modified through “natural” processes — so if a plant could be GMO’s through “natural” processes, it would technically not be a GMO. Whether consumers will accept this or not, however, remains a completely different problem.

The study was published in Plant Molecular Biology.