Category Archives: Health & Medicine

Left, right, or ambidextrous: What determines handedness?

Credit: YouTube capture.

Although on the outside our bodies look symmetrical, our body movements are anything but. If you’re like most people, you write, use a phone, eat, and perform just about any task that requires tactile dexterity with your right hand. A small fraction of the population, comprising around 10% of the population, is left-handed. Rarer still are those who can use either hand with equal ease for various, though not necessarily all, tasks. These people are known as ambidextrous, with fewer than 1% of the population capable of this feat.

It isn’t generally understood why some people are ambidextrous, but the limited research conducted thus far suggests it all starts in the brain. Ambidexterity isn’t as great as it sounds either, as studies have associated ambivalent handedness with poor cognitive and mental health outcomes.

What determines hand preference?

The brain is divided into the left and right hemispheres by a deep longitudinal fissure of nerves called the corpus callosum. You probably know about these hemispheres and you may have also heard that the left hemisphere handles language, learning and other analytical processes while the right hemisphere processes images and emotions, among other things. This has inevitably led to the erroneous notion that some people who are “more logical” are left-brained while those who are “more creative” are right-brained.

Despite this enduring belief, there’s no such thing as being “right-brained” or “left-brained.” We’re actually “whole-brained” since we use both hemispheres when speaking, solving math, or playing an instrument. But that’s not to say that the brain’s two regions aren’t specialized — and the actual science of how the two halves of the brain work together may be stranger than fiction.

Credit: ResearchGate.

Without going into lengthy details about how the brain performs its division of labor across all areas, we can simply observe our motor functions to see brain lateralization in action. In all vertebrates, the right hemisphere controls the left side of the body via the spinal cord and vice versa. The jury’s still out on why that is, but some scientists believe that this basic organizational feature of the vertebrate nervous system evolved even before the appearance of vertebrates.

Over 90% of humans are naturally right-handed, a proclivity that may start as early as the womb. This suggests that handedness — the tendency to be more skilled and comfortable using one hand instead of the other for tasks such as writing and throwing a ball — is genetic in nature. However, like most aspects of human behavior, it’s like a complex trait that is influenced by numerous other factors, including the environment and chance.

Until not too long ago, it was thought that a single gene determined handedness, but more recently scientists have identified up to 40 that may contribute to this trait. Each gene has a weak effect in isolation, but together their sum is greater than their parts, playing an important role in establishing hand preference.

These genes are associated with some of these brain asymmetries, especially of language-related regions. This suggests links between handedness and language during human development and evolution. For instance, one implicated gene is NME7, which is known to affect the placement of the visceral organs (heart, liver, etc.) on the left to right body axis—a possible connection between brain and body asymmetries in embryonic development.

However, handedness is not a simple matter of inheritance — not in the way eye color or skin tone is, at least. While children born to left-handed patterns are more likely to be left-handed themselves compared to children of right-handed parents, the overall chance of being left-handed is relatively low in the first place. Consequently, most children born out of left-handed parents are right-handed. Even among identical twins, many have opposite hand preferences.

According to a 2009 study, genetics contribute around 25% toward handedness, the rest being accounted for by environmental factors such as upbringing and cultural influences.

In the majority of right-handed people, language dominance is on the left side of the brain. However, that doesn’t mean that the sides are completely switched in left-handed individuals — only a quarter of them show language dominance on the right side of the brain. In other words, hand preference is just one type of lateralized brain function and need not represent a whole collection of other functions.

Since writing activates language and speech centers in the brain, it makes sense that most people use their right hand. However, most individuals do not show as strong a hand preference on other tasks, using the left hand for some, the right hand for others, with the notable exception of tasks involving tools. For instance, even people who have a strong preference for using their right hand tend to be better at grabbing a moving ball with their left hand; that’s consistent with the right hemisphere’s specialization for processing spatial tasks and controlling rapid responses.

Ambidexterity may hijack brain asymmetry — and that may actually be a bug, not a feature

This brings us to mixed-handedness, in which some people have a preference for a particular hand for certain tasks. A step above are ambidextrous people, who are thought to be exceptionally rare and can perform tasks equally well with both hands.

But if the picture of what makes people left or right handed is murky, ambidexterity is even more nebulous. We simply don’t know why a very small minority of people, fewer than 1%, is truly ambidextrous. And from the little we know, it doesn’t sound like such a good deal either.

Studies have linked ambidexterity with poor academic performance and mental health. Ambidextrous people perform more poorly than both left- and right-handers on various cognitive tasks, particularly those that involve arithmetic, memory retrieval, and logical reasoning. Being ambidextrous is also associated with language difficulties and ADHD-like symptoms, as well as greater age-related decline in brain volume. The findings suggest that the brain is more likely to encounter faulty neuronal connections when the information it’s processing has to shuttle back and forth between hemispheres.

Again, no one is sure why this is the case, nor are any of these studies particularly robust since ambidextrous people comprise such a small fraction of the general population and any study involving them will naturally involve a small sample size that invites caution when interpreting results in a statistically meaningful way. All scientists can say for now is that naturally ambidextrous people have an atypical brain lateralization, meaning they simply have brain circuitry and function that is likely different from the normal pattern we see in right-handed and left-handed people.

Of course, it’s not all bad news for the handedness-ambivalent. Being able to use both hands with (almost) equal ease certainly has its perks, which can really pay off, especially in sports, arts, and music.

Can you train yourself to be ambidextrous?

Left-handers have always been stigmatized, often being punished in school and forced to use their non-dominant right hand. However, starting with the late 19th-century, people have not only become more tolerant of left-handedness but some have actually gone as far as to praise the merits of ambidexterity and worked to actively promote it by teaching others how to use both their hands well.

For instance, in 1903, John Jackson, a headteacher of a grammar school in Belfast, founded the Ambidextral Culture Society. Jackson believed that the brain’s hemispheres are distinct and independent. Being either right or left hand dominant effectively meant that half of your brainpower potential was being wasted. To harness this potential, Jackson devised ambidexterity training that, he claimed, would eventually allow each hand “to be absolutely independent of the other in the production of any kind of work whatever… if required, one hand shall be writing an original letter, and the other shall be playing the piano, with no diminution of the power of concentration.”

Although these claims have been proven to be bogus, to this day you can find shady online programs that claim to teach you to become ambidextrous. Training involves all sorts of routines such as using your non-dominant hand for writing, brushing your teeth, and all sorts of daily activities that require the fine manipulation of a tool. Doing so would allow you to strengthen neural connections in the brain and activate both hemispheres, which may help you think more creatively — or so they claim. But that’s never been shown by any study I could find. On the contrary, if anything, ambidextrous training may actually hamper cognition and mental health, judging from studies on natural ambidextrous people.

“These effects are slight, but the risks of training to become ambidextrous may cause similar difficulties. The two hemispheres of the brain are not interchangeable. The left hemisphere, for example, is typically responsible for language processing, whereas the right hemisphere often handles nonverbal activities. These asymmetries probably evolved to allow the two sides of the brain to specialize. To attempt to undo or tamper with this efficient setup may invite psychological problems,” Michael Corballis, professor of cognitive neuroscience and psychology at the University of Auckland in New Zealand, wrote in an article for Scientific American.

“It is possible to train your nondominant hand to become more proficient. A concert pianist demonstrates superb skill with both hands, but this mastery is complementary rather than competitive. The visual arts may enhance right-brain function, though not at the expense of verbal specialization in the left hemisphere. A cooperative brain seems to work better than one in which the two sides compete.”

Handedness is a surprisingly complex trait that isn’t easily explained by inheritance. Whether you’re left or right handed, this doesn’t make you necessarily smarter or better than the other. Brain lateralization exists for a reason, and that should be celebrated. 

Just two glasses of wine could exceed a whole’s day sugar intake

Yes, wine is good, but here’s the thing: there’s sugar in all wines, from whites to red to cooking wine and everything in between. But how much sugar are we talking about? Kind of a lot, according to a new study. Researchers reviewed 30 bottles of different types of wine in the UK and found that two glasses might exceed the recommended daily sugar limit for adults.

Image credit: Flickr / David.

The Alcohol Health Alliance, a group of over 60 non-profit organizations from the UK, commissioned a laboratory to analyze 30 bottles of red, white, fruit, rosé, and sparkling wine from the top 10 leading wine brands in the UK. The results showed a variation of sugar and calories between products – information missing from most alcohol labels.

“Alcohol’s current exemption from food and drink labeling rules [in the UK] is absurd. Shoppers who buy milk or orange juice have sugar content and nutritional information right at their fingertips. But this information is not required when it comes to alcohol,” Professor Ian Gilmore, Chair of the Alcohol Health Alliance UK, said in a statement

Wine and sugar

As you likely know, wine is made from grapes, which naturally contain sugar. To produce wine, the grapes have to be fermented – a process through which yeast is added and the sugars are transformed into alcohol. Any sugars that aren’t converted in the process are called residual sugars. So basically, wine does contain sugar, but it’s technically less than if you ate the grapes.

But the story is a bit more complicated. Every wine type is kind of unique in terms of sugar content. Aged wine, for example, has less sugar since it’s fermented for a longer time. Also, winemakers can add more sugar after fermentation depending on the desired sweetness. In the US, for example, the market for sweets is higher, so more sugar is added to the wine.

The problem is most of the bottles lack nutritional information on labels. In the UK, like in many countries, this isn’t currently required by law, so campaigners are calling for a change to better inform wine drinkers about the number of calories and sugars they are consuming. It’s also something consumers want, according to recent surveys.

The National Health Service (NHS) in the UK recommends adults consume a maximum of 30 grams of sugars per day. This is sugar in all its forms — and a lot of foods have more sugar than you think. The analysis by the Alcohol Health Alliance UK shows it’s possible to reach that level by drinking two medium-sized glasses of some wines. Lower-strength wines have the most sugar, according to the research.

Alcohol accounts for about 10% of the daily calorie intake of adults who drink in the UK, with over three million adults consuming an extra day’s worth of calories each week. That’s two months of food each year, and it’s basically just empty calories. It goes much further than just wine, with up to 59 grams of sugar found on every ready-to-drink cocktail on the market, a study showed.

“The alcohol industry has dragged their feet for long enough – unless labeling requirements are set out in law, we will continue to be kept in the dark about what is in our drinks. People want and need reliable information directly on bottles and cans, where it can usefully inform their decisions,” Alison Douglas from Alcohol Focus Scotland said in a statement.

The full report on sugar and wine can be accessed here.

Employers that help workers manage headaches stand to win big on productivity

Researchers at the University of Copenhagen, in Denmark, are looking into how migraines and tension headaches are impacting our ability to do productive work. The findings raise some interesting questions regarding how we think about and treat such conditions, and also point out that both workers and employers stand to benefit from better management of employees suffering from frequent headaches.

Image credits Robin Higgins.

Both migraines and tension headaches can be debilitating experiences. People suffering from either become hyper-sensitive to outside stimuli, from a door slamming to a curtain being drawn. And, quite understandably, such situations make it near impossible for them to be productive, and virtually guarantees that the quality of their work will drop.

A migraine attack can last for up to 72 hours if untreated, and tension headaches can draw out for up to a week. Needless to say, overall, such events represent a huge drain on the overall productiveness of a workforce. In Denmark alone (with a population of 5.8 million), the authors explain, roughly 770,000 people suffer from migraine or frequent tension headaches. Their study is the first to focus on the effect these conditions have upon our ability to work.

A head full of trouble

“Migraine is the leading cause of functional impairment among people under the age of 50. And headaches have negative effects on sick leave and productivity. So, it would benefit workplaces to open their eyes to the untapped potential that you find here. Indeed, we cannot afford not to take it seriously,” says corresponding author of the study, Kirsten Nabe-Nielsen.

“It is especially the ability to remember, make quick decisions and do hard physical work that cause difficulties for people with these headache disorders.”

Migraines are bouts of moderate to severe, pulsating headaches, often accompanied by nausea, vomiting, and sensitivity to light and sound. Tension headache is characterized by mild to severe pain, on both sides of the head, but usually without nausea. Both are considered ‘chronic’ if they occur for more than 14 days a month.

Roughly 24% of women and 10% of the men in the Danish working population suffer from migraines or frequent tension headaches. How well these men and women can adapt to their tasks during headaches depends, largely, on their type of employment;

Those in academic positions will often have the luxury of going home a little earlier, working remotely, or postponing tasks that demand the most focus. Others, especially those working physical jobs such as cleaning or nursing staff, don’t often have this option. Instead, workers in these fields of employment may have to call in sick due to migraines or headaches. There is some evidence that headaches are the second-most common cause of sick leave, she explains, second only to infectious disease.

Managers and workers can however collaborate to find solutions that work for both parties and don’t force an ailing employee to give up an entire day of work. For example, the work schedule can be shifted around to allow the employee to postpone more difficult tasks for some that can be solved at a leisurely pace or in a quiet space until the pain has subsided.

She also believes that there are still many unknowns in the general public regarding the importance of headache disorders. For example, she explains that taking too many painkillers can actually lead to more headaches.

“Most people have experienced headaches. Therefore, it may be difficult to understand how debilitating migraine and frequent headaches may be for a colleague, friend or family member. People still have the notion that it will be sufficient to swallow a pill.”

For the study, the team used information about migraines and frequent headaches from literature and tracked the painkiller usage of over 5,000 Danish participants with different educational backgrounds. Participants also provided information about their health, depressive symptoms and pain in muscles and joints. They were also asked about their “ability to cope with seven different, specific requirements at work” to give researchers an accurate idea of their ability to perform professionally.

One of the key findings of the study was that depressive symptoms and pain in muscles or joints are associated with headache disorders and their ill effect on our ability to work. Handling these depressive symptoms and pain may therefore help reduce the symptoms of people with headache disorders and improve their ability to perform. These findings align with previous research that found a link between headaches, muscle and joint pain, and depressive symptoms.

These findings mean that feeling neck pain may be a warning sign of a migraine attack, just as frequent headache attacks may affect the mood negatively. Mood changes may also be indicative of an upcoming headache, the team adds.

The two groups in the study whose ability to work was most affected by migraines were participants who took no painkillers at all, and those who used them daily. This suggests that the two groups are under- and over-treated, respectively. The first is feeling the full debilitating effect of the pain, while the other is likely not receiving the correct medication and may even be suffering the symptoms of medication overuse.

“On the other hand, when you look at the group who does not take medication at all, it seems to indicate that they are undermedicated. And maybe it has to do with the fact that they do not consider their illness to be severe enough to seek medical attention — but that is just our guess,” says Kirsten Nabe-Nielsen.

Based on these findings, the team makes three recommendations. The first is that people take their headaches seriously and visit their doctor for advice and medical treatment, if needed. Secondly, employers should consider steps to adapt work during an employee’s headache attacks, which will reduce absenteeism. Thirdly, people with headache disorders should take steps to handle other types of pain disorders (such as neck-shoulder pain) and protect their mental health, to help prevent headaches as much as possible and protect their quality of life.

The paper “Demand-specific work ability among employees with migraine or frequent headache” has been published in the International Journal of Industrial Ergonomics.

Teenage pregnancy is a big problem. Sex education can help fight it

Sexual education is a hotly debated topic in many parts of the world but increasingly, researchers are finding more benefits to it. In a new study, researchers found that having access to sexual education programs can reduce teenage pregnancy.

The study was carried out in the US and compared teenage pregnancy rates in different counties over a 20-year period. Some counties had implemented sexual education programs over a decade ago, while others had not. Results showed that in places where sexual education was introduced, teenage pregnancy dropped significantly.

In particular, teenage pregnancy rates dropped by 1.5% the first year programs were introduced, and by approximately 7% in the fifth year of funding.

“​​Sex education in the United States has been hotly debated among researchers, policy makers, and the public,” says Nicholas Mark, a doctoral candidate in New York University’s Department of Sociology and the lead author of the paper, which appears in the Proceedings of the National Academy of Sciences (PNAS). “Our analysis provides evidence that funding for more comprehensive sex education led to an overall reduction in the teen birth rate at the county level of more than 3 percent.”

The study focused specifically on the Teen Pregnancy Prevention program (TPP), which was initiated in 2010 and awards funding at the county level. This program offers comprehensive information on sex, contraception, and reproductive health. Meanwhile, some counties focus more on abstinence-only programs, which have proven to be widely ineffective.

“We’ve known for some time that abstinence-only programs are ineffective at reducing teen birth rates,” adds Lawrence Wu, a professor in NYU’s Department of Sociology and the paper’s senior author. “This work shows that more wide-reaching sex education programs—those not limited to abstinence—are successful in lowering rates of teenage pregnancy.”

Among developed countries, the US has a relatively high teen pregnancy rate. Although the rate has generally declined in the past two decades, it still remains relatively high. Three in ten American girls will get pregnant before age 20, adding up to almost 750,000 pregnancies a year.

Previous studies have shown that abstinence-only programs are not only ineffective, but also unethical. In fact, teaching only abstinence can lead to more (not less) teenage pregnancy.

Studies such as this one add more weight to the idea that sex education programs provide tangible benefits to society. Furthermore, studies suggest that teens are having sex earlier than before, which means sex-ed programs are needed more than ever. In addition, a majority of US voters support the introduction of sexual education programs.

The researchers emphasize that the findings are consistent with previous findings. Usually, when sex education programs are introduced, there’s a small decline in teenage birth rates, and over time, the rate seems to drop further

Ultimately, the team concludes, this speaks for federal funding towards comprehensive sexual education.

The study was published in Proceedings of the National Academy of Sciences.

Your microbiota will be having non-stop sex this Valentine’s Day

Even if you’re alone this Valentine’s Day, there’s no need to worry: some parts of your body will be getting plenty of action. In fact, your body will host a veritable carnival of the sensual in your tummy, as your microbiota will engage in an orgy of sex and swinger’s parties — where they’ll be swapping genes instead of keys.

A medical illustration of drug-resistant, Neisseria gonorrhoeae bacteria. Original image sourced from US Government department: Public Health Image Library, Centers for Disease Control and Prevention. Image in the public domain.

The salacious gene

Imagine you have a severe disease with a very unusual cure: you can treat by making love with someone who then passes on the necessary genes to cure your ailment. It is, as they say, sexual healing. Using sex to protect or heal themselves is precisely what bacteria can do, and it’s a crucial defense mechanism.

In the past, the research community thought bacterial sex (or conjugation, as scientists call it) was a terrible threat for humans, as this ancient process can spread DNA capable of conveying antibiotic resistance to their neighbors. Antibiotic resistance is one of the most pressing health challenges the world is facing, being projected to cause 10 million deaths a year by 2050.

But there’s more to this bacterial sex than meets the eye. Recently, scientists from the University of Illinois at Urbana-Champaign and the University of California Riverside witnessed gut microbes sharing the ability to acquire a life-saving nutrient with one another through bacterial sex. UCR microbiologist and study lead Patrick Degnan says:

“We’re excited about this study because it shows that this process isn’t only for antibiotic resistance. The horizontal gene exchange among microbes is likely used for anything that increases their ability to survive, including sharing vitamin B12.”

For well over 200-years, researchers have known that bacteria reproduce using fission, where one cell halves to produce two genetically identical daughter cells. However, in 1946, Joshua Lederberg and Edward Tatum discovered bacteria could exchange genes through conjugation, an entirely separate act from reproduction.

Conjugation occurs when a donor and a recipient bacteria sidle up to each other, upon which the donor creates a tube, called a pilus that attaches to the recipient and pulls the two cells together. A small parcel of DNA is then passed from the donor to the recipient, providing new genetic information through horizontal transfer.

Ironically, it wasn’t until Lederberg met and fell in love with his wife, Esther Lederberg, that they made progress regarding bacterial sex.

Widely acknowledged as a pioneer of bacterial genetics, Esther still struggled for recognition despite identifying the horizontal transfer of antibiotic resistance and viruses, which kill bacteria known as bacteriophages. She discovered these phages after noticing small objects nibbling at the edges of her bacterial colonies. Going downstream to find out how they got there, she found these viral interlopers hiding dormant amongst bacterial chromosomes after being transferred by microbes during sex.

Later work found that environmental stresses such as illness activated these viruses to replicate within their hosts and kill them. Still, scientists assumed that bacterial sex was purely a defense mechanism.

Esther Ledeberg in her Stanford lab. Image credits: Esther Lederberg.

Promiscuity means longevity

The newly-published study builds on Esther’s work. The study authors felt this bacterial process extended beyond antibiotic resistance. So they started by investigating how vitamin B12 was getting into gut microbial cells, where the cells had previously been unable to extract this vitamin from their environment — which was puzzling as, without vitamin B12, most types of living cells cannot function. Therefore, many questions remained about how these organisms survived without the machinery to extract this resource from the intestine.

The new study in Cell Reports uses the Bacteroidetes species, which comprise up to 80% of the human microbiome in the intestines, where they break down complex carbohydrates for energy.

“The big, long molecules from sweet potatoes, beans, whole grains, and vegetables would pass through our bodies entirely without these bacteria. They break those down so we can get energy from them,” the team explained.

This bacteria was placed in lab dishes mixing those that could extract B12 from the stomach with some that couldn’t. The team then watched in awe while the bacteria formed their sex pilus to transfer genes enabling the extraction of B12. After the experiment, researchers examined the total genetic material of the recipient microbe and found it had incorporated an extra band of DNA from the donor.

Among living mice, something similar happens. When the group-administered two different subgroups of Bacteroidetes to a mouse – one that possessed the genes for transferring B12 and another that didn’t — they found the genes had ‘jumped’ to the receiving donee after five to nine days.

“In a given organism, we can see bands of DNA that are like fingerprints. The recipients of the B12 transporters had an extra band showing the new DNA they got from a donor,” Degnan said.

Remarkably, the team also noted that different species of phages were also transferred during conjugation, exhibiting bacterial subgroup specificity in some cases. These viruses also showed the capacity to alter the genomic sequence of its bacterial host, with the power to promote or demote the life of its microbic vessel when activated.

Sexual activity in our intestines keeps us healthy

Interestingly, the authors note they could not observe conjugation in all subgroups of the Bacteroidetes species, suggesting this could be due to growth factors in the intestine or a possible subgroup barrier within this large species group slowing the process down.

Despite this, Degnan states, “We’re excited about this study because it shows that this process isn’t only for antibiotic resistance.” And that “The horizontal gene exchange among microbes is likely used for anything that increases their ability to survive, including sharing [genes for the transport of] vitamin B12.”

Meaning that bacterial sex doesn’t just occur when microbes are under attack; it happens all the time. And it’s probably part of what keeps the microbiome and, by extension, ourselves fit and healthy.

Stop feeling dizzy after suddenly standing up with these two simple movements

Credit: Dysautonomia Today.

Abruptly standing up after sitting or lying down can induce a form of low blood pressure that can make us feel light-headed or woozy. It’s a common phenomenon with an uncommon name: orthostatic hypotension.

Although it’s generally harmless and short-lived, orthostatic hypotension can sometimes cause people to faint and some individuals experience it routinely, which affects their daily function. If severe, the dizzying phenomenon can also be a sign of an underlying health condition.

Satish Raj, a heart rhythm cardiologist at the University of Calgary in Canada, works with severe cases of orthostatic hypotension on almost a daily basis at his clinic. Apart from telling them to drink more water or switch medication, there wasn’t much he could do to improve the patients’ symptoms, so he wondered if he could devise a new intervention.

When we stand up after sitting or lying down for a while, blood rushes towards our legs because of gravity. But the body also has to work to push blood upward to supply the brain with oxygen. The sudden activation of the leg muscles causes blood vessels to open wider for a few moments in order to compensate for the abrupt uptick in demand, which can cause a rapid drop in blood pressure and accompanying dizziness.

Raj and colleagues believed this orthostatic hypotension could be avoided if the muscle reflex was activated early. They put this idea to the test in an experiment involving 22 volunteers with severe orthostatic hypotension who performed two simple types of movements.

One method involved raising the knees one at a time for up to 30 seconds while seated before standing up. The other involved tensing the lower limbs, by crossing the legs and clenching the thighs and butt.

Credit: Heart Rhythm.

Compared to the participants who stood up with no intervention, the volunteers using the two methods improved their blood circulation and self-reported orthostatic hypotension symptoms.

“It’s free, it doesn’t have any drug side effects, and it’s totally within their control, which I think a lot of patients like,” Raj told Gizmodo.

As a caveat, the trial involved a small sample size of fewer than two dozen participants — all of whom were women. The volunteers were selected on a first-come-first-served basis and the fact that women jumped on the opportunity so quickly may suggest they are disproportionately affected by orthostatic hypotension.

This is why the researchers would like to conduct large-scale trials to determine the efficacy of the new therapy and prompt government and health-related organizations to endorse the two techniques. In the meantime, Raj has introduced the techniques to his patients at the clinic, most of whom have informally reported similar success to the study participants.

The findings were reported in the journal Heart Rhythm.

Doctors overlook a curable cause of high blood pressure

Credit: Pixabay.

In early 2013, after Erin Consuegra gave birth to her second child at age 28, her health nosedived. She developed worrying symptoms, including extreme fatigue, fluttery heart beats, and high blood pressure. She said her doctor prescribed blood pressure medication and chalked it up to stress.

But Consuegra, an elementary school teacher by training, didn’t buy it. “It’s like, you think staying home all day with two kids is causing these real medical issues?” she said. “It was offensive to just write it all off to stress and anxiety.”

Researching her symptoms online and through family members in the medical field, Consuegra learned of a little-known syndrome called primary aldosteronism, in which one or both adrenal glands, small structures that sit atop the kidneys, overproduce a hormone called aldosterone. Aldosterone increases blood pressure by sending sodium and water into the bloodstream, increasing blood volume. It also lowers potassium, a mineral that Consuegra was deficient in.

Her primary care physician agreed to run a blood test to screen for the condition but insisted that the result was normal and balked at Consuegra’s request to see a specialist. “She took it as me questioning her,” Consuegra said. Getting a referral, she added, “took a lot of fighting, a lot of tears, a lot of advocacy on my part.”

Consuegra’s story has a relatively happy ending. Doctors at Vanderbilt University Medical Center eventually diagnosed her with primary aldosteronism and found a small noncancerous tumor, or adenoma, in one of her adrenal glands — known to often be a cause of the condition. After doctors removed the gland in July 2014, her symptoms disappeared.

Millions of other patients are not so lucky. More than six decades after primary aldosteronism was first described in the medical literature, less than 1 percent of cases are diagnosed and treated despite evidence that it is a common cause of high blood pressure, or hypertension.

The syndrome shows up in people with mild, moderate, and severe hypertension — and even in those with normal blood pressure — according to a comprehensive 2020 study. “The prevalence of primary aldosteronism is high and largely unrecognized,” the study authors wrote in the Annals of Internal Medicine, adding that it may account for high blood pressure that has no identifiable cause and is typically attributed to genetics, poor diet, lack of exercise, and obesity.

Closing the diagnosis and treatment gap poses a series of challenges, experts say. Many physicians haven’t gotten the message that primary aldosteronism is common, so they don’t look for it. Screening tests can be tricky to interpret and miss a lot of cases. Complicating matters, primary care groups, whose members treat the bulk of hypertension, have so far declined to help develop relevant guidelines. Research on the syndrome lags behind other diseases, and only a few health systems have a cadre of knowledgeable specialists who provide coordinated care.

Clinicians may dismiss telltale symptoms, leaving patients to turn to Google, bounce from doctor to doctor, or go undiagnosed for years. “Unfortunately, I think my story is super-typical,” said Consuegra, whose frustrations led her to start a patient Facebook group. “I don’t think anyone has had an easy road to diagnosis.”

As a result, patients take standard blood pressure medications that do little or no good and miss out on effective treatments that include not only surgery but low-salt diets and targeted drugs. Missed diagnoses pose additional dangers: Excess aldosterone is toxic to the heart, blood vessels, kidneys, and other organs. Compared to patients with garden-variety hypertension, those with primary aldosteronism have greater risk of kidney disease, heart failure, coronary artery disease, and stroke.

With nearly half of U.S. adults, or 116 million people, classified as having high blood pressure, some experts have warned of a public health crisis hidden in plain sight — one that will demand widespread changes in hypertension treatment. They’ve called on clinicians to increase their vigilance and more readily prescribe drugs that block aldosterone’s effects.

“My personal frustration is seeing patients who’ve clearly had primary aldosteronism for more than a decade and now have irreversible kidney damage,” which may require dialysis, said endocrinologist William Young Jr. of the Mayo Clinic. Young treats about 250 primary aldosteronism patients a year but “compared to what’s going on out there,” he said, “that’s miniscule.”

The push for greater recognition of primary aldosteronism isn’t new. Since 2008, the Endocrine Society, a medical organization dedicated to the advancement of hormone science and public health, has recommended screening patients who have red flags such as low potassium, an adrenal mass that shows up on a scan, or drug-resistant hypertension — defined as blood pressure that is uncontrolled despite the patient taking three different kinds of antihypertensive medications at their maximally tolerated doses. A family history of early-onset hypertension or stroke before age 40 are other signs. In 2017, the American College of Cardiology and the American Heart Association incorporated the directive into a hypertension treatment guideline.

Screening usually entails a roughly $150 blood test called the aldosterone-to-renin ratio, or ARR. Renin is an enzyme produced by the kidneys that triggers a chain reaction that leads to aldosterone production. When renin is low, aldosterone should be low. But in people with primary aldosteronism, aldosterone can be elevated even when renin is low.

A positive ARR can be followed by additional tests to confirm the diagnosis and determine whether surgery is an option. If one gland is secreting excess aldosterone, removing that gland may cure or improve the disease. Usually both glands are affected, in which case surgery isn’t recommended and patients take one of two drugs that block aldosterone.

But physicians haven’t followed the guidelines. Recent U.S. studies found ARR screening rates for high-risk patients ranging from 1.3 percent in an urban health system to 3.3 percent at an academic medical center. The largest analysis, which was published in 2021 and involved 269,010 patients with drug-resistant hypertension treated in the U.S. Veterans Health Administration, revealed that just 1.6 percent were tested.

The data show primary aldosteronism is “not top of mind for gatekeepers of hypertension,” said Vivek Bhalla, a kidney specialist who directs the Stanford Hypertension Center. Bhalla said he was astounded when a 2020 analysis he led revealed that just 2.1 percent of patients with drug-resistant hypertension were screened.

Yet even those tiny percentages may downplay the problem because they don’t account for people without recognizable risk factors, who may nonetheless be on a path to developing severe disease. Some experts suggest studying the cost-effectiveness of expanding the population of patients who should be screened, a point underscored by the 2020 Annals study, which estimated that the syndrome affects one in six people with mild hypertension and one in five with moderate hypertension.

More troubling, the study showed that ARR fails to detect a large fraction of cases, yielding a positive result in people who have the condition as little as 22 percent of the time. False positive results, on the other hand, are uncommon. The authors wrote that ARR “can be a simple and useful screening method” but cautioned against overreliance, noting that arbitrarily high cutoff values and aldosterone’s tendency to fluctuate likely contribute to underdiagnosis.

Those revelations “really changed the whole landscape,” said Sandra Taler, a Mayo Clinic kidney and hypertension specialist who was not involved in the research. She added that she’s become “more meticulous” in looking for primary aldosteronism as a result. “The point of this study is there may not be any clues and it could still be present,” she said. “And if you don’t look for it you won’t find it.”

Experts have put forth various explanations for the lack of screening, including the complexity of the process and concerns over expensive follow-up procedures. Given the sheer volume of hypertension patients, physicians typically don’t focus on finding root causes. “The temptation for a physician seeing a new patient with hypertension is to say — ‘Let’s just start off with getting your blood pressure down, and then take it from there,’” Australian medical researcher John Funder, who led the Endocrine Society’s most recent guideline effort, wrote in a 2020 editorial in Hypertension.

There are also historic misperceptions that primary aldosteronism is rare and characterized by symptoms such as potassium deficiency. University of Michigan physician Jerome Conn is credited with first describing the syndrome in medical literature in 1956 based on a woman with extreme symptoms that included temporary and occasional paralysis from the hips down. Although Conn and others postulated that rogue aldosterone production is a common cause of hypertension, it took until the 1980s for diagnostic advances to confirm their hunch.

In his editorial, Funder cited “residual ignorance” from the days when medical schools taught that primary aldosteronism was a mild and rare form of hypertension affecting less than 1 percent of patients. Others cite ongoing gaps in educating physicians who think it is too complicated or don’t know they should be testing people for primary aldosteronism. Specialty societies have not paired screening recommendations with aggressive efforts to educate physicians about the disorder’s prevalence, acknowledged Robert Carey, a professor of medicine at the University of Virginia School of Medicine and Endocrine Society past president, who helped develop the guidelines.

At some institutions, that’s changing. Varun Sharma, an associate professor of general internal medicine at Georgetown University, said he wasn’t taught how or when to diagnose primary aldosteronism during his medical training. A few years ago he began testing some patients with hypertension and was surprised by frequent positive results. “That was what made me push and also made me feel comfortable telling residents that we ought to be screening more,” he said.

Similarly, Bradley Changstrom, an assistant professor of medicine at the University of Colorado School of Medicine, doesn’t recall learning about primary aldosteronism as a common cause of hypertension when he was a resident. But he said, “Once I started looking for it I started finding it all the time, practically speaking once a month or so.”

“I think if physicians realize how common this truly is,” he added, “they would start to look for it more often.”

To increase detection, experts have suggested removing a requirement that patients take a hiatus from blood pressure medications prior to screening, liberalizing cutoffs for a positive ARR result, and bypassing ARR for urine excretion tests, which are more reliable but cost more. Some have suggested wider prescribing of drugs to treat primary aldosteronism, even as a first-line hypertension therapy.

Carey said it will be critical to involve primary care societies — including the American Academy of Family Physicians and the American College of Physicians — in developing the next guideline, which he said will take at least two years. He said their endorsement would provide “the strongest message regarding the validity of the recommendations” but such collaboration can be challenging because societies “want to keep their guidelines under their control.”

Primary care groups declined to participate in a multi-society task force that developed the 2017 ACC/AHA hypertension guideline, which famously expanded the definition of hypertension to include about 30 million more U.S. adults as well as endorsing primary aldosteronism screening.

The ACP and the AAFP declined interview requests from Undark. In an email, the AAFP said it updates its members on research and “would welcome the Endocrine Society to reach out to us directly to discuss guideline opportunities.”

Greater focus on excess aldosterone could advance national progress on blood pressure control, which has stalled, according to the 2020 U.S. Surgeon General’s Call to Action to Control Hypertension. Although that 48-page document, like much public health messaging, doesn’t mention primary aldosteronism or aldosterone, it notes that only about one in four U.S. adults with hypertension has it under control. Hypertension is a leading risk factor for heart disease and contributes to half a million U.S. deaths annually. Primary aldosteronism, Taler said, “opens up a whole area of research in terms of looking for the cause of high blood pressure.”

More detection won’t be a silver bullet. No health care system is prepared for a glut of newly diagnosed primary aldosteronism patients, says Carey. Only a handful of U.S. medical centers have a cadre of relevant experts — particularly scarce are radiologists adept a procedure to determine whether surgery is feasible. Care is also often uncoordinated. Bhalla said he created Stanford’s hypertension center in 2015 because “it was clear that there was no expert that had taken these people under their wing,” referring to patients with primary aldosteronism, but his institution isn’t unique. “We practice in these silos in medicine,” he said. “And that is not healthy for patient care.”

Sweeping improvements are needed in diagnosis and treatment, said Marianne Leenaerts, co-founder of the Primary Aldosteronism Foundation, a patient group launched in 2019. The only drug approved by the U.S. Food and Drug Administration to treat primary aldosteronism, spironolactone, was developed in the late 1950s. It usually lowers blood pressure but has nasty side effects that include erectile dysfunction and painful breast growth in men, and irregular menstrual cycles in women. Another drug of the same class, eplerenone, is prescribed off-label. The Endocrine Society’s guideline notes that eplerenone has fewer side effects but is less potent than spironolactone and must be taken more often.

Two new classes of drugs are in testing. Clinical trials are underway for new scanning techniques and procedures that could spare the adrenal gland.

Yet for millions of patients, advances are slow in coming.

Leenaerts, who lives in Canada, believes she had primary aldosteronism for 25 years before it was diagnosed in 2017. Both of her adrenal glands produce excess aldosterone, which means she is not a candidate for surgery, and she does not tolerate either available drug. Instead, she tries to manage her disease with a standard blood pressure-lowering drug and a strict low-sodium, high-potassium diet. At age 58, her liver and kidney functions are declining, and she has insomnia, difficulties with memory and focus, and painful inflammation. Primary aldosteronism has cut short her productive years, she said. While new drugs might help, she added, “At the speed at which I’m declining, it may be too late for me.”

Mary Chris Jaklevic is a veteran health care journalist based in the Midwest.

This article was originally published on Undark. Read the original article.

Pap tests could one day tell women if they have breast or ovarian cancer

Experts have identified changes in a woman’s cervix that can help detect tumors elsewhere in the body. These tests involve scraping cells from the cervix to detect any abnormalities that could cause cervical cancer. But researchers from Innsbruck University and gynecological cancer research charity The Eve Appeal found the cells from this test can also give clues and alerts for other types of cancers. With development, they state that the method used could one day predict the risk of developing ovarian, breast, womb, and cervical cancers from a straightforward smear pap test.

They developed their system using a process known as DNA methylation — epigenetic modifications to DNA that don’t alter the genetic sequence but do determine whether a gene expresses or stifles its function: in this case, forming or preventing cancer in the body. These modifications leave ‘methylation markers or signatures’ on genomic regions that scientists can read to determine what has occurred within a person’s body throughout their lifetime. Akin to the rings of a tree, this method can provide chronological clues as to what has happened in our biological life.

Researchers created the test, dubbed WID (Women’s Risk Identification), to analyze markers left by cancerous activity in the DNA of cervical cells. By calculating a woman’s WID, they hope to identify those with a high risk of developing ovarian, breast, womb, or cervical cancers: providing an early-warning system for medical teams to increase treatment outcomes.

The team was able to spot these modifications because they matched DNA markers found in diseased cervical, breast, ovarian, and womb biopsy tissue (a highly invasive procedure) to those found in the easier to access cells of the cervix — whose similar biological structures undergo the same hormonal changes as the tissues these cancers flourish in.

Finding cancer through the cervix

The first study examined cervical cell samples collected from 242 women with ovarian cancer and 869 healthy controls. To develop the WID risk scale, the scientists measured 14,000 epigenetic changes to identify ovarian cancer’s unique DNA signature to spot the presence of the disease in epithelial tissue scraped from the cervix.

They then validated the signature in an additional cohort of 47 women who had ovarian cancer and 227 healthy subjects. Results identified 71% of women under 50 and roughly 55% of the volunteers older than 50 who had previously tested positive for the disease — giving the tests an overall specificity of 75%. A test’s specificity is its ability to correctly identify people without the disease.

Professor Martin Widschwendter of the University of Innsbruck and UCL, heading up the research, said the findings suggest their WID index is picking up cancer predisposition, adding that the results were similar to a study on women with cancer of the womb. He is adamant their test cannot predict ovarian, with more studies needed.

A possible screening method for an undetectable cancer 

In the second study, the same team analyzed epigenetic changes in cervical cell samples provided by 329 women with breast cancer against those from the same 869 healthy volunteers in the first study. Using the WID index, they were able to identify women with breast cancer based on a unique epigenetic signature. The group once again confirmed these markers in a smaller consort of 113 breast cancer patients and 225 women without this condition.

The researchers also used the patterns to predict whether patients had breast cancer-but they didn’t say exactly how accurate the tests were. Instead, they stressed that further trials are needed-with the hope that clinicians could use their WID as a regular test for women in the future-specifically for those under fifty years of age who do not have access to screening for this disease.

“This research is incredibly exciting,” said Liz O’Riordan, a breast cancer surgeon who was also diagnosed with this disease. “At the moment, there is no screening test for breast cancer in women under the age of 50. If this test can help pick up women with a high risk of developing breast, ovarian, cervical, and uterine cancer at a younger age, it could be a game-changer.”

The team adds that these findings are also crucial for ovarian cancer, whose symptoms can be as benign as a bloated abdomen. The biggest killer of women out of gynecological-based tumors, this disease is diagnosed late by clinicians in an alarming 3 out of four cases.

But for now, Widschwendter says, the findings suggest that the molecular signatures in cervical cells may detect the predisposition to other women-specific cancers rather than providing a solid prediction of the disease.

Because of the pandemic, women have stopped taking pap tests

A pap smear test detects abnormal cells on the cervix, which is the entrance to the uterus from the vagina. Removing these cells can prevent cervical cancer, which most commonly affects sexually-active women aged between 30 and 45. In most cases, the human papillomavirus causes this cancer after being acquired through unprotected sex or skin-to-skin contact. To summarise, the whole point of these tests is to detect women at risk of developing cancer and encourage them to carry further health check-ups, not to find those displaying cancer symptoms.

Around the world, the number of women taking smear tests has dropped substantially during the pandemic. In England, for instance, one of the countries with the highest testing rates, just 7 out of 10 eligible women got a cervical check-up — and conditions are expected to worsen due to a new policy brought in by the UK government at the start of 2022, which saw all eligible women in Wales have their wait times increased from three to five years in between tests. The government expects to roll out the policy in England this year after the pandemic caused the delay of its initial release. Experts insisted the move was safe, but campaigners hit back at the plans, arguing it would cause preventable deaths by delaying the detection of cancer or pre-cancerous issues.

In a statement to the Guardian, the UK’s Secretary for Patient Safety and Primary Care says it’s “great to see how this new research could help alert women who are at higher risk to help prevent breast, ovarian, womb, and cervical cancer before it starts.” Until this time, cancer screening remained vital and urged all women aged 25 and above to attend their appointments when invited. The secretary did not remark on the new government policy.

An ovarian cancer specialist urged caution in interpreting the data: They show a “moderate association” between the methylation signature and ovarian cancer, said Dr. Rebecca Stone, the Kelly Gynecologic Oncology Service director at Johns Hopkins Hospital. “They are not showing that it’s predictive or diagnostic,” Stone stressed. Clarifying that to see whether the cervical cell signature predicts cancer, a study would have to observe a large group of women over a long period.

Filling the gap in screening options for women

In contrast, Athena Lamnisos, CEO of the Eve Appeal, emphasizes the importance of a new screening tool:

“Creating a new screening tool for the four most prevalent cancers that affect women and people with gynae organs, particularly the ones which are currently most difficult to detect at an early stage, from a single test could be revolutionary.”

The Eve Appeal goes on that women could get separate risk scores for each of the four cancers in the future where medical teams could offer those with high scores more active monitoring, regular mammograms, risk-reducing surgery, or therapeutics.

Ultimately, it’s better to prevent than to treat, and this method could offer women worldwide access to proper screening services that could save lives through the application of early intervention and preventative medicine.

Don’t drink milk? Here’s how to get enough calcium and other nutrients

Cow’s milk is an excellent source of calcium which, along with vitamin D, is needed to build strong, dense bones.

Milk also contains protein, the minerals phosphorus, potassium, zinc and iodine, and vitamins A, B2 (riboflavin) and B12 (cobalamin).

As a child, I drank a lot of milk. It was delivered in pint bottles to our front steps each morning. I also drank a third of a pint before marching into class as part of the free school milk program. I still love milk, which makes getting enough calcium easy.

Of course, many people don’t drink milk for a number of reasons. The good news is you can get all the calcium and other nutrients you need from other foods.

What foods contain calcium?

Dairy products such as cheese and yoghurt are rich in calcium, while non-dairy foods including tofu, canned fish with bones, green leafy vegetables, nuts and seeds contain varying amounts.

Some foods are fortified with added calcium, including some breakfast cereals and soy, rice, oat and nut “milks”. Check their food label nutrition information panels to see how much calcium they contain.

Tofu is an excellent source of calcium. Image credits: Anh Nguyen.

However, it’s harder for your body to absorb calcium from non-dairy foods. Although your body does get better at absorbing calcium from plant foods, and also when your total calcium intake is low, the overall effect means if you don’t have dairy foods, you may need to eat more foods that contain calcium to maximize your bone health.

How much calcium do you need?

Depending on your age and sex, the daily calcium requirements vary from 360 milligrams per day to more than 1,000 mg for teens and older women.

One 250ml cup of cow’s milk contains about 300mg of calcium, which is equivalent to one standard serve. This same amount is found in:

  • 200 grams of yoghurt
  • 250 ml of calcium-fortified plant milks
  • 100 grams of canned pink salmon with bones
  • 100 grams of firm tofu
  • 115 grams of almonds.

The recommended number of daily serves of dairy and non-dairy alternatives varies:

  • children should have between 1 and 3.5 serves a day, depending on their age and sex
  • women aged 19 to 50 should have 2.5 serves a day, then 4 serves when aged over 50
  • men aged 19 to 70 should have 2.5 serves a day, then 3.5 serves when aged over 70.

However, the average Australian intake is just 1.5 serves per day, with only one in ten achieving the recommendations.

What other nutrients do you need?

If you don’t drink milk, the challenge is getting enough nutrients to have a balanced diet. Here’s what you need and why.


Food sources: meat, poultry, fish, eggs, nuts, seeds, legumes, dried beans and tofu.

Needed for growth and repair of cells and to make antibodies, enzymes and make specific transport proteins that carry chemical massages throughout the body.


Food sources: meat, poultry, seafood, nuts, seeds, wholegrains, dried beans and lentils.

Builds bone and teeth, supports growth and repair of cells, and is needed for energy production.

Whole grains are also a good source of calcium. Image credits: Gabriella Clare Marino.


Food sources: leafy green vegetables (spinach, silverbeet, kale), carrots, potatoes, sweet potatoes, pumpkin, tomatoes, cucumbers, zucchini, eggplant, beans and peas, avocados, apples, oranges and bananas.

Needed to activate cells and nerves. Maintains fluid balance and helps with muscle contraction and regulation of blood pressure.


Food sources: lean meat, chicken, fish, oysters, legumes, nuts, wholemeal and wholegrain products.

Helps with wound healing and the development of the immune system and other essential functions in the body, including taste and smell.


Food sources: fish, prawns, other seafood, iodised salt and commercial breads.

Needed for normal growth, brain development and used by the thyroid gland to make the hormone thyroxine, which is needed for growth and metabolism.

Vitamin A

Food sources: eggs, oily fish, nuts, seeds. (The body can also make vitamin A from beta-carotene in orange and yellow vegetables and green leafy vegetables.)

Needed for antibody production, maintenance of healthy lungs and gut, and for good vision.

Vitamin B2 (riboflavin)

Food sources: wholegrain breads and cereals, egg white, leafy green vegetables, mushrooms, yeast spreads, meat.

Needed to release energy from food. Also supports healthy eyesight and skin.

Vitamin B12 (cobalamin)

Food sources: meat, eggs and most foods of animal origin, some fortified plant milks and fortified yeast spreads (check the label).

Needed to make red blood cells, DNA (your genetic code), myelin (which insulate nerves) and some neurotransmitters needed for brain function.

When might you need to avoid milk?

Reasons why people don’t drink milk range from taste, personal preferences, animal welfare or environmental concerns. Or it could be due to health conditions or concerns about intolerance, allergy and acne.

Lactose intolerance

Lactose is the main carbohydrate in milk. It’s broken down in the simple sugars by an enzyme in the small intestine called lactase.

Some people are born without the lactase enzyme or their lactase levels decrease as they age. For these people, consuming foods containing a lot of lactose means it passes undigested along the gut and can trigger symptoms such as bloating, pain and diarrhoea.

Research shows smalls amounts of lactose – up to 15 grams daily – can be tolerated without symptoms, especially if spread out over the day. A cup of cows milk contains about 16 grams of lactose, while a 200g tub of yoghurt contains 10g, and 40g cheddar cheese contains less than 1g.

Cow’s milk allergy

Cow’s milk allergy occurs in about 0.5-3% of one year olds. By age five, about half are reported to have grown out of it, and 75% by adolescence. However, one survey found 9% of pre-school children had severe allergy with anaphylaxis.

Symptoms of cow’s milk allergy include hives, rash, cough, wheeze, vomiting, diarrhoea or swelling of the face.

Symptom severity varies, and can happen immediately or take a few days to develop. If a reaction is severe, call 000, as it can be a medical emergency.


The whey protein in cow’s milk products, aside from cheese, triggers an increase in insulin, a hormone that transports blood sugar, which is released into the blood stream.

Meanwhile, milk’s casein protein triggers an increase in another hormone, called insulin-like growth factor (IGF), which influences growth.

These two reactions promote the production of hormones called androgens, which can lead to a worsening of acne.

If this happens to you, then avoid milk, but keep eating hard cheese, and eat other foods rich in calcium regularly instead.

While milk can be problematic for some people, for most of us, drinking milk in moderation in line with recommendation is the way to go.

Clare Collins, Laureate Professor in Nutrition and Dietetics, University of Newcastle

This article is republished from The Conversation under a Creative Commons license. Read the original article.

UK man becomes first patient to receive experimental cancer vaccine

Graham Booth, a 54-year-old man from the UK who has long battled with head and neck cancer, has now received the first vaccine of what will be a year-long treatment against his recurring cancer. The injections are tailor-designed to match his DNA and it is hoped they will trigger Booth’s immune system to eliminate cancer permanently. 

Image credit: Pixabay.

Cancer is a major health problem in the UK, as in most parts of the world. In the UK, a country of approximately 68 million, There are around 375,000 cases every year, which means around 1,000 per day. Over half of the cases are of breast, prostate, lung, and bowel cancers. Meanwhile, on a global scale, 18 million cases of cancer were reported in 2020. It’s the leading cause of death worldwide. 

But we’re getting better and better at dealing with it. Now, researchers are increasingly looking at ways to get the patient’s body to fight cancer.

The immune system keeps track of all of the substances in the body. If a new element appears that isn’t recognized, the immune system attacks it. But cancer cells present a big challenge because cancer starts when normal cells become altered and start growing out of control. This makes it difficult for the immune system to identify them, as cancer cells pretend to be normal cells.

Researchers have been increasingly optimistic on the use immunotherapy, a treatment that uses parts of the person’s immune system to fight the disease of cancer. Innovative approaches are being tested and approved, and new ways of working with the immune system are being discovered very fast, including the use of new vaccines. 

Cancer treatment vaccines are quite different from the ones that work against viruses. Instead of preventing a disease, they try to get the immune system to attack cancer cells in the body. Some vaccines are made up of cancer cells or pure antigens. Sometimes the immune cells from a patient are used in the lab to create the vaccine. This is exactly the case in the new study.

Trying out a new approach

In the UK, Graham Booth was first diagnosed with head and neck cancer in 2011. The disease then returned four times, each involving a difficult treatment, including facial surgery and radiotherapy. Now, Booth is optimistic over the new treatment. He’ll be part of a year-long course of injections at the UK’s Clatterbridge Cancer Center. 

“This clinical trial has opened new doorways and gives me a bit of hope that my cancer won’t come back. And this could open doorways for other people. I’m hopefully looking at a brighter future. A bit of hope that it never returns again – which would mean the world to my family and everyone around me,” he said in a statement. 

Researchers behind the trial expect that the immunotherapy injections will produce fewer negative side effects compared to traditional cancer therapies, as it’s usually the case with radiotherapy and chemotherapy. This is because healthy tissue and cells won’t be damaged, Christian Ottensmeier, chief investigator for the trial, said in a statement.

For Ottensmeier, the research could be game-changing, especially considering receiving this treatment a few years ago was seen as “science fiction.” He said the treatment will make “a real difference” for the people at Clatterbridge and beyond, highlighting the “meaningful benefits” instead of the “meanigfuly side-effects” of other treatments. 

It’s not the first time the idea of using vaccines to treat cancer has been floated. There are already some vaccines used to treat some forms of cancer, but this is a nascent field with plenty of room for progress still left.

Facebook ads can be used to gauge cultural similarity between countries

The cultural similarity between countries and international migration patterns can be measured quite reliably using Facebook data, a new study reports.

Image via Pixabay.

“Cultural hotspot” isn’t the first thing that pops into mind when thinking about social media for most of us. However, new research from the Max Planck Institute for Demographic Research in Rostock, Germany shows that data from Facebook can be used to gauge cultural closeness between countries, and overall migration trends.

And the way to do it is to track ads for food and drink on the platform.

We are what we eat

“[A] few years ago, after reading a work of a colleague using data from the Facebook Advertising Platform, I was surprised to find how much information we share online and how much these social media platforms know about us,” said Carolina Coimbra Vieira, a Ph.D. student in the Laboratory of Digital and Computational Demography at the Max Planck institute and lead author of the research, in an email for ZME Science.

“After that, I decided to work with this social media data to propose new ways of answering old questions related to society. In this specific case, I wanted to propose a measure of cultural similarity between countries using data regarding Facebook users’ food and drink preferences.”

For the study, the team developed a new approach that uses Facebook data to gauge cultural similarity between countries, by making associations between immigration patterns and the overall preference for food and drink across various locations.

They employed this approach as migrants have a very important role to play in shaping cultural similarities between countries. However, they explain, it’s hard to study their influence directly, in part because it is hard to ‘measure’ culture reliably. The traditional way of gauging culture comes in the form of surveys, but these have several drawbacks such as cost, the chances of bias in question construction, and difficulties in applying them to a large sample of countries.

The team chose to draw on previous findings that show food and drink preferences may be a proxy for cultural similarities between countries, and build a new analytical method based on this knowledge. They drew on Facebook’s top 50 food and drink preferences in various countries — as captured by the Facebook Advertising Platform — in order to see what people in different areas liked to dine on.

“This platform allows marketers and researchers to obtain an estimate of the number of Facebook monthly active users for a proposed advertisement that matches the given input criteria based on a list of demographic attributes, such as age, gender, home location, and interests, that can be customized by the advertiser,” Vieira explained for ZME Science. “Because we focus on food and drink as cultural markers, we selected the interests classified by Facebook as related to food and drink. We selected the top 50 most popular foods and drinks in each one of the sixteen countries we analyzed to construct a vector indicator of each country in terms of these foods and drinks to finally measure the cultural similarity between them.”

In order to validate their findings, the team applied the method to 16 countries. They report that food and drink interests, as reflected by Facebook ads, generally align with documented immigration patterns. Preferences for foreign food and drink align with domestic preferences in the countries from which most immigrants came. On the other hand, countries that tend to have few immigrants also showed lower preferences for foreign foods and drinks, and were interested in a narrower range of such products more consistently.

The team cites the example of the asymmetry between Mexico and the U.S. as an example of the validity of their model. The top 50 foods and drinks from Mexico are more popular in the U.S. than the top 50 U.S. foods and drinks are in Mexico, they explain, aligning well with the greater degree of immigration coming from Mexico into the U.S. than the other way around.

All in all, the findings strongly suggest that immigrants help shape the culture of various countries. In the future, the team hopes to expand their methodology to include other areas of preference beyond food and drink, and see whether these align with known immigration patterns.

“The food and drink preferences shared by Facebook users from two different countries might indicate a high immigrant population from one country living in the other. In our results we observed that immigration is associated with higher cultural similarity between countries. For example, there are a lot of immigrants from Argentina living in Spain and our measure showed that one of the most similar countries to Spain is Argentina. This means that foods and drinks popular between Facebook users in Argentina are also really popular in Spain,” she adds.

“The most surprising aspect of this study is the methodology and more precisely, the data we used to study culture. Differently from surveys, our methodology is timely, [cost-effective], and easily scalable because it uses passively-collected information internationally available on Facebook.”

Overall, the researchers say, this study suggests that immigrants indeed help shape the culture of their destination country. Future research could refine the new method outlined in this study or repurpose it to examine and compare other interests beyond food and drink.

“I would like to see our proposed measure of cultural similarity being used in different contexts, such as to predict migration. For instance, it would be interesting to use our measure of cultural similarity to answer the question: Do the migrants prefer to migrate to a country culturally similar to their origin country?” Vieira concludes in her email.”More generally, I hope our work contributes to increasing the development of research using social media data as an alternative to complement more traditional data sources to study society.”

The paper “The interplay of migration and cultural similarity between countries: Evidence from Facebook data on food and drink interests” has been published in the journal PLoS ONE.

Shifting to a healthier diet can increase your lifespan by up to a decade

New research is showcasing how a more healthy, balanced diet — including more legumes, whole grains, and nuts, while cutting down on red and processed meat — can lead to longer lives.

Image via Pixabay.

“You are what you eat” is an age-old saying, but a new study from the University of Bergen says that we also live as long as what we eat. The healthier and more diverse our diets, the healthier and longer our life expectancy (LE) becomes, it reports.

The paper estimates the effect of such changes in the typical Western diets for the two sexes at various ages; the earlier these guidelines are incorporated into our eating habits, the larger the improvements in LE, but older people stand to benefit from significant (if smaller) gains as well.

Change your meals, enjoy more meals

“Our modeling methodology used data from [the] most comprehensive meta-analyses, data from the Global Burden of Disease study, life-table methodology, and added analyses on [the] delay of effects and combination of effects including potential effect overlap”, says Lars Fadnes, a Professor at the Department of Global Public Health at the University of Bergen who led the research, in an email for ZME Science.

“The methodology provides population estimates under given assumptions and is not meant as individualized forecasting, with uncertainty that includes time to achieve full effects, the effect of eggs, white meat, and oils, individual variation in protective and risk factors, uncertainties for future development of medical treatments; and
changes in lifestyle.”

Dietary habits are estimated to contribute to 11 million deaths annually worldwide, and to 255 million disability-adjusted life-years (DALYs). One DALY, according to the World Health Organization “represents the loss of the equivalent of one year of full health”. In other words, there’s a lot of room for good in changing what we eat.

The team drew on existing databases to develop a computerized model to estimate how a range of dietary changes would impact life expectancy. The model is publicly available as the online Food4HealthyLife calculator, which you can use to get a better idea of how changing what you eat can benefit your lifespan. The team envisions that their calculator would also help physicians and policy-makers to understand the impact of dietary choices on their patients and the public.

For your typical young adult (20 years old) in the United States, the team reports that changing from the typical diet to an optimal one (as described by their model) could provide an increase in LE of roughly 10.7 years for women and 13 years for men. There is quite some uncertainty in these results — meaning that increases for women range between 5.9 years and 14.1, and for men between 6.9 and 17.3 — due to the effect of factors that the model doesn’t factor in, such as preexisting health conditions, socioeconomic class, and so on. Changing diets at age 60 would still yield an increase in LE of 8 years for women and 8.8 years for men.

“The differences in life expectancy estimates between men and women are mainly due to differences in background mortality (and particularly cardiovascular disease such as coronary heart disease, where men generally are at higher risk at an earlier age compared to women),” prof. Fadnes explained for ZME Science.

The largest gains in LE would be made by eating more legumes, more whole grains, more nuts, less red meat, and less processed meat.

So far, the research focused on the impact of diet on LE, but such changes could be beneficial in other ways, as well. Many of the suggestions the team makes are also more environmentally sustainable and less costly, financially. The team is now hard at work incorporating these factors into their online calculator, in order to help people get a better understanding of just how changes in diet can improve their lives, on all levels involved.

“We are working to include sustainability aspects in Food4HealthyLife too. Based on former studies, the optimal diets are likely to have substantial benefits compared to a typical Western diet also in terms of reduction in greenhouse gas emissions, land use, and other sustainability facets,” he added for ZME Science. We have not systematically investigated financial aspects yet, but several of the healthy options could also be cheap, such as legumes and whole grains.”

The paper “Estimating the Impact of Food Choices on Life Expectancy: A Modeling Study” has been published in the journal PLoS Medicine.

Nursing home violence among dementia patients is a problem of neglect, not mental illness

Conflict in nursing homes is a surprisingly common occurrence among patients, especially those suffering from dementia. Resident-to-resident incidents are typically defined as negative, aggressive, and intrusive verbal, physical, material, and sexual interactions between patients, which can lead to psychological distress and physical harm. In some extreme and unfortunate cases, they can end in tragedy.

Such was the case of Frank Piccolo, a retired high school chemistry professor from Canada who in his later years developed dementia and was moved to a Toronto nursing home by his family. One evening in 2012, Frank was attacked by another resident, a woman who also had dementia. The woman hit Frank on the head and face with a wooden activity board, badly injuring him. Frank died three months later.

Although such extreme cases are rare, even minor conflicts between residents can have a significant psychological and physical toll on patients. But despite the serious nature of these incidents, they remain understudied and largely unaddressed across the more than 15,000 nursing homes found in the United States.

Until now, both government reports and media coverage have been content to chalk these incidents up to inevitable conflicts owed to interactions between mentally ill people. Residents involved in violent incidents are often called “perpetrators” or “aggressors”.

However, gerontologist and dementia behavior specialist Eilon Capsi strongly thinks otherwise. In a recent article, Capsi argues that home care violence among dementia patients is typically the result of unmet human needs, paired with their cognitive limitations. As such, the real problem is inadequate care and neglect. The real problem is the nursing homes themselves.

“Growing body of evidence suggests the true cause of these injuries and deaths is inadequate care and neglect on the part of care homes. Specifically, there is a lack of the specialized care that people with dementia require,” Capsi argued.

Some of the research supporting these assertions includes a 2004 study by the Harvard School of Public Health, which found the rate of violent incidents between residents was nearly three times higher in dementia care homes than in other long-term care homes. Meanwhile, another study supports this observation, finding higher rates of injurious or even fatal interactions between residents in dementia care homes than in other care homes.

These findings may startle many family members currently considering a care home for relatives showing signs of dementia. However, there are good care homes where conflict is minimal thanks to proper staffing with trained professionals and adequate care.

Violent conflict occurs mostly when the patients’ emotional, medical, and other needs are not properly met. All that pinned up frustration is let out during a breaking point, which may result in a patient pushing or hitting another resident.

In 2018, Capsi undertook a study in which he used publicly available information (primarily newspaper articles and death review reports) to identify patterns pertaining to the circumstances surrounding the death of 105 elders as a result of these incidents. The researcher found that nearly half of fatal incidents were associated with frustrating psychological triggers. And these triggers are much more common in dementia care homes since those with advanced dementia are more likely to inadvertently say or do things that may anger other residents, another study from the U.S. found.

The strongest triggers included those pertaining to violations of personal space and possessions. These include taking or touching a resident’s belongings or food, or unwanted entries into their bedroom. Crowded spaces may also lead to two residents claiming the same space, such as a dining room seat.

A stale and uneventful care home can also trigger violent outbursts among dementia residents just as well as crowded and busy institutions. Research shows that lack of meaningful activity and generally feeling bored can raise the pressure among residents, contributing to harmful interactions. Evenings and weekends are particularly dangerous.

“In most of these situations, the person with dementia does not intend to injure or kill another resident. Individuals with dementia live with a serious cognitive disability. And they often must do it while being forced to share small living spaces with many other residents,” Capsi said.

The researcher calls for raising the standards of dementia care homes in order to stave off the number of these preventable incidents that can lead to tragedies like Frank’s story. This includes increasing staff and improving training such that patients receive the care and attention they rightly deserve and need to live a dignified life.

“Understanding the role of dementia is important. But seeing a resident’s brain disease as the main cause of incidents is inaccurate and unhelpful. That view ignores external factors that can lead to these incidents but are outside of the residents’ control,” he concluded.

Just one extra hour of sleep can help overweight people eat less

Credit: Pixabay.

Research conducted over the years has increasingly linked poor sleep (particularly sleeping less than the minimally recommended 7 hours per night) to the risk of weight gain over time. Not sleeping enough may result in hormonal imbalances that affect appetite, leading some to eat more than they normally would on a healthy sleep regimen.

To investigate in more detail how sleep affects calorie intake, researchers from the University of Chicago and the University of Wisconsin-Madison conducted a randomized clinical trial involving 80 young, overweight adults who habitually sleep less than 6.5 hours a night.

“Over the years, we and others have shown that sleep restriction has an effect on appetite regulation that leads to increased food intake, and thus puts you at risk for weight gain over time,” said lead investigator Esra Tasali, director of the UChicago Sleep Center at the University of Chicago Medicine. “More recently, the question that everyone was asking was, ‘Well, if this is what happens with sleep loss, can we extend sleep and reverse some of these adverse outcomes?’”

The volunteers were randomly split into two groups. One received personalized sleep hygiene counseling, which involved changing one’s routine to avoid the things that hinder sleep (caffeine in the evening, heavy meals close to bedtime, excessively warm bedroom, etc.) and introduce activities that aid sleep (going to bed at the same time, using your sleep only for sleep or sex, etc.). The other group received no intervention at all and acted as a control.

In the first two weeks, the researchers just gathered baseline information about sleep and calorie intake. Sleep patterns were measured using wearable devices while calorie intake was quantified using the “doubly labeled water” method. The doubly labeled water method is a trialed and tested urine-based test for objectively tracking calorie intake, which involves a participant drinking water in which some hydrogen and oxygen atoms have been replaced with stable isotopes that are easy to trace. With this technique, it is possible to measure every calorie a person burned over a one to two week interval, without having to hawkishly record everything a person puts into their mouths.

“This is considered the gold standard for objectively measuring daily energy expenditure in a non-laboratory, real-world setting and it has changed the way human obesity is studied,” said the study’s senior author Dale Schoeller, professor emeritus of nutritional sciences at UW–Madison.

A month after the study started, the researchers found that participants in the sleep intervention group managed to extend their sleep duration by an average of 1.2 hours. Compared to the control group, the sleep intervention reduced the participants’ daily calorie intake by 270 calories, the equivalent of a small meal.

Of important note is that this examination was performed in a real-world setting. Each volunteer slept in their own beds, ate what they wished, wasn’t prompted to exercise, and generally went about their day as they pleased and normally would. That’s in stark contrast to most weight loss studies that are generally short-lived and diligently measure calorie intake by making sure participants only consume a particular offered diet.

The only factor that was manipulated in the study was sleep duration, and this single aspect proved to have a significant impact on the participants’ calorie intake. If the average reduction in calorie intake of 270 calories per day is maintained over the long term, this would translate to roughly 12 kg (26 pounds) of weight loss over a three-year period. That’s on average; some participants consumed as many as 500 fewer calories per day.

“This was not a weight-loss study,” said Tasali. “But even within just two weeks, we have quantified evidence showing a decrease in caloric intake and a negative energy balance — caloric intake is less than calories burned. If healthy sleep habits are maintained over a longer duration, this would lead to clinically important weight loss over time. Many people are working hard to find ways to decrease their caloric intake to lose weight — well, just by sleeping more, you may be able to reduce it substantially.”

In the future, the researchers plan on studying the underlying mechanisms that may explain why more sleep can lead to weight loss. Previous research by Tasali and colleagues suggest that sleep is important for appetite regulation. Limited sleep may drive changes in appetite-regulating hormones and reward centers in the brain that could lead to overeating.

If you struggle with both your sleep and weight, these findings suggest a simple intervention could do wonders: just sleep more. That’s harder than it sounds, but with some hard work, it is possible. According to the researchers, limiting the use of electronic devices before bedtime was a key intervention.

Here are a few tips that may help you clock in more hours of sleep:

  1. Go to sleep at the same time each night, and get up at the same time each morning, even on the weekends.
  2. Don’t take naps after 3 p.m, and don’t nap longer than 20 minutes.
  3. Stay away from caffeine and alcohol late in the day.
  4. Avoid nicotine completely.
  5. Get regular exercise, but not within 2-3 hours of bedtime.
  6. Don’t eat a heavy meal late in the day. A light snack before bedtime is OK.
  7. Make your bedroom comfortable, dark, quiet, and not too warm or cold.
  8. Follow a routine to help you relax before sleep (for example, reading or listening to music). Turn off the TV and other screens at least an hour before bedtime.
  9. Don’t lie in bed awake. If you can’t fall asleep after 20 minutes, do something calming until you feel sleepy, like reading or listening to soft music.
  10. Talk with a doctor if you continue to have trouble sleeping.

The findings of the new study appeared in the journal JAMA Internal Medicine.

These spinal cord implants allow paralyzed patients to stand, walk, and even swim and cycle

Credit: EPFL.

In 2018, Swiss researchers Grégoire Courtine and Jocelyne Bloch made headlines with an implant they devised that sends electrical pulses to the spinal cord of paralyzed patients. The stimulation of the spinal nerves triggers plasticity in the cells, which seems to regenerate nerve connections, allowing test subjects paralyzed from the waist down to stand and walk, something that doctors told them they were unlikely to do again in their lifetimes. Now, the same team from the Swiss Federal Institute of Technology (EPFL) and Lausanne University Hospital have showcased an upgraded version of this spinal cord electrical stimulation — and the improvements speak for themselves.

The personalized spinal cord electrode implants were shown to restore motor movements within a few hours of the therapy’s onset in three paralyzed patients. The volunteers could not only stand and walk, but also perform motor movements that are an order of magnitude more complex, such as cycling, swimming, and canoeing.

Credit: EPFL.

Furthermore, the newly designed electrode paddle configuration can work with patients with more severe spinal cord injuries. For instance, the two 2018 patients who first tested the system retained some residual control over their legs following injury — too little for them to walk or even stand but just enough for the external electrical stimulation to allow them to regain motor function. In the upgraded version, the three patients who underwent spinal cord stimulation are completely paralyzed and were hence unable to voluntarily contract any of their leg muscles.

One of these patients is Michel Roccati, an Italian man who became completely paralyzed after a motorcycle accident four years prior to his enrollment in the EPFL spinal cord stimulation therapy. Bloch, a professor and neurosurgeon at Lausanne University Hospital, surgically implanted the new electrode lead in his spinal cord, and after recovery Roccati was ready to put the whole thing to the test.


During one particularly windy day in downtown Lausanne, the researchers and Roccati gathered outdoors with an array of hardware. The walker used by Roccati had been fitted with two small remote controls that connect wirelessly to a pacemaker in the patient’s abdomen, which in turn relays the signals to the spinal implants. The signal is converted into discontinued electrical pulses that stimulate specific neurons, allowing Roccati to move his lower limbs.

For the entire duration of the test, Roccati was in full control. The patient grasped the walker and pressed the remote control buttons when he intended to move. For instance, he would press the right side button when he intended to move his left leg. Pressing the buttons almost magically caused his legs to spring forward. He was walking — and getting better and stronger with each therapy session.

“The first few steps were incredible – a dream come true!” he says. “I’ve been through some pretty intense training in the past few months, and
I’ve set myself a series of goals. For instance, I can now go up and down stairs, and I hope to be able to walk one kilometer by this spring.”

The updated system employs more sophisticated electrode paddle implants that target the dorsal roots in the lumbosacral region of the spinal cord controlled by artificial intelligence and the lowest nerve root in the spine responsible for trunk stability. These implants are controlled by an artificial intelligence system whose stimulation algorithms are supposed to imitate nature, activating the spinal cord like the brain would normally do to allow us to stand, walk, swim or ride a bike.

“Within a couple of hours, our therapy restored independent walking within a few hours after the onset of the therapy; in addition to many additional motor activities that are critical for rehabilitation and daily life,” said Robin Demesmaeker, a researcher at EPFL and the Department of Clinical Neurosciences, University Hospital Lausanne, told ZME Science.

“Central to this remarkably more effective and ultrafast therapeutic efficacy was a series of disruptive technological innovations driven by our understanding of the mechanisms through which electrical spinal cord stimulation restores movement after paralysis,” add Demesmaeker, who is also the first author of the new study that appeared today in the journal Nature Medicine.

The two other patients who’ve tested the new system also made dramatic improvements in their quality of life. In each case, the therapy — both the electrode placement on the spinal cord and the activities involved in the therapy — were personalized. After several months of intensive training, the three patients were able to regain muscle mass, move about more independently, and could take part in social activities they previously couldn’t possibly do like having a drink standing at a bar. Millions of other patients in similar conditions could stand to benefit from the same therapy.

“The main requirement is that the region of the spinal cord where the spinal implant is placed should still be intact and that the lesion should be higher,” Demesmaeker wrote in an email. “The major difficulties with more severe cervical injuries are owed to injury-induced blood-pressure instability that leads to severe orthostatic hypotension, making upright locomotor training impossible as well as highly impaired arm and hand function impeding the use of assistive devices such as crutches and a walker.”

The researchers in Switzerland are still in the middle of an ongoing clinical trial, in which they’re trying to find the most optimal path towards enabling brain-controlled spinal cord stimulation in real-time.

“We are also assessing the ability of spinal cord stimulation to alleviate other problems such as hemodynamic instability in patients with spinal cord injury and gait deficits in patients with Parkinson’s disease,” Demesmaeker said.

The fascinating science behind the first human HIV mRNA vaccine trial – what exactly does it entail?

In a moment described as a “potential first step forward” in protecting people against one of the world’s most devastating pandemics, Moderna, International AIDS Vaccine Initiative (IAVI), and the Bill and Melinda Gates Foundation have joined forces to begin a landmark trial — the first human trials of an HIV vaccine based on messenger ribonucleic acid (mRNA) technology. The collaboration between these organizations, a mixture of non-profits and a company, will bring plenty of experience and technology to the table, which is absolutely necessary when taking on this type of mammoth challenge.

The goal is more than worth it: helping the estimated 37.7 million people currently living with HIV (including 1.7 million children) and protecting those who will be exposed to the virus in the future. Sadly, around 16% of the infected population (6.1 million people) are unaware they are carriers.

Despite progress, HIV remains lethal. Disturbingly, in 2020, 680,000 people died of AIDS-related illnesses, despite inroads made in therapies to dampen the disease’s effects on the immune system. One of these, antiretroviral therapy (ART), has proven to be highly effective in preventing HIV transmission, clinical progression, and death. Still, even with the success of this lifelong therapy, the number of HIV-infected individuals continues to grow.

There is no cure for this disease. Therefore, the development of vaccines to either treat HIV or prevent the acquisition of the disease would be crucial in turning the tables on the virus.

However, it’s not so easy to make an HIV vaccine because the virus mutates very quickly, creating multiple variants within the body, which produce too many targets for one therapy to treat. Plus, this highly conserved retrovirus becomes part of the human genome a mere 72 hours after transmission, meaning that high levels of neutralizing antibodies must be present at the time of transmission to prevent infection.

Because the virus is so tricky, researchers generally consider that a therapeutic vaccine (administered after infection) is unfeasible. Instead, researchers are concentrating on a preventative or ‘prophylactic’ mRNA vaccine similar to those used by Pfizer/BioNTech and Moderna to fight COVID-19.

What is the science behind the vaccine?

The groundwork research was made possible by the discovery of broadly neutralizing HIV-1 antibodies (bnAbs) in 1990. They are the most potent human antibodies ever identified and are extremely rare, only developing in some patients with chronic HIV after years of infection.

Significantly, bnAbs can neutralize the particular viral strain infecting that patient and other variants of HIV–hence, the term ‘broad’ in broadly neutralizing antibodies. They achieve this by using unusual extensions not seen in other immune cells to penetrate the HIV envelope glycoprotein (Env). The Env is the virus’s outer shell, formed from the cell membrane of the host cell it has invaded, making it extremely difficult to destroy; still, bnAbs can target vulnerable sites on this shell to neutralize and eliminate infected cells.

Unfortunately, the antibodies do little to help chronic patients because there’s already too much virus in their systems; however, researchers theorize if an HIV-free person could produce bnABS, it might help protect them from infection.

Last year, the same organizations tested a vaccine based on this idea in extensive animal tests and a small human trial that didn’t employ mRNA technology. It showed that specific immunogens—substances that can provoke an immune response—triggered the desired antibodies in dozens of people participating in the research. “This study demonstrates proof of principle for a new vaccine concept for HIV,” said Professor William Schief, Department of Immunology and Microbiology at Scripps Research, who worked on the previous trial.

BnABS are the desired endgame with the potential HIV mRNA vaccine and the fundamental basis of its action. “The induction of bnAbs is widely considered to be a goal of HIV vaccination, and this is the first step in that process,” Moderna and the IAVI (International AIDS Vaccine Initiative) said in a statement.

So how exactly does the mRNA vaccine work?

The experimental HIV vaccine delivers coded mRNA instructions for two HIV proteins into the host’s cells: the immunogens are Env and Gag, which make up roughly 50% of the total virus particle. As a result, this triggers an immune response allowing the body to create the necessary defenses—antibodies and numerous white blood cells such as B cells and T cells—which then protect against the actual infection.

Later, the participants will also receive a booster immunogen containing Gag and Env mRNA from two other HIV strains to broaden the immune response, hopefully inducing bnABS.

Karie Youngdahl, a spokesperson for IAVI, clarified that the main aim of the vaccines is to stimulate “B cells that have the potential to produce bnAbs.” These then target the virus’s envelope—its outermost layer that protects its genetic material—to keep it from entering cells and infecting them.  

Pulling back, the team is adamant that the trial is still in the very early stages, with the volunteers possibly needing an unknown number of boosters.

“Further immunogens will be needed to guide the immune system on this path, but this prime-boost combination could be the first key element of an eventual HIV immunization regimen,” said Professor David Diemert, clinical director at George Washington University and a lead investigator in the trials.

What will happen in the Moderna HIV vaccine trial?

The Phase 1 trial consists of 56 healthy adults who are HIV negative to evaluate the safety and efficacy of vaccine candidates mRNA-1644 and mRNA-1644v2-Core. Moderna will explore how to deliver their proprietary EOD-GT8 60mer immunogen with mRNA technology and investigate how to use it to direct B cells to make proteins that elicit bnABS with the expert aid of non-profit organizations. But readers should note that only one in every 300,000 B cells in the human body produces them to give an idea of the fragility of the probability involved here.

Sensibly, the trial isn’t ‘blind,’ which means everyone who receives the vaccine will know what they’re getting at this early stage. That’s because the scientists aren’t trying to work out how well the vaccine works in this first phase lasting approximately ten months – they want to make sure it’s safe and capable of mounting the desired immune response.

And even though there is much hype around this trial, experts caution that “Moderna are testing a complicated concept which starts the immune response against HIV,” says Robin Shattock, an immunologist at Imperial College London, to the Independent. “It gets you to first base, but it’s not a home run. Essentially, we recognize that you need a series of vaccines to induce a response that gives you the breadth needed to neutralize HIV. The mRNA technology may be key to solving the HIV vaccine issue, but it’s going to be a multi-year process.”

And after this long period, if the vaccine is found to be safe and shows signs of producing an immune response, it will progress to more extensive real-world studies and a possible solution to a virus that is still decimating whole communities.

Still, this hybrid collaboration offers future hope regarding the prioritization of humans over financial gain in clinical trials – the proof is that most HIV patients are citizens of the third world.

As IAVI president Mark Feinberg wrote in June at the 40th anniversary of the HIV epidemic: “The only real hope we have of ending the HIV/AIDS pandemic is through the deployment of an effective HIV vaccine, one that is achieved through the work of partners, advocates, and community members joining hands to do together what no one individual or group can do on its own.”

Whatever the outcome, money is no longer a prerogative here, and with luck, we may see more trials based on this premise very soon.

Artificial enamel is even stronger than real teeth

Credit: Pixabay.

Enamel, the hard mineralized surface of teeth, is the hardest thing in the human body. Pound for pound, enamel is tougher and harder than steel. Its unique mix of minerals, water, and organic material makes it tough enough not to dent while at the same time making it durable enough to withstand decades of grinding and tear – but only for so long. Depending on your diet and how well you take care of your teeth, you can stave off tooth decay but you can only postpone the inevitable for so long. The problem is that once teeth lose their enamel, it never comes back, and tooth decay is right around the corner.

Despite many attempts to replicate the wondrous properties of enamel, most efforts have proven in vain. A new study, however, has reignited hopes that such a thing is actually possible after researchers at the University of Michigan have devised a way to make artificial enamel. It goes without saying that this would be a huge leap for dentistry, which still uses decades-old filling technology to repair cavities.

Mimicking enamel in the lab is incredibly challenging due to its complex structure of interwoven hydroxyapatite nanocrystals, which are one-thousandth the thickness of human hair. These crystals are arranged in wires, which become coated in magnesium by enamel-producing cells, and then are woven together into a very strong mesh, which is further organized into twists and bunches.

Researchers have struggled while attempting to reconstruct the complex and multi-layered organization of enamel. But where others failed, the authors of the new study finally succeeded. They encased wires of hydroxyapatite in a malleable metal-based coating, resulting in a structure that has a soft layer that can absorb the powerful shock of a bite but is strong enough to take a lot of pressure without denting.

In fact, the artificial enamel is stronger than the natural variation due to swapping the magnesium-rich coating with the much stronger (and non-toxic) zirconium oxide. To test the material’s strength and elasticity, the researchers cut a piece with a diamond-bladed saw then used a mechanical press to apply pressure steadily until it started to crack. The artificial enamel surpassed natural enamel in six different measures, including hardness, elasticity, and shock absorption.

Now, the artificial enamel doesn’t mimic natural enamel to the tee. It lacks the complex 3D woven patterns of natural enamel, but its parallel wire structure is the closest scientists have come to true enamel thus far.

The research could drastically improve the construction of artificial teeth, as well as significantly reduce tooth decay through new and improved fillings that last much longer. However, the best dental treatment is still prevention, which is why doctors recommend having a good dental routine and opting for teeth straightening as early as possible. Comparing options currently available on the market shows there is a good number of quality aligners and braces manufacturers to choose from.

But beyond dentistry, the hard artificial enamel could prove highly useful when incorporated into implantable electrons and biosensors, such as pacemakers and blood pressure monitors.

“This method of making artificial enamel lends itself to commercial production and it can be produced for the manufacture of artificial teeth,” Nicholas A. Kotov, of the University of Michigan, told i.

It’s still early to make any predictions when this product might reach the market, but since all the components of the material are biocompatible, researchers hope to soon begin trials on both animals and humans. The artificial enamel hasn’t been binded to natural enamel yet, a crucial step in tooth repair, so this will be one of the many tests the material needs to pass before we can finally enter a new age of dentistry.

The findings appeared in the journal Science.

What is vitamin K?

Vitamin K plays a key role in our blood’s ability to form clots. It’s one of the less glamorous vitamins, more rarely discussed than its peers and, although it’s usually referred to as a single substance, it comes in two natural varieties — K1 and K2 — and one synthetic one, K3. People typically cover their requirements of vitamin K through diet, so it’s rarely seen in supplement form, but we’ll also look at some situations that might require an extra input of vitamin K.

A molecule of menatetrenone, one of the forms of vitamin K2. Image via Wikimedia.

The ‘K’ in vitamin K stands for Koagulations-vitamin, Danish for ‘coagulation vitamin’. This is a pretty big hint as to what these vitamers — the term used to denote the various chemically-related forms of a vitamin — help our bodies do. Vitamin K is involved in modification processes that proteins undergo after they have been synthesized, and these proteins then go on to perform clotting wherever it is needed in our blood. Apart from this, vitamin K is also involved in calcium-binding processes for tissues throughout our bodies, for example in bones.

Although we don’t need very high amounts of vitamin K to be healthy (relative to other vitamins), a deficiency of it is in no way a pretty sight. Without enough vitamin K, blood clotting is severely impaired, and uncontrollable bleeding starts occurring throughout our whole bodies. Some research suggests that a deficiency of this vitamin can also cause bones to weaken, leading to osteoporosis, or to the calcification of soft tissues.

What is vitamin K?

Chemically speaking, vitamin K1 is known as phytomenadione or phylloquinone, while K2 is known as menaquinone. They’re quite similar from a structural point of view, being made up of two aromatic rings (rings of carbon atoms) with a long chain of carbon atoms tied to one side. K2 has two subtypes, one of which is longer than the other, but they perform the same role in our bodies. The K1 variety is the most often seen one in supplements.

Vitamin K3 is known as menadione. It used to be prescribed as a treatment for vitamin K deficiency, but it was later discovered that it interfered with the function of glutathione, an important antioxidant and key metabolic molecule. As such, it is no longer in use for this role in humans.

They are fat-soluble substances that tend to degrade rapidly when exposed to sunlight. It also breaks down very quickly and is excreted quickly in the body, so it’s exceedingly rare for it to reach toxic concentrations in humans. Vitamin K is concentrated in the liver, brain, heart, pancreas, and bones.


Vitamin K is abundant in green, leafy vegetables, where it is involved in photosynthesis. Image credits Local Food Initiative / Flickr.

As previously mentioned, people tend to get enough vitamin K from a regular diet.

Plants are a key synthesizer of vitamin K1, especially their tissues which are directly involved in photosynthesis; as such, mixing leafy or green vegetables into your diet is a good way to access high levels of the vitamin. Spinach, asparagus, broccoli, or legumes such as soybeans are all good sources. Strawberries also contain this vitamin, to a somewhat lesser extent.

Animals also rely on this vitamin for the same processes human bodies do, so animal products can also be a good source of it. Animals tend to convert the vitamin K1 they get from eating plants into one of the varieties K2 (MK-4). Eggs or organ meats such as liver, heart, or brain are high in K2.

All other forms of K2 vitamin are produced by bacteria who produce it during anaerobic respiration. As such, fermented foods can also be a good source of this vitamin.

Some of the most common signs of deficiency include:

  • Slow rates of blood clotting;
  • Long prothrombin times (prothrombin is a key clotting factor measured by doctors);
  • Spontaneous or random bleeding;
  • Hemorrhaging;
  • Osteoporosis (loss of bone mass) or osteopenia (loss of bone mineral density).

Do I need vitamin K supplements?

Cases of deficiency are rare. However, certain factors can promote such deficiencies. Most commonly, this involves medication that blocks vitamin K metabolism as a side-effect (some antibiotics do this) or medical conditions that prevent the proper absorption of nutrients from food. Some newborns can also experience vitamin K deficiencies as this compound doesn’t cross through the placenta from the mother, and breast milk only contains low levels of it. Due to this, infants are often given vitamin K supplements.

Although it is rare to see toxicity caused by vitamin K overdoses, it is still advised that supplements only be taken when prescribed by a doctor. Symptoms indicative of vitamin K toxicity are jaundice, hyperbilirubinemia, hemolytic anemia, and kernicterus in infants.

Vitamin K deficiencies are virtually always caused by malnourishment, poor diets, or by the action of certain drugs that impact the uptake of vitamin K or its role in the body. People who use antacids, blood thinners, antibiotics, aspirin, and drugs for cancer, seizures, or high cholesterol are sometimes prescribed supplements — again, by a trained physician.

How was it discovered?

The compound was first identified by Danish biochemist Henrik Dam in the early 1930s. Dam was studying another topic entirely: cholesterol metabolism in chickens. However, he observed that chicks fed with a diet low in fat and with no sterols had a high chance of developing subcutaneous and intramuscular hemorrhages (strong bleeding under the skin and within their muscles).

Further studies with different types of food led to the identification of the vitamin, which Dam referred to as the “Koagulations-Vitamin”.

Some other things to know

Some of the bacteria in our gut help provide us with our necessary intake of vitamin K — they synthesize it for us. Because of this, antibiotic use can lead to a decrease in vitamin K levels in our blood, as they decimate the populations of bacteria in our intestines. If you’re experiencing poor appetite following a lengthy or particularly strong course of antibiotics, it could be due to such a deficiency. Contact your physician and tell them about your symptoms if you think you may need vitamin K supplements in this situation; it’s not always the case that you do, but it doesn’t hurt to ask.

Another step you can take to ensure you’re getting enough vitamin K is to combine foods that contain a lot of it with fats — as this vitamin is fat-soluble. A salad of leafy greens with olive oil and avocado is a very good way of providing your body with vitamin K and helping it absorb as much of it as possible.

Solving crosswords and number puzzles may make your brain sharper at old age

Regular use of word and number puzzles may help keep our brains working better for longer. According to a pair of studies, adults aged 50 and over who are in the habit of solving crosswords and Sudoku scored much higher on cognitive tests, such as those that assess problem-solving and memory, than those who didn’t. In some instances, the differences were quite dramatic: people who regularly do puzzles had the cognitive abilities of those eight years younger, on average, compared to those who didn’t.

Credit: Pixabay.

Researchers led by Dr. Anne Corbett of the University of Exeter Medical School surveyed participants in the PROTECT study, a large online cohort of over 22,000 older adults between the ages of 50 and 96, about how frequently they engage in word and number puzzles. The participants then had to undertake a battery of cognitive tests whose results are supposed to measure age-related changes in brain function. These include tasks that assess attention, reasoning, and memory. The results were striking.

Those who engage in crosswords had a brain function equivalent to ten years younger than their biological age on tests assessing grammatical reasoning and eight years younger than their age on tests measuring short-term memory.

“The improvements are particularly clear in the speed and accuracy of their performance. In some areas the improvement was quite dramatic — on measures of problem-solving, people who regularly do these puzzles performed equivalent to an average of eight years younger compared to those who don’t. We can’t say that playing these puzzles necessarily reduces the risk of dementia in later life but this research supports previous findings that indicate regular use of word and number puzzles helps keep our brains working better for longer,” Corbett said in a statement.

PROTECT is designed as a 25-year study and participants are followed-up yearly to assess how their brain ages and what lifestyle choices might influence the risk of dementia later in life. Despite tremendous progress, we still know little about how the brain ages or what causes debilitating neurodegenerative diseases like Alzheimer’s or Parkinson’s. PROTECT may offer exciting research opportunities in the year to come.

The two studies published in the International Journal of Geriatric Psychiatry don’t necessarily conclude that solving puzzles will necessarily reduce the risk of dementia and keep your brain sharper. The findings are observational and it could just be that people who have a natural ability to preserve their brain function with age also have a tendency to use word and numbers puzzles. In other words, the study established a correlation but did not define causation.

However, the findings are consistent with previous studies. A 2011 experiment with participants from the Bronx Aging Study found regularly solving crosswords is associated with a delay in the onset of cognitive decline. Other studies came to totally different conclusions. When Scottish researchers tested nearly 500 participants, all born in 1936, and found a tricky crossword or a challenging puzzle will not fend off age-related mental decline. However, they did note that although brain games like jigsaw puzzles may not prevent dementia, regularly challenging yourself mentally seems to improve the brain’s ability to cope with neurodegenerative disease.

“We know that what is good for the heart is good for the head, and there are other ways we can reduce our risk of developing dementia,” James Pickett, head of research at the UK’s Alzheimer’s Society, told CNN, “by taking steps towards a healthy lifestyle, eating a balanced diet, avoiding smoking and heavy drinking, and exercising regularly.”

If you want to keep your brain healthy, paying attention to your diet is clearly shown to help, but the occasional puzzle can’t hurt either.

Polluted air can reduce cognitive abilities — but improvements in air quality can help

The more researchers look into air pollution, the more problems it seems to cause.

Exposure to pollution has been linked to a number of major health problems, including cardiovascular disease and lung disease. It’s also been linked to dementia before, and some studies have even found that it can even impair cognitive ability. When it comes to other conditions, pollution is a modifiable risk. When the pollution is eliminated, the risk also drops, but for dementia, this hadn’t yet been demonstrated.

So a team of researchers led by Diana Younan, of the University of Southern California, carried out a study on 2,232 older women who were free of dementia when they entered the study. They chose to focus on women because older women are disproportionately affected by Alzheimer’s disease, Younan told ZME Science. The researchers then followed the women for 20 years, giving them two different cognitive tests every year. They also analyzed local changes in air quality for all of the women and used statistical analysis to see if a reduction in air pollution was associated with slower cognitive decline.

It was. Women living in areas with greater improvements in air quality tended to have a much slower decline, as indicated by cognitive tests. Basically, the reduced rate of decline in areas with greater air improvement was equivalent to being 0.9-1.6 years younger, depending on the test.

The findings strengthen the link between pollution and cognitive decline. In order to give context on how much air pollution can affect cognitive ability, the researchers compared the magnitude of their results with other known predictors of cognitive decline, such as age.

We found that reducing air pollution exposure can promote healthier brain aging in older women by slowing cognitive decline. These benefits were seen in older women of all ages, levels of education, geographic regions of residence, and cardiovascular histories,” Younan says.

“Based on our results, we saw that an interquartile range increment of reduced PM2.5 (1.79 ug/m3) and of reduced NO2 (3.92 ppb) was associated with slower decline in cognition,” Younan told ZME Science. “This potential benefit was equivalent to the slower decline rate observed in women who were 1-1.5 years younger at baseline.”  

The good news is that environmental policies can help reduce pollutants, and consequently, help reduce the burden on people’s cognitive abilities.

“The health benefits seen in our study were a result of decreasing levels of both PM2.5 and NO2 across the U.S., which were likely due to national policies and strategies aimed at regulating pollution from stationary (power plants; factories) and mobile (vehicles) sources.”

Cleaner air is already known to improve heart and respiratory health, but in addition to the health component, there’s also an economic component to the study. Dementia is estimated to cost the U.S. economy $159–$215 billion annually, and reducing pollution could be an efficient way of reducing this financial burden.

The researchers were surprised to see that the benefits of reducing pollution levels were seen across older women of all ages — which is all the more reason to take measures to reduce atmospheric pollution, they say.

“Studies have shown that seniors, people with lower levels of education, people living in certain areas across the US, and people with preexisting heart disease are affected more by air pollution,” Younan concludes. “What surprised us and was the most important finding was that these benefits were seen in older women of all ages, levels of education, geographic regions of residence, and cardiovascular histories. The Clean Air Act mandates that the Environmental Protection Agency sets air quality standards to provide a safe margin for sensitive populations and these results suggest that the benefits may be universal in older women. I think these findings show that it is worth the continuing efforts to enforce air quality standards and provide more clear air to all.”