Category Archives: Nutrition

Just two glasses of wine could exceed a whole’s day sugar intake

Yes, wine is good, but here’s the thing: there’s sugar in all wines, from whites to red to cooking wine and everything in between. But how much sugar are we talking about? Kind of a lot, according to a new study. Researchers reviewed 30 bottles of different types of wine in the UK and found that two glasses might exceed the recommended daily sugar limit for adults.

Image credit: Flickr / David.

The Alcohol Health Alliance, a group of over 60 non-profit organizations from the UK, commissioned a laboratory to analyze 30 bottles of red, white, fruit, rosé, and sparkling wine from the top 10 leading wine brands in the UK. The results showed a variation of sugar and calories between products – information missing from most alcohol labels.

“Alcohol’s current exemption from food and drink labeling rules [in the UK] is absurd. Shoppers who buy milk or orange juice have sugar content and nutritional information right at their fingertips. But this information is not required when it comes to alcohol,” Professor Ian Gilmore, Chair of the Alcohol Health Alliance UK, said in a statement

Wine and sugar

As you likely know, wine is made from grapes, which naturally contain sugar. To produce wine, the grapes have to be fermented – a process through which yeast is added and the sugars are transformed into alcohol. Any sugars that aren’t converted in the process are called residual sugars. So basically, wine does contain sugar, but it’s technically less than if you ate the grapes.

But the story is a bit more complicated. Every wine type is kind of unique in terms of sugar content. Aged wine, for example, has less sugar since it’s fermented for a longer time. Also, winemakers can add more sugar after fermentation depending on the desired sweetness. In the US, for example, the market for sweets is higher, so more sugar is added to the wine.

The problem is most of the bottles lack nutritional information on labels. In the UK, like in many countries, this isn’t currently required by law, so campaigners are calling for a change to better inform wine drinkers about the number of calories and sugars they are consuming. It’s also something consumers want, according to recent surveys.

The National Health Service (NHS) in the UK recommends adults consume a maximum of 30 grams of sugars per day. This is sugar in all its forms — and a lot of foods have more sugar than you think. The analysis by the Alcohol Health Alliance UK shows it’s possible to reach that level by drinking two medium-sized glasses of some wines. Lower-strength wines have the most sugar, according to the research.

Alcohol accounts for about 10% of the daily calorie intake of adults who drink in the UK, with over three million adults consuming an extra day’s worth of calories each week. That’s two months of food each year, and it’s basically just empty calories. It goes much further than just wine, with up to 59 grams of sugar found on every ready-to-drink cocktail on the market, a study showed.

“The alcohol industry has dragged their feet for long enough – unless labeling requirements are set out in law, we will continue to be kept in the dark about what is in our drinks. People want and need reliable information directly on bottles and cans, where it can usefully inform their decisions,” Alison Douglas from Alcohol Focus Scotland said in a statement.

The full report on sugar and wine can be accessed here.

Don’t drink milk? Here’s how to get enough calcium and other nutrients

Cow’s milk is an excellent source of calcium which, along with vitamin D, is needed to build strong, dense bones.

Milk also contains protein, the minerals phosphorus, potassium, zinc and iodine, and vitamins A, B2 (riboflavin) and B12 (cobalamin).

As a child, I drank a lot of milk. It was delivered in pint bottles to our front steps each morning. I also drank a third of a pint before marching into class as part of the free school milk program. I still love milk, which makes getting enough calcium easy.

Of course, many people don’t drink milk for a number of reasons. The good news is you can get all the calcium and other nutrients you need from other foods.

What foods contain calcium?

Dairy products such as cheese and yoghurt are rich in calcium, while non-dairy foods including tofu, canned fish with bones, green leafy vegetables, nuts and seeds contain varying amounts.

Some foods are fortified with added calcium, including some breakfast cereals and soy, rice, oat and nut “milks”. Check their food label nutrition information panels to see how much calcium they contain.

Tofu is an excellent source of calcium. Image credits: Anh Nguyen.

However, it’s harder for your body to absorb calcium from non-dairy foods. Although your body does get better at absorbing calcium from plant foods, and also when your total calcium intake is low, the overall effect means if you don’t have dairy foods, you may need to eat more foods that contain calcium to maximize your bone health.

How much calcium do you need?

Depending on your age and sex, the daily calcium requirements vary from 360 milligrams per day to more than 1,000 mg for teens and older women.

One 250ml cup of cow’s milk contains about 300mg of calcium, which is equivalent to one standard serve. This same amount is found in:

  • 200 grams of yoghurt
  • 250 ml of calcium-fortified plant milks
  • 100 grams of canned pink salmon with bones
  • 100 grams of firm tofu
  • 115 grams of almonds.

The recommended number of daily serves of dairy and non-dairy alternatives varies:

  • children should have between 1 and 3.5 serves a day, depending on their age and sex
  • women aged 19 to 50 should have 2.5 serves a day, then 4 serves when aged over 50
  • men aged 19 to 70 should have 2.5 serves a day, then 3.5 serves when aged over 70.

However, the average Australian intake is just 1.5 serves per day, with only one in ten achieving the recommendations.

What other nutrients do you need?

If you don’t drink milk, the challenge is getting enough nutrients to have a balanced diet. Here’s what you need and why.

Protein

Food sources: meat, poultry, fish, eggs, nuts, seeds, legumes, dried beans and tofu.

Needed for growth and repair of cells and to make antibodies, enzymes and make specific transport proteins that carry chemical massages throughout the body.

Phosphorus

Food sources: meat, poultry, seafood, nuts, seeds, wholegrains, dried beans and lentils.

Builds bone and teeth, supports growth and repair of cells, and is needed for energy production.

Whole grains are also a good source of calcium. Image credits: Gabriella Clare Marino.

Potassium

Food sources: leafy green vegetables (spinach, silverbeet, kale), carrots, potatoes, sweet potatoes, pumpkin, tomatoes, cucumbers, zucchini, eggplant, beans and peas, avocados, apples, oranges and bananas.

Needed to activate cells and nerves. Maintains fluid balance and helps with muscle contraction and regulation of blood pressure.

Zinc

Food sources: lean meat, chicken, fish, oysters, legumes, nuts, wholemeal and wholegrain products.

Helps with wound healing and the development of the immune system and other essential functions in the body, including taste and smell.

Iodine

Food sources: fish, prawns, other seafood, iodised salt and commercial breads.

Needed for normal growth, brain development and used by the thyroid gland to make the hormone thyroxine, which is needed for growth and metabolism.

Vitamin A

Food sources: eggs, oily fish, nuts, seeds. (The body can also make vitamin A from beta-carotene in orange and yellow vegetables and green leafy vegetables.)

Needed for antibody production, maintenance of healthy lungs and gut, and for good vision.

Vitamin B2 (riboflavin)

Food sources: wholegrain breads and cereals, egg white, leafy green vegetables, mushrooms, yeast spreads, meat.

Needed to release energy from food. Also supports healthy eyesight and skin.

Vitamin B12 (cobalamin)

Food sources: meat, eggs and most foods of animal origin, some fortified plant milks and fortified yeast spreads (check the label).

Needed to make red blood cells, DNA (your genetic code), myelin (which insulate nerves) and some neurotransmitters needed for brain function.

When might you need to avoid milk?

Reasons why people don’t drink milk range from taste, personal preferences, animal welfare or environmental concerns. Or it could be due to health conditions or concerns about intolerance, allergy and acne.

Lactose intolerance

Lactose is the main carbohydrate in milk. It’s broken down in the simple sugars by an enzyme in the small intestine called lactase.

Some people are born without the lactase enzyme or their lactase levels decrease as they age. For these people, consuming foods containing a lot of lactose means it passes undigested along the gut and can trigger symptoms such as bloating, pain and diarrhoea.

Research shows smalls amounts of lactose – up to 15 grams daily – can be tolerated without symptoms, especially if spread out over the day. A cup of cows milk contains about 16 grams of lactose, while a 200g tub of yoghurt contains 10g, and 40g cheddar cheese contains less than 1g.

Cow’s milk allergy

Cow’s milk allergy occurs in about 0.5-3% of one year olds. By age five, about half are reported to have grown out of it, and 75% by adolescence. However, one survey found 9% of pre-school children had severe allergy with anaphylaxis.

Symptoms of cow’s milk allergy include hives, rash, cough, wheeze, vomiting, diarrhoea or swelling of the face.

Symptom severity varies, and can happen immediately or take a few days to develop. If a reaction is severe, call 000, as it can be a medical emergency.

Acne

The whey protein in cow’s milk products, aside from cheese, triggers an increase in insulin, a hormone that transports blood sugar, which is released into the blood stream.

Meanwhile, milk’s casein protein triggers an increase in another hormone, called insulin-like growth factor (IGF), which influences growth.

These two reactions promote the production of hormones called androgens, which can lead to a worsening of acne.

If this happens to you, then avoid milk, but keep eating hard cheese, and eat other foods rich in calcium regularly instead.

While milk can be problematic for some people, for most of us, drinking milk in moderation in line with recommendation is the way to go.


Clare Collins, Laureate Professor in Nutrition and Dietetics, University of Newcastle

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Facebook ads can be used to gauge cultural similarity between countries

The cultural similarity between countries and international migration patterns can be measured quite reliably using Facebook data, a new study reports.

Image via Pixabay.

“Cultural hotspot” isn’t the first thing that pops into mind when thinking about social media for most of us. However, new research from the Max Planck Institute for Demographic Research in Rostock, Germany shows that data from Facebook can be used to gauge cultural closeness between countries, and overall migration trends.

And the way to do it is to track ads for food and drink on the platform.

We are what we eat

“[A] few years ago, after reading a work of a colleague using data from the Facebook Advertising Platform, I was surprised to find how much information we share online and how much these social media platforms know about us,” said Carolina Coimbra Vieira, a Ph.D. student in the Laboratory of Digital and Computational Demography at the Max Planck institute and lead author of the research, in an email for ZME Science.

“After that, I decided to work with this social media data to propose new ways of answering old questions related to society. In this specific case, I wanted to propose a measure of cultural similarity between countries using data regarding Facebook users’ food and drink preferences.”

For the study, the team developed a new approach that uses Facebook data to gauge cultural similarity between countries, by making associations between immigration patterns and the overall preference for food and drink across various locations.

They employed this approach as migrants have a very important role to play in shaping cultural similarities between countries. However, they explain, it’s hard to study their influence directly, in part because it is hard to ‘measure’ culture reliably. The traditional way of gauging culture comes in the form of surveys, but these have several drawbacks such as cost, the chances of bias in question construction, and difficulties in applying them to a large sample of countries.

The team chose to draw on previous findings that show food and drink preferences may be a proxy for cultural similarities between countries, and build a new analytical method based on this knowledge. They drew on Facebook’s top 50 food and drink preferences in various countries — as captured by the Facebook Advertising Platform — in order to see what people in different areas liked to dine on.

“This platform allows marketers and researchers to obtain an estimate of the number of Facebook monthly active users for a proposed advertisement that matches the given input criteria based on a list of demographic attributes, such as age, gender, home location, and interests, that can be customized by the advertiser,” Vieira explained for ZME Science. “Because we focus on food and drink as cultural markers, we selected the interests classified by Facebook as related to food and drink. We selected the top 50 most popular foods and drinks in each one of the sixteen countries we analyzed to construct a vector indicator of each country in terms of these foods and drinks to finally measure the cultural similarity between them.”

In order to validate their findings, the team applied the method to 16 countries. They report that food and drink interests, as reflected by Facebook ads, generally align with documented immigration patterns. Preferences for foreign food and drink align with domestic preferences in the countries from which most immigrants came. On the other hand, countries that tend to have few immigrants also showed lower preferences for foreign foods and drinks, and were interested in a narrower range of such products more consistently.

The team cites the example of the asymmetry between Mexico and the U.S. as an example of the validity of their model. The top 50 foods and drinks from Mexico are more popular in the U.S. than the top 50 U.S. foods and drinks are in Mexico, they explain, aligning well with the greater degree of immigration coming from Mexico into the U.S. than the other way around.

All in all, the findings strongly suggest that immigrants help shape the culture of various countries. In the future, the team hopes to expand their methodology to include other areas of preference beyond food and drink, and see whether these align with known immigration patterns.

“The food and drink preferences shared by Facebook users from two different countries might indicate a high immigrant population from one country living in the other. In our results we observed that immigration is associated with higher cultural similarity between countries. For example, there are a lot of immigrants from Argentina living in Spain and our measure showed that one of the most similar countries to Spain is Argentina. This means that foods and drinks popular between Facebook users in Argentina are also really popular in Spain,” she adds.

“The most surprising aspect of this study is the methodology and more precisely, the data we used to study culture. Differently from surveys, our methodology is timely, [cost-effective], and easily scalable because it uses passively-collected information internationally available on Facebook.”

Overall, the researchers say, this study suggests that immigrants indeed help shape the culture of their destination country. Future research could refine the new method outlined in this study or repurpose it to examine and compare other interests beyond food and drink.

“I would like to see our proposed measure of cultural similarity being used in different contexts, such as to predict migration. For instance, it would be interesting to use our measure of cultural similarity to answer the question: Do the migrants prefer to migrate to a country culturally similar to their origin country?” Vieira concludes in her email.”More generally, I hope our work contributes to increasing the development of research using social media data as an alternative to complement more traditional data sources to study society.”

The paper “The interplay of migration and cultural similarity between countries: Evidence from Facebook data on food and drink interests” has been published in the journal PLoS ONE.

Shifting to a healthier diet can increase your lifespan by up to a decade

New research is showcasing how a more healthy, balanced diet — including more legumes, whole grains, and nuts, while cutting down on red and processed meat — can lead to longer lives.

Image via Pixabay.

“You are what you eat” is an age-old saying, but a new study from the University of Bergen says that we also live as long as what we eat. The healthier and more diverse our diets, the healthier and longer our life expectancy (LE) becomes, it reports.

The paper estimates the effect of such changes in the typical Western diets for the two sexes at various ages; the earlier these guidelines are incorporated into our eating habits, the larger the improvements in LE, but older people stand to benefit from significant (if smaller) gains as well.

Change your meals, enjoy more meals

“Our modeling methodology used data from [the] most comprehensive meta-analyses, data from the Global Burden of Disease study, life-table methodology, and added analyses on [the] delay of effects and combination of effects including potential effect overlap”, says Lars Fadnes, a Professor at the Department of Global Public Health at the University of Bergen who led the research, in an email for ZME Science.

“The methodology provides population estimates under given assumptions and is not meant as individualized forecasting, with uncertainty that includes time to achieve full effects, the effect of eggs, white meat, and oils, individual variation in protective and risk factors, uncertainties for future development of medical treatments; and
changes in lifestyle.”

Dietary habits are estimated to contribute to 11 million deaths annually worldwide, and to 255 million disability-adjusted life-years (DALYs). One DALY, according to the World Health Organization “represents the loss of the equivalent of one year of full health”. In other words, there’s a lot of room for good in changing what we eat.

The team drew on existing databases to develop a computerized model to estimate how a range of dietary changes would impact life expectancy. The model is publicly available as the online Food4HealthyLife calculator, which you can use to get a better idea of how changing what you eat can benefit your lifespan. The team envisions that their calculator would also help physicians and policy-makers to understand the impact of dietary choices on their patients and the public.

For your typical young adult (20 years old) in the United States, the team reports that changing from the typical diet to an optimal one (as described by their model) could provide an increase in LE of roughly 10.7 years for women and 13 years for men. There is quite some uncertainty in these results — meaning that increases for women range between 5.9 years and 14.1, and for men between 6.9 and 17.3 — due to the effect of factors that the model doesn’t factor in, such as preexisting health conditions, socioeconomic class, and so on. Changing diets at age 60 would still yield an increase in LE of 8 years for women and 8.8 years for men.

“The differences in life expectancy estimates between men and women are mainly due to differences in background mortality (and particularly cardiovascular disease such as coronary heart disease, where men generally are at higher risk at an earlier age compared to women),” prof. Fadnes explained for ZME Science.

The largest gains in LE would be made by eating more legumes, more whole grains, more nuts, less red meat, and less processed meat.

So far, the research focused on the impact of diet on LE, but such changes could be beneficial in other ways, as well. Many of the suggestions the team makes are also more environmentally sustainable and less costly, financially. The team is now hard at work incorporating these factors into their online calculator, in order to help people get a better understanding of just how changes in diet can improve their lives, on all levels involved.

“We are working to include sustainability aspects in Food4HealthyLife too. Based on former studies, the optimal diets are likely to have substantial benefits compared to a typical Western diet also in terms of reduction in greenhouse gas emissions, land use, and other sustainability facets,” he added for ZME Science. We have not systematically investigated financial aspects yet, but several of the healthy options could also be cheap, such as legumes and whole grains.”

The paper “Estimating the Impact of Food Choices on Life Expectancy: A Modeling Study” has been published in the journal PLoS Medicine.

Just one extra hour of sleep can help overweight people eat less

Credit: Pixabay.

Research conducted over the years has increasingly linked poor sleep (particularly sleeping less than the minimally recommended 7 hours per night) to the risk of weight gain over time. Not sleeping enough may result in hormonal imbalances that affect appetite, leading some to eat more than they normally would on a healthy sleep regimen.

To investigate in more detail how sleep affects calorie intake, researchers from the University of Chicago and the University of Wisconsin-Madison conducted a randomized clinical trial involving 80 young, overweight adults who habitually sleep less than 6.5 hours a night.

“Over the years, we and others have shown that sleep restriction has an effect on appetite regulation that leads to increased food intake, and thus puts you at risk for weight gain over time,” said lead investigator Esra Tasali, director of the UChicago Sleep Center at the University of Chicago Medicine. “More recently, the question that everyone was asking was, ‘Well, if this is what happens with sleep loss, can we extend sleep and reverse some of these adverse outcomes?’”

The volunteers were randomly split into two groups. One received personalized sleep hygiene counseling, which involved changing one’s routine to avoid the things that hinder sleep (caffeine in the evening, heavy meals close to bedtime, excessively warm bedroom, etc.) and introduce activities that aid sleep (going to bed at the same time, using your sleep only for sleep or sex, etc.). The other group received no intervention at all and acted as a control.

In the first two weeks, the researchers just gathered baseline information about sleep and calorie intake. Sleep patterns were measured using wearable devices while calorie intake was quantified using the “doubly labeled water” method. The doubly labeled water method is a trialed and tested urine-based test for objectively tracking calorie intake, which involves a participant drinking water in which some hydrogen and oxygen atoms have been replaced with stable isotopes that are easy to trace. With this technique, it is possible to measure every calorie a person burned over a one to two week interval, without having to hawkishly record everything a person puts into their mouths.

“This is considered the gold standard for objectively measuring daily energy expenditure in a non-laboratory, real-world setting and it has changed the way human obesity is studied,” said the study’s senior author Dale Schoeller, professor emeritus of nutritional sciences at UW–Madison.

A month after the study started, the researchers found that participants in the sleep intervention group managed to extend their sleep duration by an average of 1.2 hours. Compared to the control group, the sleep intervention reduced the participants’ daily calorie intake by 270 calories, the equivalent of a small meal.

Of important note is that this examination was performed in a real-world setting. Each volunteer slept in their own beds, ate what they wished, wasn’t prompted to exercise, and generally went about their day as they pleased and normally would. That’s in stark contrast to most weight loss studies that are generally short-lived and diligently measure calorie intake by making sure participants only consume a particular offered diet.

The only factor that was manipulated in the study was sleep duration, and this single aspect proved to have a significant impact on the participants’ calorie intake. If the average reduction in calorie intake of 270 calories per day is maintained over the long term, this would translate to roughly 12 kg (26 pounds) of weight loss over a three-year period. That’s on average; some participants consumed as many as 500 fewer calories per day.

“This was not a weight-loss study,” said Tasali. “But even within just two weeks, we have quantified evidence showing a decrease in caloric intake and a negative energy balance — caloric intake is less than calories burned. If healthy sleep habits are maintained over a longer duration, this would lead to clinically important weight loss over time. Many people are working hard to find ways to decrease their caloric intake to lose weight — well, just by sleeping more, you may be able to reduce it substantially.”

In the future, the researchers plan on studying the underlying mechanisms that may explain why more sleep can lead to weight loss. Previous research by Tasali and colleagues suggest that sleep is important for appetite regulation. Limited sleep may drive changes in appetite-regulating hormones and reward centers in the brain that could lead to overeating.

If you struggle with both your sleep and weight, these findings suggest a simple intervention could do wonders: just sleep more. That’s harder than it sounds, but with some hard work, it is possible. According to the researchers, limiting the use of electronic devices before bedtime was a key intervention.

Here are a few tips that may help you clock in more hours of sleep:

  1. Go to sleep at the same time each night, and get up at the same time each morning, even on the weekends.
  2. Don’t take naps after 3 p.m, and don’t nap longer than 20 minutes.
  3. Stay away from caffeine and alcohol late in the day.
  4. Avoid nicotine completely.
  5. Get regular exercise, but not within 2-3 hours of bedtime.
  6. Don’t eat a heavy meal late in the day. A light snack before bedtime is OK.
  7. Make your bedroom comfortable, dark, quiet, and not too warm or cold.
  8. Follow a routine to help you relax before sleep (for example, reading or listening to music). Turn off the TV and other screens at least an hour before bedtime.
  9. Don’t lie in bed awake. If you can’t fall asleep after 20 minutes, do something calming until you feel sleepy, like reading or listening to soft music.
  10. Talk with a doctor if you continue to have trouble sleeping.

The findings of the new study appeared in the journal JAMA Internal Medicine.

What is vitamin K?

Vitamin K plays a key role in our blood’s ability to form clots. It’s one of the less glamorous vitamins, more rarely discussed than its peers and, although it’s usually referred to as a single substance, it comes in two natural varieties — K1 and K2 — and one synthetic one, K3. People typically cover their requirements of vitamin K through diet, so it’s rarely seen in supplement form, but we’ll also look at some situations that might require an extra input of vitamin K.

A molecule of menatetrenone, one of the forms of vitamin K2. Image via Wikimedia.

The ‘K’ in vitamin K stands for Koagulations-vitamin, Danish for ‘coagulation vitamin’. This is a pretty big hint as to what these vitamers — the term used to denote the various chemically-related forms of a vitamin — help our bodies do. Vitamin K is involved in modification processes that proteins undergo after they have been synthesized, and these proteins then go on to perform clotting wherever it is needed in our blood. Apart from this, vitamin K is also involved in calcium-binding processes for tissues throughout our bodies, for example in bones.

Although we don’t need very high amounts of vitamin K to be healthy (relative to other vitamins), a deficiency of it is in no way a pretty sight. Without enough vitamin K, blood clotting is severely impaired, and uncontrollable bleeding starts occurring throughout our whole bodies. Some research suggests that a deficiency of this vitamin can also cause bones to weaken, leading to osteoporosis, or to the calcification of soft tissues.

What is vitamin K?

Chemically speaking, vitamin K1 is known as phytomenadione or phylloquinone, while K2 is known as menaquinone. They’re quite similar from a structural point of view, being made up of two aromatic rings (rings of carbon atoms) with a long chain of carbon atoms tied to one side. K2 has two subtypes, one of which is longer than the other, but they perform the same role in our bodies. The K1 variety is the most often seen one in supplements.

Vitamin K3 is known as menadione. It used to be prescribed as a treatment for vitamin K deficiency, but it was later discovered that it interfered with the function of glutathione, an important antioxidant and key metabolic molecule. As such, it is no longer in use for this role in humans.

They are fat-soluble substances that tend to degrade rapidly when exposed to sunlight. It also breaks down very quickly and is excreted quickly in the body, so it’s exceedingly rare for it to reach toxic concentrations in humans. Vitamin K is concentrated in the liver, brain, heart, pancreas, and bones.

Sources

Vitamin K is abundant in green, leafy vegetables, where it is involved in photosynthesis. Image credits Local Food Initiative / Flickr.

As previously mentioned, people tend to get enough vitamin K from a regular diet.

Plants are a key synthesizer of vitamin K1, especially their tissues which are directly involved in photosynthesis; as such, mixing leafy or green vegetables into your diet is a good way to access high levels of the vitamin. Spinach, asparagus, broccoli, or legumes such as soybeans are all good sources. Strawberries also contain this vitamin, to a somewhat lesser extent.

Animals also rely on this vitamin for the same processes human bodies do, so animal products can also be a good source of it. Animals tend to convert the vitamin K1 they get from eating plants into one of the varieties K2 (MK-4). Eggs or organ meats such as liver, heart, or brain are high in K2.

All other forms of K2 vitamin are produced by bacteria who produce it during anaerobic respiration. As such, fermented foods can also be a good source of this vitamin.

Some of the most common signs of deficiency include:

  • Slow rates of blood clotting;
  • Long prothrombin times (prothrombin is a key clotting factor measured by doctors);
  • Spontaneous or random bleeding;
  • Hemorrhaging;
  • Osteoporosis (loss of bone mass) or osteopenia (loss of bone mineral density).

Do I need vitamin K supplements?

Cases of deficiency are rare. However, certain factors can promote such deficiencies. Most commonly, this involves medication that blocks vitamin K metabolism as a side-effect (some antibiotics do this) or medical conditions that prevent the proper absorption of nutrients from food. Some newborns can also experience vitamin K deficiencies as this compound doesn’t cross through the placenta from the mother, and breast milk only contains low levels of it. Due to this, infants are often given vitamin K supplements.

Although it is rare to see toxicity caused by vitamin K overdoses, it is still advised that supplements only be taken when prescribed by a doctor. Symptoms indicative of vitamin K toxicity are jaundice, hyperbilirubinemia, hemolytic anemia, and kernicterus in infants.

Vitamin K deficiencies are virtually always caused by malnourishment, poor diets, or by the action of certain drugs that impact the uptake of vitamin K or its role in the body. People who use antacids, blood thinners, antibiotics, aspirin, and drugs for cancer, seizures, or high cholesterol are sometimes prescribed supplements — again, by a trained physician.

How was it discovered?

The compound was first identified by Danish biochemist Henrik Dam in the early 1930s. Dam was studying another topic entirely: cholesterol metabolism in chickens. However, he observed that chicks fed with a diet low in fat and with no sterols had a high chance of developing subcutaneous and intramuscular hemorrhages (strong bleeding under the skin and within their muscles).

Further studies with different types of food led to the identification of the vitamin, which Dam referred to as the “Koagulations-Vitamin”.

Some other things to know

Some of the bacteria in our gut help provide us with our necessary intake of vitamin K — they synthesize it for us. Because of this, antibiotic use can lead to a decrease in vitamin K levels in our blood, as they decimate the populations of bacteria in our intestines. If you’re experiencing poor appetite following a lengthy or particularly strong course of antibiotics, it could be due to such a deficiency. Contact your physician and tell them about your symptoms if you think you may need vitamin K supplements in this situation; it’s not always the case that you do, but it doesn’t hurt to ask.

Another step you can take to ensure you’re getting enough vitamin K is to combine foods that contain a lot of it with fats — as this vitamin is fat-soluble. A salad of leafy greens with olive oil and avocado is a very good way of providing your body with vitamin K and helping it absorb as much of it as possible.

Childhood obesity is a growing problem. Soda picture warnings could help fight it

Childhood obesity is on the rise. In the US, the country most affected by it, 1 in 5 children and teenagers are obese — even among the 2 to 5 age group, the obesity incidence in the US is 13.4%. But a new strategy could help counteract that: labeling sodas like cigarettes, with health warnings, could reduce children’s sugar uptake.

As most parents will tell you, kids love their sugary drinks. The problem is that these drinks have no real nutritional value, and contain a lot of sugar.

Marissa G. Hall, assistant professor in Gillings’ Department of Health Behavior and lead author of the new study, says sugary drinks have become a major problem for kids. They’re contributing to type 2 diabetes, cavities, and a number of other health problems. Hall and colleagues wanted to see whether label warnings could convince parents to make healthier choices for their children and reduce the consumption of sugary drinks.

“Health warning labels with images (pictorial warnings) are a proven strategy for reducing smoking, but haven’t been extensively tested for sugary drinks,” Hall told ZME Science. We wanted to conduct a study to understand whether pictorial warnings could shift parents to make healthier drink choices for their kids. We found that the warnings worked – parents who saw the warnings were more likely to choose a healthy beverage for their child than parents who did not see the warning.”

They carried out the study in a unique laboratory — the “UNC Mini Mart.” The space is essentially a replica of a convenience store, mimicking the experience of shoppers. This then allows researchers to see how different tweaks and changes would affect shopper behavior.

In particular, they wanted to see if visual labels, warning buyers of the health problems associated with sugar intake, would reduce the consumption of sodas. They did.

“In our study, we used images visually depicting type 2 diabetes and heart damage,” Hall explained. “These images performed well when we pre-tested them. We wanted to use large, attention-grabbing images following best practices for warning label design. Our studies have found that images are most effective when they attract attention and make people think about the harms of the product. “

In the study, 326 parents (25% of which were Black, and 20% of which were Latino) participated in a trial in which drink labels either presented health warnings (of diabetes or heart damage) or a barcode. Participants were then instructed to choose one drink and one snack for their child, as well as a household good. After the survey, participants completed a survey about their collection and were allowed to keep their drink of choice (as well as a small cash reward).

Overall, there was a 17% reduction in purchases of sugary drinks — 45% of parents in the control study bought a sugary drink for their child, compared to 28% in the pictorial warning group.

Hall says authorities could take note of this and implement policies to reduce the consumption of sugary drinks.

“Health warning labels for sugary drinks could be mandated at the local or state level, or by the federal government. So far, 7 US states have proposed health warnings for sugary drink containers. And 7 countries currently require similar warning labels on unhealthy foods and beverages. So warning labels policies are gaining momentum here in the US and worldwide. Our research shows that implementing warning labels on sugary drinks encourages parents to select healthier beverages for their children, the first time this had been shown in a randomized trial,” the researcher concludes.

The study was published in PLOS Medicine.

Regular Vitamin D and Omega-3 intake reduces risk of autoimmune disease

Autoimmune diseases, where the immune system attacks healthy cells by mistake, are a growing problem. In the industrialized world, they are the third leading cause of overall morbidity and the leading cause of morbidity among women. Many times, there’s no effective cure for such diseases and the best available approach is only aimed at reducing symptoms.

Previous studies have suggested that vitamin D could have an effect on autoimmune disease. To put that to the test, led by Jill Hahn, researchers from Harvard and Brigham and Women’s Hospital, Boston, carried out a study on almost 26,000 participants (mean age 67.1 years), analyzing the effect that vitamin D and omega-3 supplements have on autoimmune disease.

They split participants randomly into two groups, giving them either vitamin D supplements or a placebo. They then looked at the differences between these two groups. It was a randomized, double-blind trial — in which neither the participants nor the researchers knew who was in which group, to remove any potential bias and influence from the study.

The team then tracked participants for five years.

Overall, a dose of 2000 international units (IU) of vitamin D per day reduced the risk of developing an autoimmune disease by 22% compared to the placebo group. Surprisingly, when researchers looked at only the last three years of the intervention, the vitamin D group had 39% fewer participants that developed autoimmune diseases.

This dose is larger than the standard 400-800 IU recommended by most health organizations. However, consuming 1,000 IU or even more has been linked to various health benefits, and there seem to be no negative side effects up to 4,000 IU.

We get vitamin D from two sources: from foods, and from the Sun. Foods such as fatty fish, cheese, or mushrooms are rich in vitamin D. Your body also produces vitamin D when exposed to sunlight, which is why it’s recommended that especially in places with lower exposure to sunlight, either foods or supplements rich in vitamin D should be considered.

Previous studies have suggested that vitamin D levels are correlated with the risk and severity of autoimmune diseases, as well as several other diseases such as MS, heart disease, and influenza. A Danish study even found a 49% reduction in rheumatoid arthritis risk for each 30 g increase in daily fatty fish intake. However, the mechanisms through which vitamin D exhibits these protective effects are unclear, and researchers warn that you shouldn’t rush to take as much Vitamin D as you can.

“I would say everybody should talk to their doctor first before taking 2000 international units of vitamin D on top of whatever else you’re taking,” study author Dr. Karen Costenbader, a professor of medicine at Harvard Medical School in the division of Rheumatology, Inflammation and Immunity and the director of the lupus program at Brigham and Women’s Hospital in Boston, told CNN. “And there are certain health problems such as kidney stones and hyperparathyroidism (a rise in calcium levels), where you really shouldn’t be taking extra vitamin D.” Speaking to NewScientist, Costenbader added that there are several known potential mechanisms, such as, for instance, vitamin D aiding the immune system tell the difference between healthy cells and disease-causing microbes.

The study also reported a possible link between omega-3 supplements and a reduction in autoimmune disorders, though this was not immediately apparent. It was only when researchers also considered the possible cases of autoimmune disease (and not just the confirmed cases) that the association emerged.

Now, the researchers are working on another survey that also includes younger participants to assess the impact of vitamin D on autoimmune diseases and see how long the benefits last.

The study was published in the British Medical Journal.

Ethiopian ‘false bananas’ could be the new supercrop we’ve been waiting for against climate change

Bananas versus enset. Credit: RBG KEW.

Enset is a very close relative of the banana that’s grown and consumed in some parts of Ethiopia. Outside the Horn of Africa, especially in the West, virtually no one has heard of this crop, which locals have been using for centuries to make porridge and bread. Pay attention though: enset could become a new staple across the world. Scientists claim that enset is highly resilient to climate change and could help feed more than 100 million people, boosting food security in regions where conventional crops are threatened by rising temperatures and extreme weather.

The tree against hunger

According to the Intergovernmental Panel on Climate Change, global temperature is expected to reach or exceed 1.5°C of heating, averaged over the next 20 years. As temperatures increase, crop yields for the world’s most essential crops, which provide over 66% of the calories people across the globe consume, are expected to decrease. Maize yields, for instance, could plummet by 24% as early as 2030 under a high greenhouse gas emissions scenario.

Climate change disproportionately affects sub-Saharan African countries because their economies are highly dependent on rain-fed agriculture. It is therefore likely that the agriculture sector, which provides essential food for human consumption and feed for livestock, will undergo an important transformation in order to withstand the impacts of climate change and protect the livelihoods of farmers. Such transformation may involve introducing new crops that are currently not being rotated — and this is where enset may come in.

A farm in the southern Ethiopian highlands. Credit: Richard Buggs, RBG Kew.

Enset (Ensete ventricosum) is a perennial crop that fruits only once in its 10-year-long life cycle. It is known as the Ethiopian banana, Abyssinian banana, or false banana due to its morphological resemblance with the banana. The crop, which was domesticated some 8,000 years ago, is widely cultivated in the south and southwestern parts of Ethiopia, representing a traditional staple for about 20 million people. A multipurpose crop, enset is also utilized to feed animals, make clothes and mats from its fibers, and build dwellings.

However, unlike sweet bananas, which are widely farmed for their fruits, people in Ethiopia disregard the enset fruit and use its starchy stems and roots instead, from which they make porridge and bread.

There are a number of reasons why enset may boost food security. It grows over a relatively wide range of conditions, is somewhat drought-tolerant, and can be harvested at any time of the year, over several years. It provides an important dietary starch source, as well as fibers, medicines, animal fodder, roofing, and packaging. The crop also stabilizes soils and microclimates. These attractive qualities have earned it the nickname the ‘tree against hunger’.

Although enset is a hugely underrated food crop, not much research regarding its potential to feed a wider population has been conducted until recently. Dr. James Borrell, of the Royal Botanic Gardens, Kew has run agricultural surveys and modeling work to investigate what the potential range of enset could look like over the next four decades, and the findings are very encouraging.

The researchers found that the crop could feed at least 100 million people in the coming decades, boosting food security not just in Ethiopia but other vulnerable African countries, such as Kenya, Uganda, and Rwanda. Writing in the journal Environmental Research Letters, the authors believe enset could supplement our diets and offset expected yield losses of rice, wheat, and maize due to climate change.

“We need to diversify the plants we use globally as a species because all our eggs are in a very small basket at the moment,” said Dr. Borrell.

Non-vegetarians more likely to opt for plant-based options when the menu is 75% vegetarian

Meat production is taxing on the environment, and if we want to reduce our carbon footprint and tackle climate change, we need as many people to cut down on their meat consumption as possible. Restaurants and cafeterias can play a role in this — firstly, by offering plant-based alternatives.

Many restaurants and even fast food places already offer at least one vegetarian or vegan option, which is a good start, especially for those who regularly opt for such options. But could more plant-based options push more meat-eaters to go for a veggie option, at least once in a while?

With this question in mind, a team of researchers from the University of Westminster carried out an experiment in which menus where 75%, 50%, or 25% of items were vegetarian were allocated to 468 participants. The menus looked like this:

Participants were either given a menu where A) 75% of the dishes were meat based and 25% vegetarian B) 50% of the dishes were meat based and 50% vegetarian of C) 25% of the dishes were meat and 75% were vegetarian. Credits: Parkin & Atwood (2021).

Researchers wanted to see whether having access to more vegetarian options makes a significant difference — apparently, it did, but only at 75% vegetarian options.

“We show that meat eaters were significantly more likely to choose a vegetarian meal when presented with a menu with 75% vegetarian items, but not when half (50%) were vegetarian,” the study notes.

There are significant shortcomings of the study — the fact that it has a small sample size, the fact that the sample size may not be representative for the entire population, the fact that the type of menu may also play a role — but researchers say that this study shows that interventions that offer more vegetarian options can push consumers can make towards more sustainable, low-meat and low-carbon options.

Dr. Beth Parkin, lead author of the study from The University of Westminster, said:

“This intervention shows the potential that the food service sector has in creating large-scale shifts to encourage meat eaters to change their preferences. The findings provide practical instruction on what percentage of their food offerings should be vegetarian if they are to succeed in encouraging sustainable eating behaviors. If the food service industry is to decrease its carbon footprint, they need to act by providing far more plant-based items than currently on offer.”

The meat and dairy industries account for nearly 60% of our agriculture emissions, or 15-20% of our total, planetary greenhouse gas emissions. It’s also one of the most impactful changes we, as individual consumers can do. Diet changes are paramount to avoiding catastrophic climate change, a growing body of scientific evidence is showing. This type of menu intervention can help reduce this negative impact, the researchers conclude.

The study has been published in the Journal of Environmental Psychology.

Cases of eating disorders have doubled in the US during the pandemic

The number of hospitalizations for health disorders has doubled across the US during the pandemic (between January 2018 and December 2020), according to new research. The largest part of this increase was represented by cases of anorexia or bulimia.

Image via Pixabay.

Despite this, other common behavioral health conditions such as depression, alcohol use, or opioid use disorder, haven’t registered any meaningful changes during this time.

Eating issues

“This pandemic era is going to have some long-term impacts on the course of disease and the course of weight over the lifespan,” says Kelly Alison, Ph.D., Director of the Center for Weight and Eating Disorders at the University of Pennsylvania, co-author of the paper. “What that does for eating disorders? We just don’t know.”

Although the team can’t yet tell what the cause of this increase is, they believe that we’re looking at the combined effect of several factors ranging from the toll the pandemic has taken on our mental health, an outsized focus on weight gain in parallel with constantly viewing ourselves on video calls, and even symptoms of COVID-19 itself. There is also very little data on how this trend will affect public health in the long run.

The study included data from over 3.2 million individuals across the U.S., with a mean age of 37.7 years old. According to the findings, the number of inpatient care cases for eating disorders remained pretty stable over time, at approximately 0.3 cases per 100,000 people per month, until May 2020. At that date, the number of cases doubled, to 0.6 per 100,000. This increase was registered across anorexia nervosa, bulimia nervosa, and other and unspecified eating disorders.

The average length of inpatient stays for such cases has also increased. This was on average 9 days and 8 days between June to December of 2018 and 2019, respectively, going up to 12 days between June and December of 2020. A similar increase was not seen for the 3 behavioral health conditions used as controls over the same timeframe.

As far as outpatient care cases for eating disorders have increased from around 25 per 100,000 people per month to 29 per 100,000. The age range of inpatient patients ranged from 12 to 20 pre-pandemic, rising to 18 to 28 after its onset.

The average length of inpatient stays for such cases has also increased. This was on average 9 days and 8 days between June to December of 2018 and 2019, respectively, going up to 12 days between June and December of 2020. A similar increase was not seen for the 3 behavioral health conditions used as controls over the same timeframe.

Stress caused by the pandemic and the changes it caused in our lives could be one of the drivers of this increase, the team reports. Additionally, the shift towards video calls for conferences at work gives us ample opportunity to look at ourselves, which can create a further drive towards the development of eating disorders.

“During the pandemic, having a lack of routine and structure primed us in terms of our behaviors around food,” says Ariana Chao, Ph.D., from Penn’s School of Nursing.

Social media reflects this increase in self-scrutiny and concerns regarding weight, the authors report. As far as eating disorders are concerned, discussions about weight can be “very triggering”, Allison explains, so social media can create a lot of stress in patients at risk. Different people handle this stress differently, the team adds, with some binge eating, while others didn’t eat enough.

For now, it’s not clear whether the rising trend in eating disorder cases will continue after the pandemic. The present study is based on data up to December 2020, so it’s missing the latest part of the picture. The team is now hard at work analyzing data recorded well into 2021 to see how these trends are evolving.

“We really need more research,” says Chao. “Adversity can be a long-term predictor of developing eating disorders. Even the transition back to ‘normal’ can exacerbate eating disorders. Everything is changing so rapidly. Then again, people are also resilient. It’s hard to say what the long-term implications will be.”

The paper “Trends in US Patients Receiving Care for Eating Disorders and Other Common Behavioral Health Conditions Before and During the COVID-19 Pandemic” has been published in the journal JAMA Network Open Psychiatry.

There are pesticides inside your body — but an organic diet can flush them out

A study following four American families for two weeks found that everyone’s bodies contain chemicals — but they can be flushed out.

Image in Creative Commons

To feed the almost 8 billion people around the world (and furthermore, ensure rich and diverse diets for the richer countries), we’re spraying our crops with an impressive amount of pesticides. In one way, this is extremely helpful, drastically reducing the negative impact of pests and diseases that have plagued our crops for millennia. But there’s a downside to pesticides as well: we may end up ingesting them, and they could be harmful to our health.

Roundup, the world’s most widely used weedkiller, contains a compound called glyphosate. There’s a lot of scientific debate regarding the actual perils of glyphosate, but the compound was flagged as a potential carcinogen as far back as 1983 by the US Environmental Protection Agency (EPA). Although there is contradictory evidence regarding the health dangers posed by this chemical, public debate has been shaped by the corporate lobby just as much as (if not more than) scientific evidence. As a result, the EPA has raised the accepted level of glyphosate substantially (in some cases by a factor of 300) above levels considered safe in 1990. Unsurprisingly, the presence of glyphosate in average Americans has also skyrocketed, from 12% in the mid-1970s to 70% by 2014.

The new study paints an even more concerning picture, as all members of the four families tracked contained glyphosate. The family members were tracked for six days on their regular diet, and for six days, they were asked to switch to a completely organic diet (various places have various definitions for what organic really means, but in this case, it was pesticide-free food). In just six days, the level of glyphosate in their bodies dropped by 70%.

We’ve written before about organic diets and how the alleged benefits of such diets are often exaggerated and blown out of proportion, but this study seems to show a tangible benefit. There is still a debate on whether or not the pesticides are causing any actual damage, but the authors of the study believe the pesticide levels can be hazardous, especially in children.

“While food residues often fall within levels that regulators consider safe, even government scientists have made it clear that US regulations have not kept pace with the latest science,” one of the study authors writes in an article for The Guardian. “For one, they ignore the compounding effects of our daily exposures to a toxic soup of pesticides and other industrial chemicals. Nor do they reflect that we can have higher risks at different times in our lives and in different conditions: a developing fetus, for instance, is particularly vulnerable to toxic exposures, as are children and the immunocompromised. Instead, US regulators set one “safe” level for all of us. New research also shows that chemicals called “endocrine disruptors” can increase the risk of cancers, learning disabilities, birth defects, obesity, diabetes, and reproductive disorders, even at incredibly small levels. (Think the equivalent of one drop in 20 Olympic-sized swimming pools.).”

Other areas have implemented stricter controls on how much glyphosate and other pesticides are used. In the EU, while the use of glyphosate isn’t banned, it’s limited to a much lower amount than in the US, and it’s not just this particular pesticide.

The US allows 70 pesticides currently banned in the EU, pesticides that can harm not only humans but also bees and other pollinators, causing a chain reaction that can lead to ecosystem collapse. Pesticides also have many negative environmental effects, including contributing to climate change — A report from the Intergovernmental Panel on Climate Change finds that about 30% of global emissions leading to climate change are attributable to agricultural activities, including pesticide use.

The big problem with organic food is that it’s expensive, but this could be addressed by shifting subsidies from pesticides to organic-based agriculture, the researchers believe. The US spends billions of dollars to support pesticide-based agriculture, while organic agriculture is woefully underfunded, despite growing demand.

However, organic food also tends to use up more land and water and comes with its own set of challenges. The relationship between health, organic agriculture, and the environment is complex but, if nothing else, it seems to be able to help to reduce our internal pesticide content. The problem is that for many, organic food is a luxury, or at best, a preference — when it could be seen as a public good. To make matters even more complicated, organic products are almost always significantly more expensive than non-organic products, and the benefits of organic food are often exaggerated. Just like the Green Revolution of the 1970s that brought pesticides, a new revolution could ensure that organic food becomes more and more available for more and more people, boosting public health and saving money in the long run by eliminating cancers and other health problems associated with pesticides (which can be very expensive to treat). 

Ultimately, agriculture is bound to be complex in our modern world, and the line between “good” and “bad” is not always as clear as we’d want it to be. This new study offers important information regarding the real benefits of organic foods, and while results will need to be confirmed in larger studies, it’s still important to consider these advantages.

Healthier, more nutritious diets have a lower environmental impact — at least in the UK

More nutritious and healthy diet options can also help the climate, says a new analysis from the University of Leeds.

Image via Pixabay.

Our combined dietary habits can be a significant source of greenhouse gas emissions. Worldwide, food production accounts for roughly one-third of all emissions. This isn’t very surprising, since everybody needs to eat; but there are little tweaks we can apply to our lives which, added up, can lead to significant benefits for the climate.

New research at the University of Leeds reports that more nutritious, less processed, and less energy-dense diets can be much more sustainable from an environmental point of view than more common alternatives. While “less energy-dense” might sound like a bad thing, calorie content doesn’t translate into nutrient content. In other words, many energy-rich foods may actually just leave us fatter and malnourished.

Clean dining

“We all want to do our bit to help save the planet. Working out how to modify our diets is one way we can do that,” the authors explain. “There are broad-brush concepts like reducing our meat intake, particularly red meat, but our work also shows that big gains can be made from small changes, like cutting out sweets, or potentially just by switching brands.”

Similar analyses of the impacts of dietary options on the environment have been performed in the past. While their findings align well with the conclusions of the study we’re discussing today, they focused on broad categories of food instead of specific items. The team wanted to improve the accuracy of our data on this topic.

For the study, they pooled together published research on greenhouse gas emissions associated with food production to estimate the environmental impact of 3,233 specific food items. These items were selected from the UK Composition Of Foods Integrated Dataset (COFID). This dataset contains nutritional data regarding every item on the list and is commonly used to gauge the nutritional qualities of individuals’ diets.

The team used this data to evaluate the diets of 212 participants, who were asked to report what foods they ate during three 24-hour periods. In the end, this provided a snapshot of each participant’s usual nutritional intake and the greenhouse emissions generated during the production phase of all the items they consumed.

What the results show, in broad strokes, is the environmental burden of different types of diets, broken down by their constituent elements.

According to the findings, non-vegetarian diets had an overall 59% higher level of greenhouse gas emissions compared to vegetarian diets. This finding isn’t particularly surprising; industrial livestock farming is a big consumer of resources such as food and water and produces quite a sizeable amount of emissions from the animals themselves, the production of fodder, and through the processing and storage of meat and other goods.

Overall men’s diets tended to be associated with higher emissions — 41% more on average than women’s diets — mainly due to higher meat consumption.

People who exceeded the recommended sodium (salt), saturated fat, and carbohydrate intake as set out by World Health Organization guidelines generated more emissions through their diets than those who did not.

Based on these findings, the authors offer their support for policies aimed at encouraging sustainable diets, especially those that are heavily plant-based. One other measure they are in support of is policy that promotes the replacement of coffee, tea, and alcohol with more sustainable alternatives.

The current study offers a much higher-resolution view of the environmental impact of different food items, but it is not as in-depth as it could be. In the future, the authors hope to be able to expand their research to include elements such as brand or country of origin to help customers better understand what choices they’re making. They also plan to include broader measures of environmental impact in their analyses, not just greenhouse gas emissions.

For now, the findings are based only on data from the UK, so they may not translate perfectly to other areas of the globe.

The paper “Variations in greenhouse gas emissions of individual diets: Associations between the greenhouse gas emissions and nutrient intake in the United Kingdom” has been published in the journal PLOS One.

Ruby chocolate: not just color, but actually a different type of chocolate

If you haven’t been paying attention, you may have missed the appearance of a new type of chocolate: ruby chocolate. Developed in 2017 by a Swiss chocolatier, this variety officially became the fourth type of chocolate, after white, dark, and milk chocolate.

The main ingredients of ruby chocolate are cocoa butter, cocoa mass, milk, sugar, and citric acid.

Chocolate fans, rejoice

Swiss chocolatier Barry Callebaut is a household name in the chocolate sphere. The company not only supplies chocolate to the likes of Nestle, Hershey, Unilever, and Mondelez, but many top restaurants and professionals use the Callebaut for its classic taste that has remained unchanged for decades — and the company still uses whole-bean roasting, instead of roasting cocoa kernels, just as it did 100 years ago. But Callebaut also likes to try out new things.

The variety has been under study for over 10 years, ever since chocolate experts at Barry Callebaut noticed that components of certain cocoa beans could produce chocolate with an unusual color and flavor. Like grapes for fine wine, these cocoa beans had to be selected and cared for under special climate conditions. According to Callebaut, these conditions can be found in Ecuador, Brazil, or the Ivory Coast.

Although the exact production method of ruby chocolate is a trade secret, some publications believe they’ve zoomed in on the source of ruby cocoa: beans from a variety called Brazil Lavados, which can have a natural pinkish color and a sour, delicate taste. The beans take a more intense color after being treated with an acid, and after they are defatted (a standard process in the chocolate industry), they revert to a truly pink color.

It’s the first new type of chocolate developed in over 80 years after white chocolate was introduced in 1936. The ruby chocolate also has a unique taste. It’s completely unlike dark or milk chocolate, and only bears a slight resemblance to white chocolate, but boasts a berry-type flavor and a slight tart aftertaste. It’s not overly sweet and carries an overall slight aroma.

“Ruby chocolate is an intense sensorial delight. A tension between berry-fruitiness and luscious smoothness,” they write in a press release. “Ruby chocolate is made from the Ruby cocoa bean; through a unique processing, Barry Callebaut unlocks the flavor and color tone naturally present in the Ruby bean. No berries or berry flavor is added. No color is added.”

A 35 gram broken Ruby chocolate bar containing caramelized almonds and pistachios, produced by Confiserie Bachmann in Lucerne, Switzerland. Image credits: Zacharie Grossen.

Of course, developing the first new type of chocolate in almost a century can bring in a lot of money, in addition to bragging rights. So many, including ourselves, were skeptical and thinking that this could be little more than marketing. However, several studies analyzing ruby chocolate noted its particularities.

Ruby chocolate science

A study from 2019 compared ruby chocolate to its dark, white, and milk counterparts. The researchers noted that ruby chocolate does exhibit a different phenolic content than all the other types of chocolate (phenols are mildly acidic aromatic compounds), ranging somewhere between milk and white chocolate. The researchers also added that ruby chocolate has a higher content of specific compounds (such as flavan-3-ols and proanthocyanidins).

However, when the researchers subjected the chocolate to a sensory assessment, they found that ruby chocolate was the least desirable type of chocolate of the attempted ones; it even fared worse than white chocolate with added berries.

“The panel was formed of 20 trained personnel, 15 female and 5 male members, who had previous experience in the assessment of confectionery products,” the study reads.

“Semisweet and dark chocolate obtained the highest score in chocolate distinctive odour, while for the same attribute, Ruby chocolate was estimated as least preferable chocolate. White chocolate with strawberry was used because of similar sensory characteristics as Ruby chocolate, regarding taste and fruity odour, and was rated with a higher score compared to Ruby. The highest intensity of acidity was determined in Ruby chocolate, which is its main characteristic. All estimated sensory attributes were scored the best for the semisweet chocolate, while Ruby chocolate was least acceptable chocolate.”

Another study from 2021 confirmed the distinctive chemical components of ruby chocolate, analyzing its chemistry in unprecedented detail.

“The data show that a wide range of phytochemicals, present in the “conventional” dark and milk chocolates are present in ruby chocolate too. Most interesting is the finding that proanthocyanidins [a class of polyphenols found in many plants, such as cranberry, blueberry, and grape seeds] of the A-type appear to be characteristic for ruby chocolate, while B-type proanthocyanidins were found mainly in the dark chocolate,” the study read.

Researchers essentially confirmed that ruby chocolate is indeed a distinct type of chocolate.

“Ruby chocolate contained higher levels of epicatechin and procyanidin B2, compared to milk chocolate, which may be the result of a shorter, or no fermentation of the cocoa beans starting material used for the production of ruby chocolate. Moreover, the ruby chocolate was the only chocolate in which caffeic acid could be quantified,” the team noted.

However, the team made no claims regarding the quality or overall appeal of ruby chocolate. All in all, although genetically, the cocoa beans used to produce ruby chocolate are not genetically different from others used to create other types of chocolate, the way they are selected and processed leads to a product that is indeed chemically distinct.

The future of ruby chocolate

Ruby chocolate is already present in different types of products.

So where does this leave ruby chocolate? The product is still relatively new, but it’s been penetrating quite a few markets already. The first mass-market release was in January 2018, when it was introduced as a new flavor of Kit Kat in Japan and South Korea. Nestlé, the manufacturers of Kit Kat had an exclusive 6-month deal for the use of pink chocolate, but that has since expired and several companies in different countries have already started selling pink chocolate products. It’s not just straight chocolate, either. For instance, Magnum sells ice cream bars dipped in ruby chocolate, while Costa and Starbucks are each selling ruby chocolate-based drinks. It’s not exactly common, and its relatively low supply still limits production and distribution, but ruby chocolate seems to be catching on.

Regulators are also taking it seriously. The US Food and Drug Administration, for instance, set a standard for ruby chocolate — it must contain a minimum of 1.5% nonfat cacao solids and a minimum of 20% of cacao fat by weight. It also cannot contain flavors that mimic milk, butter, fruits, or additional coloring.

Whether or not ruby chocolate truly becomes a staple remains to be seen, but so far, the future looks promising. It will likely retain its novelty or delicacy status for some time, but it’s not unlikely for it to become as diversified as white chocolate.

However, it will likely be plagued by what many experts see as a chocolate crisis on the horizon. Cocoa beans require very specific conditions (and ruby beans even more so), and climate change is reducing their habitat more and more, basically pushing producers into a corner. If you ever needed another reason for fighting climate change, here it is: it’s coming for our chocolate.

Scientists figure out a way to add fat to lab-grown meat

A research team has simultaneously engineered both muscle tissue and fat from sampled cattle cells, an achievement that could eventually bring higher quality cultured meat to dinner tables.

Image credits: Naraoka et al.

As people are becoming more and more aware of the negative environmental and ethical problems associated with meat consumption, the alternative meat industry is booming. Veggie burgers have become commonplace in many places, and meat alternatives are only becoming more and more diversified. Until now, these alternatives only mimicked the properties of meat. But soon, meat alternatives could be actually meat.

Lab-grown meat, meat grown from animal cells without actually killing animals, is not only more ethical, but perhaps also more environmentally friendly, producing less CO2 emissions and using less water and soil than traditional meat productin. Since the industry is just starting out, we don’t know exactly how eco-friendly it would be, but there are already reasons for optimism.

“The current process of meat production using livestock has significant effects on the global environment, including high emissions of greenhouse gases. In recent years, cultured meat has attracted attention as a way to acquire animal proteins,” write the authors of a new study.

Whether or not the lab-grown meat industry will succeed, though, will likely depend on two things: price and taste/texture.

The price is already looking pretty decent. Although it’s not quite at the same price as regular meat, lab-grown meat has gone from $325,000 a burger in 2013 to around $10 in 2020. In Singapore, the only place that has currently regulated lab-grown meat and is selling it so far, a serving of chicken nuggets goes for $23 — it’s still expensive, but not extremely far away from parity, and as production scales and matures, cost will undoubtedly continue to go down.

Which leaves us with how the meat actually tastes. Part of what makes lab-grown meat so attractive (other than the fact that it’s better for animals and the environment) is that you can grow any type of meat. Sure, $10 for a burger or a steak sounds like a lot, but you don’t have to grow regular steaks, you can grow luxury, expensive steaks. For instance, wagyu steaks can cost up to $200 per pound and by comparison, $10 doesn’t sound as bad. But to engineer different types of meat, researchers need to be able to not just produce meat, but also produce the fat around it. Now, researchers working in Japan have found a way to produce both muscle tissue and fat from sampled cattle cells, which could enable scientists to engineer higher-quality meat.

For most of the lab-grown meat, muscle cells are cultivated to produce fibers, while the fat is injected afterward to resemble the “real” thing. However, with the new approach, muscle and fat can be grown at the same time, using cells from an animal’s skeletal muscle. This type of cell is easy to grow, the researchers explain.

Currently, researchers can use small chunks of meat, 0.5 millimeters in diameter, to grow pieces of up to 1.5 centimeters in diameter — not enough for a full-grown steak, but this is still just the first study describing the method. It takes around 21 days for beef to be grown using this method.

What makes this even more exciting is that different types of oil and fat can be added into the product this way, making the resulting lab-grown meat healthier and richer in nutritional supplements.

It’s still early days, but this type of study shows just how quickly the field of lab-grown meat can progress. It went from little more than a pipe dream ten years ago to already becoming a reality in 2021 — in several countries, including the US and Israel, the factories are already ready, it’s just the regulatory approval that’s lacking. So, would you go for a lab-grown steak?

The study “Isolation and Characterization of Tissue Resident CD29-Positive Progenitor Cells in Livestock to Generate a Three-Dimensional Meat Bud” has been published in the journal Cells.

Eating animal fat increases stroke risk — while vegetable fat may decrease it

Eating high amounts of red meat and animal fat can be bad for your heart, but vegetable fat may not be as bad. According to a new study, vegetable fat may actually decrease the risk of stroke. However, the findings are still preliminary, and the study has one big caveat: almost all participants were white.

It’s the first study to comprehensively analyze the impact on stroke risk from different types of fat, the study authors say, and the findings are intriguing.

“Our findings indicate the type of fat and different food sources of fat are more important than the total amount of dietary fat in the prevention of cardiovascular disease including stroke,” said Fenglei Wang, Ph.D., lead author of the study and a postdoctoral fellow in the department of nutrition at Harvard’s T.H. Chan School of Public Health in Boston.

Researchers investigated data gathered over 27 years from over 117,000 participants in the Nurses’ Health Study (1984-2016) and Health Professionals Follow-up Study. At the beginning of the study (and every 4 years) participants were asked to complete food frequency questionnaires that included that amount and type of fat in their diets over the previous year. They split the participants into 5 groups (or quintiles) based on how much animal and vegetarian fat they consumed. Although self-reporting is not entirely reliable, it’s one of the best options researchers have for tracing the eating habits of a large number of participants.

“Among those who consumed the most non-dairy animal fat (in the highest quintile of non-dairy animal fat), the non-dairy animal fat intake is ~17% of total energy and vegetable fat intake is ~ 13%; among those who consumed the most vegetable fat (in the highest quintile of vegetable fat), the non-dairy animal fat intake is 10% and vegetable fat intake is ~ 20%,” Wang told ZME Science.

Out of all these participants, 6,189 had a stroke. Those in the highest quintile of non-dairy animal fat were 16% more likely to experience a stroke than those in the lowest quintile. Dairy products (such as cheese, butter, or milk) did not appear to influence the risk of stroke. Meanwhile, participants in the quintile that ate the most vegetable fat were 12% less likely to experience a stroke compared to those who ate the least.

“Our interpretation is that higher intake of non-dairy animal fat is associated with higher stroke risk, whereas higher vegetable fat intake is associated with lower stroke risk,” Wang explained for ZME Science.

However, the participants may not be representative of the entire population. Of them, 63% were women, all were free of heart diseases and cancer at enrolment, and most notably, 97% of them were white.

“Our findings might not be generalizable to other populations. Further studies are needed to investigate these associations in other demographics,” Wang explains.

There’s also a problem of not knowing what types of animal or vegetable fat participants consume. Researchers didn’t have access to this detailed information, which would be useful in evaluating this association, says Wang.

“For example, we did not observe associations between saturated fat and stroke risk. But the associations might differ for saturated fat from vegetable, dairy, or non-dairy animal foods. For future steps, finer categories will help us better understand how types and sources of fat are associated with the disease risk.”

Although these are still preliminary findings, the researchers do offer a suggestion: that we eat less animal fat, especially fat associated with red meat.

“We would recommend the general public to reduce consumption of red and processed meat, minimize fatty parts of unprocessed meat if consumed, and replace lard or tallow (beef fat) with non-tropical vegetable oils such as olive oil, corn or soybean oils in cooking to lower their stroke risk,” Wang concludes.

“Many processed meats are high in salt and saturated fat, and low in vegetable fat. Research shows that replacing processed meat with other protein sources, particularly plant sources, is associated with lower death rates,” said Alice H. Lichtenstein, D.Sc., FAHA, the Stanley N. Gershoff professor of nutrition science and policy at Tufts University in Boston, and lead author of the American Heart Association’s 2021 scientific statement, Dietary Guidance to Improve Cardiovascular Health, where the new study will be presented.

California cultured meat plant is ready to produce 50,000 pounds of meat per year

In a residential neighborhood in Emeryville, California, a rather unusual facility has taken shape. The factory, which almost looks like a brewery, is actually a meat factory — but rather than slaughtering animals, it uses bioreactors to “grow” meat. According to the company that built it, it can already produce 50,000 pounds of meat per year, and has room to expand production to 400,000 pounds.

UPSIDE Chicken Salad

Upside Foods (previously called Memphis Meats) started out in 2015 as one of the pioneers of the nascent food-growing industry. Now, just 6 years later, there are over 80 companies working to bring lab-grown meat to the public — including one in Singapore which is already selling cultured chicken.

The fact that such a factory can be built (while regulatory approval is still pending and Upside can’t technically sell its products) already is striking. Upside’s new facility is located in an area known more for its restaurants than its factories, but with $200 million in funding and ever-growing consumer interest, the company seems to be sending a strong message.

Cultivating meat

The new facility is a testament to how much technology in this field has grown. The company can not only produce ground meat, but cuts of meat as well. Chicken breast is the first planned product, and the company says they can produce many types of meat, from duck to lobster.

“When we founded UPSIDE in 2015, it was the only cultivated meat company in a world full of skeptics,” says Uma Valeti, CEO and Founder of UPSIDE Foods. “When we talked about our dream of scaling up production, it was just that — a dream. Today, that dream becomes a reality. The journey from tiny cells to EPIC has been an incredible one, and we are just getting started.”

There’s still no word yet on how much these products will cost, but it’s probably not gonna be the cheapest meat on the market. Although lab-grown meat is nearing cost-competitiveness with slaughter meat, it’s not quite there yet. Besides, Upside already announced that their chicken products will be served by three-Michelin-starred chef Dominique Crenn. Crenn is the only chef in the US to be awarded three Michelin stars, and she famously removed meat from her menus in 2018 to make a statement against the negative impact of animal agriculture on the global environment and the climate crisis

Not for sale yet

Upside isn’t the only company to recently receive a lot of money in funding. Their San Francisco rival Eat Just, which became the first company in the world to sell lab-grown meat, received more than $450 million in funding. A 2021 McKinsey & Company report estimates that the cultivated meat industry will surge to $25 billion by 2030. However, in the US (and almost every country on the globe) cultured meat isn’t approved for sale yet.

The FDA has largely been silent on lab-grown meat since 2019, and while many expect a verdict soon, there’s no guarantee of a timeline. Even if the FDA allows the sale and consumption of lab-grown meat in the US, it will likely do so on a product-by-product basis rather than opening the floodgates to lab-grown meat as a whole. In the EU, things will likely move even slower.

However, pressure is mounting. In addition to the obvious ethical advantages of lab-grown meat, its environmental impact may also be less severe than that of slaughter meat. However, this has not been confirmed since we don’t yet have a large-scale production facility, and the few available studies don’t have definitive conclusions.

This is why having a working factory is so exciting, because it could offer the first glimpses of how sustainable the practice actually is. Upside says the facility uses 100% renewable energy and has expressed its desire to have a third party verify the facility’s sustainability by mid-2022.

Of course, all of this depends on the regulatory approval that may or may not come anytime soon. In the meantime, the factory is ready and good to go.

American diets consisting of even more ultra-processed foods than thought

Heart disease is one of the largest killers in the United States. (Photo: Pixabay)

Let’s face it, Americans have never been famous for their healthy diets and slender physiques. Now a new study out of New York University published in the American Journal of Clinical Nutrition has found that the diet of the average United States citizen is including more ultra-processed foods than ever.

Ultra-processed foods are defined as industrially manufactured, ready-to-eat or heat foods that include additives and are largely devoid of whole foods. These ingredients form an equation that leads to obesity and heart disease.

“The overall composition of the average U.S. diet has shifted towards a more processed diet. This is concerning, as eating more ultra-processed foods is associated with poor diet quality and higher risk of several chronic diseases,” said Filippa Juul, an assistant professor and postdoctoral fellow at NYU School of Public Health and the study’s lead author. “The high and increasing consumption of ultra-processed foods in the 21st century may be a key driver of the obesity epidemic.”

The study looked at 41,000 adults who took part in the Center for Disease Control and Prevention’s National Health and Nutrition Examination Survey from 2001 to 2018. The survey asked the participants about their diet in the previous 24 hours. Despite movements to decrease intakes of processed foods and transition to a diet with more whole foods, the results didn’t appear to show any such trend towards healthiness.

Ultra-processed food consumption grew from 53.5% of calories at the beginning of the period studied (2001-2002) to 57% at the end (2017-2018). The intake of ready-to-eat or heat meals, like frozen dinners, increased the most, while the intake of some sugary foods and drinks declined. In contrast, the consumption of whole foods decreased from 32.7% to 27.4% of calories, mostly due to people eating less meat and dairy.

Processing food changes it from its natural state. Processed foods, for the most part, only have two or three ingredients. They are also essentially made by adding substances such as salt, oil, or sugar. Examples include canned fish or canned vegetables, fruits packaged in syrup, and freshly made bread.

Some foods go a step further in their unhealthiness. These are highly processed or ultra-processed foods. These most likely have many added ingredients such as added sugar, salt, fat, and artificial colors or preservatives, as well as substances extracted from foods, starches, and hydrogenated fats. They may also contain additives like artificial flavors or stabilizers. These are your frozen meals, soft drinks, hot dogs and cold cuts, fast food, packaged cookies, cakes, and salty snacks.

Juul says that one of the best – and maybe only ways – to improve diets is to implement policies to reduce their intake, such as revised dietary guidelines, marketing restrictions, package labeling changes, and taxes on soda. The political landscape being what it is, however, it would be a very curvy and pothole-filled road to implement any of those changes.

“In the current industrial food environment, most of the foods that are marketed to us are in fact industrial formulations that are far removed from whole foods,” said Juul. “Nevertheless, nutritional science tends to focus on the nutrient content of foods and has historically ignored the health implications of industrial food processing.”

The study didn’t see any correlation between income or ethnicity. The one outlier was Hispanic adults, who ate significantly less ultra-processed foods and more whole foods compared with non-Hispanic white and Black adults.

The study took into account diets pre-COVID-19, and Juul says that diets probably only got worse throughout the pandemic.

“In the early days of the pandemic, people changed their purchasing behaviors to shop less frequently, and sales of ultra-processed foods such as boxed macaroni and cheese, canned soups and snack foods increased substantially. People may have also eaten more packaged ‘comfort foods’ as a way of coping with the uncertainty of the pandemic. We look forward to examining dietary changes during this period as data become available.”

Why kids hate broccoli: a foul combination with oral bacteria

Credit: Pixabay.

Your first memory of eating Brussels sprouts and broccoli is likely not a very happy one. Many children dislike these sorts of vegetables, known as Brassica, and some may even find them disgusting. There are a couple of reasons why broccoli can taste really bad, especially for children who are more sensitive, including bitter-taste compounds and gene variants.

Now, scientists have found yet another factor that makes these plants unpalatable: enzymes in broccoli can combine with bacteria in our saliva to produce very unpleasant sulfurous odors. The higher the levels of these compounds, the more likely children were to say they dislike the vegetables. Furthermore, the levels of these volatile compounds were found to be similar in parent-child pairs, which suggests the oral biome is shared.

In the mouth, broccoli can produce putrid odors in some people

Broccoli, cauliflower, and Brussels sprouts all contain a glucosinolate compound that makes them taste bitter. But to some people, their taste can be especially foul. For some time, scientists have known that the TAS2R38 gene is responsible for regulating how humans sense bitterness in food, with huge evolutionary implications.

The bitter taste, along with sourness, is thought to be protective, an early sign that is supposed to communicate ‘be careful, this food may be toxic’. This warning system is quite robust, being capable of identifying thousands of different compounds, some of which could poison and even kill us.

Sensitivity to bitter compounds is a little bit higher in very young humans. Children have around twice as many taste buds as adults, for instance. Also, there’s quite a bit of genetic variance in how people express TAS2R38.

Of course, broccoli isn’t toxic. On the contrary, it’s a ‘superfood’, very rich in nutrients and antioxidants, while being low on calories. It just so happens that our body mistakes it for something that may be toxic, and this sensitivity is within a spectrum, meaning there’s significant variation among people. To some people broccoli and other vegetables like it are palatable, to others it’s simply not approachable.

Normally the Glucosinolates get all the attention, but Damian Frank, a Research Fellow in Food Chemistry and Sensory Food Scientist at the University of Sydney, found that another compound called S-methyl-ʟ-cysteine sulfoxide shouldn’t be overlooked when it comes to Brassica bitterness. When these compounds combine with enzymes in the plant’s tissue and people’s saliva, they produce sulfurous odors.

Frank and colleagues investigated differences in sulfur volatile production in saliva from 98 child/parent pairs. Using gas chromatography-olfactometry-mass spectrometry, the researchers first measured the main odor-active compounds in raw and steamed cauliflower and broccoli. They then mixed saliva samples from each participant with raw cauliflower powder and analyzed the produced volatile compounds. Each sample was then associated with taste ratings self-reported by the parent or child.

Unsurprisingly, dimethyl trisulfide, which smells rotten, sulfurous and putrid, was the least liked odor by both children and adults. But what was intriguing was that there were large differences in sulfur volatile production between child/parent pairs while children had very similar sulfurous odor production to their parents. This makes sense since people tend to have similar microbiomes when sharing the same diet, household, and ancestry.

“There were big differences between the amount of volatiles formed between individuals. But there was a significant correlation between children and adults; the parents of children with high enzyme activity tended to also have high activity. This suggested similarity in the amount and type of bacteria present,” Frank told ZME Science.

Although children whose saliva produced the highest amount of sulfur volatiles predictably disliked raw Brassica vegetables the most, this relationship wasn’t as strong for their parents. This is perhaps due to less taste sensitivity with age and an acquired tolerance of the flavor with repeated exposure through life. That being said, many parents likely hate broccoli as much as their kids do.

“Sometimes the parent has to overcome their own dislike to give their child “healthy” food like brassicas. They want to be a good parent and do the right thing, but it goes against the grain!” said Frank.

The researchers also measured common genetic differences in bitter sensing receptor genes among the participants, the results of which will be published soon. These will likely help explain why some people like Brassica vegetables and others, well, not so much.

“Not sure whether I will be doing further work in this interesting area.  But a better characterization of the type of bacteria present in individual oral microbiomes is a worthwhile research area. Also more research on how bacteria in the mouth affect taste and perception is super fascinating,” Frank said.

The findings were reported in the Journal of Agricultural and Food Chemistry.

How to tell if eggs are bad — according to science

Every once in a while, you come across an egg and you’re not exactly sure if it’s still good anymore. Maybe it smells a bit funny, maybe it’s past its due date, or it’s just been in the fridge for a few days and you wanna be sure. But due dates can be misleading, and smell alone is not a reliable indicator. So how can you tell if the eggs are still good?

We’ve looked at some of the methods and found what really works.

Why it matters

Every year, the average person on the globe consumes 197 eggs. In many countries (like the US) that figure is much higher, at almost 300 per year. But many eggs are also thrown away. In the UK alone, 720 million eggs are wasted every year and worldwide, and while global estimates are scarce, wasted eggs probably number in the billions every year.

Granted, some of this waste happens due to restaurants or producers, but consumers can also play their part and not throw away eggs unless they have gone bad. At the same time, you really don’t want to consume bad eggs, as this would increase the risk of Salmonella or E. coli infection — which can cause diarrhea, fever, and vomiting.

To reduce the risk of bacterial infection from eggs, you can keep eggs refrigerated (which keeps them fresh for a longer time), and cook them thoroughly. A 2011 research project has found that keeping eggs at steady, low temperatures can help their natural defenses against bacteria.

Generally speaking, you shouldn’t eat eggs past their expiration date. However, some eggs have sell-by dates, others have eat-by (or expiration dates), which can get confusing. Also, those dates aren’t absolute. Most health and food organizations note that eggs are usually good for several weeks past the stamped date, but they can also get bad quicker, if stored improperly.

This is why it’s important to have a reliable method to check if your eggs are still good.

How to tell if eggs are bad: the floating test

The most common (and probably most reliable) test to check if an egg is bad is the floating test. You take a glass (or a pot, or any container really), and fill it with room temperature water. Place your egg (or eggs, just one at a time) in the water. If the egg floats, it’s not good anymore — simple as that.

Good eggs are heavier than water, which is why they sink. But when an egg starts to decompose, it becomes lighter by giving off gases. This creates pockets of air, especially at the bottom of the egg. But if the egg was a perfectly isolated system, it wouldn’t float. After all, even when solid or liquid mass changes into gas, it has the same mass.

This common method is not a myth, it actually works, and there’s some interesting science as to why it does.

Why it works

There’s a common misconception about the egg float test. The reason why bad eggs float has to do with pockets of air forming, but that’s only half of the story.

Good eggs are heavier than water, which is why they sink. But when an egg starts to decompose, it becomes lighter. When an egg starts to decompose, it gives off gases. This creates pockets of air, especially at the bottom of the egg. But if the egg was a perfectly isolated system, it wouldn’t float. After all, even when solid or liquid mass changes into gas, it has the same mass.

However, eggs aren’t perfectly isolated, they have pores and gases can escape. These gases, light as they may be, still have mass, and when they escape, they make the egg lighter. At some point, when the egg becomes lighter than water, it floats — and it’s not good for consumption anymore. This is probably the best test to employ to see if eggs are still good.

The shake test

A less reliable but still useful test is to take an egg and shake it gently by your ear. Listen carefully; is there a sloshing sound and feel? If not, you’ve probably got a fresh egg. If you do hear it, you may be dealing with an egg that has gone bad.

Keep in mind that if you shake them hard enough, even fresh eggs can make a sloshing sound, so shake them gently.

Why it works

As an egg gets older, the yolk becomes more alkaline and runny. It’s hard to say exactly where exactly the point of no return is, but as a rule of thumb, if the yolk seems too runny, it’s a bad egg.

There’s a bit of art to this test, and it’s best to complement it with another.

The good old smell test

A good cracked egg.

We’ve mentioned before that smell alone is not a reliable indicator — and it’s not. But if you crack an egg and it just smells bad, you should throw it away (they don’t call them rotten eggs for nothing). There’s a good chance the egg may actually be bad, but even if it’s not, you probably won’t be able to enjoy it, so better not to take any risks.

Why it works

The smell from bad eggs is a mixture of things, but a key component is hydrogen sulfide (H2S) — a heavy and pungent gas. If you feel any sulfur-type smell coming from the eggs, that’s a sign of decomposition.

Fresh eggs don’t emit a smell, but keep in mind that eggs can “suck” up smells from your fridge (which is why you should keep them covered and in a carton that can absorb any unwanted smells).

The visual test

If you crack open an egg and you see a discolored yolk, it’s likely bad. The same thing goes for eggs with white parts that are cloudy. But if you’ve reached that point, the odds are the egg is stinky already.

Why it works

It’s not just decomposition and bacteria, there’s also some chemistry that changes the color (and smell) of the eggs. Eggs contain carbonic acid — an acid that forms when carbon dioxide reacts with water. Carbonic acid slowly turns into CO2 (and other gases), and leaves the egg; this is why the floating egg test works. But at the same time, this makes the remaining egg more alkaline, and more chemically capable of interacting with hydrogen.

This changing chemistry is a big part of why the inside of the egg looks different, and also a big part of why it smells differently.

The “not sure” test

Are you unsure if an egg is safe to eat? Just don’t eat it — that’s the “not sure” test.

We all want to play our part and fight food waste, and that’s a very noble goal. But if you’ve done the test and still have doubts about it, it’s best to just play it safe and not take any risks.

Egg tips

Boiling eggs and then storing them in the fridge for a few days can be useful for salads, sandwiches, etc.

Always cook eggs properly. Cooking isn’t just something we do to make food edible or tastier, it’s also something we do to kill off pathogens.

If you’ve got eggs and want to cook them but not consume them right away, the best thing to do is boil them. Boiled eggs don’t last as long as fresh eggs in the fridge, but hard-boiling eggs is a good way of giving them a couple of extra days. Boiled eggs can last up to a week when stored in the fridge, so if you’ve got a bunch of eggs you need to eat in a few days, you can use this for dressing or sandwiches or whatever else you like.

To give your eggs the most fridge life, store them in the coldest part of the fridge where they won’t freeze. It’s common to store eggs on the door, but that’s actually the least cold part of the fridge. Go deep and put them where it’s cold.

If you take eggs out, either put them back in quickly or cook them. When you take cold eggs out of the fridge, they “sweat” as the water condenses, creating an environment well-suited for bacterial growth. Avoid leaving eggs out for more than an hour, and if you do, it’s safest to cook them.

You can also freeze eggs (after cracking them), but if you don’t know what you’re doing, it’s best to avoid this.

As mentioned, leave eggs in their original carton. If you don’t have one, store them in something covered. Eggs can absorb smells and pick up unpleasant odors from your fridge.

Some countries (most notably in Western Europe) don’t store supermarket eggs in the fridge — but the fridge is still the best place to store them at home.