The number of hospitalizations for health disorders has doubled across the US during the pandemic (between January 2018 and December 2020), according to new research. The largest part of this increase was represented by cases of anorexia or bulimia.
Despite this, other common behavioral health conditions such as depression, alcohol use, or opioid use disorder, haven’t registered any meaningful changes during this time.
“This pandemic era is going to have some long-term impacts on the course of disease and the course of weight over the lifespan,” says Kelly Alison, Ph.D., Director of the Center for Weight and Eating Disorders at the University of Pennsylvania, co-author of the paper. “What that does for eating disorders? We just don’t know.”
Although the team can’t yet tell what the cause of this increase is, they believe that we’re looking at the combined effect of several factors ranging from the toll the pandemic has taken on our mental health, an outsized focus on weight gain in parallel with constantly viewing ourselves on video calls, and even symptoms of COVID-19 itself. There is also very little data on how this trend will affect public health in the long run.
The study included data from over 3.2 million individuals across the U.S., with a mean age of 37.7 years old. According to the findings, the number of inpatient care cases for eating disorders remained pretty stable over time, at approximately 0.3 cases per 100,000 people per month, until May 2020. At that date, the number of cases doubled, to 0.6 per 100,000. This increase was registered across anorexia nervosa, bulimia nervosa, and other and unspecified eating disorders.
The average length of inpatient stays for such cases has also increased. This was on average 9 days and 8 days between June to December of 2018 and 2019, respectively, going up to 12 days between June and December of 2020. A similar increase was not seen for the 3 behavioral health conditions used as controls over the same timeframe.
As far as outpatient care cases for eating disorders have increased from around 25 per 100,000 people per month to 29 per 100,000. The age range of inpatient patients ranged from 12 to 20 pre-pandemic, rising to 18 to 28 after its onset.
The average length of inpatient stays for such cases has also increased. This was on average 9 days and 8 days between June to December of 2018 and 2019, respectively, going up to 12 days between June and December of 2020. A similar increase was not seen for the 3 behavioral health conditions used as controls over the same timeframe.
Stress caused by the pandemic and the changes it caused in our lives could be one of the drivers of this increase, the team reports. Additionally, the shift towards video calls for conferences at work gives us ample opportunity to look at ourselves, which can create a further drive towards the development of eating disorders.
“During the pandemic, having a lack of routine and structure primed us in terms of our behaviors around food,” says Ariana Chao, Ph.D., from Penn’s School of Nursing.
Social media reflects this increase in self-scrutiny and concerns regarding weight, the authors report. As far as eating disorders are concerned, discussions about weight can be “very triggering”, Allison explains, so social media can create a lot of stress in patients at risk. Different people handle this stress differently, the team adds, with some binge eating, while others didn’t eat enough.
For now, it’s not clear whether the rising trend in eating disorder cases will continue after the pandemic. The present study is based on data up to December 2020, so it’s missing the latest part of the picture. The team is now hard at work analyzing data recorded well into 2021 to see how these trends are evolving.
“We really need more research,” says Chao. “Adversity can be a long-term predictor of developing eating disorders. Even the transition back to ‘normal’ can exacerbate eating disorders. Everything is changing so rapidly. Then again, people are also resilient. It’s hard to say what the long-term implications will be.”
The paper “Trends in US Patients Receiving Care for Eating Disorders and Other Common Behavioral Health Conditions Before and During the COVID-19 Pandemic” has been published in the journal JAMA Network Open Psychiatry.
Spicy foods are meant to discourage us from eating them. However, humans stand apart from other animals in that we sometimes seek these items to eat specifically because they’re spicy. Exactly why we do this is unclear, but it’s likely a combination of factors ranging from potential health benefits to cultural norms and personal preference.
Do you enjoy doing things that hurt your tongue? Have you ever thought “man, I’d like to feel my mouth on fire!”? Do you get excited at the prospect of hot wings so hot that they make your very soul tremble? If yes, let me just say that I cannot, for the life of me, sympathize with you. I like my meals like I like my car: not burning.
But that’s not a universal preference among people, which raises an interesting point — why do some people like spicy food? On the face of it, it doesn’t make any sense. We know certain plants use chemical defenses against pests and pathogens, chemicals that also give them unique qualities like flavor or taste. Some are milder, like onions, garlic, or pepper. Others will have you in tears, gagging for life, hoping for death. And yet, we keep coming back for seconds. Sometimes we even go to events to see who can withstand the spiciest foods.
In short, although these plants contain substances specifically to make us not want them, we seek them out, specifically. We don’t really know why, but we do have some ideas, and we’re going to talk about those today.
What makes a spice, what makes it spicy?
We’ve talked about spices before here on ZME Science, but mostly from a historical standpoint. In more practical terms, spices are plant products (apart from their leaves, stems, and flowers, which are referred to as ‘herbs’) that can impart taste, flavor, or color to a meal.
They aren’t very common, all things considered. Their special properties were most likely formed because these plants had to contend with environmental pressures such as parasites, predators, or diseases. They became spices through chemical warfare. Since each species had its own issues to contend with, there is a very wide range of substances they employ. We collectively know these plants as spices, but we also make a distinction between them and things that are ‘spicy’.
A good example are peppers. Bell peppers are a spice, they’re the main ingredient in paprika, but they’re not spicy. Jalapeño peppers can be a spice, but they’re definitely very spicy. The difference between these two terms is more of a subjective one. Things that ‘are spicy’ contain substances that are particularly irritating or unpleasant to us as humans. They’re tailored to offend our bodies in particular.
In the case of spicy peppers, that substance is capsaicin. It will make your eyes water, but it wouldn’t have much effect on a bird. We think it comes down to the fact that pepper seeds can’t survive the strong acids in the mammalian gut, but they can make it through a birds’ intestines unscathed. In a bid to help spread their seeds, the theory goes, peppers developed capsaicin to keep mammals away but allow birds to peck away unscathed. We were the intended target for their chemical war effort.
Why people like spices, in general, isn’t very hard to wrap your head around: the flavors they contain are interesting and make meals more enjoyable. Why people like things that are spicy, on the other hand, is a bit more nebulous. Especially so because their spiciness was designed specifically to make us not like them.
Maybe it’s because they make food safe
Evolutionary biologists like to view the traits and behaviors of individual species as elements that help them navigate their environments — like skills that you acquire in time to fulfil your needs. On the one hand, this means that certain plants had a reason to become spices, and we’ve talked about that just now. But on the other hand, it would also mean that we have an evolutionary need to consume spices, or else we wouldn’t.
A paper (Sherman, Billing) published back in 1999 sums up that idea quite nicely in its headline: “Darwinian Gastronomy: Why We Use Spices: Spices taste good because they are good for us“. The authors looked at the use of spices in traditional cuisines across the world from “traditional cookbooks”, comparing this to the natural conditions these cultures developed in.
Their theory was that the use of spice is, at least in part, a pragmatic thing. In warmer climates, they hypothesized, food (meat especially) would spoil quicker and contain more pathogens than in colder climates. The use of spice may well be a subconscious effort to protect ourselves from these, which grew into a cultural preference over time. As we’ve seen before, spices are essentially plant species that use powerful chemicals to protect themselves. The theory, then, is that people mixed these into their foods, in relatively small quantities, to fight off any pathogens in the food — which are a much bigger risk than the chemicals contained in the spices.
Essentially, it’s taking a gamble that the small dose of poison in our food will do more damage to any bacteria or viruses therein than it would do to our bodies.
The authors did find some evidence in support of their hypothesis. The cookbooks from warmer areas mentioned more types of spices overall, and called for more of them to be included in every dish, than those in colder climates. When looking only at meat dishes (meat spoils faster and contains more pathogens than spoiled plant matter), the average number of spices called for by the recipes was 4, and 93% of these recipes called for at least one type of spice. However, Norwegian cookbooks only mentioned 10 different spices and called for 1.6 spices per dish on average. Hungarian cookbooks mentioned up to 21 different spices and called for 3 spices, on average, for each dish.
“But wait!” you cry out, wise to the fact that correlation doesn’t imply causation, “so it doesn’t mean one causes the other just because they occur together”. And, as always, you’re right. This one paper can’t prove that people employ spices against pathogens in foods. It also just happens that most spices today are endemic (native) to warmer areas, as these generally harbor more diverse communities of plants, animals, and the like. So it could simply be a matter of availability. Spices also tended to be extremely expensive or simply not available to many colder regions in the past, so it would make sense their traditional cookbooks won’t mention them, or only do so sparingly.
At the same time, this doesn’t necessarily mean that the hypothesis is wrong, it just means we can’t know for sure. The authors further note that vegetable dishes called for much fewer spices across the board, which would fit well with their hypothesis — since spoiled meat contains more bacteria than spoiled vegetables, it makes sense to use more spices when cooking meats. Furthermore, there is data to support the fact that many spices do have an antimicrobial or antifungal effect. At the same time, many of the most widely-used spices, like pepper, aren’t that great at the job; salt, for example, is more of a bacteria-killer than black pepper. There is also quite a lot unknown about how effective these spices will be at killing pathogens in the concentrations and conditions seen during cooking.
Another point that might help support this view is that predators, even obligate carnivores, will eat small amounts of plant matter. While we don’t exactly understand why (it could be simply to get more fibers and assist in digestion) it is possible that the instinct formed to help these animals destroy some of the bacteria in their food with the chemicals contained in the plants. Kind of like the theory proposes people do with spices.
Maybe it’s because your folks served spicy food
While evolutionary biologists like to treat everything in a very clean, cause-and-effect way, when talking about people’s preferences, there’s always an element of subjectivity. Our tastes, wants, and desires are — at least in part — shaped by what we’ve experienced so far. A food item can be our favorite not through the virtue of its taste alone, but also due to intangibles such as nostalgia, social mores, our personal experiences.
If you’re sensitive to spicy, you can train yourself to become desensitized to it. Through repeated exposure to low dosages of spicy compounds in our childhoods, then, we can acquire both a preference for and a resilience in the face of spicy foods.
This cultural hypothesis has two major limitations. First off, it’s kind of a self-fulfilling prophecy — we like spicy food because we eat spicy food, so we eat more of it. While it may well be true that we acquire a taste for spiciness with exposure to it, it doesn’t explain why or when this behavior started. If eating spicy food is what makes us like spice, why did we start in the first place? This hypothesis doesn’t offer a starting point.
Secondly, it doesn’t offer an explanation for why people seek increasingly higher levels of spice. Even if we accept, for the sake of the argument, that repeated exposure to spiciness makes us tolerate it better, the fact remains that people often seek out spiciness, especially in cultures that already include it a lot in their cuisine — such as Mexican or Chinese traditions. More to the point, they seek levels of spiciness in excess of what they can already tolerate. If the point is to make the sensation bearable, why do people keep seeking ever stronger burns? It would suggest that their goal isn’t to become accustomed to spicy, rather the sensation itself, or something associated with it. So, after all…
Maybe it’s because we like the burn
Capsaicin can make your mouth hurt a lot. In fact, if you’ve ever bitten into a mean pepper, you know it can make your whole body ache and tremble. You get sweaty, your eyes sting, some crying might be involved. This effect can stay with you for the whole length of your digestive tract (let’s put it that way).
It’s undeniable then that the effect this substance has on us is profoundly unpleasant and temporarily debilitating. And, while keeping in mind that you can die from eating too much capsaicin, it doesn’t actually harm you in any way. What it does, instead, is to trick your body into thinking it’s in danger.
Capsaicin binds to TrpV 1: the transient receptor potential cation channel subfamily V member 1, more easily rememberable as the vanilloid receptor 1. Despite the name, it’s a receptor that’s quite widespread in your body and whose main function is to keep tabs on and regulate your body temperature. Capsaicin wreaks havoc on TRPV1; it binds to it and activates it. While there’s literally no physical damage to your body when you munch on a pepper, to your nervous system, it looks like your mouth is suddenly, and violently, aflame. This effect is so powerful that our bodies’ response to the illusion — mostly in the form of inflammation and changes in heart rate — can kill us.
It is, after all, a substance designed to keep mammals away.
And yet, we have chili eating contests, a food containing a lot of capsaicin. We know for a fact that even people who say they like chili in particular are not immune to the burning sensation it produces. A paper published in 1980 (Rozin, Shiller), “The nature and acquisition of a preference for chili pepper by humans” notes that these individuals “come to like the same burning sensation that deters animals and humans that dislike chili; there is a clear hedonic shift [in their preferences]”, which could come down to “association with positive events, including enhancement of the taste of bland foods, postingestional effects, or social rewards”.
Another point they raise, however, one that I find much more entertaining, is that eating spicy foods is a way to toy with danger. Much like a roller coaster, that danger is (pretty much) contained. While we do understand that, on an intellectual level, our bodies don’t make the distinction. The physiological effects of being in danger and/or on fire, such as the rush produced by adrenaline or the feel-good sensation produced by the release of endorphins in our system, are still genuine.
In this light, spicy food can be seen as a facet of human thrill seeking — or what the authors refer to as “enjoyment of ‘constrained risks'”.
That bit about endorphins is also pretty interesting. They are a family of compounds that our bodies use to clamp down on stress and pain when needed. They’re not really a chemical family, more of a pharmacological convention, as several different substances with different structures are endorphins. But function-wise, they work very much like opioid drugs, causing euphoria and a host of other delightful effects, including, as mentioned, pain relief. They’re one of a group of molecules the Internet gleefully knows as the ‘happiness molecule’, alongside serotonin, dopamine, and oxytocin. It’s a pretty wide group because the Internet, overall, is not a very capable pharmacologist, but there is a kernel of truth at the core of the meme.
Eating spicy foods is a reliable and non-threatening way of squeezing out some of this happy juice from your brain. This would also explain why some people would seek ever-spicier foods to torture themselves with. As they become desensitized to a certain level of spicy, an ever higher threshold is needed to obtain the same endorphin reward.
By itself, this doesn’t really explain why some people are aficionados of spice — if eating spicy food is a painful way of enjoying some pleasure, why isn’t everyone doing it? We don’t know. There is some evidence (Byrnes, Hayes, 2012) that personality traits, especially ones such as thrill-seeking, as well as differences in our individual abilities to perceive substances like capsaicin, have a role to play. Someone who’s psychologically predisposed to taking risks, and has a lower abundance of TrpV 1 receptors on their mouth, I’d imagine, is more likely to engage in such behavior.
At the end of the day, the truth is we don’t know. If I had to take a wager, I’d say that all the hypotheses we’ve talked about today play a part. They’re not mutually exclusive. How much influence they have is, very likely, dependent on who you’re talking with. For some it’s the thrill, and the bragging rights. For others, it’s grandma’s cooking. People are complex, and so are the forces that drive us, so we probably won’t ever be able to tell for sure why any of us — nevermind all of us, as a species — would engage in such a behavior.
But the thing we do know is that, apart from a few species that have evolved specifically to be less sensitive to certain irritants, we are the only ones which seek out food that hurts to eat. Could that be a sign of how far we’ve come, that we’d want to seek a semblance of danger just to feel excited? Or is it the other way around, and such predisposition for risky behavior is what set us on the path to success? Very interesting questions to ponder the next time you’re praying for salvation over a bowl of chili.
Traditional wisdom says that you shouldn’t eat before going to sleep, but a new study casts doubt on that belief.
Late night snacks might not be all that bad, a new study concludes. Image in public domain.
A lot of changes happen in our bodies when we sleep. Among other things, our metabolism and digestion slow down. It seems quite logical, therefore, that you shouldn’t eat before going to sleep — otherwise, your body just doesn’t have enough time to process it all and your blood sugar increases.
This translates into a common piece of advice which has been adopted into many cultures: “Don’t eat two hours before bedtime.” However, a new study casts new doubt on this advice, finding no clear connection between eating before bedtime and blood sugar levels.
Su Su Maw and Chiyori Haga, two researchers from Okayama University assessed the effect of pre-sleep eating. They analyzed 1,573 healthy middle-aged and older adults with no underlying conditions, looking at the levels of HbA1c — the most common marker for blood glucose (sugar) levels for the last two to three months. A high HbA1c means you have too much sugar in your blood (which in turn, means you’re more likely to develop serious health issues like diabetes).
In all, 83 (16%) of the men and 70 (7.5%) of the women fell asleep within 2 hours of eating dinner. However, when they corrected for other factors (such as smoking, overall weight, blood pressure, etc) they did not find any connection between the two.
Instead, researchers find that lifestyle choices like drinking or smoking are the leading factors when it comes to blood sugar levels. They also emphasize that eating nutritious foods and maintaining a healthy lifestyle are actually more important than eating before bedtime. Simply put, what you eat is much more important than when you eat.
“Contrary to general belief, ensuring a short interval between the last meal of the day and bedtime did not significantly affect HbA1c levels,” researchers write.
“More attention should be paid to healthy portions and food components, getting adequate sleep and avoiding smoking, alcohol consumption, and overweight, as these variables had a more profound influence on the metabolic process.”
However, it should be noted that the study was carried out in Japan, where afternoon portions are relatively small and often contain healthy foods such as soup or vegetables, so the findings might not translate to other diets and countries.
At any rate, this doesn’t mean you should go crazy with the late snacks. Ice cream and French fries are still probably not okay.
Journal Reference: Su Su Maw and Chiyori Haga. Effect of a 2-hour interval between dinner and bedtime on glycated haemoglobin levels in middle-aged and elderly Japanese people: a longitudinal analysis of 3-year health check-up data. BMJ Nutrition, Prevention & Health, 2019 DOI: 10.1136/bmjnph-2018-000011
Ever noticed how after a while, the things you enjoy can lose some of their appeal? Thankfully, researchers have figured out how to prevent that from happening: do the same things in a slightly different way.
In a new study, Ohio researchers found new enjoyment in popcorn, videos, and even water, when consuming them in an unconventional way.
Like eating popcorn for the first time
The best example is probably popcorn — if you’re like most people, you enjoy it, especially while watching a movie or something that draws your attention. But the enjoyment certainly fades to an extent. After all, you already know the taste and you’re very familiar with the experience. But what if you changed the experience a bit, like for instance, eating popcorn with chopsticks?
“When you eat popcorn with chopsticks, you pay more attention and you are more immersed in the experience,” Smith said. “It’s like eating popcorn for the first time.”
This would explain novelty restaurants (such as pitch-black restaurants) — they change the experience and help us re-learn to enjoy food.
“It may not be anything special about darkness that makes us enjoy food more. It may be the mere fact that dining in the dark is unusual,” Smith said.
Even something as insipid as drinking water can be more enjoyable when done in a creative, unconventional way, the study showed (like for instance drinking water from a martini glass lapping it like a cat). Lastly, researchers also showed that it’s not just about food. Study participants were more likely to enjoy watching videos if they did something unusual such as use their hands as goggles, researchers report.
“They actually thought the video was better because the hand-goggles got them to pay more attention to what they were watching than they would have otherwise,” he said. “They were more immersed in the video.”
It may seem like a pretty prosaic study, but it could make a cheeky difference in our day to day lives. After all, everyone wants to enjoy the daily routine, and there are small things we can do that can accomplish that. For instance, when you’re eating pizza, you could eat the first slice normally, then roll the second one, then eat the third one with a fork and knife.
Don’t like your furniture anymore? Try moving it around in the house rather than throwing it out. You never know how much you’ll like it.
“It may be easier to make it feel new than you might think. It is also a lot less wasteful to find new ways to enjoy the things we have rather than buying new things,” he said.
Personally — I’m drinking water out of a jar for the rest of the day. After all, I have the science to back it up.
Journal Reference: Ed O’Brien, Robert W. Smith. Unconventional Consumption Methods and Enjoying Things Consumed: Recapturing the “First-Time” Experience. Personality and Social Psychology Bulletin, 2018; 014616721877982 DOI: 10.1177/0146167218779823
Sleeping during the day and staying up all night will impact the concentration and activity of over 100 proteins in the blood — even if you only do it for short while.
Image credits picsessionarts / Flickr.
Staying awake and eating during the night throws a wrench in the activity of blood-borne proteins, according to new research from the University of Colorado Boulder. The proteins identified by the team impact processes involved in a wide array of metabolic functions, from blood sugar levels to immune function. The study is the first to examine how protein levels in human blood, also known as the plasma proteome, vary over a 24-hour period and how altered sleep and meal timing affects them.
“This tells us that when we experience things like jet lag or a couple of nights of shift work, we very rapidly alter our normal physiology in a way that if sustained can be detrimental to our health,” said senior author Kenneth Wright, director of the Sleep and Chronobiology Laboratory and Professor in the Department of Integrative Physiology.
The team enlisted the help of six healthy male subjects in their 20s for the study. The participants were asked to spend six days at the university’s clinical translational research center. While here, their meals, sleeping hours, active periods and the hours they were exposed to light were tightly controlled and recorded.
On the first two days, the men were kept on a normal schedule: active hours and light exposure during the day, sleeping hours at night. They were then gradually transitioned to a night-shift work pattern — they could get an eight-hour sleep if they wanted, but only during the day, and stayed up and ate at night. The team collected blood samples every four hours, which they analyzed for the concentrations and time-of-day-patterns of 1,129 proteins.
They report that 129 of these proteins’ patterns were thrown off by the simulated night shift. The effect was already noticeable by the second day of night-shift waking patterns, Depner adds.
One of the affected proteins was glucagon — which tells the liver to inject sugar into the bloodstream. Glucagon levels in the blood peaked during waking hours, the team found, meaning they shifted to night-hours as the participants started staying awake at night. But it also peaked in higher concentrations, the team adds. They think that this effect could, in the long-term, form the root cause of the higher diabetes rates seen in night-shift workers.
Night-shift wakefulness patterns also decreased blood levels of fibroblast growth factor 19. Previous research with animal models has shown this protein to boost calorie-burning and energy expenditure, the team adds. The participants in this study burned 10% fewer calories per minute when their schedule was misaligned.
Overall, thirty proteins showed a clear 24-hour-cycle, most showing a peak between 2 p.m. and 9 p.m.
“The takeaway: When it comes to diagnostic blood tests—which are relied upon more often in the age of precision medicine—timing matters,” said senior author Kenneth Wright.
The authors note that all the participants were kept in dim light conditions, to eliminate the effect of light-exposure (which can also strongly affect the circadian system) on the results. Even without the glow of electronics at night, changes in protein patterns were rapid and widespread.
“This shows that the problem is not just light at night,” Wright said. “When people eat at the wrong time or are awake at the wrong time that can have consequences too.”
The findings could lead to new treatment options for night shift workers, who are at a higher risk for diabetes and cancer. It could also enable doctors to precisely time administration of drugs, vaccines and diagnostic tests around the circadian clock.
The paper “Mistimed food intake and sleep alters 24-hour time-of-day patterns of the human plasma proteome” has been published in the journal PNAS.
The sounds you make while chewing have a significant effect on the amount of food you eat, a new study has found. The results suggest that people are likely to consume less if they can hear themselves eating.
Image via tclw.das.ohio.gov
Researchers at Brigham Young University and Colorado State University have found that your TV, radio, and computer are making you fat. Not by bombarding you with food ads (though they totally are) but by blocking the sounds of your chewing. In a recent study, they found that the noise your food makes while you’re eating can have a significant effect on how much food you eat.
“Sound is typically labeled as the forgotten food sense,” adds Ryan Elder, assistant professor of marketing at BYU’s Marriott School of Management. “But if people are more focused on the sound the food makes, it could reduce consumption.”
“For the most part, consumers and researchers have overlooked food sound as an important sensory cue in the eating experience,” said study coauthor Gina Mohr, an assistant professor of marketing at CSU.
The team carried out three separate experiments to quantify the effects of “food sound salience” on quantity of food consumed during a meal. In one experiment, participants were given snacks to eat while they wore headphones playing either loud or quiet noises. The ones loud enough to mask the sound of chewing made subjects eat more — 4 pretzels compared to 2.75 pretzels for the “quiet” group.
In another of their experiments they found that just having people hear chewing sounds through an advertisement can decrease the amount they eat.
Elder and Morh call this the “Crunch Effect.” The main takeaway of their work should be the idea of mindfulness, they said. Being more mindful of not just the taste and physical appearance of food, but also of the sound it makes can help consumers to eat less.
“When you mask the sound of consumption, like when you watch TV while eating, you take away one of those senses and it may cause you to eat more than you would normally,” Elder said.
“The effects many not seem huge —one less pretzel— but over the course of a week, month, or year, it could really add up.”
So the next time you sit down for a meal, take your headphones off and mute the TV. Or find a movie where there’s a lot of very audible chewing.
The full paper, titled “” has been published online in the journal Food Quality and Preference and is available here.
Throughout our hunter-forager days, humans have developed a subconscious urge to over-eat and became less and less psychologically equipped to avoid obesity, especially during the winter months, a University of Exeter study recently found. Evolving in an environment where food security was only a pipe dream, the lack of an evolutionary mechanism to help us resist the temptation of sweet, fatty and unhealthy food is understandable, researchers state.
People ultimately are animals themselves, and like all animals we’ve evolved and adapted to living in the wild, tailoring our biology to the rigors of an often harsh and unforgiving environment. In the wild, from a survivalistic point of view being overweight brings much to the table for relatively little cost, but being underweight could be life threatening. So we’ve developed an urge to eat in order to maintain body fat; an urge that only gets stronger in the winter, when food became scarce in the natural world.
Ahaha, way ahead of you dawg! Image via funnyjunk
This, scientists believe, explains why our winter holidays traditionally revolve around bountiful meals and why our New Year’s resolutions to lose all the extra weight fail so utterly. We don’t live in the wild any more though, and we know that being overweight is detrimental to our health in the modern world, so..
Why don’t we put the fork down?
“You would expect evolution to have given us the ability to realise when we have eaten enough, but instead we show little control when faced with artificial food,” said Dr Andrew Higginson, from the College of Life and Environmental Sciences at the University of Exeter, lead author of the study.
Higginson’s team used computer modelling to predict the optimal amount of fat that animals (including humans) should store, assuming evolution has given them physiological and psychological tools to maintain their healthiest weight. Their results show a strong correlation to the availability of food and predatory risks; in other words, when food is scarce animals should attempt to build their fat reserves to have a better chance of surviving if they can’t find anything to eat, and shed the extra pounds when food is readily available to give them a better chance of escaping predators (and looking less tasty.)
Overall, the model shows that there is sort of a tipping point, a target body weight above which the animal should try to lose weight and below which it should attempt to gain fat. But their simulations also showed that usually there’s only a small negative effect on energy stores (i.e. carrying those love-handles around) when exceeding the optimal point; evolution understands this really well, so any subconscious mechanisms working against becoming overweight are a feeble defense to the immediate physical reward of eating tasty food. In modern society where food is really tasty and readily available, the urge to eat becomes much more powerful than our internal weight-o-meters.
“Because modern food today has so much sugar and flavour the urge humans have to eat it is greater than any weak evolutionary mechanism which would tell us not to,” Higginson goes on to say.
And during winter, our survival instincts kick in big time, making us much more likely to over-eat just so that we’ll survive winter; and making New Year’s weigh-loss resolutions throughout the world fail before they begin.
“The model also predicts animals should gain weight when food is harder to find. All animals, including humans, should show seasonal effects on the urge to gain weight. Storing fat is an insurance against the risk of failing to find food, which for pre-industrial humans was most likely in winter. This suggests that New Year’s Day is the worst possible time to start a new diet.”
The evolutionary model also shows that there is no evidence to support the “drifty gene” hypothesis, which some researchers have previously suggested would explain why some people become overweight and others do not.
We all know that men like to impress the fairer members of our species, and this permeates into almost everything we do: we want to drive the shiniest car on the block, crack the funniest jokes 24/7 and write for ZMEScience so we can impress the ladies at parties (works every time). In essence, no matter how unlikely it is to actually impress, if a man has a choice between doing something and doing that something over the top so he can show off to women, you can bet your right arm he’s gonna do the latter.
Don’t believe me? Well, a recently published study discovered that men will actually eat more food when they dine with a woman than they do in the company of other males, just to show off.
Men who were coupled up with a female tend to eat more to impress the fairer sex. Image via wikimedia
Netflix and eat?
The study observed over 150 adults having lunch at an all-you-can-eat Italian buffet over a two-week period. Researchers from Cornell University, who collaborated with Cornell’s Food and Brand Lab for the study, took note of the number of pizza slices and how many bowls of salad each subject consumed. Men who walked in the buffet with a female and ate there packed their plates with pizza slices and left the buffet line with bowls overflowing with salad. On average, they ate 93 percent more pizza and 86 percent more greens than the men who ate alone or with other men.
‘These findings suggest that men tend to overeat to show off – you can also see this tendency in eating competitions which almost always have mostly male participants,’ explains lead author Kevin Kniffin, PhD, of Cornell University in a recent press release.
The researchers waited for the diners to finish their meal and asked them to complete a short survey indicating their level of fullness after eating, their feelings of hurridness and comfort while eating. While they didn’t change the amount they ate while dining with either gender, the women reported feeling like they overate and rushed through the meal when dining with men — however, the team said that their observations disproved this.
So the next time you’re out eating with a guy friend, just try to relax and enjoy your meal; it’s just your brain trying to impress him — his brain is busy doing the same.
After they identified precise groups of cells in mice brain that induce eating and others that curb it, a team of researchers caused full mice to continue eating and hungry mice to stop eating simply by stimulating one of these areas. Their findings could aid in the development of novel drugs that target eating disorders such as anorexia or obesity.
“This is a really important missing piece of the puzzle,” says neuroscientist Seth Blackshaw of Johns Hopkins University in Baltimore. “These are cell types that weren’t even predicted to exist.”
Scientists led by Joshua Jennings and Garret Stuber of the University of North Carolina at Chapel Hill first genetically engineered mice such that some of their neurons responded to light. This allowed them to apply optogenetics techniques to manipulate the behavior of these neurons. Optogenetics is a hot technique in neuroscience research right now, involving taking a light-activited gene (called a channel rhodopsin) targeted into a single neuron type, and inserting it into the genome of, say, a mouse. Whenever a light is shone into the mouses’s brain, the rhodopsin channel replies and consequently the neurons expressing the rhodopsin channel fire. Basically, simply by firing light you get certain neuron types to become activated or not (there are channels that inhibit neuron firing).
An on/off switch for eating
The neurons the researchers targeted reside in a brain locale called the bed nucleus of the stria terminalis, or BNST. Some of the message-sending arms of these neurons reach into the lateral hypothalamus, a brain region known to play a big role in feeding. When a laser was used to activate BNST, the mice began to insatiably eat.
“As soon as you turn it on, they start eating and they don’t stop until you turn it off,” Stuber says.
The opposite behavior happened when a laser silenced BNST neurons’ messages to the lateral hypothalamus: The mice would not eat, even when hungry. These two important observations back-up the idea that these lateral hypothalamus neurons normally restrict feeding.
Is there a physiological feeding limit though? For instance, if these neurons were stimulate indefinitely would the mice die from overeating or starvation? The researchers don’t know since for this particular research they only used short 20-minute-long bursts of laser light. Longer-term manipulations of these neural connections — perhaps using a drug — might cause lasting changes in appetite and, as a result, body mass, Stuber says.
What the findings, which were reported in a paper published in the journal Science, once again show is the brain is the master. We think of feeding in terms of metabolism and body stuff,” Stuber says. “But at the end of the day, it’s controlled by the brain.”