Category Archives: Anatomy

New bionic arm can communicate with the brain of the wearer

Researchers from the US have just created a revolutionary bionic arm for people with upper-limb amputations. It’s one of the first prosthetic arms to deploy all functions of the hand at the same time, allowing users to control it using a brain-computer interface to initiate the interaction. 

Image credit: The researchers

The prosthetic and orthotics market is expected to reach $8.6 billion by 2028, according to a recent report, with a growing geriatric population and sports injuries driving the market growth. But after a series of innovations, artificial limbs haven’t changed that much in recent years, with the ones currently on the market not giving people intuitive sensations.  

With this in mind, researchers working at the Cleveland Clinic developed a first-of-its-kind bionic arm that combines three important functions. It gives those who use it the intuitive feeling of opening and closing the hand, as well as intuitive motor control and touch and grip kinesthesia, as the researchers explain in their paper. 

“We modified a standard-of-care prosthetic with this complex bionic system which enables wearers to move their prosthetic arm more intuitively and feel sensations of touch and movement,” researcher Paul Marasco said in a statement. “It’s an important step towards providing people with amputation with complete restoration of natural arm function.”

Combining touch, grip, and motor control can trick the senses and brain of the prosthetic wearer into thinking it’s real, Marasco explained. Patients control the prosthetic by sending impulses from the brain when they want to use it or move it. The arm also works with a set of sensors to get information from the environment and sent back to the brain, mimicking the biological mechanism. 

The researchers tested their new bionic arm in real life with two individuals that had upper limb amputations. One of the test subjects, Claudia Mitchell, told Daily Mail that the arm made a “huge difference” in her life, allowing her to do everyday activities that she couldn’t before, such as cutting a peach and picking up a make-up bag with her thumb and forefinger. 

This is a big step forward, as conventional prosthetic arms can’t recreate such fine movements. Amputees instead have to keep a close eye on things that you or I would do without a second thought. Now, with the new bionic arm, the study participants reverted back to reflexive behaviors they had from before the amputation, such as intuitive grip. 

“Perhaps what we were most excited to learn was that they made judgments, decisions and calculated and corrected for their mistakes like a person without an amputation,” Marasco said. “With the new bionic limb, people behaved like they had a natural hand. Normally, these brain behaviors are very different between people with and without upper limb prosthetics.

Attention to details

The newly developed bionic arm requires three main components. Apart from the arm itself, it involves realigning nerve endings as well as using small robots that work to control the use of the arm. It all starts with a surgical procedure to take the amputee’s unused nerve endings within the healthy part of the arm and plug them into the site of amputation.

The arm is then placed onto the amputation site, with very small robots fit into the socket. The robots then stimulate the nerve endings that are now attached by pressing on relevant areas of the site. For Marasco, “you can buzz the muscles” and generate “perceptual illusions of complex hand movement” on the people that use the new bionic arm. 

Image credit: The researchers

Instead of starting from scratch, the researchers worked with prosthetic limbs currently available in the market and altered them, adding advanced computing, touch, and movement sensors. By doing so, they hope the new bionic arm will reach rehabilitation clinics much faster and also be more cost-effective. “It looks like any other prosthetic,” Marasco explained. 

“Over the last decade or two, advancements in prosthetics have helped wearers to achieve better functionality and manage daily living on their own. For the first time, people with upper limb amputations are now able to again ‘think’ like an able-bodied person, which stands to offer prosthesis wearers new levels of seamless reintegration back into daily life,” he added.

The research behind the arm was published in the journal Science Robotics. 

What is hemoglobin, and should you worry that it’s in your blood?

If you like breathing and being alive, you should be very fond of hemoglobin. This iron-bearing compound is biology’s designated oxygen carrier.

Image credits Murtada al Mousawy / Flickr.

With very few exceptions, blood is a distinctive, intense crimson. The color is given off by the high levels of hemoglobin (also spelled ‘haemoglobin’) in erythrocytes, red blood cells. Although that sounds like a fancy type of goblin, it’s actually a metalloprotein. And we should be very thankful that it’s there! Animals of our size would arguably not be possible without hemoglobin.

Important delivery

Your red blood cells are roughly 96% hemoglobin by dry weight, and around 35% when hydrated. This extremely high content should be our first indication of how critical the protein is for our organisms. So what exactly does it do? Well, in essence, it works as the body’s fuel supplier. Hemoglobin-laden cells make sure that there’s enough oxygen reaching tissues in your body for them to be able to generate energy (respiration).

Hemoglobin is the body’s designated oxygen carrier. Each molecule of it — at least, of the version the human body uses — can securely bind to four oxygen molecules and quickly let them go when needed. While blood can naturally carry some oxygen dissolved in its plasma, the hemoglobin in our red blood cells increases its ability to carry oxygen seventy-fold. Which is very good for us.

Although its main job is to carry oxygen to and fro, that’s not the only gas it can carry. Hemoglobin is also involved in scrubbing the ‘exhaust’ from cells, carrying around 25% of the CO2 produced by our cells, and shuttles nitric oxide (NO) around the body.

What exactly is it?

Like all other proteins, hemoglobin is a 3D structure created from multiple amino acids that are bound together, then folded around themselves. The term hemoglobin makes a direct nod to the molecule’s shape from the root Latin word ‘globus’ (meaning ball). While different lineages can employ heme/globin proteins (that’s the name of the wider family of these proteins) with different structures, the one humans and mammals use is made up of four sub-assemblies called ‘globular proteins’. The particular way these link up together is known as a globin fold pattern and is widely seen in the heme/globin family.

A hemoglobin molecule. Image via Pixabay.

The thing that makes hemoglobin so important is an iron ion (atom) sitting in the middle of these four sub-sections. This is the specific area where O2 molecules reversibly bind to the protein, to be carried away. This iron atom is, in turn, welded to the protein through four covalent bonds with nitrogen atoms. Depending on the exact valence of this iron atom, either ferric or ferrous, the protein will be able to bind oxygen compounds. “Ferric” iron, or iron(II) can bind it, while ‘ferrous’ iron, iron(III), cannot. When not in use, this iron atom binds to a water molecule as a placeholder.

Carbon dioxide, on the other hand, binds to other areas of the heme proteins that make up the hemoglobin molecule.

What happens when it doesn’t work?

Hemoglobin’s reactivity and ability to bind to a long list of compounds is usually its key strength, but mostly because it can also easily unbind these molecules. One gas that throws a wrench in that approach is carbon monoxide: a colorless, odorless gas that’s very deadly.

The issue with this carbon monoxide (CO) is that once it binds to hemoglobin, it forms carboxyhemoglobin. This compound makes the bond between the cell and oxygen much more stable and, as such, harder to break down later on. For starters, this means that any red blood cell that has encountered a carbon monoxide molecule has its ability to transport oxygen dramatically (if not completely) reduced. Secondly, the reaction between these two produces a release of mitochondrial free radicals. These cause oxidative stress, which could be the main driver of aging, and also attract leucocytes (white blood cells) to the area.

A few of the effects of carbon monoxide poisoning, on a biological level, include damage to endothelial cells (the ones that membranes and blood vessels are made of), most notably damage to the vasculature of the brain, and lipid peroxidation (chemical cell damage) of brain membranes. Because carboxyhemoglobin is a very bright red, the skin of CO poisoning victims can take a pink or purple hue. It only takes ambient concentrations of around 0.1% to cause unconsciousness and possibly death.

Carbon monoxide is the product of an incomplete burn. It’s most commonly found in smoke from low-burning or smoldering fires. Any fire that doesn’t have access to as much oxygen as it ideally wants will produce some amount of CO. This carbon monoxide is also produced in cigarettes and is one of the main causes of feeling out of breath after a smoke. Up to 20% of all oxygen-binding sites can be blocked by CO in heavy smokers.

Image credits ZEISS Microscopy / Flickr.

A bit ironically, the process through which hemoglobin is recycled in our spleen is the only natural source of carbon monoxide in the human body, and it accounts for the baseline levels of this gas found in our bloodstream. Each healthy red blood cell lives to around 120 days before being recycled.

Cyanide (CN-), sulfur monoxide (SO), sulfide (S2-), and hydrogen sulfide (H2S) groups also like binding to hemoglobin and not letting go, making them very toxic to us. Avoid breathing them in at all costs.

A too-low number of red blood cells — and thus, insufficient hemoglobin to carry gases around — is known as ‘anemia’. Anemia is the most common blood condition in the US, affecting around 6% of the population. In very broad terms, it is caused by either loss of blood, the inability to produce enough red blood cells, or conditions that lead to a rapid loss of such cells, although some cases are caused by genetics.

Women, older individuals, and those with long-lasting conditions are more likely to have anemia. Old age and chronic medical conditions can cause anemia by damaging the body’s ability to produce and recycle hemoglobin; women are more likely to develop iron-deficiency anemia due to the blood loss related to their menstrual cycle.

Symptoms of anemia include dizziness, or feeling like you’re about to pass out; headaches; unusual heartbeat patterns, shortness of breath, cold hands and feet, tiredness and physical weakness; pain in the chest, belly, bones, or joints, or swelling, can also be a symptom.

Any hemalternatives?

Oxygenated hemoglobin gives the blood in our arteries its scarlet color; unoxygenated blood flowing back to the heart and lungs is a darker color, although the veins that carry it often appear a bit more purple or blueish. A related compound known as myoglobin is why our muscles, or the muscles in red meat are shiny red but also a bit grey. Myoglobin works pretty much the same as hemoglobin with the difference that it’s not meant to carry oxygen around, but rather keep it stored for use. Structurally, myoglobin only has one binding site (it can hold less oxygen than hemoglobin) but the bond is much more stable.

But if you want to go full blue blood, you can go for hemocyanin. This is the second-most common molecule used to transport oxygen in blood after hemoglobin and is seen in many mollusks or anthropods. It substitutes iron heme groups for copper-bearing ones, and turns a very rich blue when oxygenated.

If you want to go for pink or violet, instead, try hemerythrin. It’s not a very commonly seen compound, with a handful of species, mainly marine vertebrates and a few worms (annelids), using it. It does stand out for turning clear when not oxygenated.

What annelids do like to use, however, are hemerythrin and erythrocruorin. They’re pretty similar in structure but with significantly different heme groups. Hemerythrin appears red when oxygenated and green when not. Erythrocruorin is common in earthworms and has the distinction of being a huge molecule, containing up to hundreds of heme-protein subsections and iron-ion binding sites.

Finally, a more exotic use of heme proteins is found in leguminous plants such as beans. They employ leghemoglobin to draw oxygen away from the nitrogen-fixing bacteria around their roots; oxygen here would impair the process of reducing nitrogen gas to nitrogen, which is a key nutrient for all plant-life and which imposes a ceiling on their ability to grow.

Gourmet pterosaurs constantly improved their flight — until they were wiped out by killer asteroid

Two giant Arambourgiania pterosaurs sharing a small theropod for dinner. Credit: Mark Witton.

Pterosaurs were the very first vertebrates to evolve powered flight nearly 230 million years ago. Previously, only insects were capable of flying. But these first fliers were a bit clumsy, and it took a while before pterosaurs could reach their full potential. According to a new study published today in the journal Nature Communications, the ancient flying reptiles became better fliers at a constant rate until they went extinct 65 million years ago.

“Meaning that on average for 150 million years descendants were better flyers than their ancestors. This is a quite striking and unique demonstration of Darwin’s idea of descent with modification as species get better in their environment. I was certainly surprised to see such a clear demonstration of that!” Chris Venditti, Lecturer in Evolutionary Biology at the University of Reading in the UK and lead author of the new study, told ZME Science.

Bigger, better fliers

Pterosaurs were airborne animals that were closely related to dinosaurs. Like other flying animals, these reptiles generated lift with their wings, performing the same kinds of motions as birds and bats. They became quite good at it, traveling over long distances where they occupied new habitats across the world. Eventually, they branched out into an enormous array of species, including the largest winged-animals ever.

But this journey wasn’t straightforward. By studying the fossil records by employing new statistical methods, Venditti and colleagues reconstructed the evolution of the pterosaur flight dynamics across millions of years. Their findings suggest that natural selection acted to increase flight efficiency in these animals constantly since the time they first appeared in the fossil record until their premature extinction when an asteroid impact wiped them out, along with all non-avian dinosaurs.

There was only one exception to this pattern.

“A group of pterosaurs called azhdarchoids buck this trend. There are many hints in the scientific literature that suggests that this group had more terrestrial affinities than other pterosaurs. Some had inflexible necks which are not ideal for efficient flying, others left fossil tracks indicating terrestrial proficiency, and yet others had adaptations associated with ground-dwelling generalist foraging or wading foragers that fed on hard-shelled organisms at water margins. So, it seemed that these azhdarchoid pterosaurs did not rely on flight so much and our results support that – whilst they could fly, they might have only done so when they needed to. Some of these groups of pterosaurs were enormous (as tall as a giraffe),” Vendetti said.

An Arambourgiania pterosaur with a giraffe and average-sized human to scale. Credit: Mark Witton.

Besides increasingly better flight performance, the researchers also showed that pterosaurs grew in size over time but only after birds first appeared. According to Vendetti, this was expected given Cope’s Rule —  the tendency for organisms in evolving lineages to increase in size over time — and the interactions between the two groups of flying animals suggest that birds outcompeted pterosaurs at the small size range.

Interestingly, pterosaurs not only got bigger, their wings grew even larger.

“Through time pterosaurs got bigger wings than we would expect for their size – their relative wing size increased through time. So, they got bigger, but the wings got even bigger! All other things equal then, in turn, this leads to increased flight performance. In Azhdarchoids that buck the trend, even though they were some of the largest pterosaurs, relatively speaking they had small wings (for their size),” Vendetti said.

From beetles to fish: how pterosaur diet evolved

Another article published today in the same edition of Nature Communications also investigated pterosaur evolution, this time from a dietary perspective.

By studying the wear and tear of fossilized teeth from numerous pterosaur species, researchers at the University of Birmingham in the UK, led by Jordan Bestwick, could reconstruct what 17 different species of pterosaurs ate.

Dimorphodon, for example, ate a mix of vertebrates, Rhamphorhynchus ate fish, and Austriadactylus ate ‘hard’ invertebrates such as beetles and crustaceans, the authors concluded.

“I found large dietary diversity between pterosaurs as a group, ranging from carnivores, piscivores (fish eaters) to consumers of crunchy invertebrates and even generalists. In some instances this reaffirmed our current understanding of the diets of some species, whereas in others they provided completely new insights into diet,” Bestwick told ZME Science.

The researchers were able to infer what pterosaurs ate millions of years ago by studying the marks left on their teeth. Different kinds of foods leave different impressions, which reflect a creature’s diet.

Bestwick and colleagues used a technique called microwear texture analysis, which they previously employed on modern reptiles such as crocodiles and monitor lizards. Since the technique was able to determine foods found in known modern reptile diets, the researchers were confident to try it out on extinct reptiles such as pterosaurs.

“We found that the earliest pterosaurs consumed mostly invertebrates and that the later species were the more obligate carnivores and piscivores. What was really interesting is that this dietary shift sped up around the 150 million year mark which is around the same time that birds were evolving. Further study is needed to know whether our finding is just a coincidence or actually represents a trend where the evolution of birds changed the trajectory of pterosaur evolution, but does provide a new voice into the fiercely debated topic on pterosaur and bird competition,” Bestwick said.

“We also found evidence of one pterosaur shifting its diet as it grew up. Rhamphorhynchus lived in Germany around 150 million years ago and its complete life-history is preserved in the fossil record, from hatchlings no bigger than a sparrow, to adults about the size of a gannet. As Rhamphorhynchus grew up it shifted its diet from invertebrates to fish. This dietary shift is observed in many crocodilians and gives a clue into how pterosaurs looked after their young. Most reptiles do not feed their young and so young individuals eat different foods to the adults. Birds, on the other hand, feed their helpless young and so the young are consuming the same foods as the adults. That Rhamphorhynchus changed its diet provides a clue that maybe pterosaurs grew up like reptiles, despite flying like birds,” he added.

The two studies completement each other nicely, enriching our understanding of nature’s pioneering fliers.

“I think our study is a striking demonstration of how natural selection sculpts biological organisms through time by natural selection to be better in their environment. It also provides a blueprint to objectively study evolution through millions of years,” Venditti said.

“Pterosaurs are one of the most famous groups of prehistoric animals in the public eye (although they are colloquially referred to as ‘pterodactyls’) and are like nothing that is alive today. Microwear analysis is truly a window into the past that enhances and even changes our understanding of how these animals lived, grew up and evolved, makes them more like real animals, rather than as monsters you see in films,” Bestwick added. 

With so many of us working from home, here’s how to prevent back aches and other issues

Twice as more people are working from home than before the pandemic. The work-from-home economy is promising to reshape how we think about our workplaces, and with no end to the pandemic in sight, many of us might be working from home for a long time.

But while this brings obvious advantages (bye-bye commuting time), there are a couple of issues to deal with. For instance, while our workspace has been designed for work, our homes might not be. You may very well work from your couch, your bed, or even the floor — but your back won’t be happy about it. Most people thought their offices would only be closed for a few weeks, but as the work exodus continues, it’s becoming apparent that we’ll have be working from home for a long time. If this does end up happening, it’s more important than ever to look after yourself.

Working from home allows you to enjoy countless benefits. But, more often than not, your back’s health is sacrificed with this kind of working setup. If you have been experiencing back ache and other related health issues, let this article help.

Here’s how you can prevent back aches when working from home:

Get up and stretch regularly

Too much sitting down is bad for you — even if you exercise. So the first piece of advice to follow is avoid sitting down for prolongued periods of time. The more you sit down, the more likely it is to develop chronic conditions down the road.

The best thing to do is to get up every half an hour or so and stretch a bit, or just walk around (get a cup of tea or some water). A good rule of thumb to follow is the ’20-20-20 rule’, which is meant for your eyes, but could be helpful in more ways than one. The idea is that every 20 minutes, you should get up and take a 20 second break to look at something that’s 20 meters away from you — that’s 65 feet.

Aside from helping you prevent back aches, getting up and stretching when you’re working can also boost your productivity and keep stress at bay. This is especially important for people whose jobs will require them to look at a computer or laptop for long hours.

Posture, posture, posture

You’ve heard it time and time again, but there’s a good reason for it: posture can make or break your back — almost literally. So what’s good posture?

For starters, don’t be a slouch. Slouching adds stress on the spine and a constant slump can have devastating long-term effects not only on your spine, but also on your internal organs (you’re basically adding extra pressure on them). A good way to prevent that is to stand (or sit) up tall — with your shoulders to the back and belly tucked in, always pretend like you’re having your height measured.

If you can’t sit up tall on your own, consider using pillows to support your back. Certain pillows  are made to support your back so using this will make it easier for you to prevent back aches in the long run.

It’s not just your back, either. Posture is also about your neck and head, beware of ‘text neck’! When you tilt your head down, that puts pressure on your spine, which can lead to — you’ve guessed it — back pain. This happens especially when we read or write texts and look down. To counter this, try to avoid looking down for prolongued periods, either bring your phone in front of your eyes or try moving your eyes, not your head.

Working from home can ruin your posture, but you can also use this to your advantage. Unlike your workspace, you can tweak your home in any way you’d like, so take advantage of this and make it comfortable and pleasant.

Finding a good chair

Posture can only get you so far if your chair sucks. Thankfully, ergonomic chairs have become common nowadays, and they’re fairly accessible and cheap. It doesn’t have to be the latest brand or the most expensive one (although good ergonomic chairs are not always cheap) — but any chair that is comfortable and offers solid lombar support can make a world of a difference.

Of course, once you get a chair you also have to sit in it properly. If you’re already suffering from back pain, listen to your body and try sitting in a way that isn’t painful or unpleasant, and again, avoid sitting down for too long in a row.

Working from home has become mainstream today, which means that you’ll be able to come across countless businesses that are coping with the demand by selling ergonomic chairs. For you to narrow down your options, make sure to visit the store and try out the chair yourself. This will allow you to determine which kind of chair truly fits your needs and suits your back.

Get a good night’s sleep

A bad back day usually starts in the morning, and that’s often because you’re not sleeping well. Get rid of that old, soft matress and get a firm one that’s good for your back. Pay attention to your position when you sleep and you wake up — are you in a comfortable, healthy position?

If you sleep on your side, it can help to bend the knees, but don’t hug them. If you sleep on your back, don’t use a thick pillow — use slim one that keeps your head at the same level as your body.

Aside from investing in a high-quality mattress, filling your bedroom with comfortable pillows can also help. Pillows can provide the necessary support to your body and relieve pressure. This way, you’ll be able to sleep soundly throughout the night.

Working out helps

A strong back is a healthy back and simple exercises can go a long way. Harvard University has some excellent tips for working out with back pain, but in general, all work outs that don’t cause pain are good. Whether it’s swimming or stretching, it will likely help your back.

Yoga is also excellent for back pain and there are a million programs you can find tailored for a specific pain or ache. Here too, it pays to customize your home for a workout because our homes aren’t just becoming our offices, they’re also becoming our gyms.

There are actually many workout routines that can help relieve your back pain. However, if you want to be successful with your efforts, you should be willing to allocate time and effort for these routines. You won’t be able to get rid of any back aches if you only did yoga for one week and then returned to your unhealthy lifestyle. If you want to see results from your workouts, consistency is key.

If pain persists, have it checked out

We all hate going to the doctor’s especially in this period. But if you’re suffering from back pain and it doesn’t go away, well, ignoring it won’t make it better. As unpleasant as it may be, if the pain doesn’t go away, you should go to the doctor’s and have it checked out.

Depending on the gravity of the problem, your doctor might recommend you to take medicines or undergo procedures such as the Pro Inversion Therapy. Regardless, to ensure that the problem will be resolved in the safest way possible, it’s always best to seek professional advice if your back ache still persists.

In Conclusion

In time, back pain can cause major problems, and it’s always a good time to start taking more care of your back. If you sit at a desk all day long, you’re at great risk of developing back pain, and that pain can turn into something much more troublesome along the way.

Human fetuses have lizard-like limb muscles but lose them before birth

Scan of a developing fetus’ left hand, showing the dorsometacarpales. Credit: Rui Diogo.

We might not have tails anymore, but our genomes still contain the genetic code for growing one. Each of us has an embryonic bony tail buried in our lower backs—the coccyx or tailbone—stunted by a loss of molecular signals that would otherwise cause it to grow out like an arm or leg. 

Scientists call these features vestigial body parts — a vestige of our evolutionary heritage. That’s because evolution is not a straightforward process but rather a series of tweaks as beneficial mutations are promoted in populations only to become useless in later generations as the environment changes (other examples include wisdom teeth or the appendix).

Often, these vestigial structures performed some important function in the organism at one point in the past. However, as population structure changed due to natural selection, they’ve become increasingly unnecessary to the point they were rendered pretty much useless. When this happens, these features tend to disappear with passing generations — but sometimes they stay because they do no harm.

In other situations, vestigial body parts form during an organism’s early development, only to be discarded as the creature matures. In a striking new study, researchers have discovered nearly 30 such muscles in the legs and arms of developing human fetuses. By week 13 of gestation, a third of these muscles vanish or fuse together.

Some of these muscles, such as the dorsometacarpales, have been last seen in our adult ancestors over 250 million years ago, but they can still be found in lizards and salamanders today.

The muscles that turned all your digits into thumbs

In order to capture these embryonic developments for the very first time, researchers led by Dr. Rui Diogo from Howard University in the U.S. employed advanced 3D imaging techniques on 15 developing babies.

Sometimes, a child is born with a few extra finger and hand muscles, which may explain some of these fetal developments. However, a baby is never born with all the dosometacarpales that the biologists saw in their 3D scans.

It’s unclear why the body temporarily grows these limb muscles only to delete them at a later time, but the process has been described in unprecedented detail in this study.

Diogo says that our thumbs have a lot more muscles than other digits, which allows for very precise thumb movements. Perhaps our other digits also had more muscles but lost them because we didn’t need them as much — however, it’s not so easy to lose a feature once it’s been expressed by the genome.

The study is striking because it adds a new level of complexity to human development. It also raises important questions. What else are we missing, for instance? Biologists hope to answer this question, at least partly, by performing detailed scans of other body parts in the future.

The findings appeared in the journal Development.

How much are Americans sitting

The short story: a lot. The long story: about 8.2 hours for adolescents and 6.4 hours for adults.

“Prolonged sitting, particularly watching television or videos, has been associated with increased risk of multiple diseases and mortality,” the new study starts off. Indeed, research has linked sitting to issues such as increased blood pressure, high blood sugar, excess body fat and abnormal cholesterol levels — to name just a few. The science is extremely clear on that, but although we know we should sit less, we really don’t. I’m sitting as I’m writing this, and you’re probably reading this sitting. Sitting has become so ingrained in our day to day life that not doing it almost feels like a chore.

In an attempt to see just how much Americans sit, researchers working in Canada and the US analyzed data from nearly 52,000 children, adolescents and adults from 2001-2016.

A majority of all investigated age groups spend an unhealthy amount of time sitting. The team found that 62% of children, 59% of adolescents and 65% of adults sat watching television or videos at least 2 hours a day in 2015-2016. But while the number of people sitting for more than 2 hours has remained relatively stable, the overall sitting time has increased. We’re sitting more than ever before, researchers found.

Overall sitting time has increased from 7.0 to 8.2 h/d among adolescents and from 5.5 to 6.4 h/d among adults, and the main reason for that is the computer. Computer use outside school or work for at least 1 h/d increased from 2001 to 2016. Surprisingly, this growth has been most prevalent for adults.

“In this nationally representative survey of the US population from 2001 through 2016, the estimated prevalence of sitting watching television or videos for at least 2 hours per day generally remained high and stable. The estimated prevalence of computer use during leisure-time increased among all age groups, and the estimated total sitting time increased among adolescents and adults,” the study reads.

Some groups tend to watch TV more than others, researchers also found.

“For all ages, a substantially higher prevalence of sitting watching television or videos was observed among male, non-Hispanic black, obese, or physically inactive individuals,” the study continues, adding that black adults were at higher risk of all-cause mortality associated with prolonged television viewing than were white adults.

While having an active lifestyle can offset some of the damage done by sitting, it can’t eliminate the risk completely. Public health programs typically focus on increasing physical activity rather than reducing sitting time and sitting time rarely plays a central role in health interventions.

The US isn’t an isolated case. The world is experiencing a sitting epidemic, with previous research finding that sitting occupies up to half of an adult’s workday in developed countries. However, the world still doesn’t have a clear solution, particularly as alternatives such as standing desks are rather unpopular, and the health results associated with them have been mixed.

The study has been published in JAMA.

Researchers 3D print a miniature heart — using a patient’s own cells

The heart is still too small to be useful, but this represents an important proof of concept, researchers say.

We’ve all seen how 3D printers can be used to produce a wide variety of materials, but human body parts weren’t exactly on the expectation list — and a heart is probably the last thing you’d expect. But in a new study, researchers from Tell Aviv University have done just that: they’ve 3D printed a miniature version of a human heart, using material from a patient.

“It’s completely biocompatible and matches the patient,” said Tal Dvir, the professor who directed the project.This greatly reduces the chances of rejection inside the body,

Dvir and colleagues harvested fatty tissue from a patient, then separated it into cellular and non-cellular components. The cellular components were then reprogrammed into stem cells and subsequently turned into heart tissue cells. The non-cellular cells were also processed and used as a gel that served as the bio-ink for printing.

The process was lengthy. A massive 3D printer sent a small stream of this bio-ink to print, and the cells were then left to mature for another month. For now, the heart is very small and doesn’t “work” — but this is still an important breakthrough. Previously, only simple tissues had been printed.

A simplified diagram of the heart-printing process. Image credits: Tel Aviv University.

“We need to develop the printed heart further,” Dvir said. “The cells need to form a pumping ability; they can currently contract, but we need them to work together. Our hope is that we will succeed and prove our method’s efficacy and usefulness.”

The potential for this invention is tremendous. Cardiovascular diseases are the number one cause of death in industrialized nations, and heart transplants face a number of hurdles, ranging from the lack of donors to challenging surgery and potential rejection. This would not only ensure that there is always a donor (the patient himself) but also eliminate the risk of rejection.

A human-sized heart might take a whole day to print and would require billions of cells, compared to the millions used to print these mini-hearts, Dvir said. This is still just the very first stage of the project, but it’s still a promising one. Even though it will be a long time before functional hearts can be produced thusly, researchers are also considering printing “patches” to address localized heart problems.

“Perhaps by printing patches we can improve or take out diseased areas in the heart and replace them with something that works,” Dvir concluded.

The study “3D Printing of Personalized Thick and Perfusable Cardiac Patches and Hearts” has been published in Advanced Science.

New study shows how muscle memory works — you never really lose it

No such thing as ‘use it or lose it’, researchers find.

Image in public domain.

Quite often, our muscles do things without us truly thinking about it. Whether it’s playing the guitar, riding a bike, or simply typing your password at the ATM, we’re all familiar with our muscles doing things “without us”. These and other forms muscle memory have been a puzzling matter of debate for decades, and researchers are still discovering more and more things about how this actually works.

But if you don’t use them regularly, do you use that ability? In other words, is it “use it or lose it”?

[panel style=”panel-default” title=”Muscle memory” footer=””]Muscle memory is a term used to describe  tasks which seems to be easier to perform after previous practice — even if the practice happened a very long time ago. It’s as if the muscles “remember” what happens and are quicker to return to their previous capacity.

This doesn’t only affect things you do, but also the muscle mass and training, which helps to explain why strength-trained athletes experience a rapid return of muscle mass and strength even after long periods of inactivity.

[/panel]

We’ve known for a while that if you stop using your muscles, they’ll shrink and in time, you’ll lose muscle mass. Until recently, scientists also thought that the nuclei (the cell “headquarters” that build and maintain muscle fibers) are also lost. However, according to a new review, this isn’t the case — and muscles are able to “bank” muscle growth potential

The key lies in a cell called a syncytium — a special type of tissue in which cells are fused so closely they almost behave as a single cell.

“Heart, bone and even placenta are built on these networks of cells,” says Lawrence Schwartz, an author of the new study. “But by far our biggest cells – and biggest syncytia – are our muscles.”

“Muscle growth is accompanied by the addition of new nuclei from stem cells to help meet the enhanced synthetic demands of larger muscle cells,” explains Schwartz. “This led to the assumption that a given nucleus controls a defined volume of cytoplasm – so that when a muscle shrinks or ‘atrophies’ due to disuse or disease, the number of myonuclei decreases.”

The assumption seemed valid for a long time, but it no longer seems to be the case.

Previously, studies have found evidence of nuclei degradation caused by atrophy or paralysis. However, more recent research involving genetic markers found that the decaying nuclei did not belong to muscles — but rather belonged to inflammatory and other cells recruited to atrophic muscle.

In other words, you never really lose these nuclei, and you never really lose your muscle memory.

“Muscles get damaged during extreme exercise, and often have to weather changes in food availability and other environmental factors that lead to atrophy. They wouldn’t last very long giving up their nuclei in response to every one of these insults,” Schwartz explains.

He goes on to quote two recent studies, one in rodents and one in insects, which demonstrated that muscle nuclei are not lost to atrophy and even remain after muscle death has been initiated. This makes a lot of evolutionary sense: muscles get torn during extreme exercise and have to weather changes in nutrient availability and activity variation. Giving up the nuclei every time an unfortunate change happens would not be very productive, Schwartz says.

“It is well documented in the field of exercise physiology that it is far easier to reacquire a certain level of muscle fitness through exercise than it was to achieve it the first place, even if there has been a long intervening period of detraining. In other word, the phrase “use it or lose it” is might be more accurately articulated as ‘use it or lose it, until you work at it again’.”

The study has been published in the open-access journal Frontiers.

Most people who think they have a penicillin allergy don’t — and it can be a problem

A new study has found that a surprisingly high number of people wrongly believe they are allergic to penicillin, and that might end up costing them somewhere down the line.

Within the US, some 10% of all patients believe they are allergic to penicillin — but in reality, 90% of these people aren’t. This means that every year, millions of people can end up taking alternative antibiotics, which are more expensive and can destroy the body’s healthy bacterial flora. Furthermore, according to a study, which was carried out in the UK and published in the British Medical Journal, people with a penicillin allergy are 70% more likely to acquire a methicillin-resistant Staphylococcus aureus (MRSA) infection and have a 26% increased risk of Clostridium difficile-related colitis (C. diff.). This means that people who wrongly believe they are allergic to penicillin are needlessly subjecting themselves to additional infection risks.

The good news, researchers say, is that there’s a simple allergy test which can be carried out, but it needs to be more widely implemented.

People with an alleged penicillin allergy are typically given a prescription of broad-spectrum antibiotics which, as the name implies, cover a broad number of microorganisms, and can end up killing more things than they should — particularly, the body’s useful bacteria. Any imbalance in the body’s bacterial fauna can weaken the immune system and make it easier for other infections to take over — especially drug-resistant bacteria like MRSA.

Meanwhile, penicillin is a very targeted and potent drug, ideal for treating a particular set of infections.

“Penicillin-related drugs, that whole class … they’re very effective at killing, and they’re very targeted. So for some bacteria they’re still the best. Oldie but goody,” Kim Blumenthal, lead author of the new study and assistant professor of medicine at Harvard Medical School, told the Washington Post. “I have seen so many terrible, terrible outcomes” from C. diff. infections, Blumenthal said, including serious diarrhea, sepsis and death.

[panel style=”panel-default” title=”Penicillin” footer=””]Penicillin was discovered in 1928 by Scottish scientist Alexander Fleming, and started being used to treat infections in 1942.

While the number of penicillin-resistant bacteria is increasing, penicillin can still be used to treat a wide range of infections caused by certain susceptible bacteria, including Streptococci, Staphylococci, and Clostridium.

An estimated 0.03% of the population have serious allergies to penicillin.[/panel]

Things can get even worse when targeted penicillin isn’t used. Using non-targeted, broad-spectrum antibiotics can breed the next generation of drug-resistant bacteria. According to the CDC, some 2 million people get infected with these pathogens every year. Among these, over 23,000 will go on to lose their lives as a result of the bacterial infection, which often leads to other complications. The World Health Organization has also identified drug-resistant pathogens as one of the main threats to human society, and already, some infections are becoming nigh impossible to treat — including gonorrhea.

Well, it sure used to at least.

It’s still not clear exactly why so many people wrongly believe they are allergic to penicillin but doctors have a few good ideas. For starters, many are diagnosed with the allergy as a child, and they grow out of the allergy — something which can happen but isn’t a guarantee by any means. Then, allergy means different things to different people. Essentially, an allergy is simply an exaggerated response of the immune system, but that can range from a minor rash to life-threatening issues. If a patient comes into the hospital suffering from a serious, potentially life-threatening condition, and his file says “allergic to penicillin,” doctors simply won’t give him the drug. But quite often, it could ultimately end up saving his life, with only a minor side effect. In most cases, researchers say, penicillin should only be avoided if the side effect is serious.

If you’ve been previously diagnosed with such a penicillin allergy, but more than 10 years have passed, doctors suggest to get re-tested.

Journal Reference: Blumenthal et al. “Risk of meticillin resistant Staphylococcus aureus and Clostridium difficile in patients with a documented penicillin allergy: population based matched cohort study.” doi: https://doi.org/10.1136/bmj.k2400

MRI study shows how Beatboxing really works — and it’s crazy

Beatboxing is an art form in which performers create percussive sounds using nothing but their vocal tract. Now, a team of scientists is using a real-time MRI machine to see how beatboxers create their magic.

Beatboxing techniques have been used as early as the 19th century, but true beatboxing is derived from the mimicry of early drum machines. Nowadays, beatboxing is mostly associated with hip-hop, though it is not limited to it.

Several studies have been carried out on beatboxers, but in the past, they’ve consisted of only one beatboxer with a particular native language. The new study looked at several beatboxers of different ages and genders and with different native languages.

The team used real-time MRI to observe the vocal tracts of beatboxers just before they make a sound to see how those movements differ from the movements associated with speech. Using real-time data offers a dynamic view of the entire vocal tract, at a high enough resolution to observe the movement and coordination of the different biological elements.

“Beatboxers may learn something different in preparing to make a sound than they do when they’re talking,” said Timothy Greer, a doctoral candidate at the University of Southern California. “Using real-time MRI allows us to investigate the difference in the production of music and language and to see how the mind parses these different modalities.”

Three different snare drum effects were demonstrated by the subject, each produced with different articulatory and airstream mechanisms. The technical names are: a click, an ejective affricate, and a pulmonic egressive dorsal stop-fricative sequence. Image credits: Timothy Greer.

The results surprised even Greer: beatboxers use movements not present in any known languages to produce a wide variety of sounds. Essentially, it’s a completely different way of moving the vocal tract.

“We found that beatboxers can create sounds that are not seen in any language. They have an acrobatic ability to put together all these different sounds,” said Greer. “They can hear a sound like a snare drum and they can figure out what they need to do with their mouth to recreate it.”

“As far as we know, some of the articulations that beatboxers can use are not attested in any language,” He added for ZME Science.

However, this type of study remains challenging, because existing algorithms to analyze the vocal tract movement are based on existing languages — and since beatboxing doesn’t seem to resemble any of them, different and new algorithms are needed.

“The vocal tract is amazing but it’s also incredibly complex. We need to keep creating better computer algorithms to understand how it all works together,” said Greer.

This is only the start, however — the group that acquired the data is already working on algorithms to analyze beatboxing is already working on ways to analyze and better understand this unusual form of art.

“The same group that collected the real-time MRI beatboxing videos–the Speech Production and kNowledge (SPAN) group at USC–has developed a set of region-of-interest (ROI) and segmentation algorithms that can be used on rtMRI data to determine how the different components of the vocal tract move in relation to each other. We are using these tools on our rtMRI data now to get more quantitative observations about beatboxing.”

However, this field of research is not only about beatboxing itself (though it will be a valuable resource for the community) — it can teach us a lot about speech patterns, and even shed some light on our vocal tract anatomy.

“This research has practical and theoretical benefits. Practically, this is one of the first looks at how the vocal tract moves during beatboxing; these videos offer the beatboxing community a tool to use in their art for teaching, exploration, and innovation. This work also benefits linguistic theory because it shows what the vocal tract can do when stretched to its limits. It addresses questions like “why do some sounds exist in speech, but not others?” and “which speech patterns exist only in language, and which speech patterns are grounded in broader cognitive capacities?”.”

Greer will present his findings at the Acoustical Society of America’s 176th Meeting.

Immune cell gif.

Amazing video shows how white blood cells find pathogens — and points to a cure against cancer

Using cutting-edge microscopy imaging, researchers discovered — and filmed — the ‘sensors’ macrophage cells use to detect pathogens. The research might also yield one of the most powerful tools to date in the fight against cancer.

Immune cell gif.

Image credits University of Queensland / Youtube.

Macrophages form the first line of defense in our immune systems, patroling tissues throughout our bodies and guarding the bits susceptible to infection. Once a macrophage encounters something that doesn’t wear the protein tags of healthy human cells — such as cellular debris, pathogens, cancer cells, or foreign substances — the cell wraps around it and proceeds to digest it.

Still, despite decades of research, we still barely understand how macrophages — and their other white cell relatives — work. In an effort to patch our grasp of these mechanisms, a team from the University of Queensland (UQ) used cutting-edge microscopy techniques to film macrophage cells.

Their research led to the discovery of structures known as “tent-pole ruffles”, which underpin the cells’ functions. The same structures, the team writes, may help us find a new and very powerful tool against cancer.

If you can’t beat them, eat them

“It’s really exciting to be able to see cell behaviour at unprecedented levels of resolution,” says co-author Adam Wall, a researcher in molecular bioscience at UQ.

“This is discovery science at the cutting edge of microscopy and reveals how much we still have to learn about how cells function”.

The ruffles are located on the surface of macrophages, a specific type of white blood cell that directly engages pathogens and other undesirables in our bodies. Tent-pole ruffles underpin their function, the team writes, by allowing the cells to sample their surrounding fluids for potential threats.

Tent pole ruffles.

How tent-pole ruffles work — video below.
Image credits Nicholas D. Condon et al., 2018, JCB.

They take the name from their shape and work similarly to our sense of taste or smell: the ruffles extend from the cell’s body and — using a special membrane strung between the poles — gather relatively large volumes of fluid that are then sampled for chemical markers. This process is known as ‘macropinocytosis’. If any molecules from a foreign entity are detected, the cells move towards the source and prepare to engage.

Tent-pole ruffles are exceedingly small. Their discovery was only made possible by a new imaging technology known as ‘lattice light sheet microscopy’. The technique can capture tiny structures in a matter of seconds, generating stunning 3D renditions with very high precision.

“This imaging will give us phenomenal power to reveal how cell behaviour is affected in disease, to test the effects of drugs on cells, and to give us insights that will be important for devising new treatments,” says study supervisor Jenny Stow, a deputy director of research in molecular bioscience at UQ.

It’s a very fortunate development. The research helps us better understand how our immune systems scrub the body clean of pathogens, but it also points to a way to cripple cancer cells. These latter cells use the process of macropinocytosis to capture nutrients, not to probe their environment like the macrophages. Apart from that, the process works largely the same — tent-pole ruffles extend, the membranes capture field, and nutrients are absorbed.

In theory, then, if researchers can figure out how to destroy or inactivate the tent-pole ruffles of cancer cells, we could simply starve them out.

The team plans to continue using lattice light sheet microscopy to probe the natures of other human immune system cells.

The paper “Macropinosome formation by tent pole ruffling in macrophages” has been published in the Journal of Cell Biology.

Here’s why you should never stare into a laser pointer

We all know the old warning — you should never stare into a laser pointer. But is it really dangerous, or is it just another old-wives tale from a more modern age? Spoiler alert: it’s not. You really shouldn’t stare into a laser pointer.

A cautionary tale

Image credits: Androudi and Papageorgiou/NEJM.

The parents of a 9-year-old boy in Greece were concerned after the boy could no longer see properly with one of his eyes. They took him to a clinic, and tests showed that while the boy could see excellently with his right eye, the same couldn’t be said about his left eye. When the doctors examined him more thoroughly, they found a large hole in the macula — the central area of the retina where the light is focused in front of the eye. The boy had managed to burn a hole in his retina after repeatedly staring into a laser pointer.

Holes in the macula aren’t that uncommon in the elderly, but there’s no reason for such a problem to occur in a child. However, retinal injuries due to laser pointers have become so common that there’s actually diagnostic criteria for determining if the problem was indeed caused by one of these devices.

It doesn’t take long for the damage to take place, either. While it’s not clear what was the exposure of the boy in Greece, doctors reported a different case in 2012 where a 5mW laser pointer caused significant damage to a 13-year-old boy’s eyes after just one minute of exposure. His eyes healed, which is often, but not always, the case.

In 2015, another 13-year-old boy was unlucky enough to have a 50 mW laser shined into his eye for just one second, permanently damaging his retina.

In the US, lasers are limited to 5 mW, and in most places in Europe, legislations are even more stringent, limiting the devices to just 1 mW. You need a permit for anything stronger. But the legislation isn’t strongly enforced and it’s easy to buy a strong laser on the internet. Many lasers are also purchased via street vendors, and they can be mislabeled or not have any label at all.

In the case of the boy in Greece, it’s unclear if the damage is permanent. After 18 months, his vision was still damaged and it may never recover. The takeaway is clear: you should never ever stare into a laser pointer — your mom was right.

The study was published in The New England Journal of Medicine.

Meet your new organ: the interstitium

Doctors have identified a previously unknown feature of human anatomy with many implications for the functions of most organs and tissues, and for the mechanisms of most major diseases.

Structural evaluation of the interstitial space. (A) Transmission electron microscopy shows collagen bundles (asterisks) that are composed of well-organized collagen fibrils. Some collagen bundles have a single flat cell along one side (arrowheads). Scale bar, 1 μm. (B) Higher magnification shows that cells (arrowhead) lack features of endothelium or other types of cells and have no basement membrane. Scale bar, 1 μm. (C) Second harmonics generation imaging shows that the bundles are fibrillar collagen (dark blue). Cyan-colored fibers are from autofluorescence and are likely elastin, as shown by similar autofluorescence in the elastic lamina of a nearby artery (inset) (40×). (D) Elastic van Gieson stain shows elastin fibers (black) running along collagen bundles (pink) (40×).

A new paper published on March 27th in Scientific Reports, shows that layers of the body long thought to be dense, connective tissues — below the skin’s surface, lining the digestive tract, lungs, and urinary systems, and surrounding arteries, veins, and the fascia between muscles — are instead interconnected, fluid-filled spaces.

Scientists named this layer the interstitium — a network of strong (collagen) and flexible (elastin) connective tissue fibers filled with fluids, that acts like a shock absorber to keep tissues from rupturing while organs, muscles, and vessels constantly pump and squeeze throughout the day.

This fluid layer that surrounds most organs may explain why cancer spreads so easily. Scientists think this fluid is the source of lymph, the highway of the immune system.

In addition, cells that reside in the interstitium and collagen bundles they line, change with age and may contribute to the wrinkling of skin, the stiffening of limbs, and the progression of fibrotic, sclerotic and inflammatory diseases.

Scientists have long known that more than half the fluid in the body resides within cells, and about a seventh inside the heart, blood vessels, lymph nodes, and lymph vessels. The remaining fluid is “interstitial,” and the current paper is the first to define the interstitium as an organ in its own right and, the authors write, one of the largest of the body, the authors write.

A team of pathologists from NYU School of Medicine thinks that no one saw these spaces before because of the medical field’s dependence on the examination of fixed tissue on microscope slides. Doctors examine the tissue after treating it with chemicals, slicing it thinly, and dyeing it in various colorations. The “fixing” process allows doctors to observe vivid details of cells and structures but drains away all fluid. The team found that the removal of fluid as slides are made makes the connective protein meshwork surrounding once fluid-filled compartments to collapse and appear denser.

“This fixation artifact of collapse has made a fluid-filled tissue type throughout the body appear solid in biopsy slides for decades, and our results correct for this to expand the anatomy of most tissues,” says co-senior author Neil Theise, MD, professor in the Department of Pathology at NYU Langone Health. “This finding has potential to drive dramatic advances in medicine, including the possibility that the direct sampling of interstitial fluid may become a powerful diagnostic tool.”

Researchers discovered the interstitium by using a novel medical technology — Probe-based confocal laser endomicroscopy. This new technology combines the benefits of endoscopy with the ones of lasers. The laser lights up the tissues, sensors analyze the reflected fluorescent patterns, offering a microscopic real-time view of the living tissues.

When probing a patient’s bile duct for cancer spread, endoscopists and study co-authors Dr. David Carr-Locke and Dr. Petros Benias observed something peculiar — a series of interconnected spaces in the submucosa level that was never described in the medical literature.

Baffled by their findings, they asked Dr. Neil Theise, professor in the Department of Pathology at NYU Langone Health and co-author of the paper for help in resolving the mystery. When Theise made biopsy slides out of the same tissue, the reticular pattern found by endomicroscopy vanished. The pathology team would later discover that the spaces seen in biopsy slides, traditionally dismissed as tears in the tissue, were instead the remnants of collapsed, previously fluid-filled, compartments.

Researchers collected tissues samples of bile ducts from 12 cancer patients during surgery. Before the pancreas and the bile duct were removed, patients underwent confocal microscopy for live tissue imaging. After recognizing this new space in images of bile ducts, the team was able to quickly spot it throughout the body.

Theise believes that the protein bundles seen in the space are likely to generate electrical current as they bend with the movements of organs and muscles, and may play a role in techniques like acupuncture.

Another scientist involved in the study was first author Rebecca Wells of the Perelman School of Medicine at the University of Pennsylvania, who determined that the skeleton in the newfound structure was comprised of collagen and elastin bundles.

What is the belly button, does it serve any purpose, and why is it an “innie” or an “outie”

The belly button (or navel) is typically the body’s first scar, caused by the detachment of the umbilical cord after birth. it g All placental mammals have a belly button.

What is the belly button

A pierced belly button.

Mammals are split into three groups: placentals, monotremes, and marsupials. Placental mammals are by far the greater and most diverse group of the three, carrying this name because they nourish their offspring through a placenta (the name is somewhat of a misnomer since marsupials also have a placenta).

The placenta is essentially an organ that connects the developing fetus to the uterine. As the fetus is carried in the uterus of its mother to a relatively late stage of development, it gets all of its necessary nutrients through the placenta. The placenta also provides oxygen and removes waste products from the fetus’ blood.

The placenta attaches itself to the wall of the uterus, and the fetus’s umbilical cord develops from the placenta. After birth, when the placenta is separated from the baby, the resulting scar tissue (a hollowed or sometimes raised area) is clinically called the umbilicus. Colloquially, it’s called the navel or the belly button.

So there you have it — your belly button is probably your first scar.

Humans aren’t the only ones to have belly buttons — most mammals do. However, it’s often hidden beneath fur and is much less visible.

Innie vs Outie

Among the most common misconceptions about the belly button is the innie (or inny) versus outie debate. Contrary to popular belief, this isn’t decided by where the umbilical cord is cut, but rather by how the scar tissue forms and then dries (remember, the belly button is basically scar tissue). As far as anyone can tell, this process is random, but the innie is much more common than the outie.

An “innie”. Credits: Stinkie Pinkie / Wikipedia.

Umbilical hernias (when the baby’s abdominal wall layers don’t join completely) can cause the belly button to push outward, even though they are often painless and don’t cause any discomfort.

Sometimes, the inner pressure can turn a pregnant woman’s navel from an innie to an outie. Extreme weight gain can do the same thing.

Is the belly button useful?

Da Vinci’s Vitruvian man has the navel at its center.

The belly button doesn’t have any biological uses, though it is used in some medical procedures. For instance, if a transfusion is necessary for a newborn, the umbilical cord stump is preferred.

Visually, the umbilicus is used to visually separate the abdomen into quadrants and is often regarded as the body’s center of balance, serving as an important anatomical landmark. It’s also used for introducing laparoscopic ports during laparoscopic surgery and can be a tell-tale sign of intra-abdominal pathologies.

Abdominal regions are used for example to localize pain, and the navel serves as an anatomic landmark.

An abnormal belly button shape can also be an indicator of pregnancy problems.

Lastly, it can also serve an aesthetic role — although many people would regard the navel as aesthetically unpleasant, others pierce and tattoo it, and there’s even a navel fetish: in 2012, it was the second most popular fetish search on Google.

Belly button facts

You never knew you wanted to learn about the belly button, didn’t you? Well, in 2012, a biological study found that the “fauna” in your navel is much like that of a tropical forest — the bacterial fauna, that is. According to researchers, thousands and thousands of bacteria types (some new to science) can be found in your belly button, but there’s no reason to worry: they’re quite harmless.

If you thought that’s disturbing, wait ’til you hear this: you can actually make cheese using the bacteria in the navel. As part of a collaborative project named “Selfmade”, biologist Christina Agapakis and odor artist Sissel Tolaas made 11 new types of cheese from the bacteria found in armpits, mouths, toes, and belly buttons.

In an article published in 2000 in the journal Plastic and Reconstructive Surgeryscientists tried to discuss what the perfect belly button looks like. The article, entitled In search of the ideal female umbilicus, had participants rate different types of belly buttons. They found that the T- or vertically shaped umbilicus with superior hooding consistently scored the highest in aesthetic appeal, and the “outie” was almost universally displeasing. Considering that several thousand people are having navel plastic surgery each year (and the trend is increasing), that can be quite useful to know. Interestingly, breast implant surgeries can also be done through the navel in order to avoid scarring; it’s called Trans-umbilical breast augmentation.

We can’t grow new neurons in adulthood after all, new study says

Previous research has suggested neurogenesis — the birth of new neurons — was able to take place in the adult human brain, but a new controversial study published in the journal Nature seems to challenge this idea.

a. Toluidine-blue-counterstained semi-thin sections of the human Granule Cell Layer (GCL) from fetal to adult ages. Note that a discrete cellular layer does not form next to the GCL and the small dark cells characteristic of neural precursors are not present.

Scientists have been struggling to settle the matter of human neurogenesis for quite some time. The first study to challenge the old theory that humans did not have the ability to grow new neurons after birth was published in 1998, but scientists had been questioning this entrenched idea since the 60’s when emerging techniques for labeling dividing cells revealed the birth of new neurons in rats. Another neurogenesis study was published in 2013, reinforcing the validity of the results from 1998.

Arturo Alvarez-Buylla, a neuroscientist at the University of California, San Francisco, and his team conducted a study to test the neurogenesis theory using immunohistochemistry — a process that applies various fluorescent antibodies on brain samples. The antibodies signal if young neurons as well as dividing cells are present. Researchers involved in this study were shocked by the findings.

“We went into the hippocampus expecting to see many young neurons,” says senior author Arturo Alvarez-Buylla. “We were surprised when we couldn’t find them.”

In the new study, scientists analyzed brain samples from 59 patients of various ages, ranging from fetal stages to the age of 77. The brain tissue samples came from people who had died or pieces were extracted in an unrelated procedure during brain surgery. Scientists found new neurons forming in prenatal and neonatal samples, but they did not find any sustainable evidence of neurogenesis happening in humans older than 13. The research also indicates the rate of neurogenesis drops 23 times between the ages one and seven.

But some other uninvolved scientists say that the study left much room for error. The way the brain slices were handled, the deceased patients’ psychiatric history, or whether they had brain inflammation could all explain why the researchers failed to confirm earlier findings.

The 1998 study was performed on brains of dead cancer patients who had received injections of a chemical called bromodeoxyuridine while they were still alive. The imaging molecule — which was used as a cancer treatment — became integrated into the DNA of actively dividing cells. Fred Gage, a neuroscientist involved in the 1998 study, says that this new paper does not really measure neurogenesis.

“Neurogenesis is a process, not an event. They just took dead tissue and looked at it at that moment in time,” he adds.

Gage also thinks that the authors used overly restrictive criteria for counting neural progenitor cells, thus lowering the chances of seeing them in adult humans.

But some neuroscientists agree with the findings. “I feel vindicated,” Pasko Rakic, a longtime outspoken skeptic of neurogenesis in human adults, told Scientific American. He believes the lack of new neurons in adult primates and humans helps preserve complex neural circuits. If new neurons would be constantly born throughout adulthood, they could interfere with preexisting precious circuits, causing chaos in the central nervous system.

“This paper not only shows very convincing evidence of a lack of neurogenesis in the adult human hippocampus but also shows that some of the evidence presented by other studies was not conclusive,” he says.

Dividing neural progenitors in the granule cell layer (GCL) are rare at 17 gestational weeks (orthogonal views, inset) but were abundant in the ganglionic eminence at the same age (data not shown). Dividing neural progenitors were absent in the GCL from 22 gestational weeks to 55 years.

Steven Goldman, a neurologist at the University of Rochester Medical Center and the University of Copenhagen, said, “It’s by far the best database that has ever been put together on cell turnover in the adult human hippocampus. The jury is still out about whether there are any new neurons being produced.” He added that if there is neurogenesis, “it’s just not at the levels that have been presumed by many.”

The debate still goes on. No one really seems to know the answer yet, but I think that’s a positive — the controversy will generate a new wave of research on the subject.

Women undoubtedly prefer strong, muscular men, study shows

Psychologists have confirmed something most women deep down know regarding male physical attractiveness: strong men are, by far, preferred to weaker looking men.

The study was based on interviews with 160 women. The female participants had to rate the physical individual attractiveness of men from two categories: a group composed of 130 psychology students and one composed of 60 gym-going university students who worked out a few times per week.

Aaron Sell, a psychology lecturer at School of Criminology and Criminal Justice, Griffith University, Australia and his co-author, Aaron Lukaszewski, an evolutionary psychologist at California State University at Fullerton measured the males’ strength via weightlifting machines, grip strength tests and other methods.

Source: Pexels/Pixabay

The male recruits all came from the University of California at Santa Barbara. The assessors, students from Oklahoma State University and Australia’s Griffith University, rated both strength and physical attractiveness on a scale from 1 to 7. Interestingly, the scores the women gave for strength were fairly accurate compared to the actual physical performances of the students.

“The rated strength of a male body accounts for a full 70 percent of the variance in attractiveness,” Sell said.

None of the surveyed women showed a statistically important preference for weaker looking guys.

“No one will be surprised by the idea that strong men are more attractive,” said one of the study authors, Aaron Lukaszewski, told The Washington Post. “It’s no secret that women like strong, muscular guys.”

“That is so obvious, people are going to wonder why scientists needed to study it,” said Holly Dunsworth, an anthropologist at the University of Rhode Island, also to The Post. “And the answer would be because they want to know how these preferences evolved.”

Dunsworth also raised questions about the reliability of the paper, because the study involved only 20-year-olds, who she adds, may not have very much experience with the meaning of attractiveness.

Source: Geralt/Pixabay

Lisa Wade, a sociologist at Occidental College in Los Angeles, also criticizes the study’s interpretation: “It’s my opinion that the authors are too quick to ascribe a causal role to evolution,“ she told The Post.

According to Wade, culture has a bigger impact on male torso aesthetics.

“We value tall, lean men with strong upper bodies in American society. We’re too quick to assume that it requires an evolutionary explanation,” she said. “We know what kind of bodies are valorized and idealized,” Wade added. “It tends to be the bodies that are the most difficult to obtain.”

In her opinion, a few centuries ago, women would have preferred larger torsos, due to the scarcity of hyper-caloric food and the requirements of heavy physical labour. The preference for leaner upper masculine bodies was not universally valued at that time.

The paper published in Proceedings of the Royal Society B surely has many scientists arguing over it, but the team led by Sell and Lukaszewski plans to examine physical attractiveness on a larger scale, with a cross-cultural study on the way.

Child’s brain rewires following double hand transplant 

This is the incredible story of Zion, the first quadruple amputee child in whom researchers observed massive brain reorganization before and after the hand transplant.

Zion was only two years old when he lost both his hands and feet due to a grave generalized infection. At the age of four, his mother donated one of her kidneys to him, allowing doctors to consider him as a candidate for a bilateral hand transplant. Zion had already been on immunosuppressant drugs.

Hand transplants in children are rare and difficult to perform, due to the extremely small size of the vessels and nerves that need to be reconstructed. Zion’s surgery was the first pediatric hand transplant in the world. The medical team included twelve surgeons, divided into four smaller teams that had to find and label all the structures that were to be sewn together. The whole procedure lasted 11 hours.

Now, Zion can do all the things he always wanted to: feed himself, scratch his nose, wave goodbye, shake hands, even play baseball. One special thing he is really eager to do is to write, with his own hands, a letter to the parents that had donated their child’s hands to him.

How it all began

The researchers recorded the child’s brain activity two years before the surgery, and then monitored the way his brain rewired after the amputation — a process called massive cortical reorganization (MCR).

“We had hoped to see MCR in our patient, and indeed, we were the first to observe MCR in a child. We were even more excited to observe what happened next — when the patient’s new hands started to recover function. For our patient, we found that the process is reversible.” said Gaetz.

For each part of the body that transmits sensitive information to the brain, there is a specific region of the cerebral cortex that is activated. This biological phenomenon is known as somatosensory representation.

“We know from research in nonhuman primates and from brain imaging studies in adult patients that, following amputation, the brain remaps itself when it no longer receives input from the hands,” said first author William Gaetz, PhD. “The brain area representing sensations from the lips shifts as much as 2 centimeters to the area formerly representing the hands.”

Zion, age 10. Credit: Children’s Hospital of Philadelphia.

Magnetoencephalography (MEG) is a neuroimaging technique that measures the magnetic activity of elected areas in the brain. Using MEG, scientists studied the location, timing, and strength of Zion’s reactions to sensory stimuli (differently sized monofilaments) applied to his fingers and lips. Four such MEGs were performed in the year following the transplant, using five children the same age as Zion as controls.

“At visits 1 and 2, index fingertips were insensitive to tactile stimulation with even the largest monofilaments. At visits 3 and 4, the patient was able to sense light touch on the fingertips,” the authors wrote in the paper published in the journal Annals of Clinical and Translational Neurology. 

The first two times, researchers found that the signal transmitted from Zion’s lips was recorded in the hand area of the cortex, but with a 20 milliseconds delay, compared to controls. At the latter two MEGs, the lip stimuli had returned to the lip-designated area, indicating that the brain map was regaining its previous configuration but with higher-than-normal signal strength.

“The sensory signals are arriving in the correct location in the brain, but may not yet be getting fully integrated into the somatosensory network,” said Gaetz. “We expect that over time, these sensory responses will become more age-typical.”

Gaetz added, “These results have raised many new questions and generated excitement about brain plasticity, particularly in children. Some of those new questions include, what is the best age to get a hand transplant? Does MCR always occur after amputation? How does brain mapping look in people born without hands? Would we see MCR reverse in an adult, as we did in this patient? We are planning new research to investigate some of these questions.”

The teams from the Children’s Hospital of Philadelphia and the Perelman School of Medicine at the University of Pennsylvania published their findings in the Annals of Clinical and Translational Neurology on December 6th, 2017.

What your pupil says about your language

A simple word is enough to trigger a reaction in your pupil.

The pupillary dilation. Image credits: Greyson Orlando.

A surprising new study found that when we hear words associated with a strong luminosity (ie “sun” or “shine”) our pupils contract as if we were actually exposed to them. The same thing happens to words we associate with darkness — our pupils dilate. The responses can have a variety of causes, from an involuntary reflex reaction to feelings of arousal to exposure to light. The latter is most common and often most pronounced.

This mechanism is done through the optic and oculomotor cranial nerve. Many creatures, humans included, exhibit a pupillary response. This is basically a mechanism through which the brain tries to adapt the body to new conditions, but it’s not totally clear why the reaction is also associated with psychological responses. What this study did is to open new avenues of research, and show that the dilation and contraction mechanism might be more complex than we thought.

“Theories about embodiment of language hold that when you process a word’s meaning, you automatically simulate associated sensory input (e.g., perception of brightness when you process lamp) and prepare associated actions (e.g., finger movements when you process typing),” the study reads. “To test this latter prediction, we measured pupillary responses to single words that conveyed a sense of brightness (e.g., day) or darkness (e.g., night) or were neutral (e.g., house).”

When confronted with a word, the pupils begin by dilating (0 — 0.5 s), following the general activation of the brain. When this initial activation has passed, the pupils retract (0.5 — 2 s). But the size of the pupil is also determined by the luminosity evoked by the words: when we read a luminance-associated word, the pupils become smaller than when we read a word associated with darkness (1 — 3 s). Image credits: Sebastiaan Mathot, University of Groningen.

Not all responses were alike. The brighter (or darker) the word people heard, the stronger the response, which in itself seems to raise more questions than it answers.

It seems to fit with a theory called the ’embodiment of language’. Basically, the theory says that whenever we hear a word or a group of words, we mentally simulate it in our mind. If someone would say ‘keyboard,’ your brain would project an image of the keyboard, as well as the gesture of typing at a keyboard; even if you might not realize it. The same thing happens with ‘sun’ — you visualize a big ball of fire, and your pupil adapts. However, researchers say, behavioral studies have so far not directly tested one of the central predictions of embodied language: that word meaning by itself can trigger, at least in some cases, associated involuntary actions. This is why this particular study is so important: it can be a definite proof for a long-standing but still challenged theory.

Journal Reference: Sebastiaan Mathôt, Jonathan Grainger, Kristof Strijkers. Pupillary Responses to Words That Convey a Sense of Brightness or Darkness. Psychological Science, 2017; 095679761770269 DOI: 10.1177/0956797617702699

 

Oxford student creates the first synthetic retina from soft, biological materials

A synthetic retina developed from soft materials offers new hope for the visually impaired.

Vanessa Restrepo-Schild. Image credits: University of Oxford.

Vanessa Restrepo-Schild, a 24-year-old student and researcher at Oxford University, is the first to successfully develop a soft retina made from biological, synthetic tissues. The double-layered retina consists of soft water droplets (hydrogels) and biological cell membrane proteins. This isn’t the first artificial retina, but unfortunately, previous efforts used only hard, rigid materials. Miss Restrepo-Schild explains why this is a big deal:

“The human eye is incredibly sensitive, which is why foreign bodies like metal retinal implants can be so damaging, leading to inflammation and/or scaring. But a biological synthetic implant is soft and water based, so much more friendly to the eye environment.” Furthermore, unlike existing artificial retinal implants, this new technology doesn’t contain any foreign body, which makes them less invasive and way less likely to generate an adverse reaction to the body.

The retina is the third and inner coat of the eye. The physics of the eye generates an image of the visual world on the retina (through the cornea and lens), much like a film in a camera. The retina reacts to light, converting it into electrical signals which are passed through the nervous system to the brain, where the image is processed. This synthetic retina is also designed like a camera, with cells acting as pixels, detecting and reacting to create an image.

The retina replica consists of soft water droplets (hydrogels) and biological cell membrane proteins. Designed like a camera, the cells act as pixels, detecting and reacting to light to create a grey scale image. Image credits: Oxford University.

Restrepo-Schild says she wanted to see how human tissues can be integrated with or replaced by artificial structures.

‘I have always been fascinated by the human body, and want to prove that current technology can be used to replicate the function of human tissues, without having to actually use living cells.I want to take the principals behind vital bodily functions, e.g. our sense of hearing, touch and the ability to detect light, and replicated them in a laboratory environment with natural, synthetic components. I hope my research is the first step in a journey towards building technology that is soft and biodegradable instead of hard and wasteful.’

So far, the technology has only been trialed in the lab, so there’s still a long way to go before it can be used in humans, and there’s also a long way to go before generating a full-color image — but there’s a lot of promise. Restrepo-Schild will soon start work on animal testing.

Journal Reference: Vanessa Restrepo Schild, Michael J. Booth, Stuart J. Box, Sam N. Olof, Kozhinjampara R. Mahendran & Hagan Bayley — Light-Patterned Current Generation in a Droplet Bilayer Array. doi:10.1038/srep46585.

Human cartilage has been successfully 3D printed

3D printers have been causing revolutions in many different fields, with materials as different as food, mud, plastic, and plants. The game-changer is that you can create very precise, complex shapes that weren’t able to be created before. Another use of 3D printing is a potentially life-saving one. 3D bioprinters are being developed that can print out tissues and organs. Some that are being tested now are skin cells, bone, heart tissue, and now cartilage. A team of researchers at Sahlgrenska Academy has created cartilage tissue by printing stem cells with a 3D-bioprinter. It appears to be just like human cartilage and could be used to replace damaged cartilage.

“In nature, the differentiation of stem cells into cartilage is a simple process, but it’s much more complicated to accomplish in a test tube. We’re the first to succeed with it, and we did so without any animal testing whatsoever,” says Stina Simonsson, Associate Professor of Cell Biology, who led the research.

The lead researcher, Stina Simonsson, holding some 3D-printer cartilage. Image credits: Elin Lindström Claessen.

The researchers took cartilage cells from patients who had recently had knee surgery and their cells were manipulated to become “pluriplotent”, so they can develop into many different types of cells. Next, they created a scaffold to print the cells on. The stem cells were coated with nanocellulose to survive the printing process. Once printed, the stem cells multiplied and were given growth factors so they differentiated into cartilage tissues. The cells formed cartilage cells on the printed structure. After a few weeks, the cells lost their ability to change into other cells. This change is good because pluripotency increases the risk of tumour formation.

“We investigated various methods and combined different growth factors. Each individual stem cell is encased in nanocellulose, which allows it to survive the process of being printed into a 3D structure. We also harvested mediums from other cells that contain the signals that stem cells use to communicate with each other so called conditioned medium. In layman’s terms, our theory is that we managed to trick the cells into thinking that they aren’t alone,” says Stina Simonsson.

Cartilage can be 3D-printed. Image credits: United States NIH National Institute of Arthritis and Musculoskeletal and Skin Diseases.

The 3D bio-printed structure is very similar to human cartilage. Experienced surgeons did not see a difference between natural and bio-printed cartilage. The cells appear well-formed under the microscope and similar to the patients’ own cartilage.

When this method is perfected, cartilage could be 3D printed from a patient’s own stem cells to repair damaged cartilage or heal osteoarthiritis (cartilage decay in the joints). The method can create a lot of cartilage, making it very useful for cartilage replacement. Right now, it isn’t known how compatible it is in the human body. The structural material needs to be able to break down and be absorbed safely by the body so only cartilage is left. Further development and testing need to be conducted. Bioprinting has a lot of potential; in the near future, tissues and organ could be printed on demand.

Journal reference: Nguyen, D. et al. 2017. Cartilage Tissue Engineering by the 3D Bioprinting of iPS Cells in a Nanocellulose/Alginate Bioink. Scientific Reports.