Tag Archives: memory

Gut bacteriophages associated with improved cognitive function and memory in both animals and humans

A growing body of evidence has implicated gut bacteria in regulating neurological processes such as neurodegeneration and cognition. Now, a study from Spanish researchers shows that viruses present in the gut microbiota can also improve mental functions in flies, mice, and humans.

Credit: CDC.

They easily assimilate into their human hosts — 8% of our DNA consists of ancient viruses, with another 40% of our DNA containing genetic code thought to be viral in origin. As it stands, the gut virome (the combined genome of all viruses housed within the intestines) is a crucial but commonly overlooked component of the gut microbiome.

But we’re not entirely sure what it does.

This viral community is comprised chiefly of bacteriophages, viruses that infect bacteria and can transfer genetic code to their bacterial hosts. Remarkably, the integration of bacteriophages or phages into their hosts is so stable that over 80% of all bacterial genomes on earth now contain prophages, permanent phage DNA as part of their own — including the bacteria inside us humans. Now, researchers are inching closer to understanding the effects of this phenomenon.

Gut and brain

In their whitepaper published in the journal Cell Host and Microbe, a multi-institutional team of scientists describes the impact of phages on executive function, a set of cognitive processes and skills that help an individual plan, monitor, and successfully execute their goals. These fundamental skills include adaptable thinking, planning, self-monitoring, self-control, working memory, time management, and organization, the regulation of which is thought, in part, to be controlled by the gut microbiota.

The study focuses on the Caudovirales and Microviridae family of bacteriophages that dominate the human gut virome, containing over 2,800 species of phages between them.

“The complex bacteriophage communities represent one of the biggest gaps in our understanding of the human microbiome. In fact, most studies have focused on the dysbiotic process only in bacterial populations,” write the authors of the new study.

Specifically, the scientists showed that volunteers with increased Caudovirales levels in the gut microbiome performed better in executive processes and verbal memory. In comparison, the data showed that increased Microviridae levels impaired executive abilities. Simply put, there seems to be an association between this type of gut biome and higher cognitive functions.

These two prevalent bacteriophages run parallel to human host cognition, the researchers write, and they may do this by hijacking the bacterial host metabolism.

To reach this conclusion, the researchers first tested fecal samples from 114 volunteers and then validated the results in another 942 participants, measuring levels of both types of bacteriophage. They also gave each volunteer memory and cognitive tests to identify a possible correlation between the levels of each species present in the gut virome and skill levels.

The researchers then studied which foods may transport these two kinds of phage into the human gut -results indicated that the most common route appeared to be through dairy products.

They then transplanted fecal samples from the human volunteers into the guts of fruit flies and mice – after which they compared the animal’s executive function with control groups. As with the human participants, animals transplanted with high levels of Caudovirales tended to do better on the tests – leading to increased scores in object recognition in mice and up-regulated memory-promoting genes in the prefrontal cortex. Improved memory scores and upregulation of memory-involved genes were also observed in fruit flies harboring higher levels of these phages.

Conversely, higher Microviridae levels (correlated with increased fat levels in humans) downregulated these memory-promoting genes in all animals, stunting their performance in the cognition tests. Therefore, the group surmised that bacteriophages warrant consideration as a novel dietary intervention in the microbiome-brain axis.

Regarding this intervention, Arthur C. Ouwehand, Technical Fellow, Health and Nutrition Sciences, DuPont, who was not involved in the study, told Metafact.io:

“Most dietary fibres are one way or another fermentable and provide an energy source for the intestinal microbiota.” Leading “to the formation of beneficial metabolites such as acetic, propionic and butyric acid.”

He goes on to add that “These so-called short-chain fatty acids may also lower the pH of the colonic content, which may contribute to an increased absorption of certain minerals such as calcium and magnesium from the colon. The fibre fermenting members of the colonic microbiota are in general considered beneficial while the protein fermenting members are considered potentially detrimental.”

It would certainly be interesting to identify which foods are acting on bacteriophages contained within our gut bacteria to influence cognition.

Despite this, the researchers acknowledge that their work does not conclusively prove that phages in the gut can impact cognition and explain that the test scores could have resulted from different bacteria levels in the stomach but suggest it does seem likely. They close by stating more work is required to prove the case.

Where are memories stored in the brain? They may be hidding in the connections between your brain cells

In the nervous system, a synapse is a structure that permits a neuron (or nerve cell) to pass an electrical or chemical signal to another neuron. Credit: NIH Image Gallery.

All memory storage devices, from your brain to the RAM in your computer, store information by changing their physical qualities. Over 130 years ago, pioneering neuroscientist Santiago Ramón y Cajal first suggested that the brain stores information by rearranging the connections, or synapses, between neurons.

Since then, neuroscientists have attempted to understand the physical changes associated with memory formation. But visualizing and mapping synapses is challenging to do. For one, synapses are very small and tightly packed together. They’re roughly 10 billion times smaller than the smallest object a standard clinical MRI can visualize. Furthermore, there are approximately 1 billion synapses in the mouse brains researchers often use to study brain function, and they’re all the same opaque to translucent color as the tissue surrounding them.

A new imaging technique my colleagues and I developed, however, has allowed us to map synapses during memory formation. We found that the process of forming new memories changes how brain cells are connected to one another. While some areas of the brain create more connections, others lose them.

Mapping new memories in fish

Previously, researchers focused on recording the electrical signals produced by neurons. While these studies have confirmed that neurons change their response to particular stimuli after a memory is formed, they couldn’t pinpoint what drives those changes.

To study how the brain physically changes when it forms a new memory, we created 3D maps of the synapses of zebrafish before and after memory formation. We chose zebrafish as our test subjects because they are large enough to have brains that function like those of people, but small and transparent enough to offer a window into the living brain.

Zebrafish are particularly fitting models for neuroscience research. Zhuowei Du and Don B. Arnold, CC BY-NC-ND

To induce a new memory in the fish, we used a type of learning process called classical conditioning. This involves exposing an animal to two different types of stimuli simultaneously: a neutral one that doesn’t provoke a reaction and an unpleasant one that the animal tries to avoid. When these two stimuli are paired together enough times, the animal responds to the neutral stimulus as if it were the unpleasant stimulus, indicating that it has made an associative memory tying these stimuli together.

As an unpleasant stimulus, we gently heated the fish’s head with an infrared laser. When the fish flicked its tail, we took that as an indication that it wanted to escape. When the fish is then exposed to a neutral stimulus, a light turning on, tail flicking meant that it’s recalling what happened when it previously encountered the unpleasant stimulus.

Pavlov’s dog is the most well-known example of classical conditioning, in which a dog salivates in response to a ringing bell because it has formed an associative memory between the bell and food. Lili Chin/Flickr, CC BY-NC-ND.

To create the maps, we genetically engineered zebrafish with neurons that produce fluorescent proteins that bind to synapses and make them visible. We then imaged the synapses with a custom-built microscope that uses a much lower dose of laser light than standard devices that also use fluorescence to generate images. Because our microscope caused less damage to the neurons, we were able to image the synapses without losing their structure and function.

When we compared the 3D synapse maps before and after memory formation, we found that neurons in one brain region, the anterolateral dorsal pallium, developed new synapses while neurons predominantly in a second region, the anteromedial dorsal pallium, lost synapses. This meant that new neurons were pairing together, while others destroyed their connections. Previous experiments have suggested that the dorsal pallium of fish may be analogous to the amygdala of mammals, where fear memories are stored.

Surprisingly, changes in the strength of existing connections between neurons that occurred with memory formation were small and indistinguishable from changes in control fish that did not form new memories. This meant that forming an associative memory involves synapse formation and loss, but not necessarily changes in the strength of existing synapses, as previously thought.

Could removing synapses remove memories?

Our new method of observing brain cell function could open the door not just to a deeper understanding of how memory actually works, but also to potential avenues for treatment of neuropsychiatric conditions like PTSD and addiction.

Associative memories tend to be much stronger than other types of memories, such as conscious memories about what you had for lunch yesterday. Associative memories induced by classical conditioning, moreover, are thought to be analogous to traumatic memories that cause PTSD. Otherwise harmless stimuli similar to what someone experienced at the time of the trauma can trigger recall of painful memories. For instance, a bright light or a loud noise could bring back memories of combat. Our study reveals the role that synaptic connections may play in memory, and could explain why associative memories can last longer and be remembered more vividly than other types of memories.

Currently the most common treatment for PTSD, exposure therapy, involves repeatedly exposing the patient to a harmless but triggering stimulus in order to suppress recall of the traumatic event. In theory, this indirectly remodels the synapses of the brain to make the memory less painful. Although there has been some success with exposure therapy, patients are prone to relapse. This suggests that the underlying memory causing the traumatic response has not been eliminated.

It’s still unknown whether synapse generation and loss actually drive memory formation. My laboratory has developed technology that can quickly and precisely remove synapses without damaging neurons. We plan to use similar methods to remove synapses in zebrafish or mice to see whether this alters associative memories.

It might be possible to physically erase the associative memories that underlie devastating conditions like PTSD and addiction with these methods. Before such a treatment can even be contemplated, however, the synaptic changes encoding associative memories need to be more precisely defined. And there are obviously serious ethical and technical hurdles that would need to be addressed. Nevertheless, it’s tempting to imagine a distant future in which synaptic surgery could remove bad memories.The Conversation

Don Arnold, Professor of Biological Sciences and Biomedical Engineering, USC Dornsife College of Letters, Arts and Sciences

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Your first memory is probably older than you think

What’s your earliest memory? Statistically speaking, it’s likely from when you were two-and-a-half years old, according to a new study.

Image credits Ryan McGuire.

Up to now, it was believed that people generally form their earliest long-term memories around the age of three-and-a-half. This initial “childhood amnesia” is, to the best of our knowledge, caused by an overload of the hippocampus, an area heavily involved in the formation and retention of long-term memory, in the infant brain.

However, new research is pushing that timeline back by a whole year — it’s just that we don’t usually realize we have these memories, for the most part.

There, but fuzzy

“When one’s earliest memory occurs, it is a moving target rather than being a single static memory,” explains lead author and childhood amnesia expert Dr. Carole Peterson, from the Memorial University of Newfoundland.

“Thus, what many people provide when asked for their earliest memory is not a boundary or watershed beginning, before which there are no memories. Rather, there seems to be a pool of potential memories from which both adults and children sample. And, we believe people remember a lot from age two that they don’t realize they do.”

Dr. Peterson explains that remembering early memories is like “priming a pump”: asking an individual to remember their earliest memory, and then asking them for more, generally allows them to recall even earlier events than initially offered, even things that happened a year before their ‘first’ memory. Secondly, she adds, the team has documented a tendency among people to “systematically misdate” their memories, typically by believing they were older during certain events than they really were.

For this study, she reviewed 10 of her research articles on childhood amnesia along with both published and unpublished data from her lab gathered since 1999. All in all, this included 992 participants, with the memories of 697 of them also being compared to the recollections of their parents. This dataset heavily suggests that people tend to overestimate how old they were at the time of their first memories — as confirmed by their parents.

This isn’t to say that our memories aren’t reliable. Peterson did find evidence that, for example, children interviewed after two and eight years had passed since their earliest memory were still able to recall the events reliably, but tended to give a later age when they occurred in subsequent interviews. This, she believes, comes down to a phenomenon called ‘telescoping’.

“Eight years later many believed they were a full year older. So, the children, as they age, keep moving how old they thought they were at the time of those early memories,” says Dr. Peterson. “When you look at things that happened long ago, it’s like looking through a lens. The more remote a memory is, the telescoping effect makes you see it as closer. It turns out they move their earliest memory forward a year to about three and a half years of age. But we found that when the child or adult is remembering events from age four and up, this doesn’t happen.”

By comparing the information provided by participants with that provided by their parents, Dr. Peterson found that people likely remember much earlier into their childhood than they think they do. Those memories are also accessible, generally, with a little help. “When you look at one study, sometimes things don’t become clear, but when you start putting together study after study and they all come up with the same conclusions, it becomes pretty convincing,” she adds, admitting that this lack of hard data is quite a serious limitation on her work.

According to her, all research in this field suffers from the same lack of hard, verifiable data. Going forward, she recommends that research into childhood amnesia needs verifiable proof — either in the shape of independently confirmed memories or through documented external dates against which memories can be compared — as this would prevent errors from both participants and their parents, thus improving the reliability of the results.

The paper “What is your earliest memory? It depends” has been published in the journal Memory.

Taking short breaks while practicing lets our brains review what we’re doing — and get better at it

When cultivating a new skill, taking a short break can go a long way. This gives our brains time to replay what we just practiced, helping to cement our skills.

Image via Pixabay.

A study from the National Institutes of Health has been looking into best practices when learning a new skill such as playing a new song on the piano. The research involved monitoring participants’ brain activity while practicing, and revealed that taking short breaks during this time is a great way to help speed the process along.

Although taking time off seems counterproductive when practicing, the authors explain that our brains rapidly and repeatedly go through the activity we’re learning during these breaks, reviewing it faster and faster. The more time it gets to do this, the better a participant’s performance during subsequent practice sessions, the team adds, which suggests that these breaks actually helped strengthen their memory of the task.

Festina lente

“Our results support the idea that wakeful rest plays just as important a role as practice in learning a new skill. It appears to be the period when our brains compress and consolidate memories of what we just practiced,” said Leonardo G. Cohen, M.D., senior investigator at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS) and the senior author of the study published in Cell Reports. 

“Understanding this role of neural replay may not only help shape how we learn new skills but also how we help patients recover skills lost after neurological injury like stroke.”

The study was carried out at the NIH’s Clinical Center in Bethesda, Maryland, using a technique known as magnetoencephalography. This allowed the team to record the brain activity of 33 healthy, right-handed volunteers as they learned to type a five-digit test code (41234) with their left hands. They were seated in a chair and wore a long, cone-shaped scanner cap during the experiment. Each participant was asked to type this code out as many times as possible for 10 seconds and then take a 10-second break, a cycle which they repeated for 35 times.

During the first trials, participants massively improved their ability to type the code up to around the 11th cycle. Previous research done at the NIH shows that the largest part of this improvement happens during the short rest periods, not when the subjects are actually typing. More significantly, the improvements seen during these trials were greater than those seen after a night’s sleep (when memories are strengthened naturally).

As the participants improved at the task, the authors also saw a decrease in the size of brain waves, called beta rhythms.

“We wanted to explore the mechanisms behind memory strengthening seen during wakeful rest. Several forms of memory appear to rely on the replaying of neural activity, so we decided to test this idea out for procedural skill learning,” said Ethan R. Buch, Ph.D., a staff scientist on Dr. Cohen’s team and leader of the study.

So the team developed software that could interpret the brain wave patterns recorded while each participant typed in their test code. This showed that a faster version of these waves, around 20 times faster, were replaying in the participants’ brains during the rest periods. Over the first eleven cycles, these ‘compressed’ versions of the events were replayed around 25 times per rest period. Beyond that, they reduced in number: by two or three times during the final cycles compared to the first eleven ones.

Participants whose brains replayed the typing the most showed the greatest improvements in performance following each cycle, the authors note. This strongly suggests that the replaying has a direct impact on the efficiency of our practice sessions, likely through memory strengthening.

“During the early part of the learning curve we saw that wakeful rest replay was compressed in time, frequent, and a good predictor of variability in learning a new skill across individuals,” said Dr. Buch. “This suggests that during wakeful rest the brain binds together the memories required to learn a new skill.”

As for where in the brain this process takes place, the paper reports that it ‘often’ took place in sensorimotor regions of the brain — i.e. regions involved in movement and sensory processing. However, other areas of the brain were involved as well, most notably the hippocampus and entorhinal cortex.

“We were a bit surprised by these last results. Traditionally, it was thought that the hippocampus and entorhinal cortex may not play such a substantive role in procedural memory. In contrast, our results suggest that these regions are rapidly chattering with the sensorimotor cortex when learning these types of skills,” said Dr. Cohen.

The paper “Consolidation of human skill linked to waking hippocampo-neocortical replay” has been published in the journal Cell Reports.

That ‘memory palace’ thing? It actually works, a new study finds

In the hazy times of the pandemic, days may seem to blend in with each other. But if you want to keep your memory sharp and reliable, there are ways to do so. According to a new study, the ‘memory palace’ technique really works — and not just for memory athletes, but for regular people as well.

Building a memory palace really does help. Image credits: Diogo Nunes.

Ancient hacks, new evidence

The idea of a ‘memory palace’ sounds complex and weird, but it’s actually quite straightforward. The idea is to associate one memory with visualizations of familiar places to enhance your ability to recall that information. A common variation called the “memory palace” involves creating an imaginary location (a palace), and “storing” information in those rooms.

The technique (properly called the ‘loci technique’) was recently popularized by series such as Sherlock or The Mentalist, but it was actually developed much longer ago. The Roman statesman Cicero described it in one of his works more than two thousand years ago. In more modern times, it has been discussed by psychologists, with one seminal study noting that the hippocampus can be used as a sort of cognitive map.

But does it really work? For those who truly have an exceptional memory, it clearly does. Many memory contest champions advocate this technique to recall faces, digits, and lists of words. But does it work for regular people? A new study says so.

In the study, researchers led by Isabella Wagner, a cognitive neuroscientist at the University of Vienna, carried out two trials. The first one was conducted with 17 memory athletes and a control group of 16 people. The control group didn’t use the loci technique but had good memory, as gauged by initial IQ scores. Meanwhile, the second trial was based on 17 people who learned the loci technique over 20 hours, and two control groups.

In both cases, those who used the memory palace approach scored better than the control groups. For instance, before the training session, the control group of regular people performed better than the memory palace training group, recalling 30 words on average (compared to 25 for the training group). But after the control group received general memory training and the other group received memory palace training, things shifted. The former improved to 41.7 words, whereas the latter improved to a whopping 56 words — more than doubling their initial performance. Although these are small groups, the results strongly indicate that memory palace training can improve people’s memory.

Participants of all groups also had fMRI images taken of their brains. Remarkably, after these brief memory training sessions, normal participants’ brains started to look much like memory athletes, suggesting that it’s fairly easy to grasp the technique.

It’s still not entirely clear why this technique words but Wagner suspects that the memory palace could serve as a sort of solid scaffolding on which it’s easier to build memories, both short-term and long-term. Speaking to Inverse, she says she likes to imagine chickens running around her memory palace when she needs to buy eggs, and it works.

The scans also reveal that those who practice this technique show reduced activity in the left lateral prefrontal cortex, suggesting that the approach could help the brain use resources more efficiently when storing information. At the same time, they exhibited higher levels of connectivity between the hippocampus and the cortex, which hints at long-term memory formation.

“Behaviorally, memory training enhanced durable, longer-lasting memories,” the study authors note.

The results are consistent with a 2017 study which found that memory training can reshape brain networks to support memory formation. In that study, it also took a relatively short period (four months) to make participants’ brain connections resemble those of memory athletes.

While the study focused on word memorization, researchers say it can be used on pretty much any type of memory.

Remarkably, this ancient memory enhancement technique really seems to work. For those of us who sometimes struggle to recall things, it’s definitely worth looking into.

The study has been published in Science Advances.

Our brains tend to judge whole experiences by how they ended, which can lead to poor decisions in the future

How we remember our enjoyment of past experiences isn’t always reliable, according to new research. The study explains that humans tend to remember those that end well as more enjoyable and those that end poorly as less enjoyable, even if the two were equally pleasant.

Image credits Dariusz Sankowski.

The findings showcase why we shouldn’t blindly trust our past experiences to inform decisions in the present. If we keep in mind that the last bits of any experience have a disproportionately high effect on our memory of it, we’ll be able to make better choices, the authors hope.

Happy ending?

“When you’re deciding where to go for dinner, for example, you think about where you’ve had a good meal in the past. But your memory of whether that meal was good isn’t always reliable — our brain values the final few moments of the experience more highly than the rest of it,” said Dr Martin Vestergaard, a researcher in the University of Cambridge’s Department of Physiology, Development, and Neuroscience, who led the study.

This preference seems to come built-in in humans, Vestergaard explains. Its effect seems to dampen over time — our memory of something we did a long time ago will factor less into our decision-making than a more recent one.

The process has its roots in two different brain areas which are activated whenever we try to make a decision based on experiences in our memory. However, they also compete with each other while doing so, making us either overvalue experiences that started badly and end well or undervalue experiences that started well and end poorly.

A part of the brain known as the amygdala then uses our memories to work to determine the ‘objective value’ of an experience (such as how tasty a meal was). The other, called the anterior insula, makes older memories progressively less important in our decision-making process. This holds true even among our most recent memories — the further back in time it is, the less it factors into our decisions.

For the study, the team enlisted 27 healthy male volunteers and asked them to estimate which one of two pots of coins on a screen had the greatest total value (these pots were shown one at a time, not side-by-side). They were also shown how coins of varying sizes fell from the pots in quick succession. A functional magnetic resonance imaging (fMRI) machine was used to see what the participants’ brains were doing during the experiment. The task was repeated several times with different sequences of coins.

Participants routinely chose the wrong pot when the coins shown decreased in size by the end of the sequence, the team explains. This suggests that their brains were using this more modest end as a cue to estimate a lower total value. Although the intensity of this effect seemed to vary between participants, only a handful were able to completely bypass it and make rational estimations, according to the authors.

Such results suggest that our current theoretical models on decision-making — chiefly that sub-obtimal decision-making is handled by the amygdala, with higher brain areas handling more complex decisions — is correct. But on a personal level, they showcase to each of us how the final moments of an experience influence our perception of the whole, especially when judging from memory.

“Our attraction to the quality of the final moment of an experience is exploited by politicians seeking re-election; they will always try to appear strong and successful towards the end of their time in office,” said Vestergaard.

“If you fall for this trick, and disregard historical incompetence and failure, then you might end up re-electing an unfit politician. Sometimes it’s worth taking the time to stop and think. Taking a more analytical approach to complement your intuitive judgement can help ensure you’re making a rational decision.”

The paper “Retrospective valuation of experienced outcome encoded in distinct reward representations in the anterior insula andamygdala” has been published in the Journal of Neuroscience.

The Mandela effect: how groups of people can all remember the wrong thing

Our memory is imperfect. We can recall some things differently from how they happened, even remember things that never happened. Sometimes, however, larger groups of people can misremember something the same way.

Image credits Eric Smart.

Psychologists call these collective false memories — or just ‘false memories’ for individuals. It’s also commonly known as the ‘Mandela effect’, so christened by “paranormal consultant” Fiona Broome around 2010.

Needless to say, they have enough of a ‘spooky factor’ to capture public interest. Examples of and explanations for the false memories abound on the internet. It’s even been proposed that those people remember alternate universes, which they’ve lived in before somehow switching to our own.

Definitely an interesting story. But the origin of this phenomenon is more likely produced by an interplay between how our memories are formed, how they are stored, and our innate drive to fit in with the group.

How it got the name

In 2009, Broome attended a conference and talked with other people about how she remembered Nelson Mandela dying in a South African prison in the 1980s. They seemed to agree with her.

His death in the 1980s makes Mandela’s term as President of South Africa between 1994 and 1999 all the more impressive.

Nelson Mandela (right) and President Bill Clinton (left) in Philadelphia, 1993.
Image via Wikimedia.

Broome eventually realized her mistake and shared the story with her friends for a laugh, but soon realized that they too misremembered the dates. They even shared having memories of news coverage of Mandela’s death and a speech by his widow. Others she asked said they remembered his death in the 80’s as well.

Encouraged by her book publisher, Broome would launch a website to discuss this Mandela Effect and other similar incidents. Presumably of the paranormal kind.

What we know about it so far

Whether Mrs. Broome was being genuine or just working on establishing her new audience, we can’t know. But she is right in pointing out that collective false memories are a real phenomenon.

If you’re a fan of sci-fi or movies, you probably know this phrase: “___, I am your father”. It’s one of Darth Vader’s lines from The Empire Strikes Back and one of the most iconic phrases to come out of the cinema. But it’s not how you remember it; it’s actually “No, I am your father” (Youtube link).

The line stuck around as “Luke, I am your father” in public memory. It’s not clear exactly why. Vader’s previous line starts with “Luke”, so people may have conflated the two. It’s arguably cooler than the original quote — maybe someone in marketing figured that out and tweaked it for appeal.

File:I am your father, Luke (30514330535).jpg
It worked.
Image credits Flickr / Bryan Ledgard.

There are a few factors that could lead to the creation of such memories. First of all is suggestibility, our inclination to take information from others as true. Memory can shift to better suit information we’re presented with, especially if we’re repeatedly exposed to it. Secondly, the way our memories are encoded and recalled can alter them over time, either as far as the contents of that memory go or its source (misattribution of memory).

Memory formation and recall

Our memories reside in groups of neurons. Their position in our brains, the little network where this memory is physically housed, is that memory’s ‘engram’. As we grow and learn, our experience helps create a framework where engrams of similar memories are housed close to each other — this structure is the ‘schema’.

There are a few key steps memories go through that can disrupt them, leading to false memories. First, information must coalesce into short-term memory. Our perception of events, along with phenomena such as priming will shape what we remember.

This data must then get transcribed into long-term memory as we sleep. It can suffer a change until it is thus imprinted as it makes its way through the brain’s different storage sites — especially under states of heightened emotion.

position of consolidation to the information-to-memory process
A diagram of memory formation.
Image via Wikimedia.

Furthermore, whenever we access and recall a memory, it temporarily becomes unstable in the brain while it’s being read. New connections form between the neurons. It does re-consolidate afterward, and repetition leads to better memories due to this process. However, there’s also a chance something can go wrong and the memory changes as its engram activates.

If two memories are close together in the brain and activated at the same time, they can even start blending together — this could be why we remember Vader’s phrase the way we do.

Peer pressure

Psychological priming is a process through which our perception of an event is influenced by the events or stimuli leading up to it. This process largely works in the subconscious and can alter our memories to fit in with our priming. It largely overlaps with suggestibility.

Asking someone “how fast the car was speeding?” can prime them to remember a higher value than “what speed was the car moving with?”.

Eyewitness testimonies are notoriously unreliable because of phenomena such as priming. Memory can change — without a witness being aware of this — to suit new information, the questions they receive and their wording, or simply due to their emotional state at the time and while testifying.

Our brains will fill in the gaps in our information to make it make sense in a process called confabulation. Through this, we can remember details that never happened because they help our memory make better sense. Combined with our inborn desire to be part of the group and/or priming, our memories can thus shift to suit the collective narrative we’re presented. Any new information we receive that’s tied to the memory also alters it to some extent.

Memory inaccuracy can also come from “source monitoring errors”, when people fail to distinguish between real and imagined events. And, naturally, how old a memory is, and how often we access it, further impacts its quality.

Image via Pxfuel.

Our memories are a large part of who we are. It can be quite scary to understand that they change, without our knowledge, often to a great extent. Or, even worse, that whole groups of people can live with the same false memory.

Human memory isn’t perfect, but it was never meant to be perfect. It was meant to keep us alive. We’re still around, so it seems to be doing its job.

Modern life places very different pressures on our minds and bodies than the environments where they evolved. In a way, the Mandela effect is a by-product of our brains’ efforts to be more efficient. Keeping every memory ever in perfect shape isn’t efficient, or particularly useful. If something is important then you’ll probably interact with and think about it repeatedly, and the memory will always be refreshed and reinforced in your brain. That’s why calendars are helpful.

You don’t need to remember your fridge perfectly the first time you see it, just roughly where it is. And your brains know that. The Mandela effect lives in the memories of things we don’t check often. Something we kind of half-heard someone say once, maybe. A line in a movie 40 years ago.

Through a combination of our innate drive to fit in with the pack, the way we prime each other when we interact, and our brain’s tendency to fill in memories, false memories can spread among a group — as long as nobody there bothers to check on Wikipedia.

Western junk-food diet can slow down your brain and make you eat even more junk

Switching from a healthy diet to a western diet (high fat, high added sugar) for a little as one week can significantly impair cognitive function and encourage people to eat more even when they’re full.

Disruption in the hippocampus, a region that is known to have a major role in learning and memory, seems to be the likely cause.

Credit: Pixabay.

It’s not the first time something like this has been suggested. Research in the past found that when animals are fed a Western-style diet (rich in saturated fat and added sugar), they show impairment in memory and learning tests. There is a growing body of evidence suggesting that the same conclusion applies to humans and that hippocampal lesions can deregulate a person’s appetite.

Psychologists at Macquarie University in Sydney wanted to put this to the test and enlisted 110 young, lean students, aged 20 to 23, who generally ate a healthy diet.

Half of the students were randomly assigned to a junk food diet for an entire week, while the other half carried on with their normal diet.

The participants in the Western-style diet group had to have a breakfast of a toasted sandwich and a milkshake, high in saturated fat and added sugar, or Belgian waffles, as well as one main meal and a dessert from a popular fast-food chain. Bearing these changes aside, the students were asked to otherwise maintain their normal diet and lifestyle.

At the end of the study, the researchers found that those on the Western-style diet had an appetite for palatable food such as snacks and chocolate even when they were full. They also scored worse on memory tests.

“When we see cake, chocolate or crisps, for example, we remember how nice they are to eat.  When we are full the hippocampus normally supresses these memories, reducing our desire to eat.  We found that lean healthy young people exposed to one week of a junk food diet developed impaired hippocampal function and relatively greater desire to eat junk food when full.  Junk food may then act to undermine self-control by increasing desire,” the researchers stated in a press release.

These results seem to indicate that junk food might cause disruption in the hippocampus, impairing memory and making it harder to resist the temptation to eat even more junk food, which in turn generates more damage to the hippocampus and triggers a vicious cycle of overeating. The more people craved for palatable food when full, the more impaired their hippocampal function was, judging from memory tests.

“More broadly, this experiment, alongside those from the other animal and human studies cited here, suggests that a WS-diet causes neurocognitive impairments following short-term exposure,” the authors concluded.

Western-style diets, characterized by foods high in sugar, salt, and fat, as well as protein from red meat (i.e. burgers, processed meat, ready meals, fries, etc), have been previously associated with the development of obesity-related diseases such as type 2 diabetes, cardiovascular disease, and high blood pressure.

The authors of the new study, which was published in Royal Society Open Science, think that there will come a time when authorities will be pressured to impose restrictions on processed food, similarly to how some policies in place today deter smoking and drinking alcohol.

Another study published last month showed how sugar can trigger changes in the brain similarly to an addiction. After just 12 days of being on a high sugar diet, participants suffered major changes in the brain’s dopamine and opioid systems.

We create ‘fake news’ when facts don’t match our biases

If you also dislike fake news, you should probably find a mirror and put on a stern look. A new study found that people unconsciously twist information on controversial topics to better fit wide-held beliefs.

Image credits Roland Schwerdhöfer.

In one study, people were shown figures that the number of Mexican immigrants has been declining for a few years now — which is true, but runs contrary to what the general public believes — and tended to remember the exact opposite when asked later on. Furthermore, such denaturations of facts tended to get progressively worse as people passed the (wrong) information along.

Don’t believe everything you think

“People can self-generate their own misinformation. It doesn’t all come from external sources,” said Jason Coronel, lead author of the study and assistant professor of communication at Ohio State University.

“They may not be doing it purposely, but their own biases can lead them astray. And the problem becomes larger when they share their self-generated misinformation with others.”

The team conducted two studies for their research. In the first one, they had 110 participants read short descriptions of four societal issues that could be quantified numerically. General consensus on these issues were established with pre-tests. Data for two of them fit in with the broad societal view on these issues: for example, many people generally expect more Americans to be in support of same-sex marriage than against it, and public opinion polls seem to indicate that this is true.

However, the team also used two topics where the facts don’t match up to the public’s perception. For example, the number of Mexican immigrants to the U.S. fell from 12.8 million to 11.7 between 2007 and 2014, but most people in the U.S. believe the number kept growing.

Image credits Pew Research Center.

After reading the descriptions, the participants were asked to write down the numbers given (they weren’t informed of this step at the beginning of the test). For the first two issues (those consistent with public perception), the participants kept the relationship true, even if they didn’t remember the exact numbers. For example, they wrote a larger number for the percentage of people supporting same-sex marriage than for those that oppose it.

For the other two topics, however, they flipped the relationship around to make the facts align to their “probable biases” (i.e. popular perception on the issue). The team used eye-tracking technology to track participants’ attention when reading the descriptions.

“We had instances where participants got the numbers exactly correct—11.7 and 12.8—but they would flip them around,” Coronel said. “They weren’t guessing—they got the numbers right. But their biases were leading them to misremember the direction they were going.”

“We could tell when participants got to numbers that didn’t fit their expectations. Their eyes went back and forth between the numbers, as if they were asking ‘what’s going on.’ They generally didn’t do that when the numbers confirmed their expectations,” Coronel said.

For the second study, participants were asked to take part in a telephone (the game) process. The first person in a telephone chain would see the accurate statistics about the number of Mexican immigrants living in the United States. They then had to write those numbers down from memory and pass them along to the second person in the chain, and so on. The team reports that the first person tended to flip the numbers, stating that Mexican immigrants increased by 900,000 from 2007 to 2014 (they actually decreased by about 1.1 million). By the end of the chain, the average participant had said the number of Mexican immigrants increased in those 7 years by about 4.6 million.

“These memory errors tended to get bigger and bigger as they were transmitted between people,” said Matthew Sweitzer, a doctoral student in communication at Ohio State and co-author of the study.

Coronel said the study did have limitations. It’s possible that the participants would have better remembered the numbers if the team explained why they didn’t match their expectations. Furthermore, they didn’t measure each participant’s biases going into the tests. Finally, the telephone game study did not capture important features of real-life conversations that may have limited the spread of misinformation. However, it does showcase the mechanisms in our own minds that can spread misinformation.

“We need to realize that internal sources of misinformation can possibly be as significant as or more significant than external sources,” said Shannon Poulsen, also a doctoral student in communication at Ohio State and co-author of the study. “We live with our biases all day, but we only come into contact with false information occasionally.”

The paper “Investigating the generation and spread of numerical misinformation: A combined eye movement monitoring and social transmission approach” has been published in the journal Human Communication Research.

Is photographic memory real? Not quite, but there’s something that comes close

Credit: Pxsphere.

Some people are able to remember intricate visual details such as the architectural features of a landmark building or entire pages from books, which they can later reproduce from memory without error. This impressive memorization ability is often described as “photographic memory.”

When we think of “photographic memory”, there’s this impression that people who have this ability can record visual snapshots just like a photograph. They can then retrieve the snapshot from memory, zooming in and out on different parts. However, no study has ever been able to prove that true photographic memory exists — at least in this sense.

Memory is more a jigsaw puzzle than a photograph

Our eyes might work, to some extent, like a lens, but our memory isn’t like some camera that captures every detail — we’d all probably go mad, if that were the case. Instead, the things we’re likely to remember are those things that we pay close attention to. This is why you’re very unlikely to remember what you had for breakfast a month ago, unless it was something particularly eventful. This selective attention allows us to focus and record only the important bits. Later, upon recalling, the mind fills in the blanks.

Alright, that’s the case for most people. But, surely there are exceptional people out there who can remember things in such vivid and excruciating detail, one might remark. You’ve read about them in the papers and you’ve seen them in movies. However, while there are people in the world with phenomenal memorizing abilities — whether ingrained thanks to genetics or acquired through intense training — their memory doesn’t function like a camera.

That’s not to say that there aren’t remarkable people with a gifted memory. Teddy Roosevelt could recite entire newspaper pages—not just articles—as if they were sitting in front of him. Kim Peek — the real person Dustin Hoffman’s character was based on in the Oscar-winning movie Rain Man memorized every word of every book he had ever read, estimated at around 9,000 books. Arturo Toscanini conducted operas from memory after his eyesight became too poor to read the music. And Lu Chao from China recited the first 67,890 digits of pi by employing memorization techniques.

But even people who claim to have a true ‘photographic memory’ haven’t stood up to scientific scrutiny. For instance, while they may be able to recite pages upon pages from a book without error, they often fail to do the same in reverse. If their memories were like photos, they should have been able to easily reproduce the text in reverse order.

Instead, what’s often called “photographic memory” can be more accurately described as “eidetic memory.” People with eidetic memory can form a mental image of what they just saw for up to several minutes, after which it is gone. They can describe the image with an unusual level of accuracy and detail.

Eidetic memory is controlled primarily by the posterior parietal cortex in the brain. This is the part of the brain through which visual stimuli are processed, and images retained. For most people, these images are only stored for a few short seconds before being discarded or transferred to short-term memory.

Between 2% and 10% of children have an eidetic memory, but this ability gradually fades that virtually no adult retains it.

But, even if eidetikers have phenomenal memories, they still can’t capture all the details. What’s more, like all people, eidetikers also invent details that were never really there in the image — so-called false memories.

How to train your memory

People like Lu Chao, who holds the record for the longest string of pi digits a person ever recited from memory, use mnemonic techniques to help them record information. Although he could remember 67,000 digits in the right order, Chao is no genius. In 2009, researchers gave Lu, along with several other people that matched his age and education, a ‘digit span’ test — how well they could remember a sequence of random digits which were presented a rate of one digit per second. Lu had a digit span of 8.83 while the average for the rest of the group was 9.27.

Lu doesn’t have an innate ability to encode vast amounts of information. He knows a good trick to make up for it, though. In order to remember thousands of digits, Lu used a memory technique called loci, which is Latin for ‘places’. The method, also known as the memory palace method, employs spatial or environmental cues to help learning and memory.

The method works something like this: you use a familiar environment, such as your home, and walk through the environment associating information (like words or digits) you want to remember with various objects or scenery. In order to recall the digits in the right order, you simply have to do a mental walk through your mind palace. In Lu’s case, he devised an intricate story, and assigned images such as a chair, a king or a horse to two-digit combinations of numbers ranging from “00” to “99.” 

So, even people who can barely remember where they put their car keys can perform seemingly superhuman memory tasks — if given the proper training.

Our brains can actively forget during REM sleep — and that’s why you don’t remember dreams

Rapid eye movement (REM) sleep is when we dream — but new research shows it may also be when we forget.

Image via Pixabay.

In a mouse study, a team of Japanese and U.S. researchers has found that brains can employ REM sleep to actively forget excess information. The authors also point to a group of neurons deep inside the brain that control this process of forgetting during sleep.


“Ever wonder why we forget many of our dreams?” asks Thomas Kilduff, Ph.D. and a senior author of the study.

“Our results suggest that the firing of a particular group of neurons during REM sleep controls whether the brain remembers new information after a good night’s sleep.”

REM is one of the several sleep stages our bodies cycle through every night. It generally starts around 90 minutes after we fall asleep and gets its name from the rapid, darting movements of our eyes during this phase. It’s also characterized by increased heart rates, immobile limbs, dreaming, and brain wave patterns reminiscent of wakeful states.

The role of sleep in memory storage has been studied in the past — especially its role in helping our brains form new memories. However, researchers haven’t looked into whether it can also help the brain cut out excess information is stored throughout the day. Recent studies in mice have shown that during sleep, REM sleep included, certain synaptic connections involved in learning are selectively ‘pruned’ — which effectively destroys the memories they store.

The present study is the first to investigate how such a process could take place.

“Understanding the role of sleep in forgetting may help researchers better understand a wide range of memory-related diseases like post-traumatic stress disorder and Alzheimer’s,” said Janet He, Ph.D., who is the program director at the National Institute of Neurological Disorders and Stroke (NINDS).

“This study provides the most direct evidence that REM sleep may play a role in how the brain decides which memories to store.”

Dr. Kilduff’s team, together with that of Akihiro Yamanaka, Ph.D., at Nagoya University in Japan, have spent years looking into the role of a hormone called hypocretin/orexin in controlling sleep and narcolepsy. Narcolepsy is a disorder that makes people feel excessively sleepy during the day and sometimes experience changes reminiscent of REM sleep, including loss of muscle tone in the limbs and hallucinations. Narcolepsy could be linked to the loss of the neurons that produce this hormone in the hypothalamus, a peanut-sized area found deep inside the brain.

For the present study, Dr. Kilduff collaborated with members from Hokkaido University in Sapporo, Japan to look at cells neighboring those neurons that secrete melanin-concentrating hormone (MCH), which is involved in the control of appetite and sleep. They found that a majority (52.8%) of hypothalamic MCH cells fired when mice underwent REM sleep, about 35% fired only when the mice were awake, and about 12% fired at both times — this is consistent with previous findings on the subject.

Electrical brain recordings and tracing experiments further revealed that many of the hypothalamic MCH cells sent inhibitory messages to the hippocampus, the brain’s memory center, through long axons.

“From previous studies done in other labs, we already knew that MCH cells were active during REM sleep. After discovering this new circuit, we thought these cells might help the brain store memories,” said Dr. Kilduff.

To test this idea, the researchers turned MCH neurons in mice on and off during memory tests. They were specifically interested in the role these cells play in retention, which is the period between learning something and it being stored (consolidated) into long term memory — a sort of memory “limbo”.

The team reports that activating MCH cells during retention worsened long-term memory consolidation while turning them off improved it. For example, activating the cells reduced the time mice spent sniffing around new objects compared to familiar ones, but turning the cells off had the opposite effect.

Further experiments suggested that the MCH neurons perform this task during REM sleep. Mice performed better on memory tests when MCH neurons were turned off during REM sleep, and turning the neurons off while the mice were awake or in other sleep states had no effect on memory.

“These results suggest that MCH neurons help the brain actively forget new, possibly unimportant information,” said Dr. Kilduff.

“Since dreams are thought to primarily occur during REM sleep, the sleep stage when the MCH cells turn on, activation of these cells may prevent the content of a dream from being stored in the hippocampus — consequently, the dream is quickly forgotten.”

The paper “REM sleep–active MCH neurons are involved in forgetting hippocampus-dependent memories” has been published in the journal Science.


The hunger hormone is involved in episodic memory in rats, new research finds

Ghrelin, the hormone that induces hunger, also seems to play a role in memory control.


Image credits Christine Sponchia.

If you’re sitting in a restaurant keenly anticipating a delicious meal that will be served shortly, chances are you’ll feel hungry. That sensation is created by ghrelin, a hormone secreted in the stomach as you anticipate food. Ghrelin has been linked with the mediation of hunger signals between our gut and our brain, but new research at the Society for the Study of Ingestive Behavior suggests that the molecule might also play an important part in memory control.

Food for memory

“We recently discovered that in addition to influencing the amount of food consumed during a meal, the vagus nerve also influences memory function,” said Dr. Scott Kanoski, senior author of the study.

After its secreted, ghrelin binds to specialized receptors on the vagus nerve, which transmits signals between the gut and the brain. The team’s hypothesis was that ghrelin might also help the vagus nerve support memory formation.

Using a method called RNA interference, the team artificially reduced the amount of ghrelin receptor in the vagus nerve for a group of lab rats. The animals were then put through a series of memory tasks. The rats with reduced ghrelin signaling in the vagus nerve showed impaired performance in an episodic memory test compared to the control group. Episodic memory is the type of memory involved in remembering what, when, and where something occurred. For the rats, the test required remembering a specific object in a specific location.

A second part of the study looked at whether ghrelin signaling in the vagal nerve influences feeding behavior. They report that mice whose vagus nerve can’t receive signals from ghrelin ate more frequently than unaltered mice but consumed less food at each meal. The team says this might come down to deficits in episodic memory associated with impaired ghrelin signaling rather than feelings of hunger.

“Deciding to eat or not to eat is influenced by the memory of the previous meal,” says Dr. Elizabeth Davis, lead author on the study. “Ghrelin signaling to the vagus nerve may be a shared molecular link between remembering a past meal and the hunger signals that are generated in anticipation of the next meal.”

The team plans to expand their research to see if they can improve memory capacity in humans by manipulating ghrelin signaling between the gut and the brain.

The findings, “Vagal afferent ghrelin signaling promotes episodic memory and influences meal patterns in rats” have been presented at the 2019 Annual Meeting of the Society for the Study of Ingestive Behavior in Utrecht, Netherlands, in July.

Bar neon sign.

Ignoring distractions or temptation is harder when you’re tired, stressed, or trying to remember something

Stress, tiredness, and general cognitive strain make it much harder for us to ignore signals in the environment for something rewarding — such as bright neon signs for fast food joints.

Bar neon sign.

Neon lights and ads are such tempting cues.
Image via Pixabay.

We all have impulses we’d like to have a better handle on. Some of you might be trying to diet, quit smoking, or kick some other habit; good luck. New research says that tiredness, stress, or any other drain on your mental resources can make it harder for you to resist tempting cues and thus make good on your decision. The team says that trying to hold information in our memory also produces this effect, the first time this link has been demonstrated.


“We knew already that participants find it hard to ignore cues that signal a large reward,” says study lead Dr. Poppy Watson at UNSW.

“We have a set of control resources that are guiding us and helping us suppress these unwanted signals of reward. But when those resources are taxed, these become more and more difficult to ignore.”

Researchers refer to the cognitive processes that allow us to pay attention, organize our life, focus, or regulate our emotions as ‘executive control’. It wasn’t yet clear whether our ability or inability to ignore reward cues (i.e. temptation) was related to executive control or a separate ability, but the present research suggests that the former is true: executive control processes are employed to keep us from distractions or temptations. However, the findings also show that these resources are limited.

“Now that we have evidence that executive control processes are playing an important role in suppressing attention towards unwanted signals of reward, we can begin to look at the possibility of strengthening executive control as a possible treatment avenue for situations like addiction,” says Dr. Watson.

For the study, the team had participants look at a screen on which various shapes — including a colorful circle — were being displayed. Their task was to locate and look at a diamond shape on the screen, and if successful, they’d be given money. However, if they looked at the colored circle — which played the part of the distraction/temptation — they wouldn’t receive money. To make things even harder, participants were told that the presence of a blue circle on-screen meant that they’d be paid more if they successfully completed the diamond task than if an orange circle was shown.

The team tracked where each participant was looking using eye-tracking technology. The team ran a low-memory load and a high-memory load version of the experiment. In the high-memory load version, the participants were also asked to memorize a sequence of numbers while performing the larger task. This set-up was used to further draw from the participants’ cognitive resources and to see how this impacted their ability to perform the diamond task.

Hot Dogs.

Image via Pixabay.

“Study participants found it really difficult to stop themselves from looking at cues that represented the level of reward — the coloured circles — even though they were paid to try and ignore them,” Dr. Watson says.

“Crucially, the circles became harder to ignore when people were asked to also memorize numbers: under high memory load, participants looked at the coloured circle associated with the high reward around 50% of the time, even though this was entirely counterproductive.”

The findings suggest that people need access to either full or at least a sizeable chunk of their cognitive control processes to successfully block distractions or temptations from the environment. This mechanism, ironically, seems to make it harder to ignore cues regarding habits or behaviors you want to change — because you’re paying attention to changing them specifically. This might also explain why people find it harder to focus on dieting or beating an addiction if they are under a lot of stress.

“There’s this strong known link between where your attention is and what you eventually do, so if you find it hard to focus your attention away from reward cues, it’s even harder to act accordingly,” says Dr. Watson. “Constant worrying or stress is the equivalent to the high-memory load scenario of our experiment, impacting on people’s ability to use their executive control resources in a way that’s helping them manage unwanted cues in the environment.”

The team wants to see if executive control can be strengthened and if that can be used in the context of drug rehabilitation.

The paper “Capture and Control: Working Memory Modulates Attentional Capture by Reward-Related Stimuli” has been published in the journal Psychological Science.


Unpleasant smells make for more powerful memories, a new study finds

Stinky smells make for stronger memories, it seems.


Literally unforgettable.
Image via Pixabay.

New research from the New York University’s Department of Psychology suggests that memories are stronger when the original experience was accompanied by unpleasant odors. The findings broaden our understanding of the mechanisms that underpin memory and of how negative experiences help shape our ability to recall past events.

Smells… familiar

“These results demonstrate that bad smells are capable of producing memory enhancements in both adolescents and adults, pointing to new ways to study how we learn from and remember positive and negative experiences,” explains Catherine Hartley, an assistant professor in New York University’s Department of Psychology and the senior author of the study.

“Because our findings spanned different age groups, this study suggests that aversive odors might be used in the future to examine emotional learning and memory processes across development,” adds lead author Alexandra Cohen, an NYU postdoctoral fellow.

Negative experiences are known to impact our memory. If you get bitten by a dog, for example, you can develop a negative memory of that particular animal — and that negative association may eventually eneralize to all dogs. You’re also much more likely to have a vivid, powerful memory of that particular interaction than your other past experiences with dogs due to the trauma associated with the event.

“The generalization and persistence in memory of learned negative associations are core features of anxiety disorders, which often emerge during adolescence,” notes Hartley.

In order to get a better idea of how these learned negative associations shape the way our memories form during this age, the team designed and administered a Pavlovian learning task to individuals aged 13 to 25. Such tasks usually employ mild electrical shocks; however, the researchers used bad smells because they can be ethically administered in studying children.

The task included viewing a series of images belonging to one of two conceptual categories: objects (e.g., a chair) and scenes (e.g., a snow-capped mountain). Participants wore a nasal mask connected to an olfactometer (an instrument used to detect and measure odor dilution) as they viewed the images. When images from one category were shown, participants were given unscented air. While participants viewed images from the other category, unpleasant smells were sometimes circulated through the device to the mask.

In order to determine which odors the participants found unpleasant, the researchers had the subjects breathe in a variety of odors and indicate which ones they thought were unpleasant prior to the study. The odors were blends of chemical compounds provided by a local perfumer and included scents such as rotting fish and manure.

This allowed the team to quantify the effect of a bad smell on individual memories as well as generalization to related images. In other words, they could measure if the image of a chair was stronger when associated with a bad smell, and whether this would happen only for this image or images in general. The team measured perspiration in the participants’ hands as a proxy for arousal levels. One day after the task, researchers tested participants’ memory for the images.

Their findings showed that both adolescents and adults showed better memory specifically for images paired with the bad smell 24 hours after the task. They also found that individuals with higher arousal levels during while viewing images that may have been associated with an unpleasant smell had better memory of the images 24 hours later regardless of whether or not a smell was actually delivered. This suggests that unpredictability or surprise associated with the outcome leads to better memory.

The paper “Aversive learning strengthens episodic memory in both adolescents and adults” has been published in the journal Learning & Memory.

Brain scan.

Taking short breaks to reinforce memories is key to learning new skills or re-learning old ones

Taking a break is a key part of learning anything, new research suggests.

Brain scan.

Some of the brain areas that saw increased activity during the trials.
Image courtesy of Cohen lab, NIH/NINDS.

A new study from the National Institute of Health says that our brains retain the memory of a skill we’re practicing a few seconds faster by taking a short rest. The findings will help guide skill-relearning therapies for patients recovering from the paralyzing effects of strokes or other brain injuries, the team hopes. However, they should be broadly-applicable to anybody trying to learn a new skill that involves physical movement.

Slow and steady wins the race

“Everyone thinks you need to ‘practice, practice, practice’ when learning something new. Instead, we found that resting, early and often, may be just as critical to learning as practice,” said Leonardo G. Cohen, M.D., Ph.D., senior investigator at NIH’s National Institute of Neurological Disorders and Stroke and a senior author of the paper.

“Our ultimate hope is that the results of our experiments will help patients recover from the paralyzing effects caused by strokes and other neurological injuries by informing the strategies they use to ‘relearn’ lost skills.”

Lead researcher Marlene Bönstrup, M.D., a postdoctoral fellow in Dr. Cohen’s lab, says she had believed, like many of her colleagues, that our brains needed long periods of rest (i.e. sleep) to strengthen new memories. This included memories associated with learning a new skill. However, after seeing brain wave recordings of healthy volunteers in ongoing learning and memory experiments at the NIH Clinical Center, she started questioning that view.

These brain waves were recorded in right-handed volunteers with magnetoencephalography, a very sensitive scanning technique. Each participant was seated in a chair facing a computer screen under a long, cone-shaped brain scanning cap. Volunteers were shown a series of numbers on the screen then asked to type the numbers as many times as possible in 10 seconds using their left hand. Then, they took a 10-second break, and started typing again; each participant repeated this cycle of practice and rest 35 times.

Volunteer’s performance improved dramatically over the course of the trial, leveling off around the 11th cycle, the team reports. However, an important finding was ‘when’ this improvement seemed to take place in the brain.

“I noticed that participants’ brain waves seemed to change much more during the rest periods than during the typing sessions,” said Dr. Bönstrup. “This gave me the idea to look much more closely for when learning was actually happening. Was it during practice or rest?”

The team explains that the data shows participants’ performance increased primarily during the short rest periods, not while they were typing. These improvements made while resting added up to create the overall gains each volunteer saw during the trial. Furthermore, the sum improvements seen during these breaks was much greater than what the volunteers experienced over time (the trial spanned two days) — this last tidbit suggests that the short breaks played as critical a role in learning as practicing itself.

By looking at the brain waves, Dr. Bönstrup found that the participants’ brains were busy consolidating memories during these short rest periods. The team reports finding changes in the participants’ beta rhythms that correlated with the improvements the volunteers made during the rests. Further analysis reveals that the changes in beta oscillations primarily took place in the right hemispheres and along with neural networks connecting the frontal and parietal lobes. These structures are associated with planning and control of movements. These changes only happened during the breaks, and were the only brain wave patterns that correlated with performance.

“Our results suggest that it may be important to optimize the timing and configuration of rest intervals when implementing rehabilitative treatments in stroke patients or when learning to play the piano in normal volunteers,” said Dr. Cohen.

“Whether these results apply to other forms of learning and memory formation remains an open question.”

Dr. Cohen’s team plans to explore, in greater detail, the role of these early resting periods in learning and memory.

The paper ” A Rapid Form of Offline Consolidation in Skill Learning” has been published in the journal Current Biology.

Old photographs.

The hippocampus is the curator of our memories, new study suggests

A new study is looking into how our minds ‘auto-complete’ memories. The mastermind behind all that, turns out, is the hippocampus.

Old photographs.

Image credits Karolina Grabowska.

Our brains split memories into a bundle of different types of data: what happened, where it happened, what we were feeling at the time. These disparate elements tie together into a congruent whole as a memory pops back up into our minds. Look at a picture you’ve taken at the beach, for example, and you can almost feel the scent of sea spray and sunscreen. We don’t actively work to recall these bits, but our minds supply it anyway — akin to a memory ‘auto-complete function’.

Needs more context

A new collaborative study between the Universities of Birmingham and Bonn analyzed the mechanisms underpinning this automated recall feature in a bid to help us better understand memory.

For the study, the team worked with 16 patients at the University Clinic of Epileptology in Bonn — one of Europe’s biggest epilepsy centers. The clinic specializes in the treatment of severe forms of temporal lobe epilepsy. As part of this, some patients have electrodes implanted into their brains (so the doctors can identify exactly which areas cause the seizures to have them removed). Of course, people aren’t thrilled to have anything implanted into their brain, so this gave researchers a rare opportunity to tap into our pound of gray matter.

They showed the participants a range of scene images. Each was paired with one of two different objects — either a raspberry or a scorpion. Participants were allowed 3 seconds to memorize each image-object combination. Following a short break, they were shown the images again and asked to recall the objects they were paired within in the initial phase. The control group only had to remember the scene images and not their associated objects. All the while, the team monitored participants’ patterns of brain activity.

“We focused on two brain regions — the hippocampus and the neighbouring entorhinal cortex,” explains Prof. Florian Mormann, who heads the Cognitive and Clinical Neurophysiology group at the University of Bonn Medical Centre.

The hippocampus has been documented to play a role in associative memory (which is what this study focused on) but we still have very little idea of how it fulfills this task. The team was able to find that neurons in the hippocampus activate start firing dramatically during memory recall. The same was observed in the control group. However, this activity persisted much longer for the experimental group than it did for the controls, and the former also showed activation in the entorhinal cortex in parallel to the hippocampus.


The hippocampus is highlighted.
Image credits Anatomography / Life Science Databases.

The team further reports that this similarity (between recall and learning) was so striking that a computer algorithm they devised could tell whether a participant was remembering the raspberry or the scorpion.

“The pattern of activation in the entorhinal cortex during successful recall strongly resembled the pattern of activation during the initial learning of the objects,” explains Dr. Bernhard Staresina from the University of Birmingham.

“We call this process reinstatement. The act of remembering put neurons in a state that strongly resembles their activation during initial learning.”

This reinstatement process, the team believes, is governed by neurons in the hippocampus. In effect, they act like hiperlinks in a Wikipedia article, guiding the rest of the brain to wherever a particular memory it wants is stored.

The paper “Recollection in the human hippocampal-entorhinal cell circuitry” has been published in the journal Nature Communications.


Researchers identify clump of neurons that block, or allow, frightful memories into our minds

New research is looking into the cells that block, or allow, frightening memories to pop up into our minds.


Image credits Henry Gray / Anatomy of the Human Body (1918) via Wikimedia.

Researchers at The University of Texas at Austin have identified the group of neurons that handle scary, recurrent memories. The findings could help us better tailor therapy for the treatment of anxiety, phobias, and post-traumatic stress disorder (PTSD).

Frightful relapse

“There is frequently a relapse of the original fear, but we knew very little about the mechanisms,” said Michael Drew, associate professor of neuroscience and the senior author of the study. “These kinds of studies can help us understand the potential cause of disorders, like anxiety and PTSD, and they can also help us understand potential treatments.”

Drew and his team worked with a group of lab mice, which they trained to associate a distinctive box with fear. Each mouse was repeatedly placed inside the box and given a harmless electrical shock until they started associating this box with feelings of pain. Needless to say, this rendered the mice quite scared of having to go inside said box.

The end result was that the mice would display fear when inside the box. In the second step of the experiment, the same mice were placed inside the box without receiving the shock. They kept displaying fear initially, the team reports. However, as exposure to the box continued without the shock being administered, the association weakened. Eventually, the mice stopped showing signs of fear. The authors explain that repeated exposure without the painful shock created extinction memories in the mice’s minds in place of the earlier, painful and fear-inducing memories.

This is a glimpse of how our brain stores and handles conditioned responses, a process which has been heavily studied and documented ever since Pavlov and his drooling dogs. However, there are still things we don’t understand. Among these, and something the team wanted to understand, is how and why memories or responses we thought were behind us can still pop up in our minds, triggering spontaneous recovery (think of it as a form of traumatic-memory relapse).

In order to find out, they artificially activated fear responses and suppressed extinction trace memories through the use of optogenetics (a technique that uses light to turn neurons on or off).

“Artificially suppressing these so-called extinction neurons causes fear to relapse, whereas stimulating them prevents fear relapse,” Drew said. “These experiments reveal potential avenues for suppressing maladaptive fear and preventing relapse.”

Drew’s team was surprised to find that the brain cells responsible for suppressing or allowing fear memories to surface are nestled in the hippocampus. The traditional view is that fear is born of the amygdala, the primitive ‘lizard’ level of our brains. The hippocampus is actually heavily involved in aspects of memory, but generally in the process of linking memory with spatial navigation. The team’s hypothesis is that the hippocampus’ job is to provide spatial context for memories, i.e. where something happened or how you got there.

Their findings could, therefore, explain why exposure therapy — one of the most common treatment avenues for fear-based disorders — sometimes simply stops working. Exposure therapy works by creating safe (extinction) memories to override the initial, traumatic one. For example, someone who’s scared of spiders after being bitten by one can undertake exposure therapy by letting a harmless spider crawl on his hand.

While the approach is sound, the team reports, it hinges on our hippocampus‘ willingness to play ball.

“Extinction does not erase the original fear memory but instead creates a new memory that inhibits or competes with the original fear,” Drew said.

“Our paper demonstrates that the hippocampus generates memory traces of both fear and extinction, and competition between these hippocampal traces determines whether fear is expressed or suppressed.”

The findings suggest we should revisit how we time exposure therapy, and how frequently patients should undergo exposure sessions, according to the authors.

Paper DOI http://dx.doi.org/10.1038/s41593-019-0361-z

Floppy disk.

A new study estimates English only takes about 1.5 megabytes of your brainspace

New research says all the language in your head takes up as much space as a picture would on a hard drive — about 1.5 megabytes.

Floppy disk.

All the English in your brain would probably fit on one of these, it seems.
Image via Pixabay.

A team of researchers from the University of Rochester and the University of California estimates that all the data your brain needs to encode language — at least in the case of English — only adds up to around 1.5 megabytes. The team reached this figure by applying information theory to add up the amount of data needed to store the various parts of the English language.

Quick download

We learn how to speak by listening to those around us as infants. We don’t yet have a clear idea of how this process takes place, but we do know that it’s not a simple case of storing words alongside their definitions as you’d see in a dictionary. This is suggested by the way our minds handle words and concepts — for example, by forming associative clues between the concept of flight and the words “bird,” “wing,” or even “robin.” Our brains also store the pronunciation of words, how to physically create the sound as we speak, or how words interact with and are used with other words.

In an effort to map out how much ‘space’ this information takes up in our brain, the authors worked to convert all of the ways our brain might store a language into data amounts. To do so, they turned to information theory, a branch of mathematics that deals with how information is encoded via sequences of symbols.

The researchers assigned a quantifiable size estimate to each aspect of English. They began with phonemes — the sounds that make up spoken words — noting that humans use approximately 50 phonemes. Each phoneme, they estimate, would take around 15 bits to store.

Next came vocabulary. They used 40,000 as an average number of words an average person would know, which would translate into 400,000 bits of data. Word frequency is also an important element in speech, one which the team estimated would take around 80,000 bits to ‘code’ in our brains. Syntax rules were allocated another 700 bits.

Semantics for those 40,000 words was the single largest contributor to size the team factored in: roughly 12 million bits. Semantics, boiled down, is the link between a word or a symbol and its meaning. The sounds that made up the words themselves were logged under ‘vocabulary’, and this category basically represents the brain’s database of the meaning those sounds convey.

“It’s lexical semantics, which is the full meaning of a word. If I say ‘turkey’ to you, there’s information you know about a turkey. You can answer whether or not it can fly, whether it can walk,” says first author Frank Mollica at the University of Rochester in New York.

Adding it all up came to approximately 1.56 megabytes, which is surprisingly little. It’s barely enough to fill a floppy disk (the ‘save’ icon).

“I thought it would be much more,” Mollica agrees.

Keep in mind that these results are estimations. Furthermore, the team applied their estimation using only English as a subject language. The result should be useful as a ballpark idea of how much space language acquisition takes up in our brains, however. Mollica says that these numbers are broad enough estimates that they might carry over to other languages as well.

The paper “Humans store about 1.5 megabytes of information during language acquisition” has been published in the journal Royal Society Open Science.

Forgetting takes more effort than remembering

We automatically remember and forget things. According to the latest memory research, human beings primarily forget due to retrieval failure, interference, and failure to store memories, but also by intentionally forgetting. Now, a new study suggests that voluntarily letting go of a memory takes more work than holding on to it.

Credit: Wang et al., JNeurosci (2019).

Our memories are not static, like photographs. Instead, the brain rebuilds a memory every time it is recalled — and this creates a window of opportunity for changing it. For decades, scientists have known that it is possible to intentionally discard a memory by redirecting attention away from the unwanted experience during memory formation, thus suppressing its retrieval once the memory has been formed. For instance, a study by researchers at Uppsala University in Sweden, found that a fear memory could be neutralized if the reconsolidation process is disrupted before the memory can solidify. Another recent study, this time carried out by scientists at the University of St. Andrews in Scotland, found that even when a memory can’t be erased, it can be made to feel less personal or painful.

“We may want to discard memories that trigger maladaptive responses, such as traumatic memories, so that we can respond to new experiences in more adaptive ways,” Jarrod Lewis-Peacock, the study’s senior author and a researcher at University of Texas, said in a statement.

“Decades of research has shown that we have the ability to voluntarily forget something, but how our brains do that is still being questioned. Once we can figure out how memories are weakened and devise ways to control this, we can design treatment to help people rid themselves of unwanted memories.”

The researchers at the University of Texas worked with healthy, young adults who were instructed to remember or forget images of scenes and neutral faces. Throughout the study, researchers used neuroimaging to identify patterns of brain activity. The results showed that forgotten images were associated with stronger activation of the ventral temporal cortex than remembered images. To be effective, though, this activation shouldn’t be too extreme — forgetting was the most successful when the visual cortex was moderately activated.   

“A moderate level of brain activity is critical to this forgetting mechanism. Too strong, and it will strengthen the memory; too weak, and you won’t modify it,” Tracy Wang, a researcher at University of Texas and study lead author, said in a news release.

“Importantly, it’s the intention to forget that increases the activation of the memory, and when this activation hits the ‘moderate level’ sweet spot, that’s when it leads to later forgetting of that experience.”

The new study suggests that activation, rather than suppression, of unwanted information, can also be a viable forgetting strategy. The authors have also identified a new link between the voluntary control of visual attention and the long-term storage of memories.

“We’re learning how these mechanisms in our brain respond to different types of information, and it will take a lot of further research and replication of this work before we understand how to harness our ability to forget,” Lewis-Peacock said. “We’re looking not at the source of attention in the brain, but the sight of it.”

The findings appeared in the Journal of Neuroscience

Sleeping feet.

The brain replays memories as you sleep to make sure you don’t lose them

The same brain mechanisms that record memory while awake make an appearance during sleep.

Sleeping feet.

Image via Pixabay.

A new study published by Dutch and British researchers reports that being awake and forming memories is just half the story — you also need to sleep and consolidate them. Surprisingly, however, the same brain networks and mechanisms that formed the memory are reactivated during its consolidation.


“Understanding how memories are reactivated in different states also provides insight into how these memories could be altered — which might for example be interesting in therapeutic settings,” says co-lead author Dr. Tobias Staudigl, of the Donders Institute in Holland.

Sleep is very important for the formation of stable, long-lasting memories. Researchers — as well as students during finals — have been aware of this for some time now, but we’ve never really had a good look at how our brains process memories during sleep.

Donders’ team set out to record and study these processes using a technique called Targeted Memory Reactivation, which is known to enhance memory. Boiled down, the technique involves playing back previously learned information (in this case words in a foreign language) to a person while they are sleeping.

The researchers used electroencephalography to monitor the brain activity of each participant while they were learning and remembering the foreign vocabulary before sleep. Participants’ brain activity was recorded as they were sleeping, allowing the team to see what brain pathways activated upon them hearing the words.

Comparing neural signals fired by the brain in each state, the researchers were able to show clear similarities in brain activity.

“Although sleep and wakefulness might seem to have little in common, this study shows that brain activity in each of these states might be more similar than we previously thought,” says lead author Dr. Thomas Schreiner, of the University of Birmingham’s School of Psychology.

“The neural activity we recorded provides further evidence for how important sleep is to memory and, ultimately, for our well-being.”

In the future, the team plans to follow up on their research by investigating spontaneous memory activation during sleep.

So, students the world over, science has spoken: pulling an all-nighter to cram up just isn’t a very good way to study. If it’s daytime, glue yourself to the book; at night, that head needs to go on a pillow.

The paper “Theta Phase-Coordinated Memory Reactivation Reoccurs in a Slow-Oscillatory Rhythm during NREM Sleep” has been published in the journal Cell Reports.