Tag Archives: reading

Fiction readers, rejoice: you probably have better language skills, study shows

When it comes to reading, non-fiction is often regarded as more useful, widening our horizons and improving our knowledge as well as our language ability. But new research contradicts that idea and brings fiction into the spotlight.

The study from Concordia University researchers found that those who read fiction (yes, even the accessible, popular stuff) have better language skills, scoring higher in language tests compared to those who read just to access specific information.

Image credit: Flickr / Paul Bence

“It’s always very positive and heartening to give people permission to delve into the series that they like,” Sandra Martin-Chang, lead author, said in a statement. “I liken it to research that says chocolate is good for you: the guilty pleasure of reading fiction is associated with positive cognitive benefits and verbal outcomes.”

Martin-Chang and her team used a scale called Predictors of Leisure Reading (PoLR) to investigate reading behavior, looking at readers’ interests, obstacles, attitudes, and motivations. Then, they looked at how well the PoLR predicted the language skills of 200 undergraduate students from York University.

The researchers chose to focus specifically on undergraduate students because they are in a crucial period of their reading life. Early adulthood is when rereading becomes self-directed rather than imposed by others, making it a period to develop one own’s reading habits. It’s also a relatively understudied group, as previous studies have focused on children.

For the study, the volunteers first completed a 48-question survey that measured various reading factors. They were then given language tests and a measure of reading habits called the Author Recognition Test – which asks respondents to pick names of fiction and non-fiction authors they are familiar with from a long list of real and fake names.

After looking at the data, the researchers found that reading enjoyment, positive attitudes, and deeply established interests predict better verbal abilities — and these traits were more strongly associated with exposure to fiction than non-fiction. For Martin-Chang, “wanting to read something over and over again and feeling connected to characters and authors are all good things.”

Previous studies have highlighted the benefits of reading, particularly of fiction, helping people develop empathy, theory of mind, and critical thinking. When we read, we strengthen several different “cognitive muscles,” which essentially makes reading the equivalent of a hardcore empathy workout.

Research also suggests that reading fiction is an effective way to enhance the brain’s ability to keep an open mind while processing information, a necessary skill for effective decision-making. A 2013 study found individuals who read short stories instead of essays had a lower need for cognitive closure – the desire to reach a quick conclusion in decision-making. Ultimately, reading fiction is also fun, reducing stress and all the pressure accumulate through the day.

Some high-level business leaders have long touted the virtues of reading fiction, while others focus more on non-fiction. Warren Buffet, CEO of Berkshire Hathaway, spends most of his day reading and recommends books every year, including fiction. SpaceX CEO Elon Musk says he learned to build rockets by reading fiction books. Of the 94 books recommended by Bill Gates from 2012 to 2020, only nine were fiction.

Still, many adults don’t read fiction because at some point they came to believe that fiction is just a waste of valuable time that could be spent on something more productive. But it isn’t true, as seen with many studies. So finish this article, go and grab one of your Harry Potter books and warm cocoa, and get on reading!

The study was published in the journal Reading and Writing.

Our ability to read and write is housed in a ‘recycled’ part of the brain

New research is homing in on the mechanisms our brains use to process written language. 

A detail of the cuneiform script carved in basalt at the Van museum.
Image credits Verity Cridland / Flickr.

Given my profession, I’m quite happy that people can read and write. From an evolutionary standpoint, however, it’s surprising that we do. There’s no need for it in the wild, so our brains didn’t need to develop specific areas to handle the task, like they did with sight or hearing.

A new study looked into which areas of the brain handle this task, finding that we use a “recycled” brain area for reading. These structures were repurposed from the visual system and were originally involved in pattern recognition.

A change of career

“This work has opened up a potential linkage between our understanding of the neural mechanisms of visual processing and […] human reading,” says James DiCarlo, the head of MIT’s Department of Brain and Cognitive Sciences and the senior author of the study.

The findings suggest that even nonhuman primates have the ability to distinguish words from gibberish, or to pick out specific letters in a word, through a part of the brain called the inferotemporal (IT) cortex.

Previous research has used functional magnetic resonance imaging (fMRI) to identify which brain pathways activate when we read a word. Christened the visual word form area (VWFA), it handles the first step involved in reading: recognizing words in strings of letters or in unknown script. This area is located in the IT cortex, and is also responsible for distinguishing individual objects from visual data. The team also cites a 2012 study from France that showed baboons can learn to identify words within bunches of random letters.

DiCarlo and Dehaene wanted to see if this ability to process text is a natural part of the primate brain. They recorded neural activity patterns from 4 macaques as they were shown around 300 words and 300 ‘nonwords’ each. Data from the macaques was recorded at over 500 sites across their IT cortexes using surgically-implanted electrodes. This data was then fed through an algorithm that tried to determine whether the activity was caused by a word or not.

“The efficiency of this methodology is that you don’t need to train animals to do anything,” Rajalingham says. “What you do is just record these patterns of neural activity as you flash an image in front of the animal.”

Naturally good with letters

This model was 76% accurate at telling whether the animal was looking at a word or not, which is similar to the results of the baboons in the 2012 study.

As a control, the team performed the same experiment with data from a different brain area that is also tied to the IT and visual cortex. The accuracy of the model was worse compared to the experimental one (57% vs. 76%). This last part shows that the VWFA is particularly suited to handle the processes involved in letter and word recognition.

All in all, the findings support the hypothesis that the IT cortex could have been repurposed to enable reading, and that reading and writing themselves are an expression of our innate object recognition abilities.

Of course, whether reading and writing arose naturally from the way our brains work, or whether our brains had to shift to accommodate them, is a very interesting question — one that, for now, remains unanswered. The insight gained in this study, however, could help to guide us towards an answer there as well.

“These findings inspired us to ask if nonhuman primates could provide a unique opportunity to investigate the neuronal mechanisms underlying orthographic processing,” says Dehaene.

The next step, according to the researchers, is to train animals to read and see how their patterns of neural activity change as they learn.

The paper “The inferior temporal cortex is a potential cortical precursor of orthographic processing in untrained monkeys” has been published in the journal Nature Communications.

Computers can now read handwriting with 98% accuracy

New research in Tunisia is teaching computers how to read your handwriting.

Image via Pixabay.

Researchers at the University of Sfax in Tunisia have developed a new method for computers to recognize handwritten characters and symbols in online scripts. The technique has already achieved ‘remarkable performance’ on texts written in the Latin and Arabic alphabets.

iRead

“Our paper handles the problem of online handwritten script recognition based on an extraction features system and deep approach system for sequence classification,” the researchers wrote in their paper. “We used an existent method combined with new classifiers in order to attain a flexible system.”

Handwriting recognition systems are, unsurprisingly, computer tools designed to recognize characters and hand-written symbols in a similar way to our brains. They’re similar in form and function with the neural networks that we’ve designed for image classification, face recognition, and natural language processing (NLP).

As humans, we innately begin developing the ability to understand different types of handwriting in our youth. This ability revolves around the identification and understanding of specific characters, both individually and when grouped together, the team explains. Several attempts have been made to replicate this ability in a computer over the last decade in a bid to enable more advanced and automatic analyses of handwritten texts.

The new paper presents two systems based on deep neural networks: an online handwriting segmentation and recognition system that uses a long short-term memory network (OnHSR-LSTM) and an online handwriting recognition system composed of a convolutional long short-term memory network (OnHR-covLSTM).

The first is based on the theory that our own brains work to transform language from the graphical marks on a piece of paper into symbolic representations. This OnHSR-LSTM works by detecting common properties of symbols or characters and then arranging them according to specific perceptual laws, for instance, based on proximity, similarity, etc. Essentially, it breaks down the script into a series of strokes, that is then turned into code, which is what the program actually ‘reads’.

“Finally, [the model] attempts to build a representation of the handwritten form based on the assumption that the perception of form is the identification of basic features that are arranged until we identify an object,” the researchers explained in their paper.

“Therefore, the representation of handwriting is a combination of primitive strokes. Handwriting is a sequence of basic codes that are grouped together to define a character or a shape.”

The second system, the convolutional long short-term memory network, is trained to predict both characters and words based on what it read. It is particularly well-suited for processing and classification of long sequences of characters and symbols.

Both neural networks were trained then evaluated using five different databases of handwritten scripts in the Arabic and Latin alphabets. Both systems achieved recognition rates of over 98%, which is ‘remarkable’ according to the team. Both systems, they explained, performed similarly to human subjects at the task.

“We now plan to build on and test our proposed recognition systems on a large-scale database and other scripts,” the researchers wrote.

The paper “Neural architecture based on fuzzy perceptual representation for online multilingual handwriting recognition” has been published in the preprint server arXiv.

Soldier-AI integration.

Researchers are looking into giving AI the power of reading soldiers’ minds — to help them in battle

The US Army is planning to equip its soldiers with an AI helper. A mind-reading, behavior-predicting AI helper that should make operational teams run more smoothly.

Soldier-AI integration.

The Army hopes that giving AI the ability to interpret the brain activity of soldiers will help it better respond to and support their activity in battle.
Image credits US Army.

We’re all painfully familiar with the autocomplete features in our smartphones or on the Google page — but what if we could autocomplete our soldiers’ thoughts? That’s what the US Army hopes to achieve. Towards that end, researchers at the Army Research Laboratory (ARL), the Army’s corporate research laboratory, have been collaborating with members from the University of Buffalo.

A new study published as part of this collaboration looks at how soldiers’ brain activity can be monitored during specific tasks to allow better AI-integration with the team’s activities.

Army men

“In military operations, Soldiers perform multiple tasks at once. They’re analyzing information from multiple sources, navigating environments while simultaneously assessing threats, sharing situational awareness, and communicating with a distributed team. This requires Soldiers to constantly switch among these tasks, which means that the brain is also rapidly shifting among the different brain regions needed for these different tasks,” said Dr. Jean Vettel, a senior neuroscientist at the Combat Capabilities Development Command at the ARL and co-author of this current paper.

“If we can use brain data in the moment to indicate what task they’re doing, AI could dynamically respond and adapt to assist the Soldier in completing the task.”

The Army envisions the battlefield of the future as a mesh between human soldiers and autonomous systems. One big part of such an approach’s success rests on these systems being able to intuit what each trooper is thinking, feeling, and planning on doing. As part of the ARL-University of Buffalo collaboration, the present study looks at the architecture of the human brain, its functionality, and how to dynamically coordinate or predict behaviors based on these two.

Currently, the researchers have focused on a single person, the purpose is to apply such systems” for a teaming environment, both for teams with Soldiers as well as teams with Autonomy” said Vettel.

The first step was to understand how the brain coordinates its various regions when executing a task. The team mapped how key regions connect to the rest of the brain (via bundles of white matter) in 30 people. Each individual has a specific connectivity pattern between brain regions, the team reports. So, they then used computer models to see whether activity levels can be used to predict behavior.

Each participant’s ‘brain map’ was converted into a computational model whose functioning was simulated by a computer. What the team wanted to see was what would happen when a single region of a person’s brain was stimulated. A mathematical framework, that the team themselves developed, was used to measure how brain activity became synchronized across various cognitive systems in the simulations.

Sounds like Terminator

“The brain is very dynamic,” Dr. Kanika Bansal, lead author on the work, says. “Connections between different regions of the brain can change with learning or deteriorate with age or neurological disease.”

“Connectivity also varies between people. Our research helps us understand this variability and assess how small changes in the organization of the brain can affect large-scale patterns of brain activity related to various cognitive systems.”

Bansal says that this study looks into the foundational, very basic principles of brain coordination. However, with enough work and refinement, we may reach a point where these fundamentals can be extended outside of the brain — to create dynamic soldier-AI teams, for example.

“While the work has been deployed on individual brains of a finite brain structure, it would be very interesting to see if coordination of Soldiers and autonomous systems may also be described with this method, too,” Dr. Javier Garcia, ARL neuroscientist and study co-author points out.

“Much how the brain coordinates regions that carry out specific functions, you can think of how this method may describe coordinated teams of individuals and autonomous systems of varied skills work together to complete a mission.”

Do I think this is a good thing? Both yes and no. I think it’s a cool idea. But, if I’ve learned anything during my years as a massive Sci-fi geek it’s that AI should not be weaponized. Using such systems to glue combat teams closer together and helping them operate more efficiently isn’t weaponizing them per se — but it’s uncomfortably close. Time will tell what such systems will be used for, if we develop them at all.

Hopefully, it will be for something peaceful.

The paper “Cognitive chimera states in human brain networks” has been published in the journal Science Advances.

Haven’t read a book lately? Blame Netflix, researchers say

You’re all up to date with the latest series, but the book on your nightstand is gathering dust — a situation more and more people are finding themselves in. A new study decries the drop in book readership, as more and more time is spent online and watching TV shows.

The old saying that every second, a German buys a book, no longer stands. People are spending more time online and less time reading, researchers report.

The new study analyzed reading trends in Germany, finding that the people who buy books are becoming fewer and fewer. Last year, just 44% of Germans over the age of 10 (29.6 million people) bought a book. The number dropped by nearly 18% between 2013 and 2017, and between people aged 20-50, the drop was even more severe (24% to 37%).

Among the main reasons for this drop is competition. Reading books is an enjoyable pastime, but people are spending their time online and, notably, watching series of TV shows — it’s no coincidence that companies like Netflix or Amazon are enjoying such tremendous success with their shows.

Watching things is often regarded as an “easier” way to spend your time, requiring less effort and often featuring less intricacy than books. There’s also social pressure — if your friends are watching the latest series, you also want to catch up and be up to date.

“There’s growing social pressure to constantly react and be tuned in so you don’t get left behind,” Boersenverein head Alexander Skipis said in a statement accompanying the study, titled “Book buyers, where are you going?”.

However, this presents the book industry with an opportunity: life is already hectic, and the web and TV shows only make it even more so. Reading a book should be presented as a relaxing activity, a sort of time-out from daily life.

“People are yearning for a time-out,” said Skipis, stressing that all age groups reported have a “very positive” attitude towards books.

However, we shouldn’t interpret this as an overall decrease in book reading. Perhaps surprisingly, while fewer people are buying books, those who are are buying are purchasing more than ever. The average reader bought 12 books last year, up from 11 in 2013. The total amount spent jumped from around 117 euros ($138) to 137 euros.

So while the non-readers group is getting larger, the readers group is getting more passionate. A similar evolution was experienced by e-books: customer numbers went down, but overall purchases per person went up.

People are also finding more creative and time-efficient ways to incorporate reading into their lives. Some people are using personalized apps for book recommendations, others are taking books in rather unexpected places, like the gym.

An interesting takeaway, and perhaps an important lesson (although this wasn’t the focus of the study), is that the gap between the two groups (readers and non-readers) is becoming larger and larger. So many times we talk about two different worlds, two societies hidden in one — here too, the same trend is noticeable.

German text.

Learning to read changes your brain from stem to cortex, study finds

A new study found that the human brain has to patch together a network that handles reading by re-purposing areas deep inside the brain into visual-language interfaces. The team reports that the brain can undergo this process with surprising ease.

German text.

Evolutionary speaking, reading is very novel for humans — not to mention wide-spread, widely employed reading and writing. Because of this, we didn’t develop a specific region in the brain to handle this process.

So what do you do if you’re a brain and you have to learn to make sense of these scribbles and marks for your human? You improvise, of course! Working withing the bounds of the skull means this ‘improvisation’ is more of a ‘re-qualification’, as some areas of the visual cortex — usually handling complex shape recognition — get bent to the task, while some of the earliest areas of the brain take on a mediating role between the language and visual system.

Old brain, new tricks

The fact that learning to read will cause physical changes in the brain, such as the creation of new pathways, isn’t exactly news. But until now, we’ve believed that the changes literacy brings about are confined to the cortex, the outer layer of the brain which handles higher functions and can adapt quickly to master new skills and overcome challenges.

However, a team led by Falk Huettig from the Max Planck Institute for Psycholinguistics found that the brain does a lot more heavy lifting to master literacy. The Max Plank team worked together with scientists from the Centre of Bio-Medical Research (CBMR) in Lucknow, India, and the University of Hyderabad to uncover how the brains of completely illiterate people change when they learn to read and write.

For the study, the researchers worked with people in India. Illiteracy is pretty high here, mostly due to poverty, going at a rate of roughly 38% of the population but strongly skewed towards women. The team worked with an all-women group of participants, almost all of them in their thirties. They were recruited from the same social class in two villages in Northern India to take social factors out of the final results. Participants were further matched for handedness, income, number of literate family members and took two initial measures of literacy (letter identification and word-reading ability.) Lastly, they each had their brains scanned in the city of Lucknow.

After the team had enough data to form a baseline, the women were given a 6-months long period of reading training in their native tongue of Hindi, one of the official languages of India. It is based on Devanagari, that distinctive Indian and Nepalese flowing script, a style of writing known as an alphasyllabary. In alphasyllabary, you don’t write with single letters but with whole syllables or words at a time, in consonant-vowel pairings (written in that order).

At the start of their training, the vast majority of the participants couldn’t read a single word in Hindi. But after only six months, they reached roughly the same reading proficiency as a first-grader, quite an impressive result.

Deep re-purposing

Devanagari.

I don’t know what this says but I know it’s written in Devanagari.

“While it is quite difficult for us to learn a new language, it appears to be much easier for us to learn to read,” says Huettig. “The adult brain proves to be astonishingly flexible.”

The team reports that the functional reorganization we’ve talked before extends all the way to the deep, early-brain structures of the thalamus and the brainstem. These are very old brain areas, evolutionary speaking, and are universally found in mammalian brains as well as other species.

“We observed that the so-called colliculi superiores, a part of the brainstem, and the pulvinar, located in the thalamus, adapt the timing of their activity patterns to those of the visual cortex,” says Michael Skeide, scientific researcher at the Max Planck Institute for Human Cognitive and Brain Sciences (MPI CBS) in Leipzig and first author of the study.

These areas take on a sort of interface role, helping the visual cortex filter relevant visual stimuli — in this case, writing — from the wealth of information our eyes supply even before we become consciously aware of it. Skeide notes that “the more the signal timings between the two brain regions are aligned, the better the reading capabilities.” This would, of course, happen with practice, explaining why experienced readers can easily take in a text which would leave an aspiring reader scrambling for help.

The findings could help uncover the causes of reading disorders such as dyslexia. The condition has previously been linked to abnormal activity in the thalamus, an avenue of research Skeide says the team has “to scrutinize” considering they showed “that only a few months of reading training can modify the thalamus fundamentally.”

Finally, the findings should come as a boon to anyone currently struggling with illiteracy, especially in the West where it’s such a taboo subject and the object of social stigma.

The full article “Learning to read alters cortico-subcortical cross-talk in the visual system of illiterates” has been published in the journal Science Advances.

speed reading

Does speed-reading really work? Not if you want to understand anything

What if you could read a book three times faster? That definitely sounds appealing, which is why speed-reading training and, most recently, apps are very popular. Research suggests, however, that for the most part speed-reading hurts comprehension. The best thing you can do to read faster, and still understand something, is to improve your language and vocabulary, scientists say.

speed reading

Image: Iris Reading

According to Keith Rayner from the University of California, San Diego and colleagues who made a meta-analysis, there “is a trade-off between speed and accuracy”, and  “it is unlikely that readers will be able to double or triple their reading speeds (eg from around 250 to 500–750 words per minute) while still being able to understand the text as well as if they read at normal speed”.

Some courses advertise that you’ll be able to read books like flipping through a phone book, or more than 2,000 words/minute which is anatomically impossible, Rayner says. Anything beyond 500 words/minute is improbable without drastically deteriorating comprehension. This limitation arises from the foveal viewing area —  a small conical area of only about 1 degree or roughly the size of your thumb held at arm’s length.

fovea

The fovea is the small depression located in the exact centre of the macula that contains a high concentration of cones but no rods, and this is where our vision is most sharp. While the normal field of vision for each eye is about 135 degrees vertically and about 160 degrees horizontally, only the fovea has the ability to perceive and send clear, sharply focused visual images to the brain. In other words, anything outside the foveal viewing area is the peripheral vision.

This simple biological fact bluntly contradicts speed-reading courses that advertise you’ll be able to read in zig-zag or even vertically by the time you finish.

“[…] processing words out of order from the sensible sequence of the sentence … or when some of the words are removed … as would happen when a speed reader uses a zigzag movement – impairs the ability to process and understand the words,” the researchers write in their paper.

Expert speed readers claim they can read 2,000 word per minute, but that has to come at a huge loss of comprehension for a novel text, researchers say.

Expert speed readers claim they can read 2,000 words per minute, but that has to come at a huge loss of comprehension for a novel text, researchers say.

Another biological constraint deals with working memory or RAM, for you computer geeks. Scientists say that the brain can hold around 3 to 5 chunks of information at a time. This makes parsing multiple lines of text simultaneously impossible — for most people at least.

Most speed reading courses teach people to read the words off the page without imagining the corresponding sounds in their minds (called subvocalization). The researchers note, however, that  “research on normal reading challenges this claim that the use of inner speech in silent reading is a bad habit”, because “there is evidence that inner speech plays an important role in word identification and comprehension during silent reading”.

The paper also examined the efficacy of speed-reading apps like  Spritz. Unlike speed courses that advise reading in saccades, Spritz works by feeding text one word a time at a given rate which increases with your confidence and training. Each word appears in the same place on the screen, so your eye can stay fixed on that point while words flip through more quickly than you could hunt them down on a page. Indeed, at first glance, it sounds like this will help you read faster and it will, but again not without sacrificing comprehension. However, it’s exactly that time “lost” finding words that is essential to understanding a text. In the fractions of a second that we spend moving to the next word or line the brain is hard at work processing the meaning of all those words.

Apps like these don’t let you skip back, so essentially you might be turning a good book or article into a sprint with little to show but sore eyes at the end.

The average reader snails through prose at a rate of about 250-300 words per minute, which roughly equates to about one page per minute. Previously, researchers discovered that as these readers crested 600 words per minute, comprehension invariably dove below 75%. Which is not to say that people who read more than 600 wpm and still can retain what they’ve read aren’t for real — they’re just a lot fewer than you’d think.

In the World Championship Speed Reading Competition, the top contestants typically read around 1,000 to 2,000 words per minute. Six-times World Speed Reading Champion, Anne Jones, who last summer read Harper Lee’s Go Set a Watchman in 25 minutes, 31 seconds is not convinced by the findings.

“For events, I train just like an athlete does. Of course, PR people want me to read as fast as I can for events such as last July’s Go Set a Watchman. That was an outstanding reading performance. I read the book, understood it, loved it and talked about it in depth to lots of journalists immediately afterwards. Because it looked effortless does not mean it was easy to do. I have 20 years’ experience of speed reading and I know exactly how to approach a reading task such as that,” she said for The Guardian.

You could double your reading speed up to the upper bound (600 wpm), and some speed reading techniques might help like holding a pen or some other pointer as cursor so your eyes don’t wander off. Depending on what you have to read, skimming and zig-zagging might work too. If the text is a bit more complex, though, it doesn’t seem to be worth it. The biggest jump in reading speed might be no secret at all: pure hard work. Read a lot, practice your vocabulary and you might learn to read a lot faster than you currently are.”Slow and steady wins the race”.

 

Are you the only one who “hears” what you read? Science says no!

A new paper from New York University researchers suggests that most people do hear an internal voice while they’re reading. The insights from this analysis lend some support to theories that say auditory hallucinations are inner voices that are incorrectly identified as not belonging to the self.

So when you read something do you “hear” the words in your head, as if someone was talking to you from inside your brain? Have you ever wondered if you’re the only one who does? Or doesn’t?

Well, Ruvanee Vilhauer at New York University did. She got on Yahoo! Answers and combed through posts between 2006 and 2014 for relevant questions. She ended up with 24 questions pertaining to this phenomenon and 136 answers between them, in which people described their own experience when reading.

*Indecipherable cactus reading voice.*
Image via pixabay

Vilhauer analysed all the relevant content and looked for recurring themes and insights. In total, 82.5 percent of contributors said that they do hear an inner voice (or IRV – inner reading voice) when reading to themselves, and 10.6 percent said they didn’t. Out of the ones who reported hearing the voice when reading, 13 percent only do so sometimes. They said that various factors increased the likelihood of this happening, most often their interest in the text.

The experiences of the remaining 6.9 percent of contributors was unclear from their posts.

“We all hear our voices in our heads at times – even those of others we know – especially while reading,” said one Yahoo contributor.

Another thing that Vilhauer was interested in was what voice or voices the readers hear. About half of the contributors reported to always hearing the same internal voice, most usually their own but different in some way from the one they use to speak — for example in terms of pitch or emotional tone. Some of them described or implied that their IRV was just the same as the inner voice they used for thoughts.

Those that reported hearing different voices tended to switch between them depending on what character was speaking in a story. If the text was an email or text message from someone they knew, they heard the sender’s voice.

It’s not just forming the words in our mind as we read them, either. Answerers refer to their IRVs as being “audible” in some way, and speak of them as having volume, depth or accent. The issue of control over IRVs also came up, with some finding them to be distracting, unsettling or even scary, while others found they could deliberately alter it or chose another one if they wanted.

But after reading all of this you must be asking yourself — almost all of us internally hear the words as we read them, so why hasn’t anyone done any research into them before? Well, I can’t remember if I started hearing this voice before I learned how to read or after, but it’s something I got so accustomed to that I take it for granted. Indeed, Vilhauer found that many people just assume that their experience when reading is the same as everyone else’s and just…kinda leave it at that.

On the other end of the spectrum, those that don’t hear anything when reading believe that it’s the norm.

“Nooo. You should get that checked out” one Yahoo! user answered to a question about this voice.

“NO, I’M NOT A FREAK,” another adds.

The usage of all-caps tells me otherwise, though.

Vilhauer thinks that psychologists have failed to study this phenomenon because just like everyone else, they’ve assumed that there’s no variability in this and everyone experiences reading the same. Being the first study to ever look at the issue, and considering the somewhat unconventional methodology it uses, further research is needed to confirm the findings. That being said, it’s awesome that someone thought of researching this.

The full paper, titled “Inner reading voices: An overlooked form of inner speech,” has been published online in the journal Psychosis and can be read here.

 

 

ebook-vs-book

Overwhelming majority of college students prefer paper books to digital copies

Despite ebooks and their corresponding electronic reading devices have become extremely popular, surprisingly most young adults and children prefer reading in print than digitally. Moreover, this trend seems to be on the rise after a momentary preference for ebook readers. For the publishing industry and ebook reader manufacturers this most certainly means a stalemate as far as sales go, yet the big picture is uncertain. A decade ago, most experts would agree that conventional, paper books would become obsolete in favor of ereaders, which can store thousands of books. This has proven to be false. Books as we know them are here to stay and will most likely remain the main choice for reading, as far serious literature goes at least.

ebook-vs-book

Credit: Citadinul

It’s a bit fuzzy who invented the first e-book. Some credit the first electronic document as being the Index Thomisticus, a heavily annotated electronic index to the works of Thomas Aquinas, prepared by Roberto Busa beginning in the late 1940s. However, this is sometimes omitted, perhaps because the digitized text was (at least initially) a means to developing an index and concordance, rather than as a published edition in its own right. Though opinions are polarized, most people seem to agree that the first ebook – at least similar to what we’d call one today – was Michael S. Hart, then at University of Illinois. Using a  Xerox Sigma V mainframe, Hart created the first electronic document by typing the United States Declaration of Independence into a computer in 1971. Subsequently,  Project Gutenberg was launched to launched to create electronic copies of more texts – especially books.  As of January 2015, Project Gutenberg has over 47,975 items in its collection, most of which are public domain books, digitized and available for free.

By 2014 28% of adults had read an e-book, compared to 23% in 2013, and 50% of Americans by 2014 had a dedicated device, either an e-reader or a tablet, compared to 30% owning a device at the end of 2013. Judging from these figures, one would say that ebooks have finally taken off and they’d be right – but only half so. Printed books are still the norm it seems, if we’re to judge Naomi Baron’s research,  a professor of linguistics at American University. In her latest book,  Words Onscreen: The Fate of Reading in a Digital World, Baron describes her findings after she and colleagues survey over 300 university students in the U.S., Japan, Germany, and Slovakia. When students were given a choice of various mediaincluding hard copy, cell phone, tablet, e-reader, and laptop— a staggering 92 percent said they could concentrate best in hard copy.

“The group we assumed would gobble this up were teenagers and young adults,” says Baron. “But they talked about things I didn’t think 18 to 26-year-olds cared about anymore.”

Speaking for the New Republic when asked about some possible explanations for this, Baron said:

“There are two big issues. The first was they say they get distracted, pulled away to other things. The second had to do with eye strain and headaches and physical discomfort.

When I asked what they don’t like about reading on a screenthey like to know how far they’ve gone in the book. You can read at the bottom of the screen what percent you’ve finished, but it’s a totally different feel to know you’ve read an inch worth and you have another inch and a half to go. Or students will tell you about their visual memory of where something was on the page; that makes no sense on a screen. One student said, “I keep forgetting who the author is. In a print book all I have to do is flip back and I see it.” There are all kinds of reasons students will give“I have a sense of accomplishment when I finish a book and I want to see it on the shelf.” They care about the smell of a book. In the Slovakian data, when I asked what do you like most about reading in hard copy, one out of ten talked about the smell of books. There really is a physical, tactile, kinesthetic component to reading.”

This mirrors a conversation I had with a friend only a couple days ago. I was evangelizing the benefits of owning a Kindle and how superior I find it to conventional reading – though I still read paper books from time to time – but was constantly hit by a brick wall. Like Baron’s correspondents, my friend loathed reading on an electronic device and found great pleasure in the simple act of turning pages, the smell and so on. There’s still a heavily engraved association between reading and books – “real books” as they’re called. It’s yet another example between romantic thinking and classical thinking. Parts and against the whole.

In 2014, 65 percent of 6 to 17-year-old children said they would always want to read books in printup from 60 percent two years earlier. So not only do most people prefer physical books, this preference is on the rise. And these are young people, persons who have now been used to being surrounded by computers, mobiles phones and various gadgets all their lives.

ZME Readers, physical books or ebooks? Why?