Tag Archives: eye

Our eyes have a focal point — but images don’t seem to focus on it, weirdly

New research says that if you want to see something better, you shouldn’t look directly at it. At least, that’s what our eyes seem to believe.

Image via Pixabay.

Researchers at the University of Bonn, Germany, report that when we look directly at something, we’re not using our eyes to their full potential. When we do this, they explain, light doesn’t hit the center of our foveas, where photoreceptors (light-sensitive cells) are most densely packed. Instead, light (and thus, the area where images are perceived) are shifted slightly upwards and towards the nose relative to this central, highly sensitive spot.

While this shift doesn’t seem to really impair our perception in any meaningful way, the findings will help improve our understanding of how our eyes work and how we can fix them when they don’t.

I spy with my little eye

“In humans, cone packing varies within the fovea itself, with a sharp peak in its center. When we focus on an object, we align our eyes so that its image falls exactly on that spot — that, at least, was the general assumption so far,” says Dr. Wolf Harmening, head of the adaptive optics and visual psychophysics group at the Department of Ophthalmology at the University Hospital Bonn and corresponding author of the paper.

The team worked with 20 healthy subjects from Germany, who were asked to fixate on (look directly at) different objects while monitoring how light hit their retinas using “adaptive optics in vivo imaging and micro-stimulation”. An offset between the point of highest photoreceptor density and where the image formed on the retina was observed in all 20 participants, the authors explain. They hypothesize that this shift is a natural adaptation that helps to improve the overall quality of our vision.

Our eyes function similarly to a camera, but they’re not really the same. In a digital camera, light-sensitive elements are distributed evenly across the surface of their sensors. They’re the same all over the sensor, with the same size, properties, and operating principles. Our eyes use two types of cells to pick up on light, the rod and cone photoreceptors. The first kind is useful for seeing motion in dim light, and the latter is suited to picking out colors and fine detail in good lighting conditions.

Unlike in a camera, however, the photosensitive cells in our retinas aren’t evenly distributed. They vary quite significantly in density, size, and spacing. The fovea, a specialized central area of our retinas that can produce the sharpest vision, has around 200,000 cone cells per square millimeter. At the edges of the retina, this can fall to around 5,000 per square millimeter, which is 40 times less dense. In essence, our eyes produce high-definition images in the middle of our field of view and progressively less-defined images towards the edges. Our brains kind of fill in the missing information around the edges to make it all seem seamless — but if you try to pay attention to something at the edges of your vision, you’ll notice how little detail you can actually notice there.

It would, then, seem very counterproductive to have the image of whatever we’re looking at directly form away from the fovea. Wouldn’t we want to have the best view of whatever we’re, you know, viewing? The team explains that this is likely an adaptation to the way human sight works: both eyes, side by side, peering out in the same direction.

All 20 participants in the study showed the same shift, which was slightly upwards and towards the nose compared to the fovea. For some, this offset was larger, for some, smaller, but the direction was always the same for all participants, and all of them showed symmetry in the offset between both eyes. Follow-up examinations carried out one year after the initial trials showed that these focal points had not moved in the meantime.

“When we look at horizontal surfaces, such as the floor, objects above fixation are farther away,” explains Jenny Lorén Reiniger, a co-author of the paper. “This is true for most parts of our natural surrounds. Objects located higher appear a little smaller. Shifting our gaze in that fashion might enlarge the area of the visual field that is sheen sharply.”

“The fact that we were able to detect [this offset] at all is based on technical and methodological advances of the last two decades,” says Harmening.

One other interesting conclusion the authors draw is that, despite the huge number of light-sensitive cells our retinas contain, we only use a small fraction of them — around a few dozen — when focusing on a single point. Even more, it’s probably the same cells all throughout our lives, as the focal point doesn’t seem to move over time. While this is an interesting tidbit to share in trivia, it’s also valuable for researchers trying to determine how best to repair eyes and restore vision following damage or disease.

The paper “Human gaze is systematically offset from the center of cone topography” has been published in the journal Current Biology.

A window to the brain? Pupil size linked to intelligence

The pupil is the central opening of the iris on the inside of the eye through which light passes before reaching the lens and being focused onto the retina. Credit: Pixabay.

Humans are hardwired to read emotional cues in a person’s facial expressions, including micro-expressions of the eye. In fact, research suggests that if you want to read a person’s true emotional state, stay away from the mouth (fake smiles, anyone?) and pay attention to the eyes, whose sensitive involuntary muscle contractions are much more difficult to conceal. Now, researchers at the Georgia Institute of Technology claim that it may be possible to even gauge a person’s intellect from the eyes, after finding a correlation between pupil size and differences in intelligence between individuals.

The larger the pupil, the higher a person’s fluid intelligence may be

According to the study, which involved more than 500 people aged 18 to 35 from Atlanta, larger pupils were associated with higher intelligence, as measured by standard tests meant to gauge reasoning, memory, and attention.

This relationship is so pronounced that a person should be able to predict with relative confidence who scored the highest or the lowest on an intelligence test just by looking at their pupils with the naked eye, no additional instruments required.

Each subject’s pupil size was assessed using eye trackers that detect light reflecting from the pupil and cornea. Over lengthier eye-tracking, the researchers were able to compute each participants’ average pupil size.

The human pupil is between two and eight millimeters in diameter. However, they’re never fixed. They get bigger or smaller, depending on the amount of light they experience. In low light, your pupils open up or dilate, to let in more light. When it’s bright, they get smaller or constrict, to let in less light.

To normalize pupil measurements, the researchers made sure to assess the pupil at rest when the participants were staring at a blank screen for a couple of minutes. Each participant also went through a barrage of tests that scored them on their ability to solve new problems, remember things over time, and keep focus even when distracted. These combined abilities are often referred to as fluid intelligence.

Pupil size is also known to diminish with age. But after the researchers corrected for age, the pupil size and intelligence link still held up.

The researchers are careful to stress that their association is a correlation and they do not have evidence of a causal link between pupil size and differences in intelligence. That being said, it wouldn’t be that crazy if pupil size did indeed reliably indicate a propensity for scoring high on metrics for intelligence.

Previously, researchers noticed that the pupil is influenced by the locus coeruleus (from the Latin for ‘blue spot’), which communicates closely with the amygdala. Neurons in this region are the main source of the neurotransmitter noradrenaline (norepinephrine), an excitatory chemical that is released in response to pain or stress, stimulating what is referred to as the ‘fight-or-flight’ mechanism.

According to the authors of the new study published in the journal Cognition, the locus coeruleus is heavily involved in organizing brain activity and coordinating distant regions of the brain to work together and accomplish different tasks. Loss of function in this critical brain region is linked to  Alzheimer’s disease, Parkinson’s, and attention deficit hyperactivity disorder (ADHD).

The researchers at the Georgia Institute of Technology speculate that a person’s pupils may be larger due to greater regulation of activity by the locus coeruleus, which may lead to better cognitive performance.

“Additional research is needed to explore this possibility and determine why larger pupils are associated with higher fluid intelligence and attention control. But it’s clear that there is more happening than meets the eye,” Jason S. Tsukahara, Ph.D. student at the Georgia Institute of Technology and lead-author of the study, wrote in an article for Scientific American.

But since pupil size varies so much with the time of day, it might not be a good idea to stare someone dead in the eyes simply to assess whether they’re worth talking to. 

Artificial eye paves the way for cyborg vision

Researchers have devised an artificial eye that mimics the structure of the human eye, which has important applications in robotics, scientific measurements, as well as cyborg-like prosthetics that restore vision.

Artist impression of an artifcial eye. Credit: Yaying Xu.

The proof-of-concept, which was recently described in the journal Nature by a team led by Zhiyong Fan from the Hong Kong University of Science and Technology, is about as sensitive to light at its natural counterpart. What’s more, it even has a faster reaction time than the real thing (30 to 40 milliseconds, rather than 40 to 150 milliseconds).

The human eye is nothing short of spectacular — and much of what it’s capable of doing is owed to the dome-shaped retina, an area at the back of the eyeball that is littered in light-detecting cells.

There are around ten million photoreceptor cells per square centimeter, enabling a wide field of view and excellent resolution that has yet to be replicated by any man-made technology.

For many years, scientists have sought to replicate these characteristics in synthetic eyeballs. However, such efforts proved extremely challenging due to the inherent difficulties in mimicking the shape and composition of the human retina.

Fan and colleagues devised a hemispherical artificial retina, measuring only two centimeters in diameter and containing densely packed, light-sensitive nanowires made from a perovskite — a promising material that is very popular in solar cell manufacturing. The purpose of these nanowires is to mimic the photoreceptors of the human eye.

Schemtic of the artificial eye. Credit: H. JIANG/NATURE 2020.

The artificial eye’s hollow center is filled with a conductive fluid, whereas the human eye is filled with a clear gel called vitreous humour.

In an experiment, the artificial eye was hooked up to a computer and could “see” by reconstructing the letters ‘E’, ‘I’, and ‘Y’.

However, this is a far cry from the capabilities of the biological eye. The array consists of just 100 pixels, where each pixel corresponds to three nanowires.

This is a proof of concept, though, and Fan is confident that his design can be scaled so that the artificial eye can obtain a resolution ever higher than the human eye. According to Fan and colleagues, the density of nanowire can be enhanced to cover ten times the number of photoreceptors in the human eye.

Each nanowire could theoretically function as a small solar cell, which means that artificial eyes might not require an external power source as the researchers’ device currently requires.

The researchers envision applications in scientific measurements and advanced robotics. But, theoretically, the artificial eye could also be connected to an optic nerve, enabling the brain to process information received from the device like it would with a real eye. This latter prospect, however, is years and years away — but the prospect is still incredibly exciting.

Gold-infused contact lenses that treat red-green color blindness could hit the market soon

New research is aiming to bring color back into the lives of the color-blind.

Image credits n4i Photo / Flickr.

Color blindness can manifest itself in several ways, from people seeing certain colors in muted shades to not perceiving some at all. Needless to say, this is not the most enjoyable way to live your life and can cause real issues with color-cues, such as difficulties navigating a traffic light. Some of our fixes so far include tinted glasses or dyed contact lenses, but they all have their own shortcomings. The glasses can’t be used to also correct vision (so some people need to pick one or the other condition to fix), and the lenses can be unstable, potentially harmful if not used properly.

A new paper, however, reports on a new approach that can help address this issue: infusing contact lenses with gold particles.

Blingvision

Color blindness is a genetic disorder so, for now, our best approach to the issue so far is to treat its symptoms. The main issue with contact lenses employed for this purpose is that, although they are effective in improving red-green color perception, clinical trials have shown that they can leech the pigments they’re dyed with, potentially harming users’ eyes.

The current paper describes how the authors used gold nanocomposite materials to produce lenses with the same effect, but no dye. This process has been used for centuries already to produce ‘cranberry’ glass, they explain, and comes down to how the gold scatters light going through the glass.

In order to produce them, the team put together an even mix of gold nanoparticles and a hydrogel polymer. The end result was a rose-tinted gel that filters light within the 520-580 nm range, which corresponds to the colors red and green. Several types of nanoparticles were tested, and those who were around 40 nm in diameter were the most effective. During lab testing, lenses built with nanoparticles of this size did not clump, nor did they over-filter the color.

The lenses have the same water-retention properties like those of commercial lenses, and were non-toxic to cell cultures in the lab.

After comparing their lenses’ efficiency to those of two commercially-available pairs of tinted glasses and the pink-dyed contact lenses. The gold-infused lenses blocked a narrower band of the visible spectrum, and a similar amount to that of the dyed contact lenses. This suggests that the gold nanocomposite lenses would be effective for people with red-green colorblindness, but without the health concerns.

The lenses will now undergo clinical trials to assess their efficiency, safety, comfort, and practicality with human patients in real-life situations. If they pass, we could see them available commercially.

The paper “Gold Nanocomposite Contact Lenses for Color Blindness Management” has been published in the journal ACS Nano.

Bigger boost in robot’s field of view

Oregon State University’s team has earned some serious bragging rights: they’ve come up with an optical sensor that can mimic the human eye. Think of robots, ones that are built to track moving objects. Roboticists dealing with such machines wouldn’t have to play with complex image processing anymore — they could rely on this optical sensor to do the job.

The human eye, while not nearly as highly performant as some of its counterparts from the animal kingdom, is still a magnificent structure. Replicating its functionality in robots has proven immensely challenging, but the OSU team’s work brings us one step closer to it, as their robot eye is able to closely match the human eye’s ability to perceive changes in its visual field.

Due to the way the team’s sensor works, a static item in the robot’s field of view would draw no response. A moving object would—registering a high voltage. Science Focus summed up the importance of their work thusly:

“Currently, computers receive information in a step-by-step way, processing inputs as a series of data points, whereas this technology helps build a more integrated system. For artificial intelligence, researchers are attempting to build on human brains which contain a network of neurons, communicating cells, able to process information in parallel.”

For example, the OSU team proceeded to simulate an array of “retinomorphic” (human eye-type) sensors that predict how a retina-like video camera would respond to visual stimuli. The idea was to input videos into one of these arrays and process that information in the same way a human eye would. For instance, one such simulation shows a bird flying into view, then all but disappearing as it stops at an invisible bird feeder. The bird reappears as it takes off. The feeder, swaying, becomes visible only as it starts to move. But you don’t just need the eye, you also need the processing power — which in the case of humans, is provided by the brain. The OSU team also tried to replicate that.

The team’s paper appears in Applied Physics Letters, explaining that “neuromorphic computation is the principle whereby certain aspects of the human brain are replicated in hardware. While great progress has been made in this field in recent years, almost all input signals provided to neuromorphic processors are still designed for traditional (von Neumann) computer architectures.”

You may have already read about researchers exploring devices that behave like eyes, especially retinomorphic devices. But previous attempts to build a human-eye type of device relied on software or complex hardware, said John Labram, Assistant Professor of Electrical and Computing Engineering.

The Science Focus piece describes why he stepped up to this kind of research effort. Labram was “initially inspired by a biology lecture he played in the background, which detailed how the human brain and eyes work.” Our eyes are very sensitive to changes in light, the piece explains, but less responsive to constant illumination. This marked the core of a new approach for devices that mimic photo-receptors in our eyes.

The innovation in this work lies mostly in the materials and the technique they used. The authors discuss how “a simple photosensitive capacitor will inherently reproduce certain aspects of biological retinas.” Their design involves using ultrathin layers of perovskite semiconductors — perovskite being a mineral also used for solar panels, among others. The perovskite is a few hundred nanometers thick and works as a capacitor that varies capacitance under illumination.

These change from strong electrical insulators to strong conductors when exposed to light. “You can think of it as a single pixel doing something that would currently require a microprocessor,” said Labram, for the university’s news site.

Their human eye-like sensor would not just be useful for object tracking robots, though. Consider that “neuromorphic computers” belong to a next generation of artificial intelligence in applications like self-driving cars. traditional computers process information sequentially as a series of instructions; neuromorphic computers emulate the human brain’s massively parallel networks, said the OSU report.

The human eye can tell day from night with three types of cells

The circadian rhythm — our biological internal clock that regulates the sleep-wake cycle and resets every 24 hours — plays a major role in health. How exactly our bodies are able to synchronize with day-night cycles has been a matter of debate. Now, a new study discovered that human eyes have three types of specialized cells that sense light with important applications in preventing circadian rhythm disruptions.

Credit: Pixabay.

Researchers at the Salk Institute developed a new method that can keep retina samples healthy and functional well after a donor passed away. Such samples were placed on an electrode grid that allowed the researchers to study how the retina reacted to light.

Several colors of light were tested, which showed that a small group of cells in the retina — known as intrinsically photosensitive retinal ganglion cells (ipRGCs) — started firing about 30 seconds after they interacted with a pulse of light. After the light was turned off, the cells took several seconds to stop firing.

The cells were the most sensitive to blue light, which is the type of light used in LCD screens employed by most smartphones and laptops. Blue light also inhibits the production of melatonin, keeping us awake and messing with our natural sleep cycles.

Follow-up experiments revealed that there are, in fact, three types of ipRGCs.

  • Type 1 responds to light relatively quickly but takes a long time to turn off;
  • Type 2 took longer to turn on and also was long to turn off;
  • Type 3 cells responded only when the light was very bright, but turned on faster than type 1 and 2, and switched off as soon as the light source was gone.

These cells may explain some very peculiar findings reported by other studies. For instance, blind people are able to align their sleep-wake cycles and circadian rhythms to the day-night cycle despite not being able to see. The new study may explain how they were able to sense light despite their visual impairment.

“We have become mostly an indoor species, and we are removed from the natural cycle of daylight during the day and near-complete darkness at night,” said Satchidananda Panda, senior author of the study and a professor at the Salk Institute.

“Understanding how ipRGCs respond to the quality, quantity, duration, and sequence of light will help us design better lighting for neonatal ICUs, ICUs, childcare centers, schools, factories, offices, hospitals, retirement homes and even the space station,” he added.

Although ipRGCs are responsible for sending light signals to the brain, they also work closely with rods and cones. The researchers believe that the ipRGCs may combine their light sensitivity with the light detected by purely visual cells to enhance brightness and add contrast.

“This adds another dimension to designing better televisions, computer monitors and smartphone screens in which changing the proportion of blue light can trick the brain into seeing an image as bright or dim,” says Panda.

In the future, the researchers plan on conducting more experiments on ipRGCs under different conditions of light color, intensity, and duration. The authors are also interested in how the cells will react to sequences of light (blue that turns into orange or vice-versa, for instance).

By understanding how each specialized light cells function in the eye, the researchers claim that is possible to access an entirely new spectrum of applications. For example, the insights could be used to design indoor lights that offer better day-night synchronization or which — why not — improve our moods.

“It’s also going to open a number of avenues to try new drugs or work on particular diseases that are specific to humans,” says Ludovic Mure, a postdoctoral researcher in the Panda lab and first author of the new study.

The findings were reported in the journal Science.

Just thinking about an object’s brightness is enough to change pupil size

Our pupils get bigger in response to dark conditions in order to allow more light into our eyes. Conversely, in bright conditions the pupils contract. Interestingly, a new study found that simply imagining a dark or bright light source is enough to change pupil size, even in the absence of visual stimuli.

Credit: Pixabay.

Nahid Zokaei, a researcher at the University of Oxford, along with colleagues, performed a series of experiments in which 22 men and women were shown dark and light patches. Each patch was associated with a specific sound.

After they learned to associate the sounds with the patches, during one experiment the patches flashed on a screen and would disappear after being displayed for only two seconds. The participants then had to imagine the correct corresponding patch when they heard a certain sound.

Amazingly, the study revealed that simply by thinking of a dark patch, the participants’ pupils would enlarge. The same would happen in the case of imaginary bright patches, which prompted the pupils to contract.

“The results provide surprising and consistent evidence that pupil responses are under top-down control by cognitive factors, even when there is no direct adaptive gain for such modulation, since no visual stimuli were presented or anticipated. The results also strengthen the view of sensory recruitment during working memory, suggesting even activation of sensory receptors,” the researchers wrote in the journal PNAS.

These are exactly the same results you would expect to see when physically looking at bright or dark objects — another testament to the power of our minds.

“The thought-provoking corollary to our findings is that the pupils provide a reliable measure of what is in the focus of mind, thus giving a different meaning to old proverbs about the eyes being a window to the mind,” the authors remarked.

Eye macro photography.

Scientists are now able to bio-print corneas

This research could usher in corneas-on-demand, offering hope for the millions of patients awaiting transplant.

Eye macro photography.

Image via Publicdomainpictures.

Researchers at the Newcastle University, UK, have successfully 3D-printed human corneas — a world first. Their technique could eventually lead to a cornea mass-production system that could help the millions of people waiting for a transplant.

A feast for the eye

The cornea is the outer layer of the human eye and plays a central role in focusing our vision. It’s also a part of the eye that doesn’t always age gracefully and is susceptible to damage from infections or disease. As such, there are over 10 million people worldwide who risk corneal blindness from diseases such as trachoma (an infectious eye disease), and almost 5 million who are completely blind due to burns, lacerations or abrasion of the cornea.

Most of them are awaiting a transplant, but there are very few donors.

The team’s work aims to address this shortage. They used a mix of human corneal stromal (stem) cells harvested from donated healthy corneas, alginate, and collagen to create a firm but printable bio-ink. This material is based on previous work, in which the team developed a similar hydrogel that could keep cells alive for weeks at a time.

They fed this substance through a simple, low-cost 3D bio-printer into concentric circles roughly the shape of a human cornea. According to their scientific paper, it took under 10 minutes to print their proof-of-concept cornea. The final step is allowing this structure to grow into a cornea on a culture dish.

“Many teams across the world have been chasing the ideal bio-ink to make this process feasible,” says lead researcher Che Connon, a Professor of Tissue Engineering at Newcastle University.

“[The gel] keeps the stem cells alive whilst producing a material which is stiff enough to hold its shape but soft enough to be squeezed out the nozzle of a 3D printer.”

The team also showed they can build corneas to match a patient’s unique needs and specifications. The dimensions required for this were originally taken from an actual cornea, the team writes. In the future, a simple scan of a patient’s eye will enable doctors to print a cornea that perfectly matches the size and shape of their eyeballs.

The 3D-printed corneas will have to undergo a lot of testing, probably over the span of a few years, before they’ll even be considered for use in transplants, the team explains. However, the ability to produce enough of them to treat all those awaiting transplant as well as the precision with which they can be crafted will is a game-changing prospect — one that’s bound to spur on further development.

The paper “3D bioprinting of a corneal stroma equivalent” has been published in the journal Experimental Eye Research.

The incredibly mobile and efficient eyes of the mantis shrimp

The mantis shrimp may only measure mere inches long but it packs the fastest punch in the animal world. They use their forelimbs like clubs, striking with such voracity that pockets of seawater vaporize and implode, wreaking havoc on their prey — as well as aquarium walls and even human thumbs. Being able to coordinate this extremely quick punch requires exceptional eyesight to coordinate it.

Now, researchers have finally figured out what makes the mantis shrimp’s eyes so amazing.

The extraordinary eyes of the stomatopod Odontodactylus scyllarus are capable of independent rotation in all three axes of rotation. Image credits: Michael Bok.

Mantis shrimp vision is extraordinary — we already knew that. Not only do they have an extremely clear color vision (mammals have just three photoreceptor cells, whereas mantis shrimps use a dozen), but they also have the ability to see the polarisation of light. Now, researchers have shown that they have extremely mobile eyes that never stop moving — in direct contrast to most other creatures, which try to limit eye movement as much as possible to avoid blurring.

Out of the 450 or so species of mantis shrimp, one of the best studied is the peacock mantis shrimp, Odontodactylus scyllarus. The compound eyes of this shrimp are perched on the end of supportive stalks, being able to move independently in all three axes of rotation: pitch (up-down), yaw (side-to-side) and roll (twisting about the eye-stalk). Amazingly, they always know which way is up, regardless of what their eyes are doing.

A Bristol-led team of researchers based at the University’s Ecology of Vision Laboratory wanted to test the limits of this mobility, seeing at what point they steady their gaze and stop their eyes from moving. The results were surprising.

Closeup of a mantis shrimp showing the structure of the eyes. Image credits: Alexander Vasenin.

Researchers found that while mantis shrimp do make stabilizing side-to-side movements that help keep their vision steady, they don’t really stop rolling their eyes. For any other creature, the purpose of stopping your eyes is to stabilize vision and prevent blurring, but the mantis shrimp continues to roll its eyes, which means that ‘up’ becomes ‘sideways’ — and yet somehow, through all this bizarre eye movement, the creature still knows which way is up.

Ilse Daly from Bristol’s School of Biological Sciences and lead author of the study, explains:

“It would be like you tipping your head on its side, then back to normal and all angles in between all while trying to follow the motion of a target.”

“Just to make things even more confusing, the left and right eyes can move completely independently of one another, such that one eye could be oriented horizontally, while the other could be twisted completely through 90 degrees to be on its side.”

In order to figure this out, researchers got a bit creative — and a bit cruel: they made the world rotate around the mantis shrimp, deliberately inducing severe vertigo, as you’d experience on some of the wilder carnival rides. However, this didn’t seem to phase the shrimps at all. Researchers expected the shrimp to move their eyes to counteract the spinning world around them, but they didn’t. Daly added:

“We expected that, in response to the world around them apparently rolling, mantis shrimp should roll their eyes to follow their surroundings. They did not.”

“The mantis shrimp visual system seems entirely immune from any negative effects of rolling their eyes. Indeed, it appears as though rolling has absolutely no effect on their perception of space at all: up is still up, even when their eyes have rolled completely sideways. This is unprecedented in the animal kingdom.”

All of this suggests that the bearing of the shrimps’ eyes doesn’t have anything to do with their perception of space, which would make them unique in the animal world.

Journal Reference: Ilse M. Daly, Martin J. How, Julian C. Partridge, Nicholas W. Roberts. Complex gaze stabilization in mantis shrimp. Proceedings of the Royal Society B: Biological Sciences, DOI: 10.1098/rspb.2018.0594

lobster

6 Reasons Why Your Eye is Twitching

lobster

Credit: Giphy.

There you are, minding your own business, when your eye starts to spasm out of control. It’s a very annoying feeling which can persist for hours. But is there any cause for alarm? Trinidadians have quite a rich collection of superstitions concerning jumping eyes. For instance, if your right eye jumps, you are going to hear good news, and if your left eye jumps, you are going to hear bad news. Superstitions aside, the short answer is that a twitching eye is totally fine, although in some exceptional cases it may underlie a neurological condition.

Eye spasms are also called eyelid spasm or eyelid tics. Myokymia, as doctors call it, is characterized by the spontaneous, fine fascicular contractions of muscle without muscular atrophy or weakness. You might have experienced an involuntary muscle spasm in your knee or elbow. The same can happen to the orbicularis oculi muscle (the muscle in the face that closes the eyelids).

Eye spams are typically unilateral, occurring in one of the lower eyelids. Sometimes, both eyelids may be involved but the fascicular contractions of each eyelid is independent of each other. The transient and intermittent twitching is semi-rhythmic at a rate of 3-8 Hz.

In the majority of cases, eyelid spasms are no cause for concern. These are usually self-limiting and benign, so medical intervention is typically unnecessary. Very rarely, eyelid myokymia may occur as a precursor of hemifacial spasm, blepharospasm, Meige syndrome, spastic-paretic facial contracture, and multiple sclerosis.

Here are some of the most common reasons why your eye might be twitching.

Stress

This is the number-one reason for a twitching eye, according to ophthalmologists. When we’re under a lot of pressure, the body releases stress hormones (cortisol) which triggers a “fight or flight” response. One immediate consequence is muscle arousal, which may affect eyelid muscles as well. This may be a good time to see your friends, unwind, or meditate in order to reduce the stress that might be causing the twitching eye.

Fatigue

Not getting enough sleep or working too much overtime may cause your eye to complain. Scientists aren’t sure why, but they’ve found that getting more rest will make the symptoms subside.

If you’re chronically tired and have eye twitches — this is probably the first thing you should check out.

Dry eyes

More and more people work office jobs that involve a lot of sitting and staring at a computer screen. This sort of lifestyle can lead to dry eye syndrome (DES), also known as keratoconjunctivitis sicca (KCS). Dry eye syndrome is caused by a chronic lack of sufficient lubrication and moisture on the surface of the eye. When our eyes are too dry, they might involuntarily start twitching to keep them moisturized. This repeated blinking can trick the brain into making one or both eyes twitch even more. To avoid DES, use eye drops if you use the computer for more than seven hours a day and make sure to look away from the screen at least once every 20 minutes. Learn more about the 20:20:20 rule to avoid eye strain.

Caffeine and alcohol

Caffeine and alcohol have seemingly polar effects on the body. One is a stimulant while the other is an inhibitor — but both can bring a twitching eye when used in excess. It’s important to stay hydrated and to avoid real or artificial sugars when this happens.

While there is some contradiction in the scientific literature about caffeine, it’s pretty clear that alcohol isn’t really good for you. Eye twitching is only one of the many health issues potentially caused by alcohol.

Mineral deficiencies

A jumping eye might be triggered by magnesium deficiency. If the twitch persists, it’s a good idea to get your magnesium levels checked with a simple blood test. If you really are magnesium deficient, you should focus on eating more foods like almonds, oatmeal, or spinach. You can also take magnesium supplements to meet daily Mg needs. Overall, less than 30% of U.S. adults consume the Recommended Daily Allowance (RDA) of magnesium. And nearly 20% get only half of the magnesium they need daily to remain healthy.

Eyelid infection

Blepharitis occurs when bacteria infect your eyelids, causing inflammation and redness. This makes the muscles around the eye twitchy. Other symptoms include burning or stinging eyes, dandruff at the base of the eye, and grittiness. The treatment of blepharitis should begin with a visit with your eye doctor to determine the cause of your sore, red, itchy eyelids. Eyelid hygiene is very helpful to treat and control blepharitis, and a good place to begin is applying a wet a clean, warm compress to melt any blocked residue around the eyelid. Repeat this several times a day.

 

Artificial Intelligence can tell you your blood pressure, age, and smoking status — just by looking at your eye

Eyes are said to be the window to the soul, but according to Google engineers, they’re also the window to your health.

The engineers wanted to see if they could determine some cardiovascular risks simply by looking a picture of someone’s retina. They developed a convolutional neural network — a feed-forward algorithm inspired by biological processes, especially pattern between neurons, commonly used in image analysis.

This type of artificial intelligence (AI) analyzes images holistically, without splitting them into smaller pieces, based on their shared similarities and symmetrical parts.

The approach became quite popular in recent years, especially as Facebook and other tech giants began developing their face-recognition software. Scientists have long proposed that this type of network can be used in other fields, but due to the innate processing complexity, progress has been slow. The fact that such algorithms can be applied to biology (and human biology, at that) is astonishing.

“It was unrealistic to apply machine learning to many areas of biology before,” says Philip Nelson, a director of engineering at Google Research in Mountain View, California. “Now you can — but even more exciting, machines can now see things that humans might not have seen before.”

Observing and quantifying associations in images can be difficult because of the wide variety of features, patterns, colors, values, and shapes in real data. In this case, Ryan Poplin, Machine Learning Technical Lead at Google, used AI trained on data from 284,335 patients. He and his colleagues then tested their neural network on two independent datasets of 12,026 and 999 photos respectively. They were able to predict age (within 3.26 years), and within an acceptable margin, gender, smoking status, systolic blood pressure as well as major adverse cardiac events. Researchers say results were similar to the European SCORE system, a test which relies on a blood test.

To make things even more interesting, the algorithm uses distinct aspects of the anatomy to generate each prediction, such as the optic disc or blood vessels. This means that, in time, each individual detection pattern can be improved and tailored for a specific purpose. Also, a data set of almost 300,000 models is relatively small for a neural network, so feeding more data into the algorithm can almost certainly improve it.

Doctors today rely heavily on blood tests to determine cardiovascular risks, so having a non-invasive alternative could save a lot of costs and time, while making visits to the doctor less unpleasant. Of course, for Google (or rather Google’s parent company, Alphabet), developing such an algorithm would be a significant development and a potentially profitable one at that.

It’s not the first time Google engineers have dipped their feet into this type of technology — one of the authors, Lily Peng, published another paper last year in which she used AI to detect blindness associated with diabetes.

Journal Reference: Ryan Poplin et al. Predicting Cardiovascular Risk Factors from Retinal Fundus Photographs using Deep Learning.  arXiv:1708.09843

Drinking tea might help reduce glaucoma risk, new study concludes

An intriguing study has found a correlation between drinking tea and a reduced risk of glaucoma. However, don’t start brewing an extra cup just yet, because no causation has been established.

Image credits: Conger Design.

Glaucoma is an eye condition in which fluid pressure inside the eye causes systematic damage to the optic nerve. Most types of glaucoma have no symptoms and can creep in and lead to blindness if they are not detected and treated early. Writing in the British Journal of Ophthalmology, researchers in the US describe the association between consumption of coffee, tea or soft drinks and this condition, which affects over 2.7 million people in America alone, and almost 60 million worldwide.

They surveyed 1,678 participants of the 2005–2006 National Health and Nutrition Examination Survey (NHANES). Across them, the prevalence of glaucoma was 5.1% (84 people). There was no significant correlation between glaucoma and consumption of caffeinated and decaffeinated coffee, iced tea and soft drinks, and glaucoma. But when it came to hot tea, things were quite different. As it turns out, tea drinkers are at a significantly lower risk of glaucoma.

Participants who consumed at least six cups a week were 74% less likely to have glaucoma when compared with those who did not consume hot tea. To make things even more interesting, there was no change for patients who drank decaffeinated hot tea.

“In summary, individuals who consumed hot tea were less likely to have a diagnosis of glaucoma compared with those who did not consume hot tea,” the authors write.

Of course, this is just a correlation and no causation has been properly explored, but the fact that the same results didn’t carry over to decaf tea seems to suggest that certain plant chemicals — such as flavonoids and other antioxidants found in tea — provide a protective effect to the eyes. But this still doesn’t prove anything, and it doesn’t really explain why the same effect wasn’t reported in iced tea.

There are also other limitations to the study, including a lack of information on the tea that was drunk, a limited sample size for both people with glaucoma and tea drinkers, and possible errors in diagnosis. Still, this is good news if you’re a tea drinker.

“Tea drinkers should feel comfortable about drinking tea but should realise that the results are preliminary and drinking tea may not prevent glaucoma,” said Anne Coleman, co-author of the research from the University of California, Los Angeles.

But if you want to reduce your risk of glaucoma, there are other things you should focus on. The biggest risk factors for glaucoma are still an unhealthy diet and a lack of physical activity. So while tea is still probably good for you (especially without sugar), staying fit and eating healthy food is still the best thing you can do.

Journal Reference: Connie W. Mu et al. Frequency of a diagnosis of glaucoma in individuals who consume coffee, tea and/or soft drinkshttp://dx.doi.org/10.1136/bjophthalmol-2017-310924

Fundus.

Woman burns her retinas looking at the eclipse and doctors take the first-ever pictures of such damage

Novel medical imaging technology has allowed doctors to peer into the eye and see, for the first time, the cellular damage incurred when looking directly at the sun during an eclipse.

Fundus.

Image modified after Chris Y. Wu et al., 2017, JAMA Opht.

The patient is a woman in her 20s, who damaged her eyes during the total solar eclipse in August, the paper reports. She told doctors that she looked at the sun for approximately 6 seconds several times during the eclipse, without using any type of protective eyewear. She later gazed at the star again, for 15 to 20 seconds, using a pair of eclipse glasses. The woman also said she viewed the solar eclipse with both eyes open.

Four hours later, she had blurred vision, metamorphopsia (a type of vision distortion), and couldn’t perceive color very well. The symptoms were worse in her left eye. Three days later she went to the doctor and was told she had a rare condition called solar retinopathy — her retinas had been damaged by direct sungazing.

Looking at a nuclear explosion

The sun is, basically, one humongous nuclear explosion in the sky, only held together by the fact that it’s so massive it can’t escape its own gravitational pull. If that sounds really metal, it’s only because it is.

Like any nuclear explosion worth its salt, the sun spews out massive amounts of energy across the electromagnetic spectrum. Although it is safe to look at the sun without eye protection during the totality phase of an eclipse (when the sun is fully obscured and most incoming radiation is blocked), the woman was not in the path of the totality. The aster was only 70% obscured during the eclipse’s peak in the area where she was watching the event. This meant incoming radiation wasn’t blocked entirely and carried enough energy to be hazardous to the naked eye.

Looking directly at the sun (or straight at the partially-obscured sun) can leave you with a very nasty case of solar retinopathy — a condition which develops when incoming radiation (most of which you perceive as light and warmth) from the sun damages the retina. Among its symptoms are blurry vision, and the formation of blind spots in one or both eyes. However, the damage is often painless and a person generally will not experience these symptoms immediately.

Because people are told not to view partial eclipses without protective gear, and because total solar eclipses are rare, doctors don’t often see patients with solar retinopathy. Also, in the past, they didn’t have the imaging tools currently available, so its effects on the eye aren’t well documented.

“We have never seen the cellular damage from an eclipse because this event rarely happens and we haven’t had this type of advanced technology to examine solar retinopathy until recently,” lead author Dr. Avnish Deobhakta, an assistant professor of ophthalmology at the Icahn School of Medicine at Mount Sinai, said in a statement.

Using a new technique dubbed adaptive optics, the doctors took “an exact look at this retinal damage on such a precise level [which] will help clinicians better understand the condition,” she explains.

After examination, doctors determined that the woman had developed solar retinopathy — in effect, the intense radiation burned holes in both of her retinas. The report also states that she had photochemical burns in her eyes — light-induced cell death. Using adaptive optics, the doctors could examine the microscopic structures and damage in minute detail in real time, the paper explains, allowing them to capture high-resolution images of the damaged rods and cones on the woman’s retinas.

Right Eye. A small circular area of high reflectivity with central low reflectivity on the outer retina. Corresponding hyporeflectivity also seen in the choriocapillaris.
Image credits Chris Y. Wu et al., 2017, JAMA Opht.

Left Eye.

Left Eye.
Image credits Chris Y. Wu et al., 2017, JAMA Opht.

 

The images reveal no significant structural damage in the right eye. They do, however, reveal a yellow-white spot in the left eye. Multiple areas with decreased sensitivity were identified in both eyes, and a central scotoma (blind spot) was further identified in the left eye. The team writes that they hope such images will help further our understanding of solar retinopathy, which currently remains untreatable.

Perhaps most importantly for the next eclipse, which will darken the skies in 2024, lead author Dr. Chris Wu hopes the report will help “prepare doctors and patients […] and make them more informed of the risks of directly viewing the sun without protective eyewear.”

Sunny side up.

As in, “don’t be this guy, fam.”
Image via YouTube.

The paper “Acute Solar Retinopathy Imaged With Adaptive Optics, Optical Coherence Tomography Angiography, and En Face Optical Coherence Tomography” has been published in the journal JAMA Ophthalmolgy.

Trilobite and eye.

World’s oldest eye found in a fossil in Estonia is very similar to today’s eyes

A newly-discovered fossil may contain the oldest eye ever found, a new paper reports.

Trilobite and eye.

S. reetae and its compound eye. (B) Head region. (C) Fields of view. (D) Abraded part of the right eye. Arrowheads indicate the ommatidial columns. (E) Lateral view of the right eye with schematic (F).
Image credits Euan Clarkson et al., 2017, PNAS.

An “exceptional” find offers us a unique glimpse into the evolutionary history of the eye. A 530-million-year-old fossil discovered by an international team of researchers could contain the oldest such sensory organ we’ve ever seen, they report. The trilobite (species Schmidtiellus reeta) sports an early model of the eyes still used by many animals today, including crabs, bees, and dragonflies.

Between 541-251 million years ago, during the Paleozoic era, the ancestors of today’s spiders and crabs were enjoying the time’s oceans and seas. They used an early form of compound eye to help them navigate — an organ consisting of tiny visual cells laid out in an array. Judging from the one found in the fossil from Estonia, their eyes were quite similar to what you might see on a modern bee. The team behind the discovery said their findings suggest that compound eyes have changed little over 500 million years.

“This exceptional fossil shows us how early animals saw the world around them hundreds of millions of years ago,” said lead author Professor Euan Clarkson from the Edinburgh University.

“Remarkably, it also reveals that the structure and function of compound eyes has barely changed in half a billion years.”

Only the animal’s right eye was found, and it was partly worn away, which gave the researchers a clear view of its interior. This made it easy for the team to record the details of its internal structure and function, as well as facilitating its comparison with modern compound eyes.

Internal structures of the eye.
Image credits Euan Clarkson et al., 2017, PNAS.

The team believes that the animal had poor vision compared to many animals today, but the eye was very likely more than enough for the trilobite to see predators and obstacles. It consisted of approximately 100 cells (called ommatidia), spaced farther apart than those in contemporary compound eyes.

Unlike most eyes, however, the fossil eye doesn’t have a lens. The team thinks this is because of the species relative primitiveness, as it lacked parts of the shell needed for lens formation.

“This may be the earliest example of an eye that it is possible to find,” said Professor Brigitte Schoenemann of Cologne University, the paper’s co-author. “Older specimens in sediment layers below this fossil contain only traces of the original animals, which were too soft to be fossilised and have disintegrated over time.”

According to the team, a more advanced compound eye capable of capturing images in a higher resolution developed in another trilobite species from the present-day Baltic region a few million years later.

The paper “Structure and function of a compound eye, more than half a billion years old” has been published in the journal PNAS.

You can’t keep eye contact during conversation because your brain can’t handle it, study finds

A new study suggests that we may struggle to maintain eye contact while having a conversation with someone because out brains just can’t handle doing both at the same time.

Image credits Madeinitaly / Pixabay.

It’s not (just) shyness, it seems. Scientists from Kyoto University, Japan tested 26 volunteers on their ability to play word association games while keeping eye contact with computer-generated faces. Their results suggest that people just can’t handle thinking of the right words while keeping their attention on an interlocutor’s face. The effect, they found, becomes more noticeable when the participants had to think up less familiar words — implying that this process uses the same mental resources as maintaining eye contact.

“Although eye contact and verbal processing appear independent, people frequently avert their eyes from interlocutors during conversation,” write the researchers.

“This suggests that there is interference between these processes.”

The participants were asked to think of word associations for terms with various difficulty levels. Thinking of a verb for ‘spoon’, for example, is pretty easy — you can eat with it. Thinking of a verb associated with the word ‘paper’ is harder since you can write, fold, cut it, and so on. Participants were tested on their ability to associate while looking at animations of faces maintaining eye contact and animations of faces looking away.  And in the first case, they fared worse.

It took them longer to think of answers when maintaining eye contact, but only when they had to associate a more difficult word. The researchers believe that this happens because the brain uses the same resources for both actions — so in a way, talking while maintaining eye contact overloads it.

The team suspects that participants may be experiencing some kind of neural adaptation, a process in which the brain alters its response to a constant stimulus — take for example the way you don’t feel your wallet in the back-pocket you usually put it in but becomes uncomfortable in the other one. The sample size this team worked with is pretty small, so further research is needed to prove or disprove the findings.

The paper “When we cannot speak: Eye contact disrupts resources available to cognitive control processes during verb generation” has been published in the journal Cognition.

goat eyes

Why goats have really weird rectangular pupils

Ever took a moment to stare a goat in the eyes? If you have, you might have noticed something really weird: their pupils are horizontal, or rectangular. It’s one of those things that baffles the mind once it hits you because we’re so used to circular pupils or even vertical slit ones, on account of cats or snakes.

goat eyes

Credit: Pinterest user Leta Sparks

It’s always about survival

UC Berkeley and Durham University researchers were also intrigued by these somewhat atypical shape. Being scientists, they decided to investigate and analyzed pupil shapes of no fewer than 214 land species.

What they eventually found was that pupil shape is linked to the ecological niche or role of the animal. The general pattern is predators have vertical slit pupils because these help them judge distance better, making it easier to pounce on prey. Meanwhile, herbivores — which are the target of carnivores — have rectangular slit pupils as a line of defense, offering them a broader field of vision.

As a herbivore, apart from some antlers and hooves, there’s not that much you can do to fend back a predator. The best thing they can do is run away, which is why many herbivores are also fast. Before you can run, though, you need to know when it’s time to make an exit which is where the goat’s rectangular eyes come in. These enable a panoramic vision which can detect intruders approaching from various directions.

The horizontal pupils also enhance the image quality of objects directly ahead of the animal. This clear front-image helps guide rapid locomotion over a potentially rough terrain, the researchers noted in Science Advances.

Grazing animals like goats also rotate their eyes when they bow their heads down so their eye slits are parallel to the ground at all times. They can rotate more than 50 degrees per eyes or 10 times more than the human eye. This way, even when they’re grazing, goats can always keep a good eye on the world and lurking predators.

Rectangular pupils are typically employed by equines and ruminants, such as sheep, deer, and horses.

The eye of the Sahara

A topographic reconstruction (scaled 6:1 on the vertical axis) from satellite photos. False coloring as follows: bedrock=brown, sand=yellow/white, vegetation=green, salty sediments=blue. Credit: NASA

This has got to be one of the strangest places on Earth- – but you couldn’t make much of it if you were just walking by.

It’s located in a rather remote area and the few people who noticed something odd about it didn’t know just how odd it really was. That’s why the 50 km formation didn’t receive much attention until some astronauts made reports about it .

Photo by NASA.

Located in Mauritania, the Eye of the Sahara is not really what you would call a structure, but rather a huge circular formation; it was originally thought to be a crater, but the more recent and accepted theories suggest that it is, in fact, a product of erosion that took place in geological time.

Also known as the Richat Structure, the Eye of the Sahara has been studied by numerous geologists.

“The Richat structure (Sahara, Mauritania) appears as a large dome at least 40 km in diameter within a Late Proterozoic to Ordovician sequence. Erosion has created circular cuestas represented by three nested rings dipping outward from the structure. The center of the structure consists of a limestone-dolomite shelf that encloses a kilometer-scale siliceous breccia and is intruded by basaltic ring dikes, kimberlitic intrusions, and alkaline volcanic rocks” – small excerpt from a paper.

You can also see it on Google Maps, it’s really a brilliant view, and you can zoom in and out for proportions (coordinates are 21.124217, -11.395569).

 

Picture sources: 1 2 3

Pupil shape reveals what kind of animal you are

Your eyes are a window to your soul, or so the saying goes – but a new research suggests that the pupil shape and size have a lot to do with an animal’s nature. Hunters like cats tend to have vertical pupils, while horizontally elongated pupils are generally plant-eaters.

cat pupil

Image via Wiki Commons.

Pupils are the eyes’ aperture – they’re black because light rays entering the pupil are absorbed. Humans have circular pupils, but that’s rather rare in the animal kingdom. Creatures like crocodiles, vipers, cats and foxes have vertical pupils, while for horses, rays, deer, sheep and many others, pupils are horizontal. But why? Why is there such a large variability between different species?

An analysis of 214 species of land animals shows that a creature’s ecological niche is a strong indication of pupil shape. The study, led by vision scientist Martin Banks, a UC Berkeley professor of optometry, found that creatures with vertical pupil slits are more likely to be ambush predators. Among the 65 frontal-eyed, ambush predators in this study, 44 had vertical pupils, and 36 of them had shoulder heights that were less than 42 centimeters (16.5 inches) – so they were close to the ground.

They also have stronger muscles in the eyes which allows them to greatly contract or dilate the pupil, allowing more or less light to enter the eye. For example, the vertical slits of domestic cats and geckos undergo a 135- and 300-fold change in area between constricted and dilated states. Us humans exhibit only a 15 fold change.

Meanwhile, grazers need horizontal pupils so they can better detect predators.

“The first key visual requirement for these animals is to detect approaching predators, which usually come from the ground, so they need to see panoramically on the ground with minimal blind spots,” said Banks. “The second critical requirement is that once they do detect a predator, they need to see where they are running. They have to see well enough out of the corner of their eye to run quickly and jump over things.”


Meanwhile, those with round pupils, like humans, are more likely to be active hunters, chasing down their prey. This raises an interesting question: we have vertical, horizontal, circular… why not diagonal?

“For species that are active both night and day, like domestic cats, slit pupils provide the dynamic range needed to help them see in dim light yet not get blinded by the midday sun,” said Banks. “However, this hypothesis does not explain why slits are either vertical or horizontal. Why don’t we see diagonal slits? This study is the first attempt to explain why orientation matters.”

Well, herbivores need to have a broad field of sight to be able to see incoming predators. For ambush predators, accurately gauging the distance animals would need to pounce on their prey – this is prevalent especially for animals closer to the ground (this is why cats have vertical pupils, but bigger cats, like lions or tigers don’t). Having diagonal pupils simply wouldn’t provide any advantage.

So far, this study has only been conducted on terrestrial animals. It would definitely be interesting to see how their findings fare with flying and water-based creatures.

The anterolateral ligament (ALL). (c) University of Leuven

New ligament discovered in the human knee

The human body is a complex biological entity in seemingly perfect harmony as thousands of components play their part in tandem. Discovering, describing and understanding how each of this body parts function and work together is the primary role of human anatomy. Some of you might be surprised to find that human anatomy is yet from being exhaustively described, as new body parts are discovered ever so often. For instance, doctors at University of Leuven, Belgium recently report they’ve discovered a new ligament in the human knee. Moreover, its function has also been revealed.

The anterolateral ligament (ALL). (c) University of Leuven

The anterolateral ligament (ALL). (c) University of Leuven

Termed the anterolateral ligament (ALL), and located in the human knee, the ligament’s existence was first proposed in 1879 by a French surgeon but couldn’t be proven until recently. Deep anatomy studies are made on cadavers, and death has the nasty habit of spoiling bodies and making observation difficult especially of subtle body parts. The researchers led by  Dr. Steven Claes, an orthopedic surgeon and study co-author at the University of Leuven, Belgium performed an in-depth analysis of 41 cadaver knees and found the ligament in 40 of the bodies.

“The anatomy we describe is the first precise characterization with pictures and so on, and differs in crucial points from the rather vague descriptions from the past,” Claes said. “The uniqueness about our work is not only the fact that we identified this enigmatic structure for once and for all, but we are also the first to identify its function.”

So what’s the ALL good for? One common injury to the knee is related to another ligament, the anterior cruciate ligament (ACL), which causes what’s known as a “pivot shift”. Basically, do the intense stress your knee stays in position while the rest of the leg moved, causing severe complications least not to mention excruciating pain.  The study suggests that one type pivot shift might actually caused by injury to the ALL, which helps to control the rotation of the tibia, one of the two bones in the lower leg, Claes said.

Like I said earlier, new body parts are discovered fairly often. In June scientists found a new eye layer, named Dua’s layer after its discoverer, that sits at the back of the cornea.

bi0niceye

This is not SciFi: software update slated for bionic eye will grant higher resolution and colour vision

The Argus II is the first bionic eye implant, designed to grant the blind vision, that has been approved by the FDA in the US. The wearer of such an implant is now capable of distinguishing objects and live an almost independent life, which is absolutely remarkable by itself, however its performance is light years away from the natural counterpart. Technology has always been on an upward trend and it’s natural to expect the implant will get better in coming years, but think of it for a moment. It’s not like you’re buying a new hard-drive for your PC. In analogy, you’d have to go through surgery again, have your implant taken out, and then have a new one implanted back – the hassles are too great.

Zoom in on the future

bi0niceyeThing is, hardware’s not the only thing you can upgrade to increase performance, a lot of times a software update can do wonders and the bionic eye is no different. Recently, Second Sight – the company that developed Argus II – announced they’ll soon be rolling out a firmware update which will allow the users to have better resolution, focus, and image zooming. A planed second update will even allow for colour recognition, even though the initial product offers only black and white imaging.

The are many causes that can lead to blindness – cataracts, glaucoma, macular degeneration or various other diseases. Going to the root of the problem, what happens is the diseased eye is incapable of converting the light that hits the rods and cones of the retina  into electrical signals which, in a healthy eye, are then transmitted down your optic nerve to your brain which process them and retrieves images, granting sight.

Granting sight

Argus II is essentially a retinal prosthesis, and it works by having 60 electrodes implanted into the macula of the patient — the central region of the retina that provides central and high-resolution vision. Since the eye can’t receive light anymore or the essential input data it needs to convert into electrical signals, the actual “eyes” are replaced by a pair of spectacles with a mounted camera that records whatever the wearer is pointing towards. The camera then converts the captured images into electrical signals and sends them to a tiny antenna which is connected to the electrodes implanted in the retina. The signal picked up by the electrodes stimulates them in such a way that they then produce an electrical signal that can read and understood by the brain. Finally, out of pitch black, finally enters light.

It’s rough pixel world, however. Since the implant only has 60 electrodes (6×10), you can imagine the resolution is extremely small, but it’s a lot better than being completely blind, and for some this means they now have the chance of living  a normal life or at least take care of themselves.

“There’s a new firmware out for my bionic eye. Cool!”

Even so, once the update called  Acuboost rolls in, Argus II users will be able to improve their implants’ performance significantly. If Second Sight needed years and years of back and forth discussions with the FDA for their product to become the first ever implantable bionic eye to be approved, apparently they require no such approval for firmware updates. Something that will most likely change soon enough as policies will become just as demanding for software, as well as hardware.

The ‘see in colour’ update is the most anticipated release, scheduled to come after Acuboost, and the most fascinating yet since technically the implant users don’t have any color vision capabilities as the these cells were destroyed by the disease. Instead, colour can be granted by ingeniously reading and correlating specific frequencies and delays in electrode stimulation with a colour.

There’s an European, more interesting, counterpart to the Argus II. In Germany, the Alpha IMS bionic eye has recently received European regulatory approval, and works much in the same way as the Argus II except in one major aspect: instead of having an auxiliary camera that feeds imaging signals, the Alpha IMS uses a self-contained bionic eye that grants vision by using light that actually enters the eye. More than that, instead of 60 electrodes the Alpha IMS boasts 1,500 electrodes, greatly enhancing resolution.  We reported on the retina implant in a previous ZME Science piece from February. Check out the presentation video for Alpha IMS below.