Bigger boost in robot’s field of view

Oregon State University’s team has earned some serious bragging rights: they’ve come up with an optical sensor that can mimic the human eye. Think of robots, ones that are built to track moving objects. Roboticists dealing with such machines wouldn’t have to play with complex image processing anymore — they could rely on this optical sensor to do the job.

The human eye, while not nearly as highly performant as some of its counterparts from the animal kingdom, is still a magnificent structure. Replicating its functionality in robots has proven immensely challenging, but the OSU team’s work brings us one step closer to it, as their robot eye is able to closely match the human eye’s ability to perceive changes in its visual field.

Due to the way the team’s sensor works, a static item in the robot’s field of view would draw no response. A moving object would—registering a high voltage. Science Focus summed up the importance of their work thusly:

“Currently, computers receive information in a step-by-step way, processing inputs as a series of data points, whereas this technology helps build a more integrated system. For artificial intelligence, researchers are attempting to build on human brains which contain a network of neurons, communicating cells, able to process information in parallel.”

For example, the OSU team proceeded to simulate an array of “retinomorphic” (human eye-type) sensors that predict how a retina-like video camera would respond to visual stimuli. The idea was to input videos into one of these arrays and process that information in the same way a human eye would. For instance, one such simulation shows a bird flying into view, then all but disappearing as it stops at an invisible bird feeder. The bird reappears as it takes off. The feeder, swaying, becomes visible only as it starts to move. But you don’t just need the eye, you also need the processing power — which in the case of humans, is provided by the brain. The OSU team also tried to replicate that.

The team’s paper appears in Applied Physics Letters, explaining that “neuromorphic computation is the principle whereby certain aspects of the human brain are replicated in hardware. While great progress has been made in this field in recent years, almost all input signals provided to neuromorphic processors are still designed for traditional (von Neumann) computer architectures.”

You may have already read about researchers exploring devices that behave like eyes, especially retinomorphic devices. But previous attempts to build a human-eye type of device relied on software or complex hardware, said John Labram, Assistant Professor of Electrical and Computing Engineering.

The Science Focus piece describes why he stepped up to this kind of research effort. Labram was “initially inspired by a biology lecture he played in the background, which detailed how the human brain and eyes work.” Our eyes are very sensitive to changes in light, the piece explains, but less responsive to constant illumination. This marked the core of a new approach for devices that mimic photo-receptors in our eyes.

The innovation in this work lies mostly in the materials and the technique they used. The authors discuss how “a simple photosensitive capacitor will inherently reproduce certain aspects of biological retinas.” Their design involves using ultrathin layers of perovskite semiconductors — perovskite being a mineral also used for solar panels, among others. The perovskite is a few hundred nanometers thick and works as a capacitor that varies capacitance under illumination.

These change from strong electrical insulators to strong conductors when exposed to light. “You can think of it as a single pixel doing something that would currently require a microprocessor,” said Labram, for the university’s news site.

Their human eye-like sensor would not just be useful for object tracking robots, though. Consider that “neuromorphic computers” belong to a next generation of artificial intelligence in applications like self-driving cars. traditional computers process information sequentially as a series of instructions; neuromorphic computers emulate the human brain’s massively parallel networks, said the OSU report.

Leave a Reply

Your email address will not be published. Required fields are marked *