Tag Archives: Motion

Fish motions could help us identify their personalities

New research says that you can, in fact, judge a fish by their cover. Or at least by the way they swim.

The three-spined stickleback (Gasterosteus aculeatus). Image credits Gilles San Martin.

A research team with members from Swansea University and the University of Essex reports that the subtle differences in how each fish moves around can be used to determine its overall personality. We still don’t know if the findings translate to humans, or if such an approach can be reliable over the long term, but it’s an interesting place to start from.

A bat of the fins

“These micropersonalities [motions] in fish are like signatures — different and unique to an individual,” explains Dr Ines Fürtbauer, a co-author of the study from Swansea University. “We found the fish’s signatures were the same when we made simple changes to the fish tanks, such as adding additional plants.”

We know that the animal kingdom is quite rich with personality types, in species ranging from ants to apes. Quite like you’d see in humans, animals can be shy, energic, bold, or sedentary. The current paper shows that the same is true with fish.

However, something new that the researchers have found is that we can look at the tiny idiosyncrasies of how fish swim around to learn more about their personality. They recorded 15 three-spined stickleback fish as they went about their day in a tank with two, three, or five plastic plants, in fixed positions. Later, high-resolution tracking was used to chart their movements. The team used this to measure different parameters for every individual including how often they turned and how often they stopped and started moving.

Each fish had distinct and very repeatable movement patterns, so much so that the team could reliably identify the animals based on how they moved. The authors also note that there is a correlation between behavior and movement patterns. Fish which spent more time moving, took more direct approaches, and didn’t “burst travel” very often tended to travel and explore more of the tank, and were also more likely to spend time in open water.

“Our work suggests that simple movement parameters can be viewed as micropersonality traits that give rise to extensive consistent individual differences in behaviors,” says Dr Andrew King from Swansea University, lead author.

“This is significant because it suggests we might be able to quantify personality differences in wild animals as long as we can get fine-scale information on how they are moving; and these types of data are becoming more common with advances in animal tracking technologies.”

It’s not to say that these patterns remain constant, however. The team used a static layout in the experimental tank, and only observed the fish for a short time. We also don’t know if changes in the environment would change their movement patterns or behavior. As such, the next step should be to observe animals’ motion “over longer periods and in the wild will give us this sort of insight and help us better understand not only personality but also how flexible an animal’s behavior is.”

However, it is possible these signatures change gradually over an animal’s lifetime, or abruptly if an animal encounters something new or unexpected in its environment. Tracking animals’ motion over longer periods — in the lab and in the wild — will give us this sort of insight and help us better understand not only personality but also how flexible an animal’s behavior is,” says Dr Ines Fürtbauer, a co-author of the study from Swansea University.

The paper ““Micropersonality” traits and their implications for behavioral and movement ecology research” has been published in the journal Ecology and Evolution.

Carbon Yarn.

Researchers design carbon yarn that generates energy from motion or waste heat

An international research effort has produced high-tech yarns that pump out electricity when stretched or twisted.

Carbon Yarn.

Image credits Shi Hyeong Kim et al., 2017.

The “Twistron” yarns are the product of a research team led by scientists from the University of Texas at Dallas and Hanyang University in South Korea. Due to the high rate with which the twistrons transform motion into electricity, they could be used to tap into ambiental energy sources. Ocean waves, waste heat, even our breathing motions could be harvested for usable power.

Twist and turn on

“The easiest way to think of twistron harvesters is, you have a piece of yarn, you stretch it, and out comes electricity,” said Dr. Carter Haines, co-lead author of the paper and an associate research professor in the Alan G. MacDiarmid NanoTech Institute at UT Dallas.

The yarns are built from carbon nanotubes, hollow carbon cylinders which are about 10,000 times thinner than a strand of human hair. The team spun these into high-strength yarns, then twisted them until they coiled to make them elastic. The last step was to coat the strands with an electrolyte (an ionically conductive material), which can be something as mundane as a water and salt solution. The end result, the twistron, acts much like a supercapacitor. A capacitor is a device used to store electrical charges, and they usually require an external source of current such as a battery. But these yarns are capacitors capable of generating their own charge.

The fundamental working principle is that when the nanotube yarns get pushed into the electrolyte, they become electrically charged. Because of the twist imparted onto the strand, whenever the Twistron is twisted or stretched its volume decreases, bringing the electric charges on the yarn closer together and increasing the overall energy of the strand, Haines said. This increase translates into a higher voltage across the yarn. In essence, mechanical motions in our scale of reference lead to changes on a very small scale that produce energy we can harvest.

According to the study’s corresponding author, Dr. Ray Baughman, director of the NanoTech Institute, 30 stretches a second generated 250 watts per kilogram of peak electrical power normalized to the harvester’s weight (i.e. this is a calculated value). Which is actually quite a lot of energy from simple motion.

“Although numerous alternative harvesters have been investigated for many decades, no other reported harvester provides such high electrical power or energy output per cycle as ours for stretching rates between a few cycles per second and 600 cycles per second,” he explains.

Experimentally, a twistron yarn weighing a few milligrams (for comparison, a typical male housefly weighs around 11,5 mg) could light up a small LED on every stretch. When sewed into a shirt, the yarn generated an electrical signal on each breath, showcasing its potential as a self-powered breathing biomonitor. To make it able to tap into wasted thermal energy, the team connected the Twistron to an artificial polymer muscle which contracts and expands with temperature.

All of this could mean that Twistron technology is ideally-suited to power wearable electronics and supply energy for really small devices where batteries would simply be impractical.

“There is a lot of interest in using waste energy to power the Internet of Things, such as arrays of distributed sensors,” said Dr. Na Li, a research scientist at the NanoTech Institute and co-lead author of the study.

Bigger fish to fry

It’s not all about the very small, however. The team also wanted to show that their strands can be used to tap into currently under-exploited forms of power, such as sea waves. As a proof-of-concept demonstration of both this and the strands’ ability to work in more chemically-complex environments, such as ocean water, the team deployed a Twistron on the east coast of South Korea. A 1-milligram, 10 centimeter-(4 inch-) long strand was tied to a balloon on one end and a weight that rested on the seabed. With every wave, the balloon would rise and stretch the yarn by up to 25%, generating electricity.

Although it only produced very small amounts of power in this attempt, the team showed that the technology’s output is scalable either by increasing diameter or by employing bundles of the strands in parallel. The only barrier to their mastery of wave-energy is cost, as building enough strands for this application would be quite pricey. However, for applications requiring relatively low levels of power, such as sensors or sensor communications, you only need small twistrons — which don’t cost very much. The team reports that “just 31 milligrams of carbon nanotube yarn harvester could provide the electrical energy needed to transmit a 2-kilobyte packet of data over a 100-meter radius every 10 seconds for the Internet of Things.”

The paper “Harvesting electrical energy from carbon nanotube yarn twist” has been published in the journal Science.

Dancing.

Teaching smart cars how humans move could help make them safer and better

Computers today can’t make heads and tails of how our bodies usually move, so one team of scientists is trying to teach them using synthetic images of people in motion.

Google driverless car.

Image credits Becky Stern / Flickr.

AIs and computers can be hard to wrap your head around. But it’s easy to forget that holds true from their perspective as well. This can become a problem because we ask them to perform a lot of tasks which would go over a lot smoother if they actually did understand us a tad better.

This is how we roll

Case in point: driverless cars. The software navigating these vehicles can see us going all around them through various sensors and can pick out the motion easily enough, but it doesn’t understand it. So it can’t predict how that motion will continue, even for something as simple as walking in a straight line. To address that issue, a team of researchers has taken to teaching computers how human behavior looks like.

When you think about it, you’ve literally had a lifetime to acquaint yourself to how people and other stuff behaves. Based on that experience, your brain can tell if someone’s going to take a step or fall over or where he or she will land after a jump. But computers don’t have that store of information in the form of experience. The team’s idea was to use images and videos of computer-generated bodies walking, dancing, or going through a myriad of other motions to help computers learn what cues it can use to successfully predict how we act.

Dancing.

Hard to predict these wicked moves, though.

“Recognising what’s going on in images is natural for humans. Getting computers to do the same requires a lot more effort,” says Javier Romero at the Max Planck Institute for Intelligent Systems in Tübingen, Germany.

The best algorithms today are tutored using up to thousands of pre-labeled images to highlight important characteristics. It allows them to tell an eye apart from an arm, or a hammer from a chair, with consistent accuracy — but there’s a limit to how much data can realistically be labeled that way. To do this for a video of a single type of motion would take millions of labels which is “just not possible,” the team adds.

Training videos

So they armed themselves with human figure templates and real-life motion data then took to 3D rendering software Blender to create synthetic humans in motion. The animations were generated using random body shapes and clothing, as well as random poses. Background, lighting, and viewpoints were also randomly selected. In total, the team created more than 65,000 clips and 6.5 million frames of data for the computers to analyze.

“With synthetic images you can create more unusual body shapes and actions, and you don’t have to label the data, so it’s very appealing,” says Mykhaylo Andriluka at Max Planck Institute for Informatics in Saarbrücken, Germany.

Starting from this material, computer systems can learn to recognize how the patterns of pixels changing from frame to frame relate to motion in a human. This could help a driverless car tell if a person is walking close by or about to step into the road, for example. And, as the animations are all in 3D, the material can also be used to teach systems how to recognize depth — which is obviously desirable in a smart car but would also prove useful in pretty much any robotic application. .

These results will be presented at the Conference on Computer Vision and Pattern Recognition in July. The papers “Learning from Synthetic Humans” has been published in the Computer Vision and Pattern Recognition.

For the first time in history, researchers restore voluntary finger movement for a paralyzed man

Using two sets of electrodes, scientists have successfully restored finger movement in a paralyzed patient for the first time in history. The results could be the starting point to developing methods that would allow people around the planet to regain limb mobility.

Four years ago, Ian Burkhart lost the ability to move his arms and legs. Now thanks to a neural implant and electrodes on his forearm, he's able to move his wrist, hand, and fingers. Image credits Ohio State University Wexner Medical Center/ Batelle

Four years ago, Ian Burkhart lost the ability to move his arms and legs. Now thanks to a neural implant and electrodes on his forearm, he’s able to move his wrist, hand, and fingers.
Image credits Ohio State University Wexner Medical Center/ Batelle

There are roughly 250,000 people living with severe spinal cord injuries in America alone — people who have to manage going through life with little or no mobility. One such man is 24 year old Ian Burkhart, who lost the ability to move or feel from the shoulder down in a diving accident four years ago. But, thanks to a team at Ohio State University, Ian became the first to regain control over his body. By using electrodes to bypass his damaged nerve pathways, the researchers allowed him to move his right fingers, hand and wrist.

“My immediate response was I want to do this,” said Ian after the four-hour procedure of inserting the electrodes into his brain . “If someone else was in my place, with the possibility of changing my life and people like me in future, I would hope they would agree.”

The first step was to implant an array of electrodes into Burkhart’s left primary motor cortex, an area of the brain that handles planing and directing movements. Signals generated in his brain were recorded and fed through a machine learning algorithm. It took almost 15 months of three training sessions each week to teach his brain to use the device. But finally, the software started to correctly interpret which brain waves corresponded to which movements.

Now, when Burkhart’s brain emits the proper signals, the implant sends these impulses through wires to a flexible sleeve — lined with electrodes — placed around his wrists to stimulate his muscles accordingly.

Image via giphy

The researchers tested Burkhart’s ability to perform six different hand, wrist and finger movements. An algorithm determined that Burkhart’s first movements were about 90 percent accurate on average.After his muscles got some exercise and improved their strength, Ian successfully poured water from a bottle into a jar and then stirred it. He was also able to swipe a credit card and even play some Guitar Hero.

The team notes that in its current form, the technique is highly invasive, meaning it might not be suitable for patients who are already in poor health or have compromised immune systems; They also point out that the device they used in this study allows for a greater range of movement than typically available neural bypass devices. Finally, the implant doesn’t restore a patient’s ability to feel — prosthetics might be able to solve that problem though.. Maybe someday the two technologies could merge to give patients both the ability to move and to feel.

Still, the study is a huge stepping stone. The way Burkhart was able to move his hand is simply mind blowing, and something considered impossible up to now. That could have big implications if the technology becomes used widely.

“Our goal was to use this technology so that these patients like Ian can be more in charge of their lives and can be more independent,” Ali Rezai, one of the researchers involved in the study, said in a statement. “This really provides hope, we believe, for many patients in the future.”

The team hopes to have their system refined and ready for wide-scale implementation in a few years.

The full paper, titled “Restoring cortical control of functional movement in a human with quadriplegia” has been published in the journal Nature and can be read here.