Posted in | News | Sensors General

Ancient Predators Inspire Visionary Research

Insects are inspiring University of South Australia researchers to create new technology based on their extraordinary vision.

The visual processing skills of the dragonfly are the envy not only of the animal kingdom, but the human one as well. They can remain in the air under very tight control waiting for potential mates, prey or predators. Using their nearly 360-degree eyesight they can discern targets against cluttered backgrounds and then take the appropriate action.

Dr Russell Brinkworth, a UniSA neuroscientist, mechatronic engineer and robotics expert, and Professor Anthony Finn, Director of the Defence and Systems Institute at UniSA and an expert on sensor processing and autonomous systems, are using insect brains as inspiration for vision systems in computers.

For six years Dr Brinkworth worked with a small team that painstakingly measured and modelled the neurology of the early vision system of the hoverfly and dragonfly. Over the past eight years, he, Prof Finn, and a growing team of scientists at UniSA’s Mawson Lakes Campus, have replicated the visual functionality of these insects and are using them as a basis to improve detection systems in cameras.

Their bio-inspired research has a range of applications, from developing bionic eyes to improving the navigation systems of driverless cars, spotting drones in complicated environments, scanning forests to capture detailed information about individual trees, improving facial recognition techniques, and even monitoring wildlife in densely camouflaged areas.

By replicating the dragonfly’s visual algorithms in a computer model, the researchers are building sensor systems that can find objects in scenes that are very bright or very dark, have either high or low contrasts, and are in complex and obscure landscapes – something that computers currently fail to do well.

“Dragonflies have the same ability as humans, animals and other insects to adjust to dark and light surroundings,” Dr Brinkworth says. “They also have superior tracking and detection skills. All these visual processes can be mapped to help us build systems that can operate in complex environments.

“The reason there have been some fatal accidents involving driverless cars is because more progress needs to be made in the visual processing field. Current camera systems struggle with differentiating between light and dark and different objects. Our research is helping to address this.

“Biologically, the human eye structure bears little comparison to insect eyes and the two species perceive things very differently. However, the way that insects process visual information is remarkably like humans.

“We take the algorithms that insects use, and we modify them to suit our purposes, whether it’s to improve security camera footage or to improve facial recognition.”

The same biologically-inspired algorithms can also be applied to sound, making it easier to listen for objects in noisy environments.

This means that small, quiet, slow-moving targets, such as drones, can be tracked based on both their visual and acoustic signatures.

Using their image processing skills and sensing expertise, Prof Finn and Dr Brinkworth are also leading a UniSA project to help combat the growing global threat posed by IED-carrying drones.

Improvised explosive devices are among the deadliest weapons in modern warfare, killing or injuring more than 3000 soldiers in Afghanistan in 2017.

This weaponisation of drones by terror groups has led to the Defence Science and Technology (DST) Group inviting researchers and experts from industry and academia to come up with technological solutions.

Prof Finn and Dr Brinkworth’s project is one of only 14 successful Grand Challenge proposals (out of more than 200 applications) to win funding from the $730 million Next Generation Technologies Fund.

Using the algorithm inspired by insect neurology and physiology, their research team has developed electro-optic, infra-red and acoustic sensor technologies which can detect remotely- piloted aircraft at impressive distances.

“What we have done is to transition the model of the hoverfly beyond the biology and simulation and put it on to embedded computers,” Prof Finn says. “These are small, portable systems that allow us to process images and data at around 100 frames a second, identifying targets in very complex settings in real time, even when they occupy less than one image pixel or are practically inaudible.”

“If somebody wants to stop a major airport from operating, all they have to do is fly a small drone in the vicinity. We saw this in December 2018 when hundreds of flights were cancelled at Gatwick Airport near London following drone sightings close to one of the runways.”

Working in conjunction with a Sydney-based company, Midspar Systems, and the DST Group, Prof Finn and Dr Brinkworth have been able to significantly extend the range at which drones can be detected while simultaneously “massively reducing” the false alarm rates in cluttered environments.

The drone detection project is expected to be completed by the end of 2020.

For a video explaining the research, click here: https://youtu.be/qEnI_Hd1mZ0

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.