Humans get a real buzz from the virtual world of gaming and augmented reality but now scientists have trialled the use of these new-age technologies on small animals, to test the reactions of tiny hoverflies and even crabs.
In a bid to comprehend the aerodynamic powers of flying insects and other little-understood animal behaviours, the Flinders University-led study is gaining new perspectives on how invertebrates respond to, interact with and navigate virtual ‘worlds’ created by advanced entertainment technology.
Published in the journal of Methods of Ecology and Evolution, the new gaming software was developed by experts at Flinders University working with coauthor Professor Karin Nordström, who leads the Hoverfly Motion Vision Lab at Flinders University, and experts from Western Australia and Germany.
The novel study aims to augment ongoing research into new technologies, including aviation and other precision devices, and provides researchers around the world with access to the specially designed software platform.
The new research included biologists, neuroscientists and software experts, including Flinders University researchers Dr Yuri Ogawa, Dr Richard Leibbrandt and Raymond Aoukar, as well as Jake Manger and colleagues from The University of Western Australia.
“We developed computer programs that create a virtual reality experience for the animals to move through,” says Dr Ogawa, a Research Fellow in Neuroscience at the Flinders Health and Medical Research Institute.
“Using machine learning and computer vision algorithms, we were able to observe the animals and work out what they are doing, whether that is a hoverfly attempting to turn to the left in its flight, or a fiddler crab avoiding a virtual bird flying overhead.
“The software then adapts the visual scenery to match the movements that the animal has made.”
Study coauthor Dr Richard Leibbrandt, a lecturer at Flinders University's College of Science and Engineering, says the machine learning technologies used in the experiments are already revolutionising industries such as agriculture, for example in automatically monitoring crops and livestock, and in the development of agricultural robots.
“Virtual and augmented reality is also instrumental in industries ranging from healthcare to architecture and the transport industry,” says Dr Leibbrandt.
This new virtual world for invertebrates is starting to unlock new ways to study animal behaviour in greater detail than ever before,” adds Mr Aoukar, a Flinders University computer science graduate.
“The last two decades have seen very rapid advances in algorithms and computer technology, such as virtual reality, gaming, artificial intelligence, and high-speed calculation using specialised computer hardware in graphics cards,” says Mr Aoukar.
“These technologies are now mature and accessible enough to run on consumer computer equipment, which opens up the chance to study animal behaviour in an environment that is systematically controlled, but still more natural than a typical lab experiment.”
As part of the behavioural observations and quantification, the new technique allows for identification of visual triggers of behaviour.
Professor Nordström says other research groups are already taking an interest in using the new platform, which is described and can be downloaded from the new article.
“This has truly been a team effort where every author on the paper has been instrumental in making the VR work.
“We look forward to using the VR to investigate the mechanisms underlying decision-making in insects,” says Professor Nordström.
The user-friendly Unity Editor interface can simplify experimental design and data storage without the need for coding. CAVE is an open source project developed by the Hoverfly Motion Vision Lab designed to streamline the process of setting up a Tethered Flight Arena.
The article, ‘Combining Unity with machine vision to create low latency, flexible, and simple virtual realities’ (2024) by Yuri Ogawa, Raymond Aoukar, Richard Leibbrandt, Jake S Manger, Zahra M Bagheri, Luke Turnbull, Chris Johnston, Pavan K Kaushik (Max Planck Institute of Animal Behavior, Konstanz, Germany), Jan M Hemmi and Karin Nordström has been published in Methods of Ecology and Evolution. DOI: 10.1111/2041-210X.14449.
Captions: Dr Richard Leibbrandt, Professor Nordström and Flinders University colleagues Raymond Aoukar and Dr Yuri Ogawa demonstrate the new program.
Acknowledgements: This research was funded by the US Air Force Office of Scientific Research (AFOSR, FA9550-19-1-0294 and FA9550-23-1-0473), the Australian Research Council (ARC, DP180100491, FT180100289, DP200102642, DP210100740, and DP230100006) and the Flinders Foundation.
Y Ogawa, R Aoukar, R Leibbrandt, and J Manger contributed equally to the new study. Researchers thank the Biomedical Engineering team at SA Local Health Network (SALHN) and the South Australian Botanic Gardens for their help.
Journal
Methods in Ecology and Evolution
Method of Research
Experimental study
Subject of Research
Animals
Article Title
Combining Unity with machine vision to create low latency, flexible, and simple virtual realities
Article Publication Date
26-Nov-2024