Category: NCBS Research

Tags: insects, NICE lab, ecology, virtual reality

Date: Tuesday, May 19, 2020

If you’ve ever tried to swat a fly, you know how hard it is to follow its movements as it ducks and weaves around to escape. You can easily appreciate that a scientist trying to observe and understand the behavior of insects in the natural world has their work cut out. Rather than attempt the (nearly) impossible task of following flies around, scientists from Shannon Olsson’s lab at the National Center for Biological Sciences, Bangalore, brought them indoors, into a world created just for them.

 

 

 

The researchers used virtual reality (VR) tools to simulate a complex, naturalistic 3D environment for insects to navigate. They showed that insects can respond to three dimensional objects in a virtual world and also use odor and wind cues to make choices. Olsson has long wondered how tiny insects are so good at finding things—how a mosquito seeks out a human or an apple fly locates an apple tree—that are several kilometers away. “With the VR that we built we can begin to get at this question, of what it is that causes them to make certain choices,” says Olsson.

 

 

 

The use of VR to study insect behavior is not new. For several decades now, scientists have been presenting insects with simple sensory clues, like stripes, that simulate a sense of motion. “Stripes are a very neat, simple structure to understand the physiology or mechanism of vision. But they don’t help us understand behavior,” explains Pavan Kumar Kaushik, graduate student and chief architect of their virtual fly world, in which a tethered fly is presented with a panoramic natural setting, with sky, grass and trees. The fly is seen to respond by beating its wings to move as it would in the real world. The screen moves in response to the wingbeats, giving the fly an illusion of actually navigating the world. It’s the fly’s version of a video game. Mosquitoes, apple flies and other insects sought out objects of their interest in the VR world, just as they would in nature, the researchers report in their PNAS paper published earlier today.  

 

The final touches of reality come in the form of airflow to mimic wind, and puffs of odor added to the mix. These were, by far, the most challenging aspects to recreate, says Kaushik, who spent more than three years perfecting them. “When you have visual systems you can simply look to see if something is wrong,” he says. Whereas, for the other cues he had to use the behavior of the insect to help him troubleshoot and correct the system to mimic reality as closely as possible.

 

 

 

“What is really interesting about [this] setup is that it attempts to bridge the gap between traditional VR —simple shapes, simple environments—and the natural world of insects,” says Vivek Jayaraman, a cognitive neuroscientist from the Janelia Research Campus at the Howard Hughes Medical Institute in the US.

 

 

 

For their studies of behavior, the researchers chose the apple fly, Rhagoletis pomonella. “It's a specialist—it only likes apple trees—so we don't have to worry about whether we’re giving it the right stimuli,” explains Olsson. What’s more, a whole series of studies done over 50 years in apple orchards and other places have uncovered the shape of trees they like, the kind of fruit they like and the kind of smells they like, giving Kaushik and Olsson some truth to base the VR on.

 

 

 

“[The VR arena] is pretty well-controlled, so they don’t lose the experimental advantages of VR. What they gain is the ability to glean insights into some of the trickier aspects of insect navigation—long-range localization of an odor source, visual algorithms that the insects use to make decisions about whether or not to approach something, and how olfactory, wind and visual cues are combined to get to a food source,” says Jayaraman, who got a first-hand look at the set-up on a visit to the lab.

 

 

 

To begin with, the researchers measured the distance beyond which the apple fly no longer flies to an apple tree. In one of their most fascinating experiments, the researchers show that flies can use a phenomenon called motion parallax to perceive the depth of an object against a complex background. They present the fly with two trees that look equal sized, but as the fly moves closer, one tree expands in size much faster, since it is actually twice as close. By choosing to maneuver toward this tree instead of the farther one, the flies show that they can discern depth from motion and use this information to locate food sources.

 

 

 

Airflow cues are also important to help the flies orient themselves, more so the researchers find, in the absence of visual cues. This is akin to how we can rely on sounds for orientation when we can’t see an object, like when we’re trying to locate a buzzing cellphone that’s out of sight. On the other hand, when they're orienting to smell, the flies need to have a world that they can “see”—without visual and wind cues, they cannot locate the odors they detect. “Earlier, there was no way to really isolate these cues. Our VR allows us to do that and show how they use these cues in combination,” says Olsson.

 

 

 

Having demonstrated that flies use perspective and motion parallax in flight, and how they combine different sensory inputs to locate objects, the researchers can now use their arena to explore which key features of the world are necessary for these behaviors to occur. They can play with individual features that are impossible to manipulate in the real world—a world turned upside down, perhaps?—in ways that will helping them unravel the core aspects of insect navigation. The insights they glean can be exploited by ecological models, robotics, search algorithms and a variety of other applications.

 

Cover image information: A tethered apple fly (Rhagoletis pomonella) responding to airflow and odor stimuli in a multimodal virtual reality arena. Flies can distinguish the size and distance of virtual objects amidst a complex background and incorporate directional airflow and odor to navigate towards distant targets. Photograph courtesy of Shoot for Science: Deepak Kakara, Dinesh Yadav, Sukanya Olkar, and Parijat Sil.