Multimedia Gallery
Sensory Navigation of the Fruit Fly (Image 3)
Sensory Navigation of the Fruit Fly (Image 3)
A side view of a fruit fly of the species Drosophila melanogaster, tethered to a thin steel rod, where it adopts a flight posture, tucking in its legs (retracting the landing gear) and beating its wings. Suspending the fly in a "virtual reality" arena, wing motions are tracked optically and used to control an electronic wrap-around display so that the fly can effectively steer through its visual surroundings as if in free-flight.
The experiment was performed by researchers at the University of California (UCLA), whose
research found that Drosophila, with a brain smaller than a poppy seed, combines massive amounts of information from its sense of smell and vision, then transforms these sensory signals into stable and flexible flight behavior that leads them to a food source. Understanding the integration of these sensory cues could be relevant to developing smarter robotic drones. (Date of Image: 2003) [Image 3 of 3 related images. Back to Image 1.]
More about this Image
Mark Frye, an assistant professor in the UCLA Department of Physiological Science and a member of the UCLA Brain Research Institute, and Brian Duistermars, a doctoral candidate and member of Frye's lab, performed research to see how vision allows flies to navigate odors.
Since Frye and Duistermars found that flies stay the odor course by combining sensory information, they designed a "virtual plume simulator" where a fly is tethered, but free to steer into and out of a plume of vinegar odor (considered nectar to a fruit fly). At the same time, a cylinder around the fly displayed a variety of background images meant to roughly mimic the images the fly might encounter in the real world, such as blades of grass or twigs on a tree.
In free flight, a fly's path is normally characterized by segments of straight flight interspersed with transient "spikes" called saccades, in which they veer left and right. On the cylinder, the researchers displayed a visual backdrop of equally spaced, high-contrast vertical stripes, and then periodically switched the vinegar plume between 0 and 180 degrees within the circular arena. Then they tracked the fly's heading. Under these conditions, the fly periodically encountered the plume by steering into it. Upon plume contact, the fly performed fewer saccades, presumably because it was honing in and following the plume. Yet the saccades were not altogether absent, Frye noted, an apparent attempt on the fly's part to "constantly seek the sweet spot--the strongest scent--of the plume."
To examine the visual influence on odor-tracking accuracy, the researchers changed the visual backdrop. They alternated a sequence of three visual treatments--high-contrast stripes, uniform grayscale and a second high-contrast treatment. Each fly started within the vinegar plume and was exposed to the three visual stimuli at 20 second intervals. When the striped and high-contrast panoramas appeared, the flies were able to maintain their heading into the plume. But when the flat, featureless grayscale was displayed, the flies steered out of the plume and began generating saccades. Even though they occasionally would reencounter the plume within the grayscale visual panorama, they were unable to remain there until the high-contrast pattern reappeared, at which point accurate plume-tracking resumed.
Frye said this shows the flies are using the backdrop as a physical marker to keep track, along with their olfactory sense, of where the plume is. Such crossmodal integration at the behavioral and cellular level, Frye said, represents a functional adaptation for distinguishing and responding to critically important features of a complex sensory environment.
This research was part of a National Science Foundation grant (IOS 07-18325) titled "Cross-model Sensory Fusion for Flight Control in Drosophila." The work is aimed at understanding how visual and olfactory sensory signals are integrated for the control of complex search behavior.
Credit: Mark Frye, UCLA
Images and other media in the National Science Foundation Multimedia Gallery are available for use in print and electronic material by NSF employees, members of the media, university staff, teachers and the general public. All media in the gallery are intended for personal, educational and nonprofit/non-commercial use only.
Images credited to the National Science Foundation, a federal agency, are in the public domain. The images were created by employees of the United States Government as part of their official duties or prepared by contractors as "works for hire" for NSF. You may freely use NSF-credited images and, at your discretion, credit NSF with a "Courtesy: National Science Foundation" notation.
Additional information about general usage can be found in Conditions.
Also Available:
Download the high-resolution JPG version of the image. (266 KB)
Use your mouse to right-click (Mac users may need to Ctrl-click) the link above and choose the option that will save the file or target to your computer.