How do apple flies find apples?
Understanding the mechanistic bases of targeted search behaviour
Pavan Kumar Kaushik
Shannon Olsson
Naturalist Inspired Chemical Ecology Lab
National Centre for Biological Sciences
Table of contents
Multimodal sensory integration 5
Finding objects is necessary for all motile organisms but it is not easy. Most animals on this planet are flying insects and they are solitary, living in an ever changing environment with short lives. So they have to rely on their sensory cues, primarily vision, olfaction and mechanosensation. But these individual sensory cues are by themselves ambiguous and insufficient to find objects in a complex world. For example, there are many red and round objects which aren’t apples and many fruits that have similar volatile compositions, and this makes unimodal targeted search strategy ineffective. But, the combinations such as a red round object with fruity volatiles would most likely be an apple, makes the search object more distinct thereby suggesting multimodal sensory integration. But, due to the complex visual scenery and turbulent nature of odour, the sensory input would often be partially missing/occluded and confounding. To understand how organisms cope with such stochastic inputs and make decisions on the “fly”, we need three things. First, we require precise knowledge of what the animals perceive (the input), what they do in response to that (the output) and finally one needs to be capable of manipulating the ground rules of the environment (the control) in real time to observe changes in input/output relationships. To address these three demands, we've built a virtual reality arena which can give arbitrary and realistic visual, olfactory and mechanosensory input to a tethered insect. What the insect perceives depends on its instantaneous behavioral output, viz., wingbeat amplitude difference. The rules of the environment are controlled by the experimenter. Using this, we will assess the search algorithms used by flying insects during decision making to create generalizable concepts for targeted search behaviour that can be applied to, and tested in, other systems.
Locating objects in the complex environment is hard. A large number of search algorithms have been proposed for both biological and non-biological entities, such as UAVs(Ramirez et al. 2011; Vergassola et al. 2007). In all cases, the searcher must not only integrate multiple sensory inputs, but must contend with stochastic, obscured, or missing information to locate its target in the complex world. As such, one might hypothesize that a search algorithm accounting for such variables could be widely applicable across systems. However, before such a generalized algorithm can be created, we need to understand what the components of such an algorithm might be. For the sake of simplicity, we ideally need a model system for which we know the input variables concerning the object, as well as the extent of their generalization or specialization.
Rhagoletis pomonella , the apple maggot fly, has been an ideal model system not only for sympatric speciation (Bush 1969) but also for its targeted search behaviour in locating apples. The adults first mate, and then the females oviposit in ripe apples. Subsequently, the larvae feed on the fruit, and emerge to locate new trees as adults. The host search behaviours is innate and well understood. Rhagoletis has been studied from an ecological, ethological and physiological standpoint using field trap assays(Prokopy 1968), wind tunnel flights(Zhang et al. 1999) and neuronal recordings(Olsson et al. 2006). This has led to the identification of key visual(Prokopy 1968; Prokopy & Bush 1973; Moericke et al. 1975) and olfactory features(Zhang et al. 1999) for the host search. In fact, the entire host search has been modelled as a hierarchical process(Roitberg 1985). At long distances the flies perform optomotor anemotaxis following the odor plume, and as they approach closer, they make use of the shape, size, luminance and hue of the visual features to initiate landing.
The search behaviour is an output of a continuous evaluation of current and past sensory inputs (vision, olfaction, mechanosensory etc.)(Goyret et al. 2007) (Goyret et al. 2008)and internal states (hunger, sexual maturity etc). Recent studies(van Breugel & Dickinson 2014; van Breugel et al. 2015) have shown that a brief exposure to odour in the past gates the visual preference by making an initially unattractive visual cue into a strongly attractive one. This shows that olfaction gives context to visual inputs, pressing on the importance of time and historical sensory data on decision making.
Another fundamental problem with multimodal sensory integration is the confounding cues(Green & Angelaki 2010). More often than not, the direction of the target from vision and odour don’t align due the turbulent nature of odour dispersal(Cardé & Willis 2008), causing fundamental challenges in sensory integration. All of these arguments point out that, if we wish to have any hope in understanding the underlying complex decision making process in the host search, one must be able to independently manipulate the sensory stimuli across time and space while simultaneously monitoring the motor output in closed loop.
Field studies performed in natural settings have little data if any, about the insect’s sensory input. Physiological studies have precise control over sensory inputs, but are mostly performed in open loop and can focus on only small areas of the processing system at one time. Although wind tunnel studies can measure both insect sensory input and output, these studies lack precise manipulation of stimuli in real time.(van Breugel et al. 2015). Instead of relying on multiple different snapshots of the behaviour and sensory information and filling the gaps with smart guesses, we can now measure them with the help of Virtual Reality. With the advancement of technology, VR partially solves many of these issues by providing dynamic control over input and output in behaving animals.
Current virtual reality arenas are primarily used in walking assays(Aronov & Tank 2014) (Chiappe et al. 2010) and those that study flight behaviour are in closed loop primarily in the visual modality(Fry et al. 2008). A few that use odour in the VR have it as a categorical variable and lack any spatial meaning to the odour structure with no control of wind direction. This is critical if one needs to understand behaviours such as plume following. Existence of a closed loop olfactory-visual arena, where the sensory inputs can be modulated on the fly, is unheard of due to technical limitations. Here, we have developed a virtual reality arena which can provide olfactory, visual and airflow stimuli in closed loop, where the inputs can be manipulated in real time.
Using this VR arena, the aim is to understand, essentially, how apple flies find apples, and then establish a generalizable search algorithm that can consequently be tested across many systems. Specific question include the following:
We hope this study will create a platform to trigger more research in understanding the conservation and diversity of search algorithms in diverse taxa.
Before we test any hypotheses, we need to make sure all VR systems work as expected.
A one to one mapping of VR and real world coordinate space. The visual objects and wide field motion should have identical angular size and expansion rate as the real world.
Precise control of odour onsets and offsets. Stable odour concentration levels over experimental timescales with minimal bleed and latency.
A rational and systematic way to find closed loop gain and the DC offset of the fly
Flies should be able to counteract imposed turns and respond to looming stimuli. Show landing reflexes at high looming rates.
Flies should be able to discriminate foreground from background and fly towards the objects in VR.
Flies should be able to orient upwind, cast and surge on odour exposure and be able to follow the virtual plume
Flies should be able to find objects sooner/often/reliably when given multimodal sensory input over one.
Observe fly behaviour when given various combinations of confounding input and test different hypothesis of integration.
Use existing data to model the fly search behaviour with testable predictions.
Field studies have shown both visual and olfactory preferences in play. Briefly,
This data leads to a premise that there exists a differential preference to objects. A simple hypothesis stemming from this, is of hierarchical preference. Different cues lie at different levels of the hierarchy and the local maxima in the current sensory space is the prefered cue. For example, given a choice between apple and anything else, apple is always preferred.
An alternative to this is a context based approach supported by Floris et al(2015). The premise here being that there exists an “ideal” sequence of events where decisions made in response to current cues are affected by past data. Different cues lie at different points in a sequence, and the object succeeding the current point in the sequence is the preferred cue. For example, one searches for tree-like objects only on prior exposure to apple volatiles. And only after a successful tree search, will the search for an apple be executed.
To test which of these hypotheses are incorrect, I shall check for history dependence in choice assays. Is attraction to red spheres gated by historical data from sensory input? Hierarchical hypothesis would predict, the attraction is invariant with time. Whereas, Context based hypothesis would suggest that prior exposure to apple volatiles or tree-like objects would increase the attraction to red spheres. The simple hypotheses of hierarchical or context-based search strategy may end up being too naive to solve the hard search problem and would result in the culling of many more hypotheses before we reach an unfallen fighter which might even be a mixture of the current two hypotheses.
Three high refresh rate gaming monitors (165 Hz) forming a prism geometry provides the visual input for the flies in the custom built VR arena.
A High speed valve controls the odour inputs such that when the valve is turned on, odour infused air is pushed through a capillary. The capillary is millimeters away from the fly's’ antenna minimizing latency.
A new revolver design (rasen shuriken) whose alignment with airflow outlets decides the direction of wind.
The code integrates all the sensors and actuators along with the display outputs seamlessly using multiple open source libraries. The code can be found at https://github.com/pvnkmrksk/world
The visual stimuli are generated from a custom game built using Panda3d game engine.
The valves and the servo arm are controlled via arduino serial port and are interfaced on the ROS network. The valve state is such that it is turned on and off based on the fly’s virtual position in the predefined odour field. And the revolver angle is set based on the current pose (position with orientation) with respect to the local wind field.
The Wing Beat Amplitude Difference (WBAD) is used as a proxy for the flies’ intended direction and is measured by tracking the wind edges in real time.
All the parameters were optimized, and the values and their rationales is listed below in a parameters table.
We are presenting a spherical view onto a planar geometry. An object of a fixed number of pixels subtend different angles when they are on different parts of the screen. The center of the screen is closer than the periphery of the screen to the fly. To counteract this, the images have been distorted and rendered such that they subtend the same visual angle regardless of where they appear on the screen.
Insects use motion parallax to estimate sizes and it is critical that the rate of image expansion matches the real world for their typical flight speeds. The coordinate system was so defined to match the demands by scaling the visual features such as grass and trees appropriately.
Screenshot of the visual input which shows the rich scenery of grass, sky, trees and fruits. The distortion is precisely done to make it look normal from the fly’s viewpoint.
The odour lines were designed to have minimal latency and have balanced airflow which can give both attractive and repulsive odours. The system can provide arbitrary pulse widths of both odours.
The PID trace of the odour stimulus. Arbitrary pulse widths can be provided with a finite but small latency(50ms).
Flies counteract imposed turns and also respond to fast looming stimuli by showing landing reflexes.
Impose response = Wing Beat Amplitude Difference(WBAD) * Gain
Compensation = Impose Response - Impose
Externally imposed visual stimuli elicit a compensatory response. The difference is the compensation. The green lines are the average compensation in that impose pulse. The visual latency, behavioural saturation due to mechanical constraint of the wings can be seen.
The flies were externally imposed to varying angular velocities of yaw turns at different gains. The gains at which they were both stable and yet had the capacity to turn freely was chosen.
Finding the right gain. With increasing gains, there is more noise (S.D. of the compensation). But also there is lesser uncompensated signal (mean of compensation). The intersection has comparable stability and maneuverability making that region of gain most appropriate.
An elaborate GUI was built to keep track of large number parameters, modify, save and reload them to ensure repeatability and ease of use. It live updates trajectories, histograms, heading angles and critical parameters for monitoring the experiment and adjusting DC offset.
Screenshot of the GUI which shows live status of the experiments(left). It also allows manipulating the run parameters from saved templates.(right)
Flies are able to discriminate foreground from background and fly towards the objects in the VR. They go to trees and other objects in the VR and show characteristic stabilization responses. Further objects are less likely to be visited suggesting that the implementation of size and scaling is not unreasonably set.
Effect of distance on object homing. As the objects go further, the probability of finding the object decreases.
The tortuosity of the flight trajectory increases on exposure to odour. This response sustains even after odour being turned off suggesting working memory.
Effect of odour packet frequency in virtual plume tracking. As the packet frequency (pf) increases, from 0 Hz, 1 Hz, 8 Hz to 16 Hz, there is enhanced tortuosity. But lack of directionality from both visual slide slip and mechanical airflow might have been preventing any headway in following the virtual plume.
To achieve realistic behaviour in VR, many critical experimental parameters had to be experimentally or theoretically derived, set and then tested. Most of them were set based on prior literature and mechanistic limits in the setup. A few had to experimentally determined viz., Closed loop gain, DC offset etc. The table below lists the value and the rationale behind choosing it.
Parameter | Value | Rationale |
Vision | ||
Spatial scale | Identical to real world | The images were distorted such that the spatial scales of the world, the expansion rates matched the real world. The expected angular size of different objects at different distances in the real world was calculated. The angle subtended by the object in the VR at the fly position is measured and compared with real world values. They are identical. |
Contrast | High enough for optomotor response | To make sure the models have reasonable contrast it was checked if impose turns caused a near complete compensatory optomotor response. At no contrasts, the compensatory response is abolished. |
Colors | Stuck to RGB | Insects have a different color model and this is a definite limitation of the setup and there is no well defined way to tackle that. But albeit a RGB display, they do many behaviours similar to the real world. |
Olfaction | ||
Speed | ~1m/s Typical flight speed | The olfactory channel is fixed right in front of the fly, which simulates the self induced airflow. Therefore it is set to typical forward flight speeds(Aluja et al. 1993) |
Pressure | ~1 bar | If the pressure is too high, the air flow velocity also becomes large. If it is too low, the ball stop is never dislodged. The pressure is set such that it is just high enough to prevent ball stop from ricketting or bleeding but instead have a neat transition and yet have moderate air velocity range where the fly continues to fly as at high speeds they stop flying. |
Latency | ~50ms As short as possible | The pipes and the design was made to minimize the path length and patch residence times. Short length, narrow diameter tubes will result in the smallest path residence times which minimizes latency. The length was minimized until the flexion rigidity of the tube and geometrical constraints prevented any further reduction. |
Decay rates | ~10s for trapped volatiles in vapor pressure equilibrium ~10min for total contents | Currently, it decays on the order of several minutes. Refill often is the current solution, bubbling with a larger volume would be a better solution |
Concentration | 10^-3 mg/ml | Empirically found. A sudden depression of the wing beat amplitude sum at the onset of odour, increased number of turning events after odour exposure are some characteristic responses. The lowest concentration where the fly shows any characteristic response in their wing dynamics on exposure to odour was chosen. |
Frequency | Unclear | The trajectories are more tortuous with increasing packet frequency of odour. More experiments need to be done to understand the relationship of packet frequency and behaviour |
Wind | ||
Speed | 0.4 m/s | Set to typical wind tunnel speeds used in prior studies with rhagoletis flies. (Zhang et al. 1999) |
Channels | 16 | Maximum possible with available geometry |
Dimensions | Refer 3d models | Humidified air and tiny paper flags were used to visualize the flows. Based on the flows, multiple design was iterated to have max transfer of airflow through the revolver, have high angular acuity and minimize off angle bleed. |
Object | ||
Size | Tree ~ 4m Fruit ~10cm | Scaled to average sizes of real world objects. Images from the field sites and (Moericke et al. 1975) |
Symmetry | No observable bias | Models were placed such that they were mirror images and the lighting was adjusted to reflect the same |
Species | ||
Z height | 1m | Typical height of flight based on field observations(Aluja & Prokopy 1992) |
Flight Speed | 1m/s | Typical speeds of free flying houseflies. Free flight trajectory data provided by Vardhanam(Unpublished data) |
Visual Gain | 1320 degrees/rad/s | Flies were external imposed with different speeds at gains and the gain where the stability and maneuverability curves intersect was chosen as they have a good balance of stability and maneuverability as shown before in the Fig 4 . |
Individual | ||
DC Offset | Initialized at 0 | Depends on the tether, orientation and tracking marker offsets. Manually adjusted on the fly to center the distribution to zero |
S/N | Task title | Task stage | 2015 | 2016 | 2017 | 2018 | 2019 | 2020 | ||||||||||||||
3 | 4 | 1 | 2 | 3 | 4 | 1 | 2 | 3 | 4 | 1 | 2 | 3 | 4 | 1 | 2 | 3 | 4 | 1 | 2 | |||
1 | Pilot experiments for VR | |||||||||||||||||||||
1.1 | Tethering flies | 100% | ||||||||||||||||||||
1.2 | Testing optomotor responses | 100% | ||||||||||||||||||||
1.3 | Coding pilot VR | 100% | ||||||||||||||||||||
1.4 | Testing choice assays | 100% | ||||||||||||||||||||
1.5 | Designing wind and odour hardware | 100% | ||||||||||||||||||||
1.6 | Testing pilot wind and odour inputs | 100% | ||||||||||||||||||||
2 | Standardizing VR | |||||||||||||||||||||
2.1 | Build the new VR arena | 100% | ||||||||||||||||||||
2.2 | Calibrate visual input | 100% | ||||||||||||||||||||
2.3 | Calibrate odour input | 100% | ||||||||||||||||||||
2.4 | Optimize optomotor gain | 100% | ||||||||||||||||||||
2.5 | Design GUI and 3d models | 100% | ||||||||||||||||||||
3 | Testing VR | |||||||||||||||||||||
3.1 | Visual choice assay | 100% | ||||||||||||||||||||
3.2 | Distance assay | 100% | ||||||||||||||||||||
3.3 | Plume following assay | 20% | ||||||||||||||||||||
3.4 | Finding a distant apple tree | 0% | ||||||||||||||||||||
3.5 | Methods paper | 0% | ||||||||||||||||||||
4 | Using VR | |||||||||||||||||||||
4.1 | Developing testable algorithms | 0% | ||||||||||||||||||||
4.2 | Testing confounding input | 0% | ||||||||||||||||||||
4.3 | Identifying behavioural units | 0% | ||||||||||||||||||||
4.4 | Decision making paper | 0% | ||||||||||||||||||||
5 | Developing generalizable models for targeted search | |||||||||||||||||||||
6.1 | Probabilistic modeling of behavioral units and optimal stimulI | 0% | ||||||||||||||||||||
6.2 | Incorporation of state estimation and history dependence | 0% | ||||||||||||||||||||
6.3 | Test conservation of model in related races (e.g. hawthorn, blueberry) | 0% | ||||||||||||||||||||
6.2 | Search algorithms paper | 0% |
Fly pupae. 3.5 lakh Rs/ year x 3 years.
Consumables. 1.5 lakhs Rs/year x 3 years