Figure 1 - uploaded by Andrew Straw
Content may be subject to copyright.
A) Schematic diagram of the multi-camera tracking system. B) A trajectory of a fly (Drosophila melanogaster) near a dark, vertical post. Arrow indicates direction of flight at onset of tracking.

A) Schematic diagram of the multi-camera tracking system. B) A trajectory of a fly (Drosophila melanogaster) near a dark, vertical post. Arrow indicates direction of flight at onset of tracking.

Source publication
Article
Full-text available
Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in realtime - with minimal latency - opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of be...

Contexts in source publication

Context 1
... the tracking results are generated and saved online, in realtime as the experiment is performed, raw image sequences can also be saved for both verification purposes as well as other types of analyses. Fi- nally, reconstructed flight trajectories, such as that of Figure 1B, may then be subjected to further analysis, as in ...
Context 2
... ondly, the 3D coordinates and distances between coordinates measured through triangulation were verified against values measured physically. For two such measurements in the sys- tem shown in Figure 1, these values were within 4 percent. (Figure 7 near here.) ...

Citations

... Early automated tools for tracking insects like "Ethovision" and "Trackit" relied on color cameras and chrominance differences to segment foreground and background [25,26]. "Flydra" achieved real-time tracking and allowed the use of monochrome cameras without the need for chrominance-based segmentation by assuming a fixed background scene and using intensity based segmentation [27]. This technology began to move to other schooling animals, with parallel improvements in outdoor starling tracking applying trifocal cameras began to bring this technology outside the lab, assuming homogeneous sky through a fixed background assumption [28]. ...
Article
Full-text available
Insects are model systems for swarming robotic agents, yet engineered descriptions do not fully explain the mechanisms by which they provide onboard sensing and feedback to support such motions; in particular, the exact value and population distribution of visuomotor processing delays are not yet quantified, nor the effect of such delays on a visually-interconnected swarm. This study measures untethered insects performing a solo in-flight visual tracking task and applies system identification techniques to build an experimentally-consistent model of the visual tracking behaviors, and then integrates the measured experimental delay and its variation into a visually interconnected swarm model to develop theoretical and simulated solutions and stability limits. The experimental techniques include the development of a moving visual stimulus and real-time multi camera based tracking system called VISIONS (Visual Input System Identification from Outputs of Naturalistic Swarms) providing the capability to recognize and simultaneously track both a visual stimulus (input) and an insect at a frame rate of 60-120 Hz. A frequency domain analysis of honeybee tracking trajectories is conducted via fast Fourier and Chirp Z transforms, identifying a coherent linear region and its model structure. The model output is compared in time and frequency domain simulations. The experimentally measured delays are then related to probability density functions, and both the measured delays and their distribution are incorporated as inter-agent interaction delays in a second order swarming dynamics model. Linear stability and bifurcation analysis on the long range asymptotic behavior is used to identify delay distributions leading to a family of solutions with stable and unstable swarm center of mass (barycenter) locations. Numerical simulations are used to verify these results with both continuous and measured distributions. The results of this experiment quantify a model structure and temporal lag (transport delay) in the closed loop dynamics, and show that this delay varies across 50 individuals from 5-110ms, with an average delay of 22ms and a standard deviation of 40ms. When analyzed within the swarm model, the measured delays support a diversity of solutions and indicate an unstable barycenter.
... However, advances in modeling approaches have been, in large part, uncoupled from empirical observations which in turn creates a gap between theory and experiment that only recently started to decrease. Indeed, recent studies relying on new methods and tools for tracking animals in groups have shed light on empirical questions that would be considered unworkable a few years ago [10][11][12][13][14][15] . Examples include the identification of influential neighbors of fish in moving groups 16 , effects of predation on shoaling fish interactions 17 and on collective escape of pigeons 18 , emergence of swirling motion 19 , and the role of social interactions on developmental trajectories of honey bees 20 . ...
Preprint
Full-text available
Despite significant efforts devoted to understanding the underlying complexity and emergence of collective movement in animal groups, the role of different external settings on this type of movement remains largely unexplored. Here, by combining time series analysis and complex network tools, we present an extensive investigation of the effects of shady environments on the behavior of a fish species (Silver Carp Hypophthalmichthys molitrix) within earthen ponds. We find that shade encourages fish residence during daylight hours, but the degree of preference for shade varies substantially among trials and ponds. Silver Carp are much slower and exhibit lower persistence in their speeds when under shade than out of it during daytime and nighttime, with fish displaying the highest persistence degree and speeds at night. Furthermore, our research shows that shade affects fish schooling behavior by reducing their polarization, number of interactions among individuals, and the stability among local neighbors; however, fish keep a higher local degree of order when under shade compared to nighttime positions.
... Introduction 1 communications analysis and a implemented in a simulated swarm environment (Swarm 25 Model). The analysis and simulation indicate that three different specific patterns may 26 be generated by a collection of insects visually navigating relative to each other with 27 such delays, and identify the Gaussian delay distribution range needed to generate 28 stable swarm (Results and Discussion). 29 Previous Work & Background 30 Experiments for insects visuomotor response 31 The effects of environmental stimuli examining insect flight behavior and motions 32 during visually-dominated behaviors like obstacle avoidance, landing on a wall or 33 proboscis, and flower tracking have previously focused most on the role of ambient and 34 external illumination levels [9,10,11]. ...
... Early automated tools for tracking insects like "Ethovision" 68 and "Trackit" relied on color cameras and chrominance differences to segment 69 foreground and background [24,25]. "Flydra" achieved real-time tracking and allowed 70 the use of monochrome cameras without the need for chrominance-based segmentation 71 by assuming a fixed background scene and using intensity based segmentation [26]. This 72 technology began to move to other schooling animals, with parallel improvements in 73 outdoor starling tracking applying trifocal cameras began to bring this technology 74 outside the lab, assuming homogeneous sky through a fixed background assumption [27]. ...
Preprint
Full-text available
Insects are model systems for swarming robotic agents, yet engineered descriptions do not fully explain the mechanisms by which they provide onboard sensing and feedback to support such motions; in particular, the exact value and population distribution of visuomotor processing delays are not yet quantified, nor the effect of such delays on a visually-interconnected swarm. This study measures untethered insects performing a solo in-flight visual tracking task and applies system identification techniques to build an experimentally-consistent model of the visual tracking behaviors, and then integrates the measured experimental delay and its variation into a visually interconnected swarm model to develop theoretical and simulated solutions and stability limits. The experimental techniques include the development of a moving visual stimulus and real-time multi camera based tracking system called VISIONS {Visual Input System Identification from Outputs of Naturalistic Swarms} providing the capability to recognize and simultaneously track both a visual stimulus (input) and an insect at a frame rate of 60-120 Hz. A frequency domain analysis of honeybee tracking trajectories is conducted via fast Fourier and Chirp Z transforms, identifying a coherent linear region and its model structure. The model output is compared in time and frequency domain simulations. The experimentally measured delays are then related to probability density functions, and both the measured delays and their distribution are incorporated as inter-agent interaction delays in a second order swarming dynamics model. Linear stability and bifurcation analysis on the long range asymptotic behavior is used to identify delay distributions leading to a family of solutions with stable and unstable swarm center of mass (barycenter) locations. Numerical simulations are used to verify these results with both continuous and modeled distributions. The results of this experiment quantify a model structure and temporal lag (transport delay) in the closed loop dynamics, and show that this delay varies across 50 individuals from 5-110ms, with an average delay of 22ms and a standard deviation of 40ms. When analyzed within the swarm model, the measured delays support a diversity of solutions and indicate an unstable barycenter.
... Although not demonstrated, the authors proposed the interesting idea of acquiring images from multiple targets by rapidly shifting gaze between them. Shortly after the Okumura paper, a similar concept was employed by another group to make a highresolution video of a flying Muscina stabulans fly, with a simpler optical system in which two cameras shared an optical path by combining views with a beam splitter and one camera was used for tracking while the other was used to record high-resolution videos (Sakakibara et al. 2012). Okumura and colleagues also later extended their "pupil shift" optical system to incorporate an additional camera sharing the optical path and this system was also capable of recording videos using alternative hardware than that used for tracking (Okumura et al. 2015). ...
... This was the approach used by Nourizonoz and colleagues to track mice and mouse lemurs in their EthoLoop system that commands servo motors and an electronic tunable lens to control focus also based on low-latency 3D tracking data (Nourizonoz et al. 2020). Earlier work adjusted focus of a high-magnification, high frame rate camera to record a high-resolution video of a landing fly (van Breugel and Dickinson 2012) based on input commands from a low-latency 3D tracking system (Straw et al. 2011). ...
... Most of this could be adapted to moving cameras by considering camera calibrations as dynamic functions of time. In addition to providing additional information regarding the 3D position of animals, multiple cameras allow tracking despite occlusions in single camera views and to increase the volume over which the combined system functions (Straw et al. 2011). With appropriate software, similar benefits could be expected with moving viewpoints, as well. ...
Article
Synopsis Digital photography and videography provide rich data for the study of animal behavior and are consequently widely used techniques. For fixed, unmoving cameras there is a resolution versus field-of-view tradeoff and motion blur smears the subject on the sensor during exposure. While these fundamental tradeoffs with stationary cameras can be sidestepped by employing multiple cameras and providing additional illumination, this may not always be desirable. An alternative that overcomes these issues of stationary cameras is to direct a high-magnification camera at an animal continually as it moves. Here, we review systems in which automatic tracking is used to maintain an animal in the working volume of a moving optical path. Such methods provide an opportunity to escape the tradeoff between resolution and field of view and also to reduce motion blur while still enabling automated image acquisition. We argue that further development will be useful and outline potential innovations that may improve the technology and lead to more widespread use.
... To further obtain vehicle trajectory in 3D space, multi-scene cameras should be calibrated in advance and the topological relationship between cameras should be determined to convert multi-camera perspectives into point sets in 3D coordinate system. Straw et al. [27] proposed a method of cross-camera vehicle tracking which uses DLT and triangulation for camera calibration and Kalman filter for vehicle state estimation. Sensors 2020, 20, 6517 3 of 24 ...
Article
Full-text available
The three-dimensional trajectory data of vehicles have important practical meaning for traffic behavior analysis. To solve the problems of narrow visual angle in single-camera scenes and lack of continuous trajectories in 3D space by current cross-camera trajectory extraction methods, we propose an algorithm of vehicle spatial distribution and 3D trajectory extraction in this paper. First, a panoramic image of a road with spatial information is generated based on camera calibration, which is used to convert cross-camera perspectives into 3D physical space. Then, we choose YOLOv4 to obtain 2D bounding boxes of vehicles in cross-camera scenes. Based on the above information, 3D bounding boxes around vehicles are built with geometric constraints which are used to obtain projection centroids of vehicles. Finally, by calculating the spatial distribution of projection centroids in the panoramic image, 3D trajectories of vehicles are extracted. The experimental results indicate that our algorithm can effectively complete vehicle spatial distribution and 3D trajectory extraction in various traffic scenes, which outperforms other comparison algorithms.
... tracking system was used to determine the body position and orientation of the birds in each frame [36]. The output of the tracking system was smoothed using a forward-reverse Kalman 124 filter and ground-truthed by re-projecting the smoothed data onto the videos, to verify that it matched the movements of the animals [28,34,36]. ...
... tracking system was used to determine the body position and orientation of the birds in each frame [36]. The output of the tracking system was smoothed using a forward-reverse Kalman 124 filter and ground-truthed by re-projecting the smoothed data onto the videos, to verify that it matched the movements of the animals [28,34,36]. 126 ...
Preprint
Full-text available
Unpredictable movement can provide an advantage when animals avoid predators and other threats. Previous studies have examined how varying environments can elicit unpredictable movement, but the intrinsic causes of complex, unpredictable behavior are not yet known. We addressed this question by analyzing >200 hours of flight performed by hummingbirds, a group of aerial specialists noted for their extreme agility and escape performance. We used information theory to calculate unpredictability based on the positional entropy of short flight sequences during 30-min and 2-hour trials. We show that a bird′s entropy is repeatable, with stable differences among individuals that are negatively correlated with wing loading: birds with lower wing loading are less predictable. Unpredictability is also positively correlated with a bird′s overall acceleration and rotational performance, and yet we find that moment-to-moment changes in acceleration and rotational velocities do not directly influence entropy. This indicates that biomechanical performance must share an underlying basis with a bird′s ability to combine maneuvers into unpredictable sequences. Contrary to expectations, hummingbirds achieve their highest entropy at relatively slow speeds, pointing to a fundamental trade-off whereby individuals must choose to be either fast or unpredictable.
... Classical approaches may miss many important details of the real dynamic patterns since they rely completely in both an expert user and an invasive tracking strategy. Recently, some approximations have automatically estimated the dynamics of flight trajectories, most of them using machine-learning methods that require a considerable number of videos for training and finding global flight patterns [14]. Overall, these kinematic methods model the wing as a stiff airfoil [24], but in reality the wing is a very flexible structure, able to asymmetrically produce non uniform movements [24]. ...
Article
Full-text available
A new method for automatic analysis and characterization of recorded hummingbird wing motion is proposed. The method starts by computing a multiscale dense optical flow field, which is used to segment the wings, i.e., pixels with larger velocities. Then, the kinematic and deformation of the wings were characterized as a temporal set of global and local measures: a global angular acceleration as a time function of each wing and a local acceleration profile that approximates the dynamics of the different wing segments. Additionally, the variance of the apparent velocity orientation estimates those wing foci with larger deformation. Finally a local measure of the orientation highlights those regions with maximal deformation. The approach was evaluated in a total of 91 flight cycles, captured using three different setups. The proposed measures follow the yaw turn hummingbird flight dynamics, with a strong correlation of all computed paths, reporting a standard deviation of [Formula: see text] and [Formula: see text] for the global angular acceleration and the global wing deformation respectively.
... The arena was backlit by near-infrared light-emitting diodes, and the cameras were fitted with infrared pass filters. To record the 3D position of flies in free flight, we used a custom-built real-time fly tracker (for details, see ref. 29). The tracker consisted of five Basler A602f digital video cameras capturing images at 100 frames per second and an array of computers performing image analysis to locate moving flies in each 2D view. ...
Article
Full-text available
Significance Insects are widely appreciated for their aerial agility, but the organization of their control system is not well understood. In particular, it is not known how they rapidly integrate information from different sensory systems—such as their eyes and antennae—to regulate flight speed. Although vision may provide an estimate of the true groundspeed in the presence of wind, delays inherent in visual processing compromise the performance of the flight speed regulator and make the animal unstable. Mechanoreceptors on the antennae of flies cannot measure groundspeed directly, but can detect changes in airspeed more quickly. By integrating information from both senses, flies achieve stable regulation of flight speed that is robust to perturbations such as gusts of wind.
... The raw data from the color-based tracker was post-processed by projective transformation mapping of the oblique view to the top view, and a Kalman filter with linear models was used for both the dynamics of the system and the observation process. Like other Bayesian-based tracking algorithms [12], the parameters of the filter were carefully chosen by an iterated trial-and-error procedure comparing the filtered and real trajectories by eye, and we found that we could obtain good results under the covariances of Q = 10 {3 and R = 0.1. The whole system is described inFigure 1. ...
Article
Full-text available
Recently, several studies have been carried out on the direct control of behavior in insects and other lower animals in order to apply these behaviors to the performance of specialized tasks in an attempt to find more efficient means of carrying out these tasks than artificial intelligence agents. While most of the current methods cause involuntary behavior in animals by electronically stimulating the corresponding brain area or muscle, we show that, in turtles, it is also possible to control certain types of behavior, such as movement trajectory, by evoking an appropriate voluntary instinctive behavior. We have found that causing a particular behavior, such as obstacle avoidance, by providing a specific visual stimulus results in effective control of the turtle's movement. We propose that this principle may be adapted and expanded into a general framework to control any animal behavior as an alternative to robotic probes.
... A detailed description of our tracking system may be found in [59]. Briefly, we used 11 cameras (6 monochrome Pt. ...
Article
Full-text available
As animals move through the world in search of resources, they change course in reaction to both external sensory cues and internally-generated programs. Elucidating the functional logic of complex search algorithms is challenging because the observable actions of the animal cannot be unambiguously assigned to externally- or internally-triggered events. We present a technique that addresses this challenge by assessing quantitatively the contribution of external stimuli and internal processes. We apply this technique to the analysis of rapid turns ("saccades") of freely flying . We show that a single scalar feature computed from the visual stimulus experienced by the animal is sufficient to explain a majority (93%) of the turning decisions. We automatically estimate this scalar value from the observable trajectory, without any assumption regarding the sensory processing. A posteriori, we show that the estimated feature field is consistent with previous results measured in other experimental conditions. The remaining turning decisions, not explained by this feature of the visual input, may be attributed to a combination of deterministic processes based on unobservable internal states and purely stochastic behavior. We cannot distinguish these contributions using external observations alone, but we are able to provide a quantitative bound of their relative importance with respect to stimulus-triggered decisions. Our results suggest that comparatively few saccades in free-flying conditions are a result of an intrinsic spontaneous process, contrary to previous suggestions. We discuss how this technique could be generalized for use in other systems and employed as a tool for classifying effects into sensory, decision, and motor categories when used to analyze data from genetic behavioral screens.