Electronic Supplementary Material Capture and housing We

advertisement
1
Electronic Supplementary Material
Capture and housing
We captured 23 starlings in modified decoy traps in Tippecanoe (Indiana, N 40.4417, W
86.9300) and Hamilton (Ohio, N 39.2833, W 84.5947) counties in May 2012 and December
2013, respectively. The USDA APHIS captured the birds that were from Ohio. We sexed and
aged starlings following Kessel [1] and Pyle [2]. We only used birds that were at least in their
second year so that they had a certain degree of exposure to social activities (e.g., first mating
season). We housed the birds in outdoor aviaries (2.5 x 2.5 x 3.5 m) in mixed-sex groups at the
Ross Biological Reserve (Tippecanoe County, IN, N 40.4167, W 87.0693) and provided food (cat
food and game bird maintenance chow) and water ad libitum. About 4 days before the trails,
we moved birds to indoor enclosures to habituate to the lighting conditions of the experimental
arena. They were housed in groups of 2-4 in the Lilly small animal building at Purdue University.
The Purdue Institutional Animal Care and Use Committee (protocol 1306000876) approved all
animal handling procedures.
Robotic birds
We constructed robotic birds (hereafter, robots) from the skins of deceased European starlings
(1 male, 1 female) (Figure S1). We designed the robots to simulate three types of movements:
turning the head mimicking scanning head-up, going head-down mimicking pecking, and
turning the body mimicking moving. We programmed the robot to engage in head movement
patterns characteristic of live starlings (following [3]) through video and field observations, with
an average head movement rate of 0.97 s-1. When the head moved, it always moved side to
2
side, with the beak parallel to the ground. The head and body could rotate independently of
one another, about the transverse axis of the robot. We used a laptop and Maestro servo
controller software (from Pololu) to control the behavior of the robot, which was intended to
manipulate its attention behavior.
We followed Randolet et al. [4] to prepare the robots. In brief, we used a taxidermy
incision vertically along the torso of the bird. We then removed the skin completely from the
carcass using blunt probes. We made cuts at the shoulder, above the knee, and between two of
the cervical vertebrae to remove the main body mass (i.e., abdomen and thorax) of the bird
from the skin, leaving the bony structure of the wings and lower legs for support. We removed
excess muscle from the skin and bones and eyes, tongue, and brain tissue to prevent rotting.
We dried the skin on a carcass-shaped mold for two days to harden it. Afterwards, we removed
the head by carefully cutting around the circumference of the neck, cutting as few feathers as
possible. We then removed the skin from the mold and stretched it to mitigate any shrinking
that occurred during drying. We allowed it to dry for an additional 1-2 days.
We attached two servos (Hi-Tec HS-55 feather) to small pieces of foam board to make a
structure about the size of the bird carcass. This foam board not only provided structural
support for the skin, but also helped to dampen the sound made by the servos. We then
attached the skin to the two servos and foam as shown in Figure S1 using superglue. We used a
needle and thread to sew the wings in place. We attached the head to the upper servo and
added false eyes made from synthetic clay (Sculpy II) painted with clear nail polish to create a
glossy finish. The female eyes had a light colored iris while the male’s eyes were a solid dark
brown, characteristic of live starlings [1,2]. We used two robots for our visual playbacks to
3
reduce the bias of using a single stimulus repeatedly [5].
We attached the servo that was at the hip of the robot to a piece of copper tubing,
which was bent at a right angle and connected to a servo that was outside of the robot (Figure
S1). We colored the copper tubing and all wires visible to the live bird (hereafter, focal) with a
black permanent marker to minimize any areas of high contrast that might distract the focal.
This servo allowed the robot to change its body orientation and was inserted into a small block
of wood to aid in stability.
Experimental Arena
We constructed a three compartment enclosure out of hardware cloth, wood, and UV
transmitting Plexiglas (Loop acrylics, 92% transmittance over all wavelengths, at least 70% in
the UV). The enclosure was placed on a table 1 m above the floor for easy accessibility and
surrounded by black curtains to homogenize the visual environment. The focal compartment
contained seven windows (six small (0.04 m x 0.10 m) and one large (0.3m x 0.3m)) and a door
(0.2 m x 0.3 m) used to introduce the animal to the arena. The focal had visual (but not
physical) access to the robot through the large window. The focal was also surrounded by six
small windows, two of connected to the empty compartment and four that connected to the
outside of the enclosure. The purpose of the windows connecting into the empty compartment
was to allow the focal to follow the orientation of the robot’s attention when the robot would
orient its attention around the barrier. The purpose of the windows connected to the outside of
the enclosure was to provide the focal with other opportunities to interact with windows not
associated with the empty compartment. To standardize the visual environment that the focal
4
experienced when it oriented its attention to each of the windows, we hung black curtains
surrounding the outside of the enclosure as well as in the empty compartment. Since the
enclosure was placed on a table, we covered the table with thin cardboard, which also lined the
floor of the empty compartment. The robot was able to orient its attention into the empty
compartment through an additional small window (0.04 m x 0.10 m; Figure 1a).
We conducted trials indoors in the basement of Lilly Hall at Purdue University in West
Lafayette Indiana during September 2013 through January 2014. Since starlings can see
ultraviolet wavelengths [6], we hung broad spectrum lights over the experimental arena (T-Bay
II 4-Light T5HO/T8 from Kirby Risk Electrical Supply, Lafayette Indiana). We measured the light
intensity on the floor of the enclosure in nine evenly distributed points in the enclosure and
found the average light intensity to be 36,300 ± 812 lux, a light level comparable to lighting
conditions in an open habitat within 1 h of sunrise.
Experiment set up
We recorded the trials with 4 cameras (one EverFocus EZ700W-001security camera overhead,
one PelikanCam CRM-36DW camera to record the orientation of the robot, one EverFocus 1/3”
Color PIR CHC-37P wide angle security camera which was inside the enclosure recording the
right side of the enclosure, and one PelikanCam bullet camera SCB619PHW viewing the right
side of the enclosure. The cameras were time synched with a quad splitter and recorded on a
DVR (Ganz DigiMaster Digital Video Recorder Model HDC-0912).
Experimental procedures
5
The evening before trials, we food deprived the focals for 14-19 h to increase their motivation
to engage in the experiment. The experiment consisted of four phases that have been usually
implemented in other experiments that manipulate visual attention [7,8]: (1) habituation, (2)
attention getting, and (3) treatment. The habituation phase began after the focal was
introduced into the enclosure. During the habituation phase, the robot’s body was oriented
towards the barrier (i.e., the sagittal axis of the robot was parallel to the barrier), scanning
through left and right head movements and pecking. This phase lasted at least 2 min and did
not end until after the focal either began walking and/or searching for food and stopped flying
up, plus at least 30 s after this to allow the focal to explore the enclosure. In the attention
getting phase, the robot oriented its body towards the focal and began scanning and foraging
until the focal appeared to be attending to the robot (i.e., the focal approached the robot,
paused from foraging, rotated the front of its body towards the robot, or increased scanning
towards the robot). Next, we exposed the focal to the treatment. The treatment phase included
two treatments that were given on different days in random order, once with each sex of robot.
In the around barrier treatment, the robot would orient its body away from the focal, to direct
its visual attention to a window in the robot compartment, which was connected to the empty
compartment. In the towards focal treatment, the robot oriented its body towards the focal
compartment. During both treatments, the robot would engage in a series of head movements
by moving its head side to side (i.e. scanning) for 20 s, remaining still for 5 s, scanning for
another 20 s and finally remaining still for 5 s. The robot did not peck during the treatment
phase. Although we intended the treatment to take 50 s, the total duration was 51 s due to the
time it takes the script to run on the computer powering the robot. After the treatment phase,
6
the robot would resume scanning towards the barrier for about two minutes. In manipulating
the orientation behavior of the robot, we intended to manipulate where its visual attention was
directed based on the configuration of the starling visual system [9, 15]. However, we did not
manipulate the robot behavior based on orientation of attention in other sensory modalities
(e.g., hearing, olfaction).
Each focal experienced a maximum of one treatment per trial per day, with at least one
full day in between each trial. Twenty three birds participated as focals in this experiment (12
males, 13 females). Sometimes, we either could not get the attention of the focal or it would
not settle down. After 15 min of attempts, we cancelled the trial. Each failed trial was
attempted one additional time on a different day, but sometimes this was also unsuccessful. In
total, we collected data for 83 trials: 4 trials from 16 focals, 3 trials from 5 focals, and 2 trials
from 2 focals.
Video coding
We measured whether or not the focal oriented its attention to one of the two windows that
viewed the empty compartment during treatment phase. Starlings can see 296° around their
head [9], which can lead to ambiguity about where they are orienting their visual attention [10].
We intended to overcome these limitations following two approaches.
First, we used windows to limit where the focal had visual access to the empty
compartment, which required the bird to move into certain regions of the enclosure for the
observer to score a response indicative of orienting attention to the window (i.e., made the
behavior more conspicuous to the observer). This technique is commonly used in birds [11-13].
7
For example, [11] were interested in the amount of time western scrub jays spent watching
conspecifics that were engaging in caching versus non-caching behaviors. Jays were allowed to
view conspecifics engaging in these behaviors through small peep-holes that were lower than
the standing height of the bird on a perch, thus requiring the bird to bend down to look through
the peep-hole.
Second, we developed a set of sensory criteria to code whether (1) or not (0) the focal
was orienting its attention to the window. Not all regions of the retina and its projections into
the visual field (i.e., volume of space around the animal’s head from which it can see) provide
the same quality of information because of differences in the density of photoreceptors across
the retina [14]. Retinal areas with the highest density of photoreceptors are called centers of
acute vision (e.g., fovea) because they provide high quality visual information (i.e., high spatial
resolution or acuity). Other retinal areas (e.g., retinal periphery) provide lower spatial
resolution due to lower density of photoreceptors. In starlings, the centers of acute vision (one
per retina) projects in the right and left lateral visual fields at 60.5° from the bill, but close to
the edge of the binocular field [15] (Figure S2). Additionally, the binocular field is subtended by
the overlap of the peripheral areas of the right and left retinas, which can provide enhanced
contrast sensitivity [16] compared to other peripheral areas of the retinas that do not overlap in
their projection into the visual field. In starlings, the binocular field with the eyes at rest has
been reported to be 26° [17]. Based on this visual sensory information, we established an area
of 121° around the bill that included the projection of the centers of acute vision as well as the
binocular field that would indicate the direction of attention in starlings (Figure S2). We
considered the focal to be directing its attention to the window only if this area of 121° was
8
aligned with a window (Figure S3). Additionally, we imposed these four other criteria. First,
after the treatment begins, the focal had to move in a way that brought it closer to the window,
either by rotating its body towards the window, or elongating its neck in the direction of the
window, or walking closer to the window. Second, the focal had to be in the front half of the
enclosure (i.e., the half closest to the barrier) in a position where the window takes up at least
two degrees of visual space (Figure S3). We chose this narrow angle of visual space to limit the
regions of the enclosure that the bird could be considered to be orienting attention to the
window; thus, making the criteria more conservative. Third, the focal had to pause from
walking for 0.2 s and engage in scanning (e.g., head movement) to exclude birds that were
walking across the enclosure from being counted as orienting attention to the window. We
chose 0.2 s as the minimum pause duration because in a previous study [17], we observed that
this was the minimum period of time starlings would keep their eyes in a fixed position,
suggesting that this may be a minimum amount of time a starling would need to gather visual
information. Fourth, the focal had to be head-up scanning and not head down searching for
food.
We used SAS (version 9.3) to conduct all analyses in this study. We analyzed these data
using a generalized linear model (SAS version 9.3, proc Glimmix) using a binomial error
distribution, logit link function, and an autoregressive variance-covariance matrix with subject
incorporated as a random factor into the residual matrix. Our independent variable was
treatment and our dependent variable was whether or not the bird oriented its attention to the
window. Results are in the main document.
9
Latency to orient attention to the window
The latency for the focal to reorient its attention to the window was determined by the
difference in time between the beginning of the treatment and the time when the focal
reoriented its attention to window for the first time according to the criteria listed above. For
focals that did not reorient attention to the window, we gave them a cap value of 51 s
(following Carter et al. [18]). We used a Freidman two-way analysis of variance (SAS version 9.3,
proc Freq, Mantel-Haenszel method), with subject as our row factor and treatment as our
column factor and latency for the focal to orient to the window as our dependent variable.
Results are in the main document.
Head movement rate
We calculated the instantaneous head movement rate while the focal was near the robot, head
up, and not flying or preening during the treatment. We used JWatcher (Version 0.9, 2000) [19]
to count the number of head movements and the time spent head-up scanning. We were
interested in the head movement rate during the period of time when the focal was gathering
information about the orientation of the robot attention. For focals that reoriented their
attention to the window, we classified this period of time as from the beginning of the
treatment until the focal reoriented its attention to the window. This period of time was 22 s on
average. Therefore, we also analyzed the head movement rate for the first 22 s of the
treatment for trials where the focal did not orient its attention to the window. We performed
three separate analyses, with head movement rate being the dependent variable for all of
them: one for focals who oriented attention to the window where the independent variable
10
treatment, one for focals who did not orient their attention to the window where the
independent variable treatment, and one for all focals in the around treatment where the
independent variable was whether or not they oriented their attention to the window. We
analyzed these data in PROC Mixed using an autoregressive variance-covariance matrix
repeated on subject. Results are in the main document.
Testing reflexive co-orienting hypothesis
In examining the possibility that the focals were reflexively co-orienting their head and body
orientation with that of the robots (i.e., automatically responding to cues without the goal of
sharing attention with the robot [20]), we made two predictions. First, we predicted that if
birds were reflexively co-orienting their attention with the direction of the robot, we would
expect that they would align their head and body angle with that of the robot. Second, we
would predict that the bird would not walk towards the window, but rather just rotate its body
wherever it was in the cage.
To test this first prediction, we measured the angle of the focal head and body relative
to that of the robots. Because the robot body was always oriented at approximately 45°
counter clockwise from the orientation of the barrier, and its head pivoted left and right around
the center position at approximately 45°, we measured the angle of the head and body of the
focal during the frame of the video when it first oriented its attention to the window. We
recorded the absolute value of this angle relative to 45°, which ranged between 0-180°. We
then constructed a one tailed 95% confidence interval of the head and body angles, accounting
for the repeated measures of our experimental design using a repeated statement in proc
11
Mixed (SAS version 9.3, n = 27). If the birds were reflexively co-orienting, we would predict that
0° (i.e., perfect alignment between the focal and robot orientation) would be contained within
this confidence interval. The differences in the head and body orientation angles between the
robot and the focal were significantly different from 0° (95% confidence interval: head :
<38.07°, 65.63°>; body : <65.88°, 96.56°>), suggesting that the birds were not reflexively coorienting their body and head with that of the robot.
To test this second prediction, we recorded whether the bird either walked towards the
window, making translational movements, or just rotated its body or elongated its neck. We
then constructed a 95% confidence interval of the probability that the bird walked, accounting
for the repeated measures of our experimental design using proc Glimmix (SAS version 9.3, n =
27). We used a null probability of walking towards the window of 25%. If the birds were
reflexively co-orienting, we would predict that 25% would be contained in this confidence
interval. We found that the live bird walked towards the window significantly more often than
predicted by chance (95% confidence interval <42.9%, 84.3%> not contained in pnull = 25%).
Taken together, these results suggest that the focal was not reflexively co-orienting its head
and body with that of the robot.
Body and head orientation while orienting to windows
Our approach to manipulating direction of attention in a species with laterally-placed eyes was
to modify simultaneously the head and body orientation to reduce the ambiguity of the cue
(see Main Document). To validate this approach, we examined the movements that starlings
used when orienting their attention to the window post hoc. Specifically, we measured the
12
angle between the head and body (i.e., between the projection of the beak and the midsagittal axis of the bird, green shaded angle in Figure S4 and S5). If birds orient their attention
with their heads and bodies in a way that keeps the angles similar, we would expect to find
acute angles between the head and body infrequently.
We first found all of the occurrences when the focal approached windows not
associated with the direction of the bird attention (i.e., windows 4, 5, and 6 in Main Document
Figure 1a). We chose these windows because it would be unlikely that the bird would be
interacting with the robot in these locations of the experimental arena. Second, we measured
the angle between the head and body only when the bird was orienting its attention to the
window (i.e., fixating), as determined by our criteria described in the ESM. Third, since distance
can affect the fixation strategy [21, 22], we limited our analysis to when the bird was at a
distance from the window similar to the one the robot was in relation to the barrier (within 6
cm). Overall, we found 79 events in 21 trials from 7 birds. To get a representative sample, we
picked one random event per trial and measured the angle of the head and body during the
first 5 head movements that the bird engaged in before moving away from the window. We
measured 69 head and body positions from 21 window approaching events from 7 different
birds and used a general linear model that controlled for repeatedly measuring multiple
fixations per bird (proc Mixed, SAS version 9.3) to find the least squared means and construct a
95% confidence interval of these angles. We found the averaged magnitude of this angle was
157.3° ± 2.22° (95% confidence interval: <152.8°, 161.7°>), which suggests that birds tended not
to engage in very extreme head movements when fixating.
13
The implication of this finding is that when orienting their attention to windows, starlings
tend to keep their head and body in a similar orientation, which supports our manipulations of
attention in this species with laterally placed eyes, where we kept the head within ± 90° of the
resting position. If we were to manipulate head orientation alone, there would be substantial
overlap in the visual attention. Figure S6a illustrates how reorienting the head and keeping
body orientation constant changes the direction of visual attention, but not by as much as also
reorienting the body in conjunction with the head, as in Figure S6b. Reorienting the body in
conjunction with the head allowed us to use two distinct treatments with little overlap in the
direction of visual attention of the robot, without creating unnatural movement patterns (i.e.,
extreme angles of head to body).
References
1. Kessel,B. 1951 Criteria for sexing and aging European starlings (Sturnus vulgaris). Bird
banding. 22,16-23.
2. Pyle,P. 1997 Identification Guide to North American Birds.
3. Fernández-Juricic E, Kowalski V. 2011. Where does a flock end from an information
perspective? A comparative experiment with live and robotic birds. Behav. Ecol. 22, 13041311.
4. Randolet, J Lucas, J Fernández-Juricic, E. 2014 Non-redundant social information use in
avian flocks with multisensory stimuli. Ethology. 120, 375-387.
5. Anderson, RC BuBois,AL Piech,DK Searcy, WA Nowicki,S. 2013 Male response to an
agressinve visual signal, the wing wave display, in swamp sparrows. Behav. Ecol. Sociobiol.
67, 593-600.
6. Hart, N Partridge, J & Cuthill, I 2000 Retinal Asymmetry in birds. Curr. Biol. 2, 115-117.
14
7. Range, F Viranyi, Z. 2011. Development of Gaze Following Abilities in Wolves (Canis Lupus).
Plos One. 6. e16888.
8. Giret,N., Miklosi,A., Kreutzer,M. & Bovet,D. 2009. Use of experimenter-given cues by
African gray parrots (Psittacus erithacus). Anim. Cogn. 12, 1-10.
9. Martin, GR. 1986 The eye of a passeriform bird, the European starling (Sturnus vulgaris)eye movement amplitude, visual fields, and schematic optics. J. Comp. Physiol. A. 159, 545557.
10. Davidson G, Butler S, Thorton A, Fernández-Juricic E, Clayton N. 2014 Gaze sensitivity:
function and mechanisms from sensory and cognitive perspectives. Anim. Behav. 87, 3-15.
11. Grodzinski, U Wantabe, A & Clayton, N. 2012 Peep to Pilfer: what scrub-jays like to watch
when observing others. Anim. Behav. 83, 1253-1260.
12. Kondo, N., Izawa,E. & Watanabe,S. 2012. Crows cross-modally recognize group members
but not non-group members. Proceedings of the Royal Society B. 279, 1937-1942
13. Templeton J, McCracken BG, Sher M, Mountjoy DJ. 2014. An eye for Beauty: Lateralized
visual stimulation of courtship behavior and mate preferences in male zebra finches,
Taeniopygia guttata. Behav. Proc. 102, 33-29
14. Fernández-Juricic E. 2012 Sensory basis of vigilance behavior in birds: synthesis and future
prospects. Behav. Proc. 89, 143-152.
15. Dolan, T Fernández-Juricic E. 2010 Retinal ganglion cell topography of five species of ground
foraging birds. Brain Behav. Evol. 75, 111-121.
16. Campbell, FW Green, DG 1965 Monocular versus binocular visual acuity. Nature. 208,191192
17. Tyrell,L, Butler,S. Yorzinski,J. & Fernanzez-Juricic,E. in prep. A novel system for bi-ocular eyetracking in vertebrates with laterally placed eyes
18. Carter,J., Lyons,N., Cole,H. & Goldsmith,A. 2008. Subtle cues of predation risk: starlings
respond to a predator’s direction of eye-gaze. 275,1709-1715. DOI: 10.1098/rspb.2008.0095
19. Blumstein, D., Daniel, J.C., 2007. Quantifying Behavior the Jwatcher Way, illustrated ed.
Sinauer Associates, Sunderland, Mass.
20. Drive J, Davis G, Ricciardelli P, Kidd P, Maxwell E, Baron-Cohen S, 1999. Gaze perception
triggers reflexive visuospatial orienting. Visual. Cogn. 6, 509-540.
15
21. Maldonado E. 1988 Frontal and Lateral visual-system in birds- frontal and lateral gaze.
Brain, behavior, and evolution. 32, 57-62.
22. Dawkins,M. 2002. What are birds looking at? Head movements and eye use in chickens.
Anim. Behav. 63, 991-998.
16
Figure legends
Figure S1. Schematic diagram of robotic starling made of 3 servos. The robot can turn its head,
peck, and turn its body.
Figure S2. Top-view schematic representation of the visual field of starlings with the projections
of the centers of acute vision (e.g., foveae) based on Martin [9] and Dolan & Fernández-Juricic
[15]. We defined an area of 121° around the bill (delimited with green dots) that included the
projections of the foveae and the binocular field as the sector that would indicate the direction
of attention in starlings.
Figure S3. Criteria for coding videos. Shaded regions illustrate where the bird could be in the
front half of the enclosure and have the window take up at least 2° of visual space.
Figure S4. A bird with laterally placed eyes using an alternating fixation strategy with body
oriented towards the window. In this scenario, the bird can use both fovea in an alternating
manner while keeping the head relatively centered around the body, not making extreme
angles between its head and body (i.e., green shaded angle).
17
Figure S5. A bird with laterally placed eyes using an alternating fixation strategy with body
oriented parallel to the window. In this scenario, the bird must make extreme (i.e., acute)
angles with its head relative to its body (i.e., green shaded angle) to use both fovea.
Figure S6. Reorienting the head and keeping body orientation constant (a) changes the
direction of attention, but not by as much as also reorienting the body in conjunction with the
head, as in (b). In (b), there is a substantial amount of overlap in the projection of the sector of
the visual field indicating attention in the red and blue bird, but in (b), reorienting the body has
substantially reduced this overlap.
Download