Signal - SmartSite

advertisement
Animal Communication
Part 2: Functional Issues
III. Function, b. Information & Decisions
Question posed by female: Is this male healthy or sick?
Signals assigned to the same question are a signal set
(e.g. in this example, both song & dance signal health)
Alternative answers: male is healthy; male is sick….these are the
conditions
Sender has a code that correlates the signal with the conditions:
• Songs are fast in healthy males, slow in sick males
• Dance is vigorous in healthy males, not in sick males
All signals pooled across all questions and signal sets is the
signal repertoire; size depends on the number of questions
asked and the number of possible answers
III. Function, b. Information & Decisions
Example: Territorial defense
QUESTIONS
Will opponent
attack?
Will this
opponent
escalate?
ALTERNATIVE
CONDITIONS
ALTERNATIVE
SIGNALS
Unlikely
Crest fully
raised
50:50
Crest partially
raised
Likely
Crest down
Yes
Loud call
No
Soft call
III. Function, b. Information & Decisions
There are two basic kinds of information that can be
transferred by signals:
Novel Facts: Signals are used to
share some fact unsuspected by
the Receiver
Confirmation of Conditions:
Signals confirm which of several
alternatives suspected by the
Receiver is currently true
III. Function, b. Information & Decisions
Two kinds of information can be transferred by signals:
• Communication by most animals is of the second type
of Scenario: receivers “know” the likely alternatives,
either through learning or genetic biases or both, and
signals largely serve to confirm which among these
alternatives is currently true
• Where novel alternatives do turn up, these are usually
assigned to one of the existing alternatives (e.g. one
more type of predator)
III. Function, b. Info & Decisions
1. Information as Probabilities
Information as Probabilities:
• If the alternatives to some question are already
known, then what must change with the provision of
information are the relative probabilities that each
alternative might be true
• We say the probabilities are updated with the
provision of information
For example, you may start your day estimating a
1x10-5 % chance of being killed by a terrorist attack,
but increase that to 1.1x10-5 % after seeing the
“orange” DHS alert
III. Function, b. Info & Decisions
1. Information as Probabilities
Information as Probabilities:
• So today we’ll be talking about how receivers use
information encoded in signals to update their
estimated probabilities that certain conditions are
true, and thereby make decisions
• These decisions determine the benefits that both
receivers and sender gain by communicating
III. Function, b. Info & Decisions
1. Information as Probabilities
Example: The receiver (you) want to know whether you are going
to like the new movie by a favorite director
Your priors are the percentage of movies by that director that you
previously liked (72%)
When you read a positive movie review in the paper (the signal),
you update your probability estimates with this new
information… How do you update your estimate?
1- Pprior
1- Pupdated
Pprior = 0.72
Pupdated = ??
C1 Probability
C1 =
Love it
C2 Probability
C2 =
Hate it
III. Function, b. Info & Decisions
1. Information as Probabilities
The reliability of signals can be
mapped on a coding matrix
When it is received, one needle
goes to 1.0 probability and the
other to 0:
Signal:
A perfect signal is never wrong
Condition:
C1
C2
S1
1.0
0
S2
0
1.0
1- Pupdated
Pupdated = 1.0
C1 Probability
C1 =
Love it
C2 Probability
C2 =
Hate it
III. Function, b. Info & Decisions
1. Information as Probabilities
An imperfect signal is only correct
some fraction X% of the time
This is much more common in
animal communication
C1 Probability
C1 =
Love it
Signal:
Each signal moves the needles only
part way towards 0 or 1
Condition:
C1
C2
S1
0.7
0.1
S2
0.3
0.9
C2 Probability
C2 =
Hate it
III. Function, b. Info & Decisions
1. Information as Probabilities
Imperfect Signals: If you read more movie reviews, each gives
new information, but each has a smaller effect on your
opinion (diminishing returns to continued reading of reviews)
C1 Probability
C1 =
Love it
C2 Probability
C2 =
Hate it
III. Function, b. Info & Decisions
2. The Amount of Information
Since the change in probabilities with receipt of a given signal
depends on the prior probability, we need to take prior
probabilities into account when we measure the amount of
information the signal provides to the Receiver
So we use the base 2 logarithm of the ratio of updated to
prior probability estimates. Why?
We use logarithms because they give the same absolute value
regardless of which direction the estimate changes, with a
sign indicating the direction:
e.g. log2(0.9/0.1) = 3.17 and log2(0.1/0.9) = –3.17
Why do we use base 2 logs?
III. Function, b. Info & Decisions
2. The Amount of Information
It is easiest to think of information as the answers to specific
questions
Binary Questions: The simplest type of question has only two
possible answers: yes or no, A or B, male or female, etc.
Complex Questions: Any complex question with a finite
number of possible answers can be broken down into a
series of binary questions
III. Function, b. Info & Decisions
2. The Amount of Information
E.g. Cuckoo Eggs: One egg in the nest is a cuckoo egg. How
many binary questions do you have to ask to find it?
Number of
Alternatives
Number of Binary
Questions
III. Function, b. Info & Decisions
2. The Amount of Information
General Case: If M is the number of alternative answers to a
complex question, and H is the number of binary questions
that need to be asked to find which alternative is true, then
M = 2H
One bit is the information required to answer a binary question
It takes H bits to answer a question with M alternative answers
Number of
Alternatives (M)
2
4
8
16
Number of Binary
Questions (H)
1
2
3
4
III. Function, b. Info & Decisions
2. The Amount of Information
General Case: If M = 2H, then if follows that
H = log2 M
Where H is the number of bits to answer a question, and M is
the number of alternative answers (conditions)
• ...you can compute log2 M on your calculator without using
base 2 logs by calculating ln(M)/ln(2) or log10M/log102
• ...a value of M that is not an integer power of 2 is OK: thus
log23 = 1.58
III. Function, b. Info & Decisions
2. The Amount of Information
So that’s how we quantify information; how does this relate to
probability estimates?
As you saw before, we measure the amount of information the
signal provides to the Receiver as the base 2 logarithm of the
ratio of updated to prior probability estimates
Now, we formalize that in an equation:
P( A)updated
 updatedprobability 
  log2
HT  log2 
P( A) prior
 priorprobablility 
The amount of information in bits transferred by a signal about the
likelihood of a particular answer A to a question
III. Function, b. Info & Decisions
2. The Amount of Information
Perfect Signals: After receipt of a perfect signal, the numerator
in the amount of information expression, P(A)updated goes to
either 0 or 1
If A is the answer that is now known to be true, the amount of
information provided by the signal about A is
HT = log2(1/P(A)prior) = – log2(P(A)prior)
I. What is Information?
B. The Amount of Information
If you have these priors (P0)
for the 5 possible moods of
your dog:
Wants to play: 0.2
Wants food: 0.4
Amorous: 0.2
Fearful: 0.1
Aggressive: 0.1
If you receive a “play signal”, which is a
perfect signal, how much information
does it give you?
HT play signal =
III. Function, b. Info & Decisions
2. The Amount of Information
Imperfect Signals: P(A)updated never goes to 1.0 after receipt of
an imperfect signal. Instead, we are left with
H T  log2
P( A)updated
P( A) prior
which can be rewritten as
HT   log2 (P( A) prior )  ( log2 ( P( A)updated )
Which is the same as
HT = Hprior – Hupdated
III. Function, b. Info & Decisions
2. The Amount of Information
H as uncertainty: HT = Hprior – Hupdated
The first term, Hprior = –log2(P(A)prior), is the amount of information
in bits required to remove ALL of our prior uncertainty about
whether A was true before the signal
The 2nd term, Hupdated = –log2(P(A)updated), is the amount of
information in bits required to remove ALL uncertainty about
whether A is true after receipt of the signal
The difference between these two terms, HT, is the amount of
initial uncertainty about whether A was true that was removed
by the signal
It is the amount of information transferred through communication
III. Function, b. Info & Decisions
2. The Amount of Information
Example: Suppose P(A)prior = 0.5
how does HT change with different values of P(A)updated?
P(A)updated
HT
0.6
0.7
0.8
0.9
1.0
III. Function, b. Info & Decisions
2. The Amount of Information
Information theory was developed by Claude Shannon to find fundamental
limits on compressing and reliably storing and communicating data
Shannon entropy is a measure of the uncertainty associated with a random
variable; quantifies the info contained in a message (in bits).
How is information theory used? We aren’t usually privy to what animals
are trying to say...how many options are available, their priors are, etc.
So while conceptually useful, the classic information theory we just learned is
difficult to apply directly in animal communication. Many statistics have
been built on this foundation, however, which are also conceptually
useful, and more tractable to use
Markov Chain Models of syntax are one example. Signal Detection
Theory is an information-theoretic framework, which uses a statistical
approach, similar to Type I and Type II errors in statistics (Wiley, Adv.
Study Behav. 2006).
III. Function, a. Info & Decisions
2. The Amount of Information
Signal Detection Theory (Acoustics)
Each curve is a probability density
function (PDF) of for the outputs of a
perceptual channel with and w/out the
signal; The threshold is where
response occurs
Signal detection theory predicts how
animals should increase the
separation of the background vs.
signal+background PDFs (i.e. signalto-noise ratio), e.g.:
• Increase repetition rate
• Use diff freqs than background
• Use diff amplitude modulation
• Use longer signals
III. Function, b. Info & Decisions
2. The Amount of Information
Another example of how information theory is used:
Zipf’s statistic evaluates the signal composition or ‘structure’ of a
repertoire by examining the frequency of use of signals in
relationship to their ranks (i.e. first, second, third versus most-to-least
frequent)(McCowan, et al. 1999)
Log10 Frequency
• Measures the potential capacity for info transfer at the repertoire level by
examining the ‘optimal’ amount of diversity and redundancy necessary
for communication transfer across a ‘noisy’ channel (i.e. all complex
audio signals will require some redundancy)
<1mo. old dolphin
whistles
Log10 Rank of use
Adult dolphin
whistles
III. Function, b. Info & Decisions
3. Encoding Information
Now we’ll discuss how the ideal receiver uses information to
update his/her probability estimates (i.e. calculate pupdated)
Example: Suppose the Receiver needs to know whether condition
C1 or condition C2 is currently true
There are two signals, S1 and S2 that can be used to provide
information about this question
We can summarize the coding rules for this system by constructing a
coding matrix with the conditional probabilities in the cells
E.g. conditional probability P(S1|C1) is
the probability that signal 1 occurs when
condition 1 is true
Signal:
Condition:
C1
C2
S1 P(S1|C1)
P(S1|C2)
S2 P(S2|C1)
P(S2|C2)
III. Function, b. Info & Decisions
4. Making Decisions
A Receiver’s task is to combine prior probabilities, knowledge
of the coding matrix, and receipt of a particular signal to
produce new updated probabilities of the alternatives
There are many ways to update, but no mechanism of
updating can be more accurate than Bayesian updating;
it’s the theoretical upper limit
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Basic Logic: First, assemble the priors and coding matrix
Baye’s Theorem states that the updated probability that C1 is
true after receipt of signal S1 is:
p(C1 ) p( S1 | C1 )
p(C1 | S1 ) 
p(C1 ) p( S1 | C1 )  p(C2 ) p( S1 | C2 )
Signal:
Condition: C1
C2
S1
p(S1|C1)
p(S1|C2)
S2
p(S2|C1)
p(S2|C2)
Priors:
p(C1)
p(C2)
Note that all the numbers we
need to solve this are in our
coding matrix
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
p(C1 ) p(S1 | C1 )
p(C1 | S1) 
p(C1 ) p(S1 | C1 )  p(C2 ) p(S1 | C2 )
The numerator is the prob that we would see C1 and S1 together
The denominator is the prob that we would see C1 and S1
together plus the prob we would see S1 and C2 together; thus the
denominator is the overall fraction of time we might see an S1
signal
The best estimate of the updated probability is thus the fraction of
time that we observe an S1 signal and it co-occurs with C1
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example:
• Suppose females of a bird species use the rate of
male songs to assess the health of potential mates
• Healthy males tend to sing Fast songs and sick ones
tend to sing Slow songs
• Suppose the two types of males are almost equally
common (52% healthy, 48% sick)
• Suppose also that coding is not perfect: Good males
sing Fast songs 70% of the time, whereas Bad males
sing Slow songs 60% of the time
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example: We first assemble the information available before
receipt of a signal
A female assumes any male has a 52% chance of being Good
before she hears any songs
Signal:
Condition: Good
Fast
Slow
Priors:
0.70
Bad
0.40
Probability Good
0.30
0.60
0.52
0.48
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example: After receipt of a Fast song, that estimate goes to:
p(Good | Fast) 
…If that song had been Slow, her estimate would have been:
p(Good | Slow) 
Signal:
Condition: Good
Fast
Slow
Priors:
0.70
Bad
0.40
Probability Good
0.30
0.60
0.52
0.48
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example: After receipt of a Fast song, that estimate goes to:
p(Good | Fast) 
Sequential updating: If the female is finished listening, then
0.655 is her final estimate. But if she’s going to keep listening,
she now updates her priors to the new values she has obtained
Bad
Fast
0.70
0.40
Slow
0.30
0.60
0.52
0.48
Priors:
1.00
p(Good)
Signal:
Condition: Good
0.5
# Songs Sampled
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example: Suppose 2nd song is also Fast:
p(Good | Fast) 
And she again updates her priors by replacing them with the
most recent updated probabilities
Bad
Fast
0.70
0.40
Slow
0.30
0.60
0.655
0.345
Priors:
1.00
p(Good)
Signal:
Condition: Good
0.5
# Songs Sampled
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example: Suppose 3nd song is Slow:
p(Good | Slow) 
She updates to the new probabilities and uses these as the next
prior probabilities…
Bad
Fast
0.70
0.40
Slow
0.30
0.60
0.766
0.621
0.234
0.379
Priors:
1.00
p(Good)
Signal:
Condition: Good
0.5
# Songs Sampled
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example: Suppose 4th song is Fast:
p(Good | Fast) 
And so on…
Bad
Fast
0.70
0.40
Slow
0.30
0.60
0.621
0.379
Priors:
1.00
p(Good)
Signal:
Condition: Good
0.5
# Songs Sampled
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example: Sequential Sampling: Although the trajectory is
jagged (and different every time), the general trend if a male is
truly Good will be up, and if he is Bad, down. The Truth will
come out….
p(Good)
1.00
Good Male
Trajectory
Fast Songs
Slow Songs
0.5
Bad Male
Trajectory
0.0
# Songs Sampled
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example: Sequential Sampling: Note that in general, the change
in probabilities, Dp, for each successive song is smaller than for
earlier songs
What does this mean for the amount of information transmitted?
p(Good)
1.00
0.5
0.0
# Songs Sampled
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Example: Sequential Sampling: Finally, note that less accurate
coding matrices will cause the cumulative estimate to take
longer to asymptote to the extreme:
What does this mean for the amount of information transmitted?
p(Good)
1.00
More accurate code
Less accurate code
0.5
0.0
# Songs Sampled
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Do Animals use Bayesian updating?
Sequential assessment of signals and cues is very common,
from primates to honeybees; Bayesian updating is an optimal
strategy for sequential updating if:
• Animals have reasonable prior probabilities about likelihood of
alternative conditions, and accuracy of coding scheme
• Animals have time to assess signals and cues sequentially
• Animals have the neural capacity to store the information
Some animals use short cuts and rules of thumb for updating
which may be quite good. Bayesian updating is the best
possible, but that’s not always the optimal thing to do… Even
so, understanding BU is important because it defines the
upper limit of what’s possible for comparison!
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Mate searching by female satin bowerbirds:
Females visit males at their bowers to
assess their signals (bower, decs)
Males mate with multiple females
How should females find the best male?
Alternative hypotheses:
Threshold vs. Best-of-n models
Al Uy found that females visit multiple males at their bowers during the
mating season and they visit each male multiple times (sequential
updating) before mating with one male
Fits predictions of a Best-of-n model with Bayesian updating
(Luttbeg 1996)
III. Function, b. Info & Decisions
4. Making Decisions, i. Bayesian Updating
Mate searching by female satin bowerbirds:
When females find a high-quality male, they
shop less the next year and often mate with
him again (they re-affirm prior estimates
and re-mate if he’s still good)
Females who mated with a bad male, will
avoid him the following year and find a
better mate
Older (more experienced) females often went straight back to the best
male in the population each year without shopping
When he died, they were all forced to start shopping again…
(Uy et al. 200, 2001)
III. Function, b. Info & Decisions
5. Take Home Messages
Benefits of Communication:
• Animals know the potential answers to most questions but
may be unsure which answer is currently true
• Senders can provide information that helps Receivers
improve their probability estimates for each alternative
• Receivers can improve estimates further by sampling
successively and/or only attending to accurate signals
Costs of Communication:
• Providing more accurate signals or sampling successively
increases the costs of communication for both parties
• How far does an imperfect signal have to change a prior
probability before it is worth the costs of sending and
receiving it? This is an optimization problem
III. Function, b. Info & Decisions
5. Take Home Messages
Optimal Information:
• It never really pays to try to send or seek perfect
information through signals
• Instead, animals are likely to establish some intermediate
compromise in which they sometimes err
• Errors in communication are not evidence of faulty
evolution but the reasonable application of good economics
• Optimality ≠ perfection!
III. Function, c. Honesty in Advertising
III. Function, c. Honesty in Advertising
Early ethological approach: Because signals evolve from intentions,
preparatory movements, physiological precursors, etc… they
reliably predict what sender will do next because sender can’t help
it (they are constrained to be honest). Often ignored conflict
entirely, and viewed communication as an altruistic exchange of
information
Dawkins/Krebs arms race and early game models: Senders should
try to trick, mislead, and manipulate receivers into giving responses
benefiting sender, and receivers should become mind-readers
trying to discount false signals
Zahavi Handicaps: Receivers only pay attention to signals that
impose a cost (handicap) on senders, which makes it costly to
send dishonest or exaggerated signals
III. Function, c. Honesty in Advertising,
2. Current Thinking
There are several dozen game-theoretic models of communication
when there is a conflict of interest between the sender and
receiver, each depicting a different signaling context
Common theme: There must be some type of cost or constraint
imposed on senders to guarantee honesty, but this cost is
different for each model or context
We’ll discuss 3 categories of costs in communication:
A. Necessary costs
B. Incidental costs
C. Constraints
Both senders and receiver may pay these costs, but it is the cost to
the senders which we use to categorize the signals. Costs to
receivers are also important, because they select for “mind-readers”
who only respond to honest signals.
III. Function, c. Honesty in Advertising,
1. Costs
A. Necessary Costs: Costs paid up front, do not depend
on receiver response, includes:
•
Prior investment by sender in special structures, coloration,
organs, brain circuitry, etc.
•
Immediate costs sustained by sender while communicating
such as time lost, energetic expenditure, and predation risk
•
Receivers also pay some necessary costs (assessment costs,
possible brain and sensory costs, etc.), which favors receivers
who only pay attention to honest signals (i.e. “mind-readers”).
III. Function, c. Honesty in Advertising,
1. Costs
B. Incidental Costs: Decreases in magnitude of payoffs to
either the sender or receiver; does depend on receiver
response
•
Costs to the sender: if the receiver punishes the sender for
sending the signal (e.g. badges of status), this selects for honest
signals.
•
Receivers can also pay incidental costs: If sender deceives the
receiver into acting against the receiver’s interests (sender
deceit, bluff, exaggeration, withholding information). These costs
select for receivers who only pay attention to honest signals (i.e.
“mind-readers”).
III. Function, c. Honesty in Advertising,
1. Costs
C. Constraints: limits on communication imposed by
environment, phylogenetic history and physics
Examples: Frequency and amplitude are limited by body size,
brain size limits learning of songs, etc.
These aren’t always costly to signalers, but they prevent
cheating because overcoming the constraints (if that’s even
possible), would require costs too large to bear
III. Function, c. Honesty in Advertising,
3. Types of Signals
Type of cost affects the signal form, i.e. whether the signal is
arbitrary or linked to the signal “message” (e.g. “I am big”)
Approach: classify and name signals by the type of cost
that guarantees honesty
Doing so, we end up with three types of signals:
A. Quality handicap signals
B. Index signals
C. Costly Conventional signals
Note: this is yet another area with many different terms and frameworks!
III. Function, c. Honesty in Advertising,
3. Types of Signals, A. Quality Handicaps
Zahavi (1975, 1977) proposed that signals need costs to maintain
honesty, and that we should see that animals pay for their
ornaments with fitness costs (e.g. they use up some of what
they’re advertising)
Idea not given much credence until 1990, when Grafen created a
plausible game theory model showing that it works. Recent work
by Getty makes the story much more complicated!
Maynard Smith and Harper (1995) discuss how there is a minimum
“efficacy cost” that must be paid to send the signal; handicap signals
are “Cost-added signals”, where there
is extra cost paid beyond efficacy cost
to ensure honesty (this is often forgotten
in measures of handicap costs)
III. Function, c. Honesty in Advertising,
3. Types of Signals, A. Quality Handicaps
Cost: necessary costs (signal production costs, predation)
Key feature: Poor quality individuals pay a higher cost to
produce a given level or intensity of display compared to high
quality individuals (condition-dependent handicap model)
Signal form: graded and linked to that aspect of quality that the
receiver wants to know (signal "uses up" the quality feature of
interest).
Information: Condition, health, vigor, fighting ability
Contexts: mate attraction, some agonistic interactions
III. Function, c. Honesty in Advertising,
3. Types of Signals, A. Quality Handicaps
Grafen 1990 model of mate quality
asymmetric continuous-strategy scramble
Assumptions:
Cost
or
Benefit
Display intensity
1. Displaying is costly, reducing
survival of displayer
2. High-quality male pays lower
cost than low-quality male
3. Females more likely to mate
with high-investing male
4. Female preference for given
display level is not different for
high and low quality males
Solution: At ESS, high-quality males display at a higher intensity than lowquality males, so display intensity is an indicator of male quality or condition.
(Males in better condition may be better parents, or have better genes, etc)
III. Function, c. Honesty in Advertising,
3. Types of Signals, A. Quality Handicaps
Nesting females
Widowbirds (Malte Andersson, 1982)
From BBC Life of Birds
Costly graduated tail
III. Function, c. Honesty in Advertising,
3. Types of Signals, A. Quality Handicaps
Carotenoid plumage color in house finches: costly to collect
and/or costly to use for plumage
(Geoffrey Hill 1990, 1991)
Feeding rate
by male
Redder males provide more
Coloration may be heritable
III. Function, c. Honesty in Advertising,
3. Types of Signals, B. Index Signals
Stabilizing cost: physical or physiological constraints
Key feature: Signal is physically constrained to be unbluffable
and honest
Signal form: inextricably linked to information revealed by signal
Information: body size, age, pointing
Context: agonistic, mate attraction, predator-prey
III. Function, c. Honesty in Advertising,
3. Types of Signals, B. Index Signals
Graphical representation of index signal with continuous state
and signal size:
Assumptions:
1. Form of signal should
be linked or associated
with some type of
physical or physiological
constraint
Signal
size
or
intensity
Sender attribute
2. Higher intensity signal
variants should be more
effective but not more
costly to produce
III. Function, c. Honesty in Advertising,
3. Types of Signals, B. Index Signals
Examples: body size indicators, pointers, amplifiers
Fundamental call
frequency (kHz)
Call frequency in toads is an
index for body size
Pointing at
nest site
gaze direction and pointing signals
Snout-vent length (mm)
Abdomen size is a condition index in the
jumping spider; triangle is an amplifier
III. Function, c. Honesty in Advertising,
3. Types of Signals, B. Index Signals
Ritualized pushing/pulling contests
Bison
Dempsey fish
Young bull elephants
Elephant seal pups
III. Function, c. Honesty in Advertising,
3. Types of Signals, B. Index Signals
Nestlings increase begging as hunger increases (visual and vocal signal)
before flush
after flush
% saturation
In some species, young nestlings exhibit red mouth
flush, hue and saturation of color increases with hunger
• Likely explanation: Physiologically constrained
(Kilner 1997)
Hue rank
Parents respond by increasing their provisioning rate
Time after feeding
III. Function, c. Honesty in Advertising,
3. Types of Signals, B. Index Signals
Notification of Detection Signals: California Ground
squirrels perform tail-flagging displays to predatory snakes
Adults not in danger from snakes but
young in burrows in serious danger
Parent tail-flag to drive snakes away
Rattlers have IR-sensitive pit organs;
GS’s produce an IR signal to
emphasize tail movement
Gopher snakes are not IR-sensitive,
and GS’s don’t use IR
Aaron Rundus, UCD grad student in
Animal Behavior & Don Owings
III. Function, c. Honesty in Advertising,
3. Types of Signals, B. Index Signals
Notification of Detection Signals: California Ground
squirrels perform tail-flagging displays to predatory snakes
If snake is not deterred, these
bad-ass squirrels attack…
When rattlers rattle, GS’s can
determine threat level (cue?)
III. Function, c. Honesty in Advertising,
3. Types, C. Costly Conventional Signals
Cost: Stabilizing cost is incidental (receiver retaliation)
Key feature: Retaliation rule: receivers "test" senders giving a
signal of similar size to their own (they dominate senders giving
smaller signals, and retreat from senders giving larger signals);
cheaters get caught because when they’re tested, they can’t
back it up. Cost of signal is thus higher for weak individuals.
Signal form: Arbitrary (symbolic) and antithetical, discrete or graded
Information: condition, fighting ability, motivation to escalate
Context: Agonistic interactions, cannot evolve solely for mating
III. Function, c. Honesty in Advertising,
3. Types, C. Costly Conventional Signals
What kind of cost? Sender Incidental costs
•
•
Size of patch is strongly correlated with
dominance rank
Males with experimentally enlarged
patches were attacked more often
Conventional signals not used solely
for female choice only; Why?
Aggressive encounters
per 15 min
(Møller 1987)
1.2
1.0
.8
.6
.4
.2
0
Controls
Experimentally
enlarged
patches
III. Function, c. Honesty in Advertising,
4. Cheating
Do animals ever lie, bluff, or cheat?
Although prior models imply that cheating is rare, there are
some clear examples of occasional dishonesty and bluffing in
otherwise “honest signals”
A mixture of honest signals and low levels of bluff may be
common. Possible reasons:
•
•
•
Perceptual errors by receivers may allow some cheaters
to escape detection
Co-evolving sender/receiver systems have not reached
an equilibrium
It may be to costly for receivers to get perfect information
(how much cheating to allow is an optimization problem)
III. Function, c. Honesty in Advertising,
4. Cheating
Tolerating deceit: Photinus and Photuris fireflies
Male Photinus flash in species-specific
patterns; females flash back, and males
approach to mate
Photinus
Photuris mimic the response flash of
females, then eat the males = deceit
Why do males respond to the female
signals?
Photuris
III. Function, c. Honesty in Advertising,
4. Cheating
Tolerating deceit: Photinus and Photuris fireflies
Males can’t find females without the
signals, so the benefit of responding to
the signals is large
Photinus
Photuris only eat males in 10-15% of
attempts, so cost is small on average
So it is optimal for male Photinus to
respond to female signals even though
they occasionally get eaten
Optimality ≠ Perfection!!
Photuris
III. Function, c. Honesty in Advertising,
4. Cheating
Tolerating exploitation: Tungara frogs & bats (Ryan 1982)
Female Tungara frogs prefer males
that produce a “chuck” call
Bats can use the chuck to localize the
males (quick onset, broadband)
It is optimum for males to produce the
calls, despite the risk of exploitation by
bats (benefits outweigh costs)
Males chuck less when females
absent; some males let other males
chuck, then intercept females
III. Function, c. Honesty in Advertising,
4. Cheating
When there is a conflict of interests between the
sender and receiver:
Most signals seem to be relatively accurate and honest
due to receiver selection for costly signals, but a low level
of inaccuracy and cheating is probably common
When there is very little to no conflict of interests
between the sender and receiver:
Honesty is easier to maintain, since both parties benefit
from honesty
III. Function, c. Honesty in Advertising,
5. Communication with low conflict
Low-cost signals occur with low conflict of interest
If little to no conflict, no-cost conventional signals are stable, and
everyone benefits by following the rules
• Male bluehead wrasse increase the
rate of fin movement as they approach
spawning
• Spots on fins ‘amplify’ movement
• Rate of movement gives information
about the male’s ‘intention’ to spawn
• Male and female both benefit from
coordinating spawning
(Dawkins and Guilford, 1994)
III. Function, c. Honesty in Advertising,
3. Types of Signals, D. Summary
SIGNAL TYPE
COST
SIGNAL DESIGN
INFORMATION
Quality
handicap
Necessary
costs:
production,
time, predation
Graded display,
intensity correlated
with sender quality
Index
Physiological
and physical
constraints
Discrete or graded,
Body size, strength,
form linked to sender age, natal area,
attributes
pointing
Costly
Conventional
Incidental
Arbitrary form,
Costs: Receiver discrete or graded
retaliation
signal
Health, condition,
stamina, fighting ability
Motivation, willingness
to escalate or fighting
ability
III. Function, c. Honesty in Advertising,
3. Types of Signals, D. Summary
What maintains honesty in human advertising?
III. Function, d. Current Areas of Research
1. Honest signaling: Getty (TREE, 2006) argues that
handicaps are (unfortunately) not as simple as Grafen says.
Bergstrom et al. (PRS, 2002) show that in some
circumstances, low-cost honest signals are possible. Also, lots
of empirical work measuring costs...
2. Sensory Drive: How does sensory drive shape signals? Can
this contribute to reproductive isolation & speciation
(Boughman, TREE 2002)?
3. Noise Impacts: How does anthropogenic noise affect animal
communication and thus fitness? (Rabin et al., J Comp Psych
2003; Warren et al., Animal Behavior 2006; Patricelli and
Blickley, Auk 2006)
III. Function, d. Current Areas of Research
4. Brood Parasitism: The arms race between hosts and
parasites. Can these arms races lead to speciation? How do
parasites dupe hosts? How do hosts evolve resistance?
(Davies, Kilner, Hauber)
5. Sexual Selection: What do male signals indicate (e.g.
parasite resistance)? How do traits and prefs evolve (Kokko
et al., ARES 2006)? Can divergent sexual selection in
isolated populations lead to speciation (Panhuis et al. 2001)?
6. Multimodal signals / multiple signals: Why multiple signals
for the what seems to be the same question? Are they actually
different questions? Are there different receivers (Andersson
et al., AmNat 2002; Coleman et al., Nature 2004)? Is it driven
by efficacy needs (Hebets & Papaj, BES 2005)?
III. Function, d. Current Areas of Research
7. Bird Song: How are songs in a repertoire used in interactions
(e.g. song matching)? How are songs learned? How do local
dialects arise? Do dialects contribute to reproductive isolation
among populations?
8. Receiver Perceptual systems: How do biases in the
sensory and perceptual systems of receivers shape signals?
E.g. How does perception shape comparisons (Bateson &
Healy TREE 2005; Hebets & Papaj, BES 2005; Ten Cate et
al., Current Biol 2006)...and of course squirrels with IR tails!
9. Referential/Representational Signals: Are signals truly
referential to the external world or just the internal state of the
animal (e.g. fear)? Do they evoke mental representations in
receivers? (Evans & Evans Biol Letters 2006)
Download