Primary versus secondary musical parameters and the classification

advertisement
Musicæ Scientiæ
Discussion Forum 4B, 2009, 139-179
© 2009 by ESCOM European Society
for the Cognitive Sciences of Music
Primary versus secondary musical parameters
and the classification of melodic motives
Zohar Eitan* and roni y. granot**
* Tel Aviv University, Tel Aviv, Israel
** The Hebrew University, Jerusalem, Israel
• Abstract
Music theorists often maintain that motivic categorization in music is determined
by “primary” musical parameters — music-specific aspects of pitch and temporal
structure, like pitch intervals or metric hierarchies, serving as bases for musical
syntax. In contrast, “secondary” parameters, including important aspects of extramusical auditory perception like loudness, pitch register and timbre, do not establish
motivic categories. We examined systematically the effects of contrasts in primary
vis-à-vis secondary musical parameters on listeners’ melodic classification. Matrices
of melodic patterns, each presenting 8 motives, were created by all interactions
of two contrasting conditions in three musical features. In Experiment 1, four
matrices manipulated pitch contour, pitch-interval class, and a compound feature
involving the secondary parameters of dynamics, pitch register and articulation. In
Experiment 2, four different matrices manipulated rhythmic structure (metrical and
durational accent), pitch intervals, and the compound feature used in Exp1.
Participants (95 participants, 27 musically trained, in Exp. 1; 88 participants,
23 musically trained, in Exp. 2) classified stimuli in each matrix into two equalnumbered (4-4) groups of motives “belonging together.” In both experiments,
most participants used contrast in secondary parameters as a basis for classification,
while few classifications relied on differences in pitch interval (Exp. 2) or interval
class (Exp. 1). Classifications by musically trained participants also applied melodic
contour and rhythm. Results suggest a hierarchy of musical parameters that
challenges those suggested by most music theorists. We discuss the ramifications
of the results for notions of perceived thematic-motivic structure in music, and
consequentially for cognitively-informed music analysis.
Keywords: musical motives, categorization, secondary parameters, pitch, rhythm.
139
MS-Discussion_Forum_4B-RR.indd 139
23/06/09 11:41:20
Introduction
The ability to organize the environment into classes and categories is a basic cognitive
faculty, supporting information processing under limitations of memory and attention.
Music capitalizes on this ability by creating structures which are based on similarities
and contrasts in diverse parameters, applied across its multiple organizational levels
(see, e.g., Deliège, 1996, 2001a, 2001b). One important aspect of such structures is
the formation of units which maintain their identity under various transformations.
In music theory and analysis, the smallest units consistent with this definition are
termed musical motives, themselves the building blocks of larger musical structures
such as musical themes (Drabkin, 2001a, 2001b). Operationally, motives may be
described as variables “having time-varying complex acoustic properties with
temporal constraints,” (Bartlett, 1984), situated at the bottom of a musical grouping
hierarchy (Lerdahl & Jackendoff, 1983; Deliège, 1987).
Music scholars differ widely in their notions of the constituting parameters,
structural functions, and very identity of musical motives 1. Yet, some notions
concerning motives and their manipulation in music are almost taken for granted by
most scholars. Firstly, it is generally agreed that motivic repetition, similarity and
variance are often important aspects of musical structure, shaping both small-scale
structures (e.g., a succession of similar motives shaping a musical phrase) and
significant large-scale relationships (as those between the opening motives of a
sonata-form movement and their variants in its development section). Secondly, a
broad agreement also exists regarding the parameters defining motivic categories,
that is, the musical variables that stay constant within a group of related motivic
variants. These consist firstly of time-ordered pitch-interval or interval-class (IC)
relationships 2 (often with tonal or voice-leading aspects taken into account), and
secondly of temporal structure, specifically the relative durational proportions and
metric accentuation of the motive’s constituent notes. Other features (such as
instrumentation, dynamics, tempo, textural and rhythmic density, or pitch register)
are supposedly “secondary parameters”: they may create variants within a motivic or
thematic category, but do not define or constitute motives and themes. Hence, a
motive played louder, faster, or transferred to a higher octave, would remain “the
(1) Compare, for instance, Burkhart, 1978, or Cohn, 1992 (examples of Schenkerian views), with
Schoenbergian approaches, as in Schoenberg, 1934/1995, 1967, or Frisch, 1987. For recent reviews
and discussions of thematic-motivic theory, see, e.g., Boss, 1999; Dunsby, 2002; van den Toorn,
1996; Zbikowski, 2002.
(2) An Interval Class (IC) comprises a pitch interval n, its octave complement (12 - n), and their
octave compounds. For instance, IC 2 includes the intervals 2 (major 2nd), 10 (minor seventh), 14
(major ninth), etc. Interval classes can be regarded as measuring intervals between pitch classes,
disregarding pitch height. Thus, for instance, C4 and D4, as well as D4 and C5, are both separated
by IC 2. Though the term is mainly used in the analysis of post-tonal music (see, e.g., Straus, 1990,
Ch. 1), it is also applicable to the analysis of motivic variance in tonal music.
140
MS-Discussion_Forum_4B-RR.indd 140
23/06/09 11:41:21
Musical parameters and motivic classification
zohar eitan and roni y. granot
same” motive, provided that its pitch/time structure (as defined by intervals or
interval-classes between its constituent pitches, and by their proportional durations
and metric accentuation) remains unaltered. In contrast, motives that differ in their
pitch or pitch/time structure, but resemble each other in features such as dynamics,
pitch register or instrumentation, would rarely be identified by music theorists as
belonging to the same motivic or thematic category (see, e.g., Meyer, 1973, p. 46).
Thus, for instance, Figure 2 (from Liszt’s Sonata in B minor, mm. 124-130), is
regarded as a close variant of the piece’s 2nd thematic statement (Figure 1; ibid.,
mm. 8-13), despite the extreme contrasts between the two statements in many
secondary parameters (tempo, texture, dynamics, articulation), since they share
defining pitch- (similar succession of pitch intervals) and temporal structure (durational
proportions and metric hierarchy of constituent notes). Figure 4 (mm. 673-676), on
Figure 1.
Liszt, Sonata in B Minor, mm. 8-13.
Figure 2.
Liszt, Sonata in B Minor, mm. 124-130.
141
MS-Discussion_Forum_4B-RR.indd 141
23/06/09 11:41:21
the other hand, will not be acknowledged as a variant of Figure 1, despite their close
similarity in the secondary parameters of dynamics (f or ff), articulation, texture
(four-octave doubling), tempo (fast) and registration. Rather, due to similarity in
pitch structure (i.e., pitch-interval succession), it would be readily categorized as a
variant of the sonata’s opening theme (Figure 3; mm. 1-7), their vast differences in
the very same secondary parameters of dynamics, articulation, texture and tempo
notwithstanding. 3
Figure 3.
Liszt, Sonata in B Minor, mm. 1-7.
Figure 4.
Liszt, Sonata in B Minor, mm. 673-676.
What distinguishes primary from secondary musical parameters?
Meyer (1985, 1989) suggests that the basis for distinction between primary and
secondary parameters in music is the capacity of the former to produce discrete, nonuniform relationships among distinct elements, “so that the similarities and
differences between them are definable, constant, and proportional” (1989, p. 14).
Primary parameters can thus define elements or relationships which establish
mobility and closure (e.g., up-beat and down-beat, leading-tone and tonic, dissonance
(3) For analyses of thematic transformation in Liszt’s B minor Sonata, see Hamilton (1996), Walker
(2005). Liszt’s Sonata was used in Pollard-Gott’s study of thematic perception (1983).
142
MS-Discussion_Forum_4B-RR.indd 142
23/06/09 11:41:21
Musical parameters and motivic classification
zohar eitan and roni y. granot
and consonance), hence enabling musical syntax. In contrast, “the relationships
created by secondary parameters involve changes in relative amount along an
unsegmented continuum; for instance, faster or slower pitch-frequency, louder or
softer dynamics, higher or lower rates of activity, etc.” Hence, “dynamics, timbre,
rates of activity, pitch-frequency, concord and discord, and so on are understood to
be secondary parameters. Because they do not stipulate relationships of mobility,
stability and closure, these parameters do not give rise to syntax.” (Meyer, 1985,
p. 46).
From the viewpoint of ecological psychology, Balzano (1986) similarly qualifies
the unique properties of pitch and time in music as presenting a selected set of
discrete relationships which reduce the potentially infinite number of possible values
to a cognitively manageable number. This provides for efficient processing, in turn
serving as a basis for higher-level cognitive structuring. In the pitch domain, such
relationships are derived from a unit interval, which creates the entire pitch space by
iterative steps within the boundaries of the octave (“generatively quantized” in
Balzano’s terms, ibid., p. 218). In Western music, three isomorphic pitch structures,
based on the semitone, the fifth, and major or minor thirds, create this space. In the
temporal domain, equivalence relationships are derived from the position of beats
within the metrical cycle, generating temporal invariants such as on- versus off-beat
or strong versus weak beat.
Note that distinctions between primary and secondary parameters are not
equivalent to a psychoacoustically-based distinction between auditory parameters
(e.g., pitch and duration as “primary”, loudness and timbre as “secondary”
parameters). Both pitch and duration can generate continuous, secondary musical
relationships (as defined by Meyer), as well as syntactic, primary ones. Thus, while
pitch chroma establishes primary parameters, defining discrete elements and
relationships like intervals, ICs, or (in tonal music) scale-degrees, pitch height is a
secondary, non-syntactic parameter, as it “involves changes in relative amount along
an unsegmented continuum” (Meyer, 1985, p. 46), rather than limited, discrete
relationships. As Balzano notes, this is because pitch chroma limits the number of
relevant pitch categories (e.g., pitch-classes, intervals or ICs), such that they can be
efficiently discriminated and processed. In contrast, distinctions regarding pitch
height are either continuous and unsegmented, or involve simple, crude binary
distinctions (“high” vs. “low”, “ascending” vs. “descending”). Similarly to pitch,
temporal relationships can generate both primary parameters (such as those defined
by musical meter and rhythm), and continuous ones, as expressed, for instance, in a
gradual change of tempo or event density (the rate of event per unit of time).
Importantly, the distinction between primary and secondary parameters in music
also demarcates the specifically-musical from the extra-musical, “natural” aspects of
auditory organization. In fact, natural, extra-musical auditory stimuli, as well as
speech (see, e.g., Cruttenden, 1997) or expressive vocalizations in humans and other
species (Hinton, Nichols, & Ohala, 1994), are characterized by changes in secondary
143
MS-Discussion_Forum_4B-RR.indd 143
23/06/09 11:41:21
parameters like loudness, articulation, timbre, pitch register and pitch contour. In
contrast, primary parameters and their constituent elements, such as discrete pitch
intervals or metric hierarchies, are hardly found outside music. Thus, as Meyer notes
(1989, p. 16; also see Eitan, 1997; Hopkins, 1990), secondary parameters may
function as “natural” signs (e.g., of continuity or discontinuity), while primary,
syntactic parameters are conventional. Hence, while the perception and cognition of
secondary parameters, even in a musical context, may be chiefly based upon general
auditory experience (or perhaps even on innate tendencies), processing primary
parameters relies chiefly on music-specific exposure (Demorest & Serlin, 1997;
Dowling, 1978, 1999; Dowling & Bartlett, 1981; Edworthy, 1985), though not
necessarily on explicit music training (Bigand & Poulin-Charronnat, 2006).
As figures 1-4 demonstrate, manipulations of musical motives and themes often
involve considerable contrasts with the original theme in secondary parameters like
loudness and pitch register, while maintaining underlying structure, as established by
the primary parameters of pitch intervals, scale degrees and metric hierarchy. In
contrast (as in Figures 1 and 4), musical figures which are, according to music
theorists’ analyses, structurally unrelated, may still exhibit close similarities in their
secondary parameters. This study aims to examine systematically whether listeners’
categorizations of musical patterns that vary in primary or secondary parameters
would reflect music theorists’ deeply-held convictions, by giving preference to
similarities in primary parameters like pitch intervals over affinities in secondary
parameters like dynamics, pitch register or timbre.
Musical parameters in similarity and classification tasks
In contrast to predictions based on music theory, listeners often tend to categorize
musical events according to similarities in domains such as dynamics, density, or
pitch contour — music theorists’ “secondary parameters” — rather than through
domains specifically related to musical style or syntax, such as harmonic progression
or melodic intervals (Lamont & Dibben, 2001; Pollard-Gott, 1983). Though this
tendency is particularly prominent given little exposure to the music in question
or limited music training, even with repeated exposure and considerable training
some similarity relationships considered by music theorists essential to a proper
understanding of the piece’s structure are simply not perceived. This finding is
particularly valid for post-tonal music (Bruner, 1984; Gibson, 1986, 1993). When
music whose motivic structure is based upon post-tonal pitch transformations is
presented, listeners often hear other associative relationships instead, based upon
similarities and contrasts in secondary parameters like dynamics or register, shared
affective connotations, or cross-modal relationships, and ignore the pitch-based
transformations suggested by music theorists and composers (Dibben & Clarke,
1997; Hudgens, 1997; McAdams et al., 2004). Yet, simple perceptual cues for
categorization may prevail even in tonal music. For instance, a multi-dimensional
scaling in Lamont and Dibben (2001) revealed that the two principal dimensions of
144
MS-Discussion_Forum_4B-RR.indd 144
23/06/09 11:41:21
Musical parameters and motivic classification
zohar eitan and roni y. granot
thematic similarity in a Beethoven sonata were associated with supposedly secondary
parameters such as dynamics, articulation, texture, tessitura and pitch contour, while
tonal pitch configurations, determinants of thematic identity according to most
music theorists, were not associated with the principal dimensions. Similarly, Ziv
and Eitan (2007), who used Lamont and Dibben’s stimuli in a categorization task,
found that features best corresponding to listeners’ categorizations of thematic
variants included texture, pitch contour, and dynamics, while harmonic and melodic
interval structures, preeminent in music analysis, again did not play a major role in
listeners’ thematic categorizations in both a tonal piece (the 1st movement of
Beethoven’s piano sonata, op. 10 no. 1) and a dodecaphonic piece (Schoenberg’s
Klavierstück, op. 33a). 4
Importantly, previous studies of motivic and thematic perception (e.g., Eerola et
al., 2001; Lamont & Dibben, 2001; McAdams et al., 2004; Pollard-Gott, 1983; Ziv
& Eitan, 2007) have mostly used actual materials from the musical repertory,
including complete pieces of music. While itself laudable (as it examines perception
in a real musical context, thus enhancing ecological validity), the use of complex
musical stimuli, in which many musical parameters may operate simultaneously and
at several levels of syntactic hierarchy, makes the relative contribution of these
parameters very hard to dissociate. This difficulty is clearly demonstrated not only in
studies applying complex post-tonal music, like McAdams et al. (2004), but even in
the simpler case of folk melodies (Eerola et al., 2001), where the authors found it
difficult to interpret the similarity ratings of participants, suggesting that the relative
contribution of each parameter can only be determined in more controlled stimuli
(p. 285). The current study applies such controlled stimuli to examine the relative
contributions of supposedly primary versus secondary musical parameters in a
classification task.
Aims and general design
In the experiments reported here, we aim to investigate directly and systematically to
what extent listeners’ preferred classifications of musical stimuli rely on primary
parameters like pitch intervals, as most music theorists suggest, or on secondary
parameters like pitch contour, dynamics or register. The experimental stimuli
(melodic figures of 1-4 seconds, comprising 3-6 notes, akin in duration to typical
musical motives) were manipulated using a 3X2 factorial design, in which three
musical variables were featured in two contrasting states each, with other musical
variables kept, as far as possible, constant. Altogether, this design created matrices of
eight different motives, each characterized by a unique combination of the three
(4) A seeming exception is Bigand (1990), where participants successfully sorted variations to a
melody according to their underlying structure. We refer to this study in the discussion (pp. 166-67).
145
MS-Discussion_Forum_4B-RR.indd 145
23/06/09 11:41:21
musical variables (see Figure 5). In Experiment 1 we manipulated three variables.
The first is interval class (IC) — perhaps the most important primary parameter in
analyses of post-tonal music, and of considerable importance in analyses of tonal
pitch structure as well. In post-tonal theory, IC is a defining feature of basic pitchclass structures. In tonal music, ICs, relating an interval and its octave complement
(e.g., a major 3rd and a minor 6th), define relationships of seminal harmonic,
contrapuntal and motivic significance, such as that between a triad and its inversions,
or invertible counterpoint. The second variable is pitch contour, established as a
seminal feature in melodic memory processes (e.g., Dowling, 1978; Edworthy,
1985). The third variable, “Expression,” is a compound of secondary parameters,
including dynamics, texture (unison or octave doubling), articulation (legato or
staccato) and register. Experiment 2 (conducted with a different set of participants)
manipulated pitch intervals (rather than IC), rhythm (agogic and metrical accents),
and “Expression” — the same compound of secondary parameters used in
Experiment 1. Each experiment presented four different matrices. Experiment 2 also
examined the effect of tonality, as two of the matrices in that experiment involved tonal
figures, while the remaining two matrices involved atonal variants of those figures.
The two experiments involve the same tasks and procedures. The main task in
both was to classify the eight motives in each matrix into 2 equal-numbered (4-4)
groups whose members seem to “belong together”. Out of 35 possible classifications
of each matrix 5, only three would maintain categorizations consistently based on one
of the three manipulated variables. This task addresses our main research question:
which (if any) of the manipulated musical variables guide listeners in classifying
musical motives. More specifically, we examine whether musical parameters regarded
as “primary” by academic music scholars (e.g., IC in Experiment 1) indeed serve a
primary role in listeners’ classifications.
After classification, subjects were asked to select the motive which represents the
“best example” of each of the two groups they have created. This task investigates
typicality gradience, one of the main prototype effects discussed by Rosch (1978).
Discovering such prototype effects may shed light on perceived relationships among
different musical parameters (e.g., would a motivic category defined by a specific IC
pattern be associated with a specific contour?).
We note at the outset that our experimental setup and task is evidently not
equivalent to “natural” music listening. Nonetheless, as proposed in the Discussion
(pp. 170-72 ), we maintain that our results are still relevant to the main question at
hand, namely, the relative weight of the primary versus secondary parameters in
motivic classification.
(5) Given 8 different stimuli, 70 different 4+4 groupings are possible; however, since each two of
these groupings (e.g., <1,2,3,4> <5,6,7,8> and <5,6,7,8> <1,2,3,4>) are equivalent, 35 nonequivalent groupings are available.
146
MS-Discussion_Forum_4B-RR.indd 146
23/06/09 11:41:21
Musical parameters and motivic classification
zohar eitan and roni y. granot
Methods
Experiment 1
• Participants: Ninety five Tel Aviv University students, (aged 18-44, M = 23.75;
SD = 3.82; 56 females, 39 males) participated in the experiment. Twenty-seven of
these (20 females, 7 males) had at least seven years of musical training (“musicians”),
while the remaining 68 (32 females, 36 males) had little or no formal musical
training.
• Experimental materials: The musical stimuli consisted of four sets or matrices of
eight brief melodic motives each (3-5 notes; overall durations: 1.3, 1.5, 2.6, and
1.9 seconds, for motives in matrices A-D, respectively). All matrices applied a similar
event rate. 6 All stimuli applied a piano-like synthesized sound, created through
Sibelius 1.2 music software, using the software’s “Grand Piano” sound, with the
software’s “expression” and rubato features turned off.
The motives in each of the matrices were created using a 2 X 2 X 2 design: Two
sets of ordered interval classes (IC, Type 1 & Type 2 in Table 1), two contrasting
pitch contour types (hereafter contour gestures), and two contrasting “expressive
gestures,” defined by a compound of secondary parameters (see Table 1 and
Figure 5). In the two opposing IC sets in each matrix, ICs with same ordinal position
all differed, and the two sets had at most one IC in common overall. Opposing
contour gestures in each matrix exhibited contour inversions (mirror images), such
that if in one figure the contour was up-down-up, in the contrastive figure it was
down-up-down. Note that contour inversions which also preserved ICs (e.g.,
motives 1 and 3 in Figure 5) did not preserve pitch intervals: in such cases, an
ascending interval turned into its descending octave complement and vice verse (e.g.,
ascending minor 2nd  descending major 7th, both representing IC 1), rather than
into an identical descending interval. “Expressive gestures” were defined by a
compound of dynamics, texture (one or more voices in octave doubling), pitch range
and articulation (various degrees of staccato vs. legato), all of which were shaped
contrastively in each matrix (Table 1). The two Expressive gestures in each matrix
were presented in different pitch transpositions.
While tonality was not systematically manipulated in this experiment (e.g., tonal
vs. non-tonal motives; but see Experiment 2 below), contrasting IC sets in two of the
matrices opposed common tonal figures with non-tonal ones. In matrix A (see
Table 1), one IC configuration creates a major triad arpeggiation, while the other is
a chromatic configuration, hard to accommodate into any major or minor key. In
matrix B, one IC configuration constitutes a diminished-seventh chord, while the
(6) Note that although the nominal tempo of Matrix A is 90 BPM, while that of all other matrices
is 160 BPM, the rhythmic values of Matrix A are very short, such that the event rate of this matrix
is similar to that of the other matrices.
147
MS-Discussion_Forum_4B-RR.indd 147
23/06/09 11:41:21
other is non-tonal. In matrices C and D, none of the motives constitute segments of
diatonic major or minor keys or clearly tonal chromatic configurations.
As an illustration of the stimuli design, Figure 5 presents one of the four matrices
(Matrix B) used in Experiment 1. Stimuli 1-4 in this matrix all preserve the same IC
succession <1, 2, 6>, while stimuli 5-8 present a different one, <3, 6, 3>. Stimuli 1,
2, 5, 6 present the same pitch contour (rise-fall-rise), while stimuli 3, 4, 7, 8 present
its inverse (fall-rise-fall). Finally, stimuli 1, 3, 5, 7 and 2, 4, 6, 8 contrast with regard
to a compound of secondary parameters: the former present a single legato line in
the middle register, played softly (p followed by a diminuendo), while the latter
present a sharp staccato outburst, played very loudly (ff ), with two octave doubling.
Other musical variables, like rhythm and instrumental timbre (synthesized piano),
are kept constant throughout the matrix.
Figure 5.
Experiment 1, Matrix B.
148
MS-Discussion_Forum_4B-RR.indd 148
23/06/09 11:41:22
MS-Discussion_Forum_4B-RR.indd 149
(160BPM)
D
(160BPM)
C
<1,3>
<1,1,3,2>
(1,2,6>
B
<4,6>
<5,5,1,4>
<3,6,3>
(diminishedseventh chord)
<1,1,4>
Interval class
Type1
Type2
<3,4,5>
(major triad)
(160BPM)
Table 1
<U,D>
<U,U,U,D>
<U,D,U>
<U,D,U>
<D,U>
<D,D,D,U>
<D,U,D>
<D,U,D>
Contour (Up-Down)
Range:C3-F#6
f>p
Legato
Single line
Range: Bb3-C#4
Range:F3-F5
f < ff > f
Legato
Octave doubling
Range:C4-A5
p > pppp
Legato
Single line
pp > pppp
Legato
Single line
Expression
ff < ffff
Staccato
3 octave
doublings
Range:A1-F#5
ff
Staccato
3 octave
doublings
Range:Ab1-F#5
pppp
Staccato
4 octave
doubling
Range:Bb0-A6
pppp
Staccato
Single line
Range: Ab1-C#3
Description of the manipulated variables in the four matrices (A-D) in Experiment 1
(90BPM)
A
Matrix
Table 1
Musical parameters and motivic classification
zohar eitan and roni y. granot
149
23/06/09 11:41:22
• Task and procedure: Participants were asked to classify four matrices of eight
motives into two equal-numbered (4-4) groups of motives each (Task 1), and to
indicate which of the four motives in each group was the “best example” of that
group (Task 2). Motives were presented through a dedicated computer program.
Participants’ interface presented eight numbered icons, representing the eight
motives, at the bottom of the screen (the numbering of motives was randomized for
each participant and for each set of motives), and two boxes situated above them.
They could listen to each motive by clicking its icon, and could create groupings by
dragging the icons to the right or left boxes. Participants were allowed unlimited
hearings of each motive, and could experiment with different classifications with no
limit of time or number of reiterations. They were tested individually, with the motives
presented over earphones (Yamaha RH-M5A). The software kept track of and timed
all participants’ activities (i.e., all clicks, indicating hearings, and all drags, indicating
classifications). In the current paper, however, we analyze only the final choices
(classifications and “best examples”) and overall duration (from initial presentation
of the stimuli to final decision) of the classification process in each block.
Experiment 2
• Participants: 88 Tel Aviv University students, (mean age = 24.8, SD = 5.43;
32 females, 56 males) participated in the experiment. Twenty-three of the participants
(5 females, 18 males) had at least seven years of musical training (“musicians”), while
the remaining 65 (27 females, 38 males) had little or no formal musical training.
None of the participants took part in Experiment 1.
• Experimental materials: As in Experiment 1, musical stimuli consisted of four
matrices of eight short (3.9 - 4.3 seconds) melodic motives each, using a piano-like
synthesized sound. This experiment differed from Experiment 1 in the musical
parameters it manipulated. Here, the 2 X 2 X 2 design manipulated pitch Intervals
(rather than ICs), Rhythm (durational and metric accents), and Expressive Gesture
— a compound of the same secondary parameters used in Experiment 1, which
included dynamics, texture, pitch register, and articulation. Pitch contour, manipulated
in Experiment 1, was kept constant within matrices in Experiment 2. In contrasting
sets of ordered pitch intervals, all corresponding intervals differed; 7 in addition,
global differences were created between the two contrasting sets, such as conjunct
(stepwise) motion versus leaps or chordal arpeggiation. Contrasting rhythmic figures
featured opposite agogic and metrical accents (i.e., a note accented in one figure was
unaccented in the other) and different durational proportions. Figure 6 presents, in
musical notation, one of the four matrices used in this experiment; Table 2
summarizes the features manipulated in the four matrices.
Unlike Experiment 1, Experiment 2 manipulated tonality systematically. Two of
(7) A single exception is the octave which opens motives in matrix A.
150
MS-Discussion_Forum_4B-RR.indd 150
23/06/09 11:41:22
Musical parameters and motivic classification
zohar eitan and roni y. granot
Figure 6.
Experiment 2, Matrix A.
the four matrices (A, C) presented tonal (major mode) motives only. For each of
these two tonal matrices, we created a matrix (B, D) whose motives corresponded to
it in Rhythm and “Expression,” but altered pitch intervals such that tonality was
obscured. Figure 7 presents two such pairs of motives (from matrices A and B) —
two major mode motives, and their two atonal counterparts.
151
MS-Discussion_Forum_4B-RR.indd 151
23/06/09 11:41:22
152
MS-Discussion_Forum_4B-RR.indd 152
23/06/09 11:41:22
Table 2
(92BPM)
B2
(92BPM)
B1
(84BPM)
A2
(84BPM)
A1
Matrix
<+6, +1, +8, -2, -6 >
(Atonal)
<+4, +3, +2, -2, -1 >
Scale Degrees: 1-35-6-5-4#
Keys: BbM, BM
<+11, -1, -2, -1, +2
>
(Atonal)
<+2, +1, +9, -,
7-,3>
Scale Degrees:
2-3-4-2-5-3
Keys: BbM, BM
<+3, +8, +3, -4,
-4 >
(Atonal)
Meter: 3/4
1, 2, 3, 4, 5, 6
=, =,+,- , =
Meter: 3/4
1, 2, 3, 4, 5, 6
=, =,+ - ,=
Meter: 3/4
1, 2, 3, 4, 5, 6
+,-, = ,= ,+
Meter: 3/4
1, 2, 3, 4, 5, 6
+, -,= ,= ,+
Type1
Type2
<+12, -3, -4, -3,
+5 >
Scale Degrees:
5-5-3-1-6-2
Keys: CM, F#M
<+9, -3, -3, -4,
+4 >
(Atonal)
Type1
<+12, -2, -1, -2, +2>
Scale Degrees: 5-54-3-2-3
Keys: CM, F#M
Rhythm
Pitch Interval
Meter: 3/4
1, 2, 3, 4, 5, 6
-, + ,- ,+, -
Meter: 3/4
1, 2, 3, 4, 5, 6
-, + ,-,+, -
Meter: 6/8
1, 2, 3, 4, 5, 6
-, +, -, +, -
Meter: 6/8
1, 2, 3, 4, 5, 6
-, +, -, +, -
Type2
Bb2-D#5
Octave doubling
f<>f
Legato
Range:B3-G#6
Octave doubling
f<>f
Legato
mp < mf >mp
Legato
Single line
Range:C#4-Gb5
Type1
mp < mf >mp
Legato
Single line
Range:C#4-G5
Expression
pppp
Staccato
Single line
A5-B6
pppp
Staccato
Single line
Range:B4-C#6
ffff
Staccato
3 octave doublings
Range:C#1-D#5
ffff
Staccato+accent
3 octave doublings
Range:C#1-G5
Type2
Pitch intervals are marked in semitones, with “+” or “–” designating pitch direction. For instance, +12 designates an ascending octave
(12 semitones). In the tonal matrices (A, C), we also marked the succession of scale degrees and the keys used. Rhythmic figures are
described by marking the meter, the rhythmic contour and the location of accented metrical beats in each figure. For instance, Type 1
rhythmic figure in matrix A1 is in ¾ (upper line); it comprises 6 notes (1-6; 2nd line), of which the 1st and the 6th are on downbeats
(marked by bold figures in the 2nd line); the 2nd note is longer than the 1st and the 3rd note shorter than the 2nd, etc. (hence +, –,… in the
Table 2.accent in each figure is marked (line 3) by a larger font +.
3rd line). The main durational
Successions of pitch intervals, rhythmic figures,
and Expression types used in Experiment 2
Musical parameters and motivic classification
zohar eitan and roni y. granot
Figure 7.
Experiment 2: tonal motives (Matrix A) and their atonal counterparts (Matrix B).
• Task and procedure: Identical to those in Experiment 1.
Results
Experiment 1
• Task 1 (classification): Of the 35 classifications available to participants in each
of the four matrices (see also footnote 5), only three are completely consistent with
the three manipulated variables (i.e., one class including only motives featuring one
of the two contrasting states of a variable, the other including only motives featuring
the opposite state). If the manipulated variables indeed serve as a basis for
classification, the frequency of classifications consistent with these variables should
significantly exceed frequencies expected by chance, while the frequency of all other
possible classifications, taken together, should be smaller than chance. As Pearson
Chi-Square analysis shows (Table 3), this is clearly the case. Consistent classifications
comprise a large majority of classifications in all four matrices (ranging from N = 83
or 87.4% in matrix A to N = 76 or 80% in matrix C), though they comprise together
only 8.7% of possible classifications. In particular, the single classification based on
“Expression” was selected by the majority of participants for all matrices (from
N = 63 or 63.2% in matrix A to N = 71 or 74.7% in matrix B). These huge
discrepancies between actual frequencies and those expected by chance are indicated
by the extreme Chi-Square results in Table 3 (Chi-square > 50, p  0 in all matrices).
We next examined whether the weight given to each of the three consistent
classifications based on IC, Contour or Expression is equal. In this comparison
(Table 4), the expected frequency of each classification is calculated as 1/3 * N
153
MS-Discussion_Forum_4B-RR.indd 153
23/06/09 11:41:22
Table 3
Experiment 1: Chi-Square analysis comparing the expected and actual frequencies
of categorizations based on the three manipulated variables
(Interval Class [IC], Contour and Expression) as compared
Table 3
to all the remaining 32 possible categorizations (Others)
MATRIX A
MATRIX B
MATRIX C
MATRIX D
Actual
Frequency
(Freq.)
Expected
Freq.
Actual
Freq.
Expected
Freq.
Actual
Freq.
Exp.
Freq.
Actual
Freq.
Exp.
Freq.
6
2.7
1
2.7
2
2.7
0
2.7
17
2.7
8
2.7
6
2.7
14
2.7
Expression 60
2.7
71
2.7
68
2.7
64
2.7
Others
12
86.9
15
86.9
19
86.9
17
86.9
Total
95
95
95
95
95
95
95
95
Chi2(df = 3)
> 50****
Criterion for
Classification
Interval
Class
Contour
> 50****
> 50****
> 50****
**** p < .00001
participants whose classifications were consistent with any of the three variables. As
Table 4 shows, in all four matrices the null hypothesis that the proportions of the
three classifications is equal was strongly rejected (p < .00001). Indeed, as mentioned,
most participants applied “Expression” (secondary parameters) in their classifications,
while very few used IC as a basis for classification. In fact, the frequency of
classification according to IC differed from chance only in matrix A, where one of
the two IC sets presents a major triad while the other is an atonal configuration
(c2 = 4.09, df = 1, p < .05).
We also compared the proportions of the IC, Contour, Expression, and “Other”
categories across the four matrices (see Table 3), thus inquiring whether the different
musical materials used in different matrices affected classifications. Pearson Chisquared test revealed a significant difference across the four matrices (c2 = 18.97,
df = 9, p < .05), indicating that somewhat different weights were given to the three
parameters across matrices. Thus, for example, no participants applied IC as a
classifying criterion in matrix D, while 6 of the participants did in matrix A.
Similarly, while 17 (17.9%) of the participants used Contour in matrix A, only 6
(6.3%) did in matrix C. Nonetheless, it is evident that in all matrices the variable of
Expression had by far the strongest influence on participants’ suggested classifications,
while IC affected classification the least, if at all.
154
56
MS-Discussion_Forum_4B-RR.indd 154
23/06/09 11:41:23
Musical parameters and motivic classification
zohar eitan and roni y. granot
Table 4
Experiment 1: Chi-Square analysis comparing the expected (equal proportions)
and actual frequencies of categorizations consistent
with the three manipulated variables
(Interval
Class [IC], Contour and Expression)
Table 4
MATRIX A
MATRIX B
MATRIX C
MATRIX D
Actual
Frequency
(Freq.)
Expected
Freq.
Actual
Freq.
Expected
Freq.
Actual
Freq.
Exp.
Freq.
Actual
Freq.
Exp.
Freq.
6
27.7
1
26.7
2
25.3
0
26
17
27.7
8
26.7
6
25.3
14
26
Expression 60
27.7
71
26.7
68
25.3
64
26
Total
Chi2(df=2)
83
80
80
76
76
78
78
Criterion for
Classification
Interval
Class
Contour
83
> 50****
> 50****
> 50****
> 50****
**** p < .00001
Finally, we compared, collapsing across the four matrices, the classifications of
musically-trained participants with those of untrained ones. The influence of the
manipulated variables is somewhat different for musicians, as compared to
nonmusicians (c2 = 10.53, df = 3, p < .05), with musicians relying less on Expression
(60% vs. 73%) and somewhat more on Contour (16% vs. 10%) and IC (6% vs.
1%). Note, however, that even among the musicians only 6% classified the motives
on the basis of IC — far less than music theorists would probably predict.
In addition to the classification data, we measured the time spent by each
participant in performing Task 1, from initial presentation of the stimuli to
completion (Table 5). Analyses of variance (ANOVA) were performed separately for
each of the four matrices, with classification type (Expression, IC, Contour, or
Other) as between-participant independent variable (df = 3), and timing as the
dependent variable. A significant effect of classification type on timing is shown for
all four matrices: Matrix A (F = 4.45; p < .01), Matrix B (F = 5.18; p < .01), Matrix C
(F = 6.76; p < .001) and Matrix D (F = 12.63; p < .0001). As shown in Table 5, in
all four matrices the average timing of participants who chose Expression (secondary
parameters) as the basis for classification was the shortest (overall average for all four
matrices 111.7 seconds), as compared to those of participants who classified by
Contour (overall average 206.3 seconds) IC (overall average 243.1 seconds) or
“others” (overall average 150.6 seconds).
155
57
MS-Discussion_Forum_4B-RR.indd 155
23/06/09 11:41:23
Table 5
Experiment 1: Mean and SD of time (in seconds) spent on the classification task
Table 5
as a function of the criterion for classification
MATRIX A
MATRIX B
MATRIX C
MATRIX D
ALL
Mean
Time
SD
Mean
Time
SD
Mean
Time
SD
Mean
Time
SD
Mean
Time
281.2
232.9
135.0
NA
183.0
39.6
NA
NA
243.1 193.8
177.9
183.4
190.1
92.7
219.2
149.4
244.57
206.3
206.3 156.8
Expression 126.9
83.5
96.4
65.7
108.3
52.2
118.1
49.2
111.7 64.3
Others
159.0
71.2
133.7
86.6
153.1
66.2
156.7
68.3
150.6 68.3
All
149.8 123.2 110.6 75.8 125.8 70.6
143.6
87.3
132.5 92.5
Interval
Class
Contour
SD
• Task 2 (best example). In Task 2, participants were asked to select the motive that
best represents each of the two groups they had created. We examined results for this
task only where Expression served as the criterion for classification (between 63.1%
to 74.7% of all classifications, depending on matrix), since other classifications
included too few observations for further tests. Two Chi-squared tests with Yates’
continuity correction were carried out separately for each group of selected motives
in each matrix. 8 One test examined whether IC affected participants’ choices of “best
example”, that is, whether motives comprising one IC set were chosen more
frequently than motives comprising its IC counterpart as best examples. The other
test examined, in a comparable way, whether Contour affected best example
choices.
As an example, consider the following: 60 participants classified matrix A on the
basis of Expression, grouping all motives sharing type 1 Expression (pp > pppp,
legato, single line — see Table 1) in one group, and in the other —- all motives
sharing type 2 Expression (ff < ffff, staccato, 3 octave doublings). As seen in Table 6,
40 of these 60 participants selected motives comprising the major triad configuration
(IC1), as the “best example” for type 1 Expression group, while only 20 chose
motives comprising IC 2, its atonal IC counterpart (p < .01).
Significant preferences for representative IC sets (Table 6) are also shown in
matrix C, where the preferred set presents mainly conjunct motion — 2nds or their
(8) Yates’ continuity correction is often used in chi-square analyses when the number of
observations in some cells, given a 2 x 2 matrix, is small. The correction subtracts 0.5 from the
absolute difference between the observed and the expected frequencies for each cell.
156
58
MS-Discussion_Forum_4B-RR.indd 156
23/06/09 11:41:23
Musical parameters and motivic classification
zohar eitan and roni y. granot
inversions. Notably, these preferences are shown only for the “soft” Expression type
marked in both matrices by a legato, “melodious” articulation, and no doubling of
voices. Significant preferences for representative contours (Table 7) are revealed in
three of the four matrices (B, for Expression Type 2 only; C and D, for both
Expression types). Notably, all preferred contours comprise or begin with convex
(inverted U) contours, and involve congruence between pitch accent (rise) and
agogic, as well as metric accents.
Table 6
Experiment 1: Effects of Interval Class (IC) on the selection of a representative motive
Table 6
for each type of Expression
Matrix Expression type 1 χ2
Expression type 2
Interval Interval (df = 1) Interval
Interval
Class 1
Class 2
20
6.66**
33
27
0.60
B
31
40
1.14
30
41
1.70
C
43
25
4.76*
32
36
0.23
D
33
31
0.06
26
38
2.25
A
*
40
p < .05
**
p < .01
***
Class 1
Class 2
χ2
(df = 1)
p < .001
Experiment 2
• Task 1 (classification). As in Experiment 1, participants’ proposed classifications
are clearly not random (Table 8), as classifications consistent with the three
manipulated variables highly exceed expected frequency (c2 > 50, df = 3, p < .00001
in all matrices). Furthermore, as in Experiment 1, the relative weight of the three
variables is evidently not equal (Table 9): Expression was again used as a criterion for
classification more frequently than expected from equal distribution of the 3 criteria,
while Intervals were applied less frequently than expected (c2 = 27.73, df = 2,
p < .00001; c2 > 50, df = 2, p < .00001; c2 = 10.87, df = 2, p < .01; c2 > 50, df = 2,
p < .00001, in Matrices A, B, C and D respectively). Note, however, that while
Expression was participants’ primary classification criterion (with N = 46, 54, 36,
and 48 or 52.3%, 61.4%, 40.9% and 54.5% percent of participants applying it
consistently in matrices A, B, C, and D, respectively), a considerable proportion of
157
MS-Discussion_Forum_4B-RR.indd 157
23/06/09 11:41:23
Table 7
Experiment 1: Effects of contour on the selection of a representative motive
for each type of Expression
Table 7
A
χ2
Expression type 2
Contour 1 Contour 2 (df = 1) Contour 1
Contour 2
37
23
3.27
27
33
χ2
(df = 1)
0.6
B
40
31
1.14
44
27
4.07*
C
46
∩
22
∪
8.47**
44
∩
24
∪
5.88*
D
43
∩
21
∪
7.56**
42
∩
22
∪
6.25*
Matrix Expression type 1
*
p < .05
**
p < .01
***
p < .001
participants (from N = 30 or 34.1% in Matrix D, to N = 26 or 29.5% in Matrices A
and B) applied Rhythm consistently as their classification criterion.
As in Experiment 1, we compared the proportions of the classification categories
of Interval, Rhythm, Expression, and “Other” across different matrices (see Table 8).
Pearson’s Chi-squared test across all four matrices shows no significant difference
across matrices (c2 = 14.9, df = 9, p = .09). We also compared the distribution of the
classification categories in the tonal matrices (A & C) with their atonal counterparts,
B & D. Between-subjects Pearson’s Chi-squared test indicates no significant
difference between matrices A & B (c2 = 3.61, df = 3, p = .31), and a marginally
significant difference between matrices C & D (c2 = 7.62, df = 3, p = .055). As
Table 8 shows, more participants applied Intervals in the tonal matrix C, as compared
to the atonal matrix D (11 vs. 3); in contrast, fewer participants applied Expression
in matrix C, as compared to D (36 vs. 48). An examination of within-subjects
distributions in matrices C and D reveals that only two of the 11 participants who
based their classifications on Pitch Intervals in the tonal Matrix C did so in its atonal
counterpart (Matrix D). Of the remaining nine participants, six shifted their
classification criteria from Intervals in the tonal version of the motive to Rhythm in
the atonal version. Correspondingly, 11 (38%) of the 29 participants who classified
the motives in the tonal Matrix C by Rhythm, shifted their classification criteria to
Expression in the atonal Matrix D. Thus, the weakening of tonal structure seems to
have shifted classification criteria from Intervals (mainly) to Rhythm and from
Rhythm to Expression.
Comparison (collapsed across the four matrices) of the classifications of
158
MS-Discussion_Forum_4B-RR.indd 158
23/06/09 11:41:23
Musical parameters and motivic classification
zohar eitan and roni y. granot
Table 8
Experiment 2: Chi-Square analysis comparing the expected and actual frequencies
of categorizations based on the three manipulated variables
(Interval, Rhythm, and Expression) as compared
Table 8
to all the remaining 32 possible categorizations (Others)
MATRIX A
MATRIX B
MATRIX C
MATRIX D
Criterion for
Classification
Actual
Frequency
(Freq.)
Expected
Freq.
Actual
Freq.
Expected
Freq.
Actual
Freq.
Exp.
Freq.
Actual
Freq.
Exp.
Freq.
Interval
6
2.5
4
2.5
11
2.5
3
2.5
Rhythm
26
2.5
26
2.5
29
2.5
30
2.5
Expression 46
2.5
54
2.5
36
2.5
48
2.5
Others
10
80.5
4
80.5
12
80.5
7
80.5
Total
88
88
88
88
88
88
88
88
Chi2(df = 3) > 50****
> 50****
> 50****
> 50****
**** p < .00001
Table 9
Experiment 2: Chi-Square analysis comparing the expected (equal proportions)
and actual frequencies of categorizations consistent
Tablewith
9 the three manipulated variables (Interval, Rhythm and Expression)
MATRIX A
MATRIX B
MATRIX C
MATRIX D
Classificatio
n Criterion
Actual
Frequency
(Freq.)
Expected
Freq.
Actual
Freq.
Expected
Freq.
Actual
Freq.
Exp.
Fr.
Actual
Freq.
Exp.
Fr.
Interval
6
26
4
28
11
25.3
3
27
Rhythm
26
26
26
28
29
25.3
30
27
Expression
46
26
54
28
36
25.3
48
27
Total
78
78
84
84
76
76
81
81
Chi2 (df = 2)
27.73****
> 50****
10.87**
> 50****
** p < .01 **** p < .00001
159
61
MS-Discussion_Forum_4B-RR.indd 159
23/06/09 11:41:23
musically-trained and untrained participants reveals a highly significant difference
between the groups (c2 = 32.4, df = 3, p < .0001). As Table 10 reveals, while
nonmusicians mostly (60.4%) used Expression as the classification criterion, only
about one third of the musicians (29.3%) used this criterion. For musicians, Rhythm
was rather the dominant criterion, applied in 44.6% of their classifications. Though
pitch intervals were not commonly used by both groups, one notes that a sizeable
minority of musicians’ classifications (15.2%) were based on Intervals, in comparison
to mere 3.8% among nonmusicians. Notably, similar proportions of musicians and
nonmusicians (10.8% vs. 8.8%, respectively) failed to apply consistent classifications
(“Other”). Thus, while a large majority of classifications in both groups applied a
single criterion consistently, musicians tended to apply primary musical parameters
(Rhythm, and to a lesser extent Intervals), while nonmusicians mostly applied
secondary parameters (Expression).
Table 10
Experiment 2: Classifications of musicians and nonmusicians collapsed
Table 10
across the four matrices
MUSICIANS
NONMUSICIANS
Criterion for
Classification
No. of
classifications
across matrices
Proportion
No. of
classifications
across matrices
Proportion
Intervals
14
.152
10
.038
Rhythm
41
.446
70
.269
Expression
27
.293
157
.604
Others
10
.108
23
.088
TOTAL
92
260
As in Experiment 1, we compared the time spent by participants who applied the
different criteria for Experiment 2, again using Analyses of variance (ANOVA) for each
of the four matrices separately, with classification type (Expression, Intervals, Rhythm,
or Other) as between-participant independent variable (df = 3), and timing as the
dependent variable. As in that experiment, significant effects of classification type on
timing are shown for all four matrices: A (F = 7.00; p < .001), B (F = 2.96; p < .05),
C (F = 5.11; p < .01) and D (F = 6.49; p < .001). As in Experiment 1 (see Table 11),
the time spent by participants who applied Expression as the classification criterion is
the shortest (overall average 115.3 seconds), as compared to timings of participants
160
MS-Discussion_Forum_4B-RR.indd 160
23/06/09 11:41:24
Musical parameters and motivic classification
zohar eitan and roni y. granot
who classified by Rhythm (overall average 165.5 seconds) Intervals (the longest duration
by far: overall average 226.4 seconds) or “others” (overall average 160.9 seconds).
Table 11
Experiment 2: Mean and SD of time (in seconds) spent on the classification task
as a function of the criterion for classification
Table 11
MATRIX A
MATRIX B
MATRIX C
MATRIX D
ALL
Classification Mean
according to Time
SD
Mean
Time
SD
Mean
Time
SD
Mean
Time
SD
Mean
Time
SD
Intervals
251.8
179..9
211.3
109.0
160.9
51.1
436.3
447.2
226.5
187.4
Rhythm
176.5
95.4
149.9
82.9
150.2
95.2
184.3
116.5
165.5
98.8
Expression
114.3
81.5
124.1
98.9
99.6
56.3
118.4
88.3
115.4
84.6
Others
175.5
91.1
202.5
148.4
133.6
111.9
162.9
86.4
160.8
103.3
Mean
149.0 102.2 139.3 98.5
128.6 82.0 155.2.6 132.8 143.0 105.5
• Task 2 (best example). Data for Task 2 were examined, as in Experiment 1, only
for participants who applied Expression as the criterion for classification (Tables 12
& 13). Two Chi-squared tests with Yates’ continuity correction were conducted
separately for each of the two groups of selected motives in each matrix. One test
(Table 12) examined whether Rhythm affected participants’ choices of “best
example”, that is, whether motives featuring one of the two rhythmic figures in the
matrix were chosen more frequently as “best examples” than motives comprising the
other figure. The other test (Table 13) examined, in a comparable way, whether
Intervals affected best example choices.
For two of the matrices (A, B) significant preferences for specific rhythmic figures
or interval sets as better exemplars are suggested. Significant preferences for one
representative pitch interval set over the other are indicated in matrices A (p < .05)
and B (p < .01). Notably, these preferences, favoring a figure comprising larger pitch
intervals over its stepwise counterpart, are revealed only for Expression type 2 (fff,
staccato, four-octave doubling), and not for its more melodious counterpart (p,
single line, legato; see Figure 6).
In matrix B, two contrasting rhythmic preferences are shown for the two
Expression types (Table 12). The rather pompous and jerky Expression type 2 is
associated with a rhythmic figure presenting relatively complex rhythmic proportions
and incongruencies between metric and melodic accents (p < .01; see Figure 8,
motive 2), while its smoother Expression counterpart is associated with a simpler
rhythmic figure (p < .05; Figure 8, motive 1).
161
64
MS-Discussion_Forum_4B-RR.indd 161
23/06/09 11:41:24
Table 12
Experiment 2: Effects of Rhythm on the selection of a representative motive
for each type of Expression
Table 12
Matrix Expression type 1
χ2
χ2
Expression type 2
Rhythm 1 Rhythm 2 (df = 1) Rhythm 1 Rhythm 2 (df = 1)
A
27
19
1.39
29
17
3.13
B
29
15
4.45*
17
37
7.41**
C
17
19
0.11
15
21
1.0
D
25
23
0.08
24
24
0
p < .05 * p < .01 **
Table 13
Experiment 2: Effects of pitch Intervals on the selection of a representative motive
Table 13
for each type of Expression
Matrix Expression type 1
χ2
Expression type 2
Interval 1 Interval 2 (df = 1) Interval 1
Interval 2
A
21
25
0.35
16
30
χ2
(df = 1)
4.26*
B
26
28
0.07
16
36
7.69**
C
22
14
1.78
14
22
1.78
D
26
22
0.33
24
24
0
p < .05 * p < .01 **
162
MS-Discussion_Forum_4B-RR.indd 162
23/06/09 11:41:24
Musical parameters and motivic classification
zohar eitan and roni y. granot
Figure 8.
Experiment 2, Matrix B: matching rhythm and secondary parameters in the “best example” task.
Discussion
Main findings: questioning accepted parametric hierarchy
In the current study we examined the supposition that discrete pitch and rhythmicmetric elements and relationships — theorists’ “primary parameters” — serve as
principal criteria for motivic classification, by pitting them against invariants in
“secondary parameters,” the latter combining changes in loudness, register, doubling
and articulation. In contrast to previous work (e.g., Dibben & Lamont 2001; Eerola
et al., 2001; McAdams et al., 2004; Pollard-Gott, 1983) we used controlled stimuli,
rather than complete musical segments or pieces. This enabled us to begin to
dissociate the relative contribution of the examined musical features.
Our main finding challenges the suppositions of traditional music theory and
analysis regarding the primacy of pitch structure (pitch intervals or IC) in motivic
classification. Rather, the study shows that motivic classification can often be related
to similarities and differences in secondary parameters — general auditory features,
not specifically related to music. The highest percentage of classifications based on
pitch intervals or ICs was 12.5%, (Experiment 2, Matrix C), with proportions in
other matrices ranging from 0%! (Experiment 1, Matrix D) to 7% (Experiment 2,
Matrix A). Importantly, this slight use of pitch information in categorization was not
limited to atonal materials: it was also found in the two tonal matrices (A, C) of
Experiment 2, and in Matrix A of Experiment 1, in which the paradigmatic tonal
structure of the major triad was pitted against an atonal motive. This minimal use of
pitch information in classification stands in bold contrast to reliance on the
Expression variable, based on differences in the secondary parameters of dynamics,
articulation and register, that served as the basis for classification for 75%
(Experiment 2, Matrix C) to 41% (Experiment 1, Matrix B) of the participants.
163
MS-Discussion_Forum_4B-RR.indd 163
23/06/09 11:41:24
Although a larger proportion of subjects relied on pitch contour, as compared to
interval information, only 6.3% (Experiment 1, Matrix C) to 17.8% (Experiment 1,
Matrix A) based their categorization on this variable. This result is at odds with the
demonstrated importance of contour in melodic processing of short stimuli,
particularly where tonal information is weak (e.g., Dowling, 1978, 1999; Edworthy,
1985) as indeed are most of our stimuli. One possible explanation is that the
Expression (secondary parameters) variable was stronger, capturing immediate
attention. In addition, it is possible that a mirror image of contour (e.g., rise-fall-rise
vs. fall-rise-fall), which was used in the current study, is not really perceived as
contrastive, but rather as related through a simple transformational rule. Indeed, the
results obtained in the above cited studies on the role of the contour in melodic
processing examined effects of a local contour change rather than the effects of global
transformations such as the one used here.
Although most participants chose not to use Contour as a classification criterion,
their sensitivity to contour information was clearly demonstrated in the second task
(Experiment 1), where preferences for a convex over a concave contour, and for
motives in which contour was concurrent with metrical and agogic accents, were
found (Experiment 1, Matrices B, C, D). These preferences are consistent with
notions suggesting “natural” cognitive priority to convex over concave contours and
to parametric concurrence over non-concurrence (Cohen & Granot, 1995).
Preference for the convex contour is also consistent with the prevalence of this type
of contour within musical phrases in folksongs (Huron, 1996).
Unlike pitch, rhythmic information figured as a substantial criterion for
classification, with 29% (Experiment 2, Matrix A) to 34% (Experiment 2, Matrix D)
of the participants basing their classification consistently on rhythmic variance. This
was especially true of the musically trained, who relied more on Rhythm than on
Expression as a classification criterion (45% vs., 29%). In contrast, among the
nonmusicians the pattern is reversed, such that Expression was used twice as much
as a classifying criterion, compared to Rhythm (60% vs. 27%). Two points, then, are
worthy of stressing: the first is the primacy of rhythmic information (specifically,
agogic and metric accents) as a classifying criterion, as compared to pitch (specifically,
pitch Intervals or ICs). The second is that the effect of musical training on the use
of rhythm in classification is considerably stronger than the comparable effect
concerning pitch.
An issue not examined in the present experiment is the mutual effect of pitch
contour and rhythm. In this study, rhythmic information was never pitted against
pitch contour information. Rather, contour was held constant in Experiment 2
(while rhythmic figures, as well as pitch intervals and “expression”, varied). In
contrast, in Experiment 1 rhythm was held constant, while contour, along with ICs
and “Expression,” was varied. Further studies are required to tease out two alternative
possibilities: First, rhythm is perceptually more salient and relevant than pitch
intervals and pitch contour in motivic classification — at least in short, context164
MS-Discussion_Forum_4B-RR.indd 164
23/06/09 11:41:24
Musical parameters and motivic classification
zohar eitan and roni y. granot
independent stimuli. This possibility is consistent with our experience with powerful
rhythmic motives, such as the opening motive of Beethoven’s fifth symphony.
Alternatively, it is rather the specific conjunction with pitch contour that gave
rhythm its relative salience.
An important counterpart to the classification data itself was provided by
comparisons of the time spent in performing the different types of classifications.
Classification was fastest by far, in both experiments, for those who used Expression
(secondary parameters) as their criterion, and slowest — about twice as slow as
Expression — for those who used ICs (Experiment 1) or Intervals (Experiment 2).
Performance times for classifications based on Contour (Experiment 1) and Rhythm
(Experiment 2) were intermediate. Thus, the more often a criterion of classification
was used, then, the shorter its performance time tended to be.
In sum, our results seem to problematize theorists’ notions of the hierarchy of
musical dimensions shaping motivic identity: pitch intervals or IC, dimensions
supposedly central in determining motivic identity, were marginalized, and secondary
parameters, assumed to be marginal, were centralized. Correspondingly, the processing
of Expression-based categorization was fastest, while that of pitch-based categorization
was slowest.
Holistic vs. analytic processing
One perspective which may shed additional light on the results is that of holisticallybased, as compared to analytically-based information processing. Analytic processing
is often described as processing stimuli in terms of their constituent properties, while
holistic processing regards them as integral wholes (Garner, 1974). In terms of
classification, analytic processing is revealed when two stimuli are grouped together
on the basis of the structure of a specific dimension, whereas holistic processing is
implied in groupings based on overall similarity.
Several models of perception have proposed a two-stage model of stimulus
comparison: a fast, pre-attentive “comparator” which operates on wholes, and a
second, later comparator, which checks feature by feature (Krueger, 1973). Analytic
processing and focused attention may be useful in processing categories defined by a
criterial attribute. In contrast, family resemblance-based categories call for a broader
attention, which allows for parallel processing of the various features (Kemler Nelson,
1984). This distinction may adequately distinguish between the pitch and ICs
criterion for classification (“criterial attribute”), as opposed to the compound of
Expressive features. Smith and Shapiro (1989), following Kemler Nelson, showed
that holistic processing is not limited to young children, but can serve in adults as a
“backfall” under conditions which hamper the more cognitively demanding analytical
processing, as when a categorization task is performed concurrently with another,
different task.
While most studies examining the analytical-holistic distinction were carried out
on visual stimuli, Peretz, Morais and Bertelson (1987) also demonstrate that holistic
165
MS-Discussion_Forum_4B-RR.indd 165
23/06/09 11:41:24
processing of melodies is probably the default and more automatic way of processing
(at least for nonmusicians), as measured by ear asymmetry in a dichotic listening test.
Interestingly, in their study an instruction to focus on specific pitches in order to
detect changes in the melody did shift the ear advantage to the right (left hemisphere,
presumably more analytical processing), but caused deterioration rather than
amelioration in performance, while other manipulations had no such effect. Their
interpretation was that this instruction interfered with the spontaneous orientation
towards more global processing — at least in nonmusicians.
Together, these studies highlight the importance of holistic processing, especially
in dealing with complex stimuli and family resemblance relationships, typical of
music. They provide for a cognitively-based theory which could not only explain
why the Expression variable we presented was so powerful in terms of classification,
but may also help to define the special conditions under which analytic processing
could emerge as an alternative.
Preferences or incapacities?
Given our results, an inevitable question may be raised: did participants who chose
Expression-based classifications prefer this dimension over other perceivable alternatives,
or were they simply unable to process other types of classifications suggested by the
matrices? Results of Task 2, in which participants were required to choose an
exemplar representing each class they created, may provide an indirect answer. In
their exemplar choices, participants whose classifications were guided by Expression
significantly preferred one set of IC over another (Experiment 1, Matrices A, C), and
one interval configuration over the other (Experiment 2, Matrices A, B). These
preferences suggest that participants were often able to discriminate the opposing
pitch configurations presented to them, yet they overwhelmingly chose to base their
categorizations on other, simpler dimensions. Moreover, participants often associated
the pitch structures they discriminated with specific expressive characteristics. Thus,
for instance, preferences for disjunct over conjunct intervals (Experiment 2, Task 2,
Matrices A & B; see Table 13) were limited to the more forceful, less melodic
Expression type (ffff, staccato, dense texture). Listeners thus related — and perhaps
subordinated — the primary parameter of pitch to secondary parameters such as
dynamics and density, which served as their main basis for classification.
Consequentially, though pitch structures were processed and discriminated, the
relevant information in this case may not be exact interval size or scale degrees, but
rather the much more general distinction (not specific to music) between conjunct
and disjunct motion.
Such view of the results may account for a seeming incongruity between the present
study and previous studies, which indicated that listeners, regardless of formal music
training, are able to process complex pitch and rhythmic hierarchies (see Bigand &
Poulin-Charronnat, 2006, for a recent survey). For instance, musically trained and
untrained listeners correctly identified variations derived from the same underlying
166
MS-Discussion_Forum_4B-RR.indd 166
23/06/09 11:41:24
Musical parameters and motivic classification
zohar eitan and roni y. granot
tonal pitch structure, ignoring differences in surface features like pitch contour or
rhythmic texture (Bigand, 1990). Even in post-tonal contexts, listeners were able to
discriminate stimuli employing a given pitch structure (a twelve-tone row), to which
they were previously exposed, from foils which, though similar to the target melody
in “surface” features, did not feature the same structure (Bigand, & Poulin-Charronnat,
submitted, discussed in Bigand & Poulin-Charronnat, 2006, pp. 215-17).
Such findings seem to counter those of the present study, since they show that
listeners use pitch structure — even high-level structures abstracted from the musical
surface — in discrimination and classification tasks, while ignoring misleading
surface cues. Yet, in most of the above studies participants were implicitly trained to
discriminate underlying pitch structure amid irrelevant “surface” features. For
instance, in Bigand (1990) participants were exposed, prior to their discrimination
test, to several stimuli featuring the same underlying pitch structure amid contrasting
melodic and rhythmic surfaces. Thus, a bias toward the use of similar structures in
the test that followed could be generated. In the present study no such bias was
created; consequentially, most participants chose to use contrasts in dynamics,
registration and articulation as a classification criterion, ignoring pitch intervals and
ICs. Hence, it seems that while many listeners (including listeners with little formal
training in music) may be able to discriminate subtle pitch structures, when given an
unbiased option they choose not to use such structures in categorizing musical
stimuli, and instead rely on rough auditory similarities and contrasts, not musically
specific, like loud-soft and dense-sparse. This would also be consistent with the
ecological view, according to which the relevant information is that which best
associates between the sound and its source. The conjunction of co-occurring
variables such as dynamics, texture, articulation and register would point very strongly
to their virtual source, as opposed to changes in pitch, which supply very little
ecological information.
Classification of tonal and atonal stimuli
Experiments 1 and 2 both involved, in different ways, tonal as well as atonal stimuli.
In Experiment 1, two matrices (A, C) contrasted a pitch configuration common in
tonal music (a major triad and a diminished triad, respectively) with an atonal one.
In Experiment 2, two of the matrices (A, C) presented tonal stimuli, while the other
two presented atonal variants of these stimuli.
As expected, criteria for classification of tonal and atonal stimuli somewhat
differed, as more participants used pitch relationships (Intervals or ICs) in classifying
the tonal, as compared to the atonal stimuli. Yet surprisingly, these differences were
small, as pitch Intervals or ICs were rarely used as classification criteria for both tonal
and atonal stimuli. Indeed, even when pitch-configurations involved a major triad
on the one hand and a blatantly atonal configuration on the other (Experiment 1,
Matrix A), only 6 of the 94 participants used this distinction as their classification
criterion, while most others rather used “Expression” (secondary parameters).
167
MS-Discussion_Forum_4B-RR.indd 167
23/06/09 11:41:24
Given the vast empirical literature attesting to the importance of tonal pitch
information in melodic processing (see Krumhansl, 2004, for a review), these results
are intriguing. To some degree, results may stem from the lack of secure tonal context
in the experiments. When tonal motives were involved, tonal information was
contained only in these relatively brief stimuli, as no prior tonal context was given.
Given sufficient tonal priming, Interval and IC information might have been more
often used as classification criteria for the tonal stimuli (a hypothesis that could be
examined in follow-up research). Nevertheless, the fact that “Expression,” rather
than Intervals or ICs, played the major role in classifying even the most important
tonal configuration, a major triad, is noteworthy, and suggests that secondary
parameters may play a primary role in the processing of motivic structure in music,
tonal and atonal alike.
The effect of musicianship
Music training affected listeners’ choices of categorization criteria. It made only a
modest difference, though, with regard to pitch-driven categorization, which ranked
low for both groups. The percentages of musicians who utilized pitch Contour
(16%, Experiment 1), ICs (6%, Experiment 1) or Intervals (15.2%, Experiment 2)
are higher than those of nonmusicians (10%, 1%, and 3.8%, respectively), but still
strikingly small, with respect to the primacy given to pitch relationships in music
theory and analysis, and consequentially in musicians’ professional education. What
musicianship did affect strongly was rhythm, rather than pitch: as mentioned, in
Experiment 2 (where it served as a variable) Rhythm became the most common
classification criterion for musicians (44.6%), surpassing Expression; not so for
nonmusicians, where most participants classified by Expression, as in Experiment 1.
In sum, nonmusicians — which presumably represent a large majority of the general
population — overwhelmingly preferred categorization by Expression (loudness,
articulation, doubling, register) over any other criterion, including varied aspects of
pitch (Intervals, ICs, Contour) and rhythm. For musicians, in contrast, Rhythm was
as or more important than secondary parameters (Expression). Yet, for both groups
(though more pronouncedly for nonmusicians) pitch-based criteria were least
influential.
These different training effects — a strong effect concerning rhythm, as opposed
to a modest one concerning pitch — may reflect a greater importance of rhythmic
skills in actual music practice. Performing musicians need to develop intricate timing
skills in order to perform notated music correctly and interpret it musically. Sensitive
musical interpretation requires a consistent application of precise microtiming (e.g.,
Honing & De Haas, 2008) and strict control of the performance of diverse accents.
Such subtle expertise, however, is not always needed with regard to pitch, particularly
in instruments like piano (and other keyboards) or guitar, where pitch is fixed.
While musicians’ increased sensitivity to temporal information has been
demonstrated for a variety of tasks (e.g., Gaab et al., 2005; Jones & Yee, 1997;
168
MS-Discussion_Forum_4B-RR.indd 168
23/06/09 11:41:25
Musical parameters and motivic classification
zohar eitan and roni y. granot
Rammsayer & Altenmüller, 2006), these tasks mostly involve immediate perceptual
processing of temporal information. Notably, in some tasks requiring processing
memory (as is the present task), no differences associated with musical training were
found (Rammsayer & Altenmüller, 2006). Yet, musicians have been shown to
privilege temporal information in a variety of cognitive tasks: in a similarity rating
task (Eitan & Granot, 2007), in a task involving spatio-kinetic associations (Eitan &
Granot, 2006), and in a tension rating task (Granot & Eitan, submitted). Hence,
though (as Bigand & Poulin-Charronnat, 2006, suggest) nonmusicians may share
sophisticated music cognition capacities with trained musicians, music training may
affect the way these capacities are prioritized. Musicians and nonmusicians may
possess similar music processing tools; however, they may also (as seems to be the
case in our Experiment 2) choose to processes some musical structures differently,
privileging some perceivable relationships (particularly rhythmic) while downgrading
others.
Preference for “natural” criteria?
The privileged role of secondary parameters in categorization, nonmusicians’
categorization in particular, may be related to the vital functions such parameters
possess in extra-musical, natural circumstances, as opposed to dimensions like pitch
intervals or proportional rhythm, which hardly figure outside music at all. Loudness,
overall pitch register, and sound envelope (roughly equivalent to articulation in the
present stimuli) all supply environmental cues, pertinent to survival. Loudness is
related to distance (Blauert, 1997), both loudness and register imply size (Marks,
1978), and sound envelope may suggest the identity or action of the sound source.
Even the intentions of an animate sound source may be implied by loudness and
register, since loud, low voice signifies aggression and dominance in human and nonhuman species alike (Morton, 1994; Puts, Gaulin, & Verdolini, 2006). In a recent
study, Granot & Eitan (submitted) demonstrated that listeners’ perception of
musical tension is affected by such parameters in a manner consistent with their
extra-musical ecological implications. Here we show that these parameters, secondary
perhaps in music theory but certainly not so in the experiential environment, are also
the main determinants of the categorization of melodic motives, a key operation in
the cognitive processing of music. As noted, professional music training, which
stresses the central role of pitch structures in music syntax, did not result in
upgrading these primary, syntactic parameters over the secondary, natural ones as
criteria for classification, though it expectedly increased the proportion of participants
who relied on pitch and rhythm in their classifications.
While our findings may seem at odds with most motivic-thematic theory, they
are very much in line with ecological approaches to music cognition (Lombardo,
1987; McCabe & Balzano, 1986; Reybrouk, 2005), which suggest a view of music
as representing virtual sources in the sonic world. Coping in the sonic world is a
dynamic process, processing streams of continuous information (as found in most
169
MS-Discussion_Forum_4B-RR.indd 169
23/06/09 11:41:25
secondary parameters), though these are often reduced through mediated cognitive
schemes into discrete, labeled events. Indeed, in an ecological framework, dynamics,
register, and articulation offer much more information about the possible source of
sound as compared to frequency, especially if they are amalgamated into a unitary
percept, as in our experiment (Gaver, 1993a; 1993b). It seems, then, that our
participants chose to cope with their classification tasks through parameters and
strategies related to their general experience with auditory objects, rather than by
music-specific processing of music-specific parameters.
One possible interpretation of these results would suggest that the lack of wider
musical contexts (including a strongly-established tonal context) in the present
experiments focused attention on the immediate experiential level of processing,
carried by parameters like dynamics and register, as the relevant information (Jones
& Hahn, 1986). If this interpretation holds, under different experimental tasks (in
particular in actual music-listening contexts) participants may find it more useful to
focus on discrete, music-specific pitch or time relationships. Yet, experiments
involving categorization and similarity perception of musical motives in actual
music-listening conditions, whose results complement ours (see Musical parameters
in similarity and classification tasks, pp. 144-45 above), cast doubt on such cautionary
interpretation, suggesting instead that listeners’ favoring of strategies and parameters
central to general auditory experience over music-specific ones may also hold for
“real” music and actual music-listening contexts.
Ecological validity and relevance to music listening and analysis
It is evident that our explicit tasks may recruit processes somewhat different from the
implicit generation of motivic prototypes or “cues” (Deliège, 1996; 2001a; 2001b)
assumed to underlie “natural” music processing. First, our task is an explicit
classification task, while actual listening relies heavily on implicit cognitive processes,
including implicit categorization of musical events (Bigand & Poulin-Charronnat,
2006). Second, our tasks do not require extensive long-term memory resources, as
actual music often does, since participants could hear and re-hear motives at any
order at will.
Note, however, that these differences between our task and actual listening
could have been expected to strengthen, rather than weaken the role of pitch-related
parameters in classification. The explicit classification task could be expected to lead
to a stronger reliance on the primary parameters, at least for musicians, trained to
view these parameters as defining similarity relationships. The fact that even for
musicians pitch intervals or ICs were only rarely used as the main classifying criterion
thus strengthens our questioning of the supposed primacy of discrete pitch information
in motivic categorization. Furthermore, the fact that no long-term memory requirements
were imposed in the present experiments should have actually assisted participants
in comparing discrete pitch and rhythm information across motives.
An important difference between the present experiments and “ecological” music
170
MS-Discussion_Forum_4B-RR.indd 170
23/06/09 11:41:25
Musical parameters and motivic classification
zohar eitan and roni y. granot
listening is the lack of a larger musical context for the motives presented. In actual
compositions, musical context — e.g., the specific order of motives, or their
groupings into higher-order units — may affect perceived similarity and categorization;
in contrast, our task taps the preferred way of categorization when motives are
presented detached from such broader context. Musical motives and their transformations
are embedded in a segmental hierarchy, comprising parts of larger units (musical
phrases, sentences and sections), in which they may fulfill specific syntactic functions.
When melodic motives are short and de-contextualized, as is the case here, pitch
interval information may indeed become less important than in actual musical
contexts (perhaps because musical “syntax” is limited to these short, low-level units),
and non-syntactic parameters like pitch contour are more readily processed (Cuddy,
Cohen & Mewhort, 1981; Dowling & Bartlett, 1981; Edworthy, 1985). Here we
show that contrasts in secondary parameters like loudness and register may prevail
over both pitch interval and pitch contour information when such short figures,
devoid of musical context, are processed.
How relevant, then, are our findings to actual music, where such figures are
constituents of larger structures? This problem permeates, in fact, any controlled
musical experiment, and can only be addressed by complementary studies which use
actual musical compositions and more implicit tasks. Yet such studies, as mentioned
in the introduction (pp. 8-9), have indeed been performed, and mostly concur with
our results in showing that listeners’ similarity ratings and motivic or thematic
classifications are often inconsistent with pitch-based analysis, emphasizing instead
similarities and differences in secondary parameters (e.g., Lamont & Dibben, 2001;
McAdams et al., 2004; Ziv & Eitan, 2007). The present study, pitting primary and
secondary parameters against each other in a controlled design, suggests that such
earlier findings may be based upon strong preference for the “natural” psychoacoustic
dimensions of musical motives, like loudness or overall pitch register, over musicspecific pitch-based parameters, or even (for nonmusicians) rhythmic structure.
This finding suggests two implications for music analysis (insofar as it cares about
cognition), one negative and cautionary, the other positive and exploratory.
Zbikowski (2002), in an important attempt to bridge cognitive science and music
theory, suggests that motives are the “basic level” of musical categorization (Rosch,
1978), itself the most important basis of musical understanding. He thus follows and
updates seminal music theorists like Arnold Schoenberg (e.g. 1967, 1978), for whom
musical motives (by which he primarily means pitch-interval configurations) are the
key to musical coherence and comprehension. While the present results certainly
cannot invalidate established views such as Schoenberg’s or Zbikowski’s, they do
suggest some caution in concluding that pitch-based motives are the very foundation
of musical understanding, insofar as it is related to the cognitive processes of actual
musical listeners. As suggested above, caution is needed not so much because
listeners cannot process such relationships (though this may also be true in many
cases), but because they often don’t care to, and instead focus their attention on other
171
MS-Discussion_Forum_4B-RR.indd 171
23/06/09 11:41:25
types of “motives” and “themes”, based on similarities and differences in secondary
parameters.
Since composers are, among other roles, their own first listeners, it makes sense
to examine whether and how such relationships, rarely explored in music analysis,
are shaped in actual musical compositions (including the staples of Classic-Romantic
music) and how they affect musical structure and expression. Thus, heeding listeners’
tendencies to prefer secondary parameters in categorizing motives may not only serve
a cautionary role, but suggest to listener-oriented music analysis new, rarely explored
paths. 9
Acknowledgments
We thank Gera Ginzburg, Tal Galili, and Noa Ravid-Arazi for their valuable help in
conducting this experiment, as well as Petri Toiviainen and three anonymous referees
for Musicæ Scientiæ for helpful suggestions. Research was supported by an Israeli
Science Foundation Grant no. 888/02-27.0 to the 1st author. Findings reported here
were first presented at the 10th International Conference on Music Perception and
Cognition (ICMPC 10), Sapporo, Japan, August 2008.
Address for correspondence:
Dr. Zohar Eitan,
School of Music
Tel Aviv University
Tel Aviv 69978, Israel
e-mail: zeitan@post.tau.ac.il
zohar44@yahoo.com
(9) For musical analyses emphasizing the role of secondary parameters in motivic-thematic
structure, see Eitan, 2003.
172
MS-Discussion_Forum_4B-RR.indd 172
23/06/09 11:41:25
Musical parameters and motivic classification
zohar eitan and roni y. granot
• References
Balzano, G. J. (1986). Music perception as detection of pitch time constraints. In V. McCabe &
G. J. Balzano (eds) Event cognition: An ecological perspective (pp. 217-34). London:
Lawrence Erlbaum.
Bartlett, J. (1984). Cognition of complex events: visual scenes and music. In W. R. Crozier, &
A. J. Chapman (eds) Cognitive processes in the perception of art (pp. 222-51). Amsterdam:
North Holland.
Bigand, E. (1990). Abstraction of two forms of underlying structure in a tonal melody. Psychology
of Music, 18, 45-60.
Bigand, E., D’Adamo, D.A., & Poulin-Charronnat, B. (submitted). The implicit learning of
twelve-tone music.
Bigand, E., & Poulin-Charronnat, B. (2006). Are we “experienced listeners”? A review of the
musical capacities that do not depend on formal musical training. Cognition, 100, 10030.
Blauert, J. (1997). Spatial hearing: the psychophysics of human sound localization. Cambridge, MA:
MIT Press.
Boss, J. F. (1999). Schenkerian-Schoenbergian analysis and hidden repetition in the opening
movement of Beethoven’s Piano Sonata Op. 10, No. 1. Music Theory Online, 5. http://
societymusictheory.org/mto/issues/mto.99.5.1/mto.99.5.1.boss.html
Bruner, C. L. (1984). The perception of contemporary pitch structures. Music Perception, 2, 2539.
Burkhart, C. (1978). Schenker’s “motivic parallelism”. Journal of Music Theory, 22, 145-75.
Cohn, R. (1992). The autonomy of motives in Schenkerian accounts of tonal music. Music Theory
Spectrum, 14, 150-70. Cruttenden, Alan. (1997) Intonation. 2nd edition. New York:
Cambridge University Press.
Cuddy, L. L., Cohen, A. J., & Mewhort, D. J. K. (1981). Perception of structure in short melodic
sequences. Journal of Experimental Psychology: Human Perception and Performance, 7,
869-83.
Deliège, I. (1987). Grouping conditions in listening to music: An approach to Lerdahl and
Jackendoff ’s grouping preference rules. Music Perception 4, 325-60.
Deliège, I. (1996). Cue abstraction as a component of categorization processes in music listening.
Psychology of Music, 24/2, 131-56.
Deliège, I. (2001a). Prototype effects in music listening: an empirical approach to the notion of
imprint. Music Perception, 18, 371-407.
Deliège, I. (2001b). Similarity perception, categorization, cue abstraction. Music Perception, 18,
233-43.
Demorest, S. M., & Serlin, R. C. The integration of pitch and rhythm in musical judgment:
Testing age-related trends in novice listeners. Journal of Research in Music Education, 45,
67-79.
Dibben, N., & Clarke, E. F. (1997). The perception of associative structure in atonal music.
In Alf Gabrielsson (ed), Third annual ESCOM conference: proceedings (pp. 594-98).
Uppsala.
Dowling, W. J. (1978). Scale and contour: Two components of a theory of memory for melodies.
Psychological Review, 85, 341-54
173
MS-Discussion_Forum_4B-RR.indd 173
23/06/09 11:41:25
Dowling, W. J. (1999). The development of music perception and cognition. In Deutsch, D. (ed),
The psychology of music, second edition (pp. 603-25). San Diego: Academic Press.
Dowling, W. J., and Bartlett, C. J. (1981). The importance of interval information in long-term
memory for melodies. Psychomusicology, 1, 30-49.
Drabkin, W. (2001a). Motif. In Sadie, S. (ed), The new Grove dictionary of music and musicians,
second edition online. http://www.grovemusic.com
Drabkin, W. (2001b). Theme. In Sadie, S. (ed), The new Grove dictionary of music and musicians,
second edition online. http://www.grovemusic.com
Dunsby, J. (2002). Thematic and motivic analysis. In Christensen, T. (ed), The Cambridge History
of Western Music Theory, pp. 907-26. Cambridge, UK: Cambridge University Press.
Edworthy, J. (1985). Melodic contour and musical structure. In P. Howell, I. Cross, & R. West
(eds), Musical Structure and Cognition (pp. 169-87). London: Academic Press.
Eitan, Z. (1997). Highpoints: A study of melodic peaks. Philadelphia: University of Pennsylvania Press.
Eitan, Z. (2003). Thematic gestures: Theoretical preliminaries and an analysis. Orbis Musicæ, 13.
Eitan, Z. & Granot, R. Y. (2006). How music moves: Musical parameters and listener’s images of
motion. Music Perception, 23, 221-48.
Eitan, Z., & Granot, R. (2007). Intensity changes and perceived similarity: Inter-parametric
analogies. Musicæ Scientiæ, Discussion forum 4a, 39-75.
Epstein, D. (1979). Beyond Orpheus: studies in musical structure. Cambridge, MA: MIT Press.
Eerola, T., Järvinen, P., Louhivuori, J., & Toivianen, P. (2001). Statistical features and perceived
similarity of folk melodies. Music Perception, 18, 275-96.
Frisch, W. (1987). Brahms and the principle of developing variation. Berkeley: University of
California Press.
Gaab, N., Talla, P., Kim, H., Lakshminarayanan, K., Archie, J., Glover, G. H., & Gabrieli, J. D. E.
(2005). Neural correlates of rapid spectrotemporal processing in musicians and
nonmusicians. Annal of the New York Academy of Science, 1060, 82-8.
Garner, W. R. (1974). The processing of information and structure. Potomac, MD: Erlbaum.
Gaver, W. W. (1993a). What in the world do we hear? An ecological approach to auditory event
perception. Ecological Psychology, 5, 1-29.
Gaver, W. W. (1993b). How do we hear in the world?: Explorations in ecological acoustics.
Ecological Psychology, 5, 285-313.
Gibson, D. B., Jr. (1986). The aural perception of nontraditional chords in selected theoretical
relationships: A computer-generated experiment. Journal of Research in Music Education,
34, 5-23.
Gibson, D. B., Jr. (1993). The effects of pitch and pitch-class content on the aural perception of
dissimilarity in complementary hexachords. Psychomusicology, 12, 58-72.
Granot, R. Y., & Eitan, Z. (Submitted) Musical tension and the interaction of dynamic auditory
parameters.
Hamilton, K. (1996). Liszt: Sonata in B Minor. Cambridge, UK: Cambridge University Press.
Hinton, L., Nichols, J., & Ohala, J. J. (1994). Sound symbolism. Cambridge, UK: Cambridge
University Press.
Honing, H., & de Haas, W. B. (2008). Swing once more: Relating timing and tempo in expert
jazz drumming. Music Perception, 25, 471-77.
Hopkins, R. G. (1990). Closure and Mahler’s music: The Role of Secondary Parameters. Philadelphia:
The University of Pennsylvania Press.
174
MS-Discussion_Forum_4B-RR.indd 174
23/06/09 11:41:25
Musical parameters and motivic classification
zohar eitan and roni y. granot
Hudgens, H. A. (1997). Perceived categories in atonal music: Webern’s psychology of convention.
Ph.D. Dissertation, Northwestern University.
Huron, D. (1996). The melodic arch in Western folksongs. Computing in Musicology, 10, 3-23.
Jones, M. R., & Hahn, J. (1986). Invariants in sound. In V. McCabe, & G. J. Balzano (eds), Event
Cognition: An ecological perspective. London: Lawrence Erlbaum Associates, pp. 197215.
Jones, M. R., & Yee, W. (1993). Attending to auditory events: The role of temporal organization.
In S. McAdams & E. Bigand (eds), Cognitive Aspects of Human Audition. Oxford:
Oxford University Press.
Kemler Nelson, D. G. (1984). The effect of intention on what concepts are acquired. Journal of
Verbal Learning and Verbal Behavior, 23, 734-59.
Kruegler, L. (1973). Effect of irrelevant surrounding material on speed of same-different judgment
of two adjacent letters. Journal of Experimental Psychology, 98, 252-59.
Krumhansl. C. L. (2004). The cognition of tonality — as we know it today. Journal of New Music
Research, 33, 253-68.
Lamont, A., & Dibben, N. (2001). Motivic structure and the perception of similarity. Music
Perception, 18/3, 245-74.
Lerdahl, F., & Jackendoff, R. (1983). A generative theory of tonal music. Cambridge, MA: MIT
Press.
Lombardo, T. J. (1987). The reciprocity of perceiver and environment: The evolution of James
J. Gibson’s ecological psychology. Hillsdale, N.J.: Lawrence Erlbaum.
Marks, L. E. (1978). The unity of the senses. New York: Academic Press.
McAdams, S., Viellard S., Houix, O., & Reynolds, R. (2004). Perception of musical similarity
among contemporary thematic materials in two instrumentations. Music Perception, 22,
207-38.
McCabe, V., & Balzano, G. J. (eds) Event cognition: An ecological perspective. London: Lawrence
Erlbaum.
Meyer, L. B. (1973). Explaining music. Berkeley: University of California Press.
Meyer, L. B. (1985). Music and ideology in the nineteenth century. In S. M. McMurrin (ed), The
Tanner lectures on human values, Vol. 6 (pp. 23-52). Salt Lake City: University of Utah
Press.
Meyer, L. B. (1989). Style and music: Theory, history, and ideology. Philadelphia: University of
Pennsylvania Press.
Morton, E. S. (1994). Sound Symbolism and its Role in Non-Human Vertebrate Communication.
In Hinton, Leanne, Johanna Nichols, and J. J. Ohala (eds), Sound symbolism (pp. 34865). Cambridge, UK: Cambridge University Press.
Peretz, I., Morais, J., & Bertelson, P. (1987). Shifting ear differences in melody recognition
through strategy inducement. Brain and Cognition, 6, 202-15.
Pollard-Gott, L. (1983). Emergence of thematic concepts in repeated listening to music. Cognitive
Psychology, 15, 66-94.
Puts, D. Gaulin, S., & Verdolini, K. (2006). Dominance and the evolution of sexual dimorphism
in human voice pitch. Evolution and Human Behavior, 27, 283-96.
Rammsayer, T. & Altenmuller, E. Temporal infor­mation processing in musicians and nonmusicians,
Music Perception, 24, 37-48.
Reybrouck, M. (2005). A biosemiotic and ecological approach to music cognition: Event perception
175
MS-Discussion_Forum_4B-RR.indd 175
23/06/09 11:41:25
between auditory listening and cognitive economy. Axiomathes: An International Journal
in Ontology and Cognitive Systems, 15, 229-66.
Rosch, E. (1978). Principles of categorization. In E. Rosch & B. B. Lloyd (eds), Cognition and
Categorization (pp. 27-48). London: Erlbaum.
Schoenberg, A. (1934-36/1995). The musical idea and the logic, technique, and art of its presentation.
(P. Carpenter, S. Neff, eds and Trans). New York: Columbia University Press.
Schoenberg, A. (1967). Fundamentals of musical composition. (G. Strang & L. Stein, eds). London:
Faber.
Schoenberg, A. (1978). Style and idea: selected writings of Arnold Schoenberg. L. Stein (ed). New
York: St. Martin Press.
Smith, J. D., & Shapiro, H. (1989). The occurrence of holistic categorization. Journal of Memory
and Language, 28, 386-99.
Straus, J. N. (1990). Introduction to post-tonal theory. Englewood Cliffs, N.J.: Prentice-Hall.
Van den Toorn, P. (1996). What’s in a motive? Schoenberg and Schenker reconsidered. Journal of
Musicology, 14, 370-99.
Walker, A. (2005). Reflections on Liszt. Ithaca: Cornell University Press.
Zbikowski, L. (1999). Musical coherence, motive, and categorization. Music Perception, 17, 5-42.
Zbikowski, L. (2002). Conceptualizing music: Cognitive structure, theory, and analysis. New York:
Oxford University Press.
Ziv, N., & Eitan, Z. (2007). Perceiving thematic structure: Similarity judgments versus categorization
tasks. Musicæ Scientiæ, Discussion forum 4a, 99-133.
176
MS-Discussion_Forum_4B-RR.indd 176
23/06/09 11:41:25
Musical parameters and motivic classification
zohar eitan and roni y. granot
• Parámetros musicales primarios versus secundarios
y clasificación de los motivos melódicos
Los teóricos de la música sostienen a menudo que la categorización motívica en
música es determinada por parámetros musicales “primarios” —los aspectos
específicamente musicales del tono y la estructura temporal, como por ejemplo los
intervalos de alturas o las jerarquías métricas que sirven de base a la estructura
musical—. Por el contrario, los parámetros “secundarios”, que incluyen algunos
aspectos relevantes de la percepción auditiva extra-musical como el volumen, el
registro y el timbre, no establecen ninguna categoría motívica. Hemos examinado
sistemáticamente los resultados de la oposición de parámetros primarios versus
parámetros secundarios en las clasificaciones melódicas de los oyentes. Se han
creado matrices de fórmulas melódicas de ocho motivos cada una a través de todas
las interacciones posibles entre dos condiciones contrastante en tres elementos
musicales. En el Experimento 1, cuatro matrices manipularon el contorno tonal, las
clases de intervalos tonales, y un elemento compuesto que comprendía los
parámetros secundarios de la dinámica, del registro tonal y de la articulación. En el
Experimento 2 otras cuatro matrices diferentes manipulaban la estructura rítmica
(el acento métrico y la duración), los intervalos de altura y el elemento compuesto
empleado en el Experimento 1. Los participantes (95 en el Experimento 1, de ellos
27 musicalmente entrenados; 88 en el Experimento 2, de ellos 23 musicalmente
entrenados) han clasificado los estímulos de cada matriz en dos grupos iguales (44) de motivos “que van juntos”. En ambos experimentos, la mayor parte de los
participantes ha utilizado el contraste de parámetros secundarios como base de
clasificación, mientras que pocas clasificaciones se han basado en las diferencias en
los intervalos de altura (experimento 2) o en las clases de intervalos (experimento 1).
Los participantes musicalmente entrenados aplicaron en sus clasificaciones por igual
el contorno melódico y el ritmo. Los resultados sugieren una jerarquía de los
parámetros musicales que pone en discusión los parámetros indicados en la mayor
parte de la teoría musical académica. Se examinan las repercusiones de los
resultados sobre las nociones de percepción de la estructura temático–motívica en
música, y, consecuentemente, sobre el análisis musical cognitivo.
• Parametri primari a confronto con parametri secondari
e classificazione dei motivi melodici
I teorici della musica ritengono comunemente che la categorizzazione motivica in
musica sia determinata da parametri musicali “primari” — gli aspetti specificamente
musicali del tono e della struttura temporale, come ad esempio gli intervalli di
altezza o le gerarchie metriche, intese quali basi della sintassi musicale. Al contrario,
parametri “secondari”, compresi alcuni aspetti rilevanti della percezione uditiva
extra-musicale come il volume, il registro e il timbro, non stabiliscono alcuna
categoria motivica. Abbiamo analizzato sistematicamente i risultati dell’opposizione
di parametri primari a confronto con parametri secondari nelle classificazioni
melodiche degli ascoltatori. Sono state create matrici di formule melodiche di otto
177
MS-Discussion_Forum_4B-RR.indd 177
23/06/09 11:41:25
motivi ciascuna attraverso tutte le possibili interazioni tra due condizioni contrastanti
in tre elementi musicali. Nell’Esperimento 1, quattro matrici manipolavano l’altezza
isometrica, la classe d’intervallo tonale, e un elemento composto che comprendeva
i parametri secondari della dinamica, del registro tonale e dell’articolazione.
Nell’Esperimento 2 quattro diverse matrici manipolavano la struttura ritmica (l’accento
metrico e la durata), gli intervalli di altezza e l’elemento composto impiegato
nell’Esperimento 1. I partecipanti (95 nell’Esperimento 1, dei quali 27 esperti di
musica; 88 nell’Esperimento 2 dei quali 23 esperti di musica) hanno classificato gli
stimoli di ciascuna matrice in due gruppi motivici tra loro legati di uguale numero
(4-4). In entrambe gli esperimenti la maggior parte dei partecipanti ha usato
l’opposizione nei parametri secondari come base classificatoria, al contrario solo
poche classificazioni sono state fatte in base alle differenze negli intervalli di altezza
(Es. 2) o nella classe di intervallo (Es. 1). I partecipanti esperti di musica applicavano
nelle loro classificazioni anche l’isometria melodica e il ritmo. I risultati suggeriscono
una gerarchia dei parametri musicali che mette in discussione i parametri indicati
dalla maggior parte della teoria musicale accademica. Sono esaminate le ramificazioni
dei risultati per quanto riguarda le nozioni di percezione della struttura tematicamotivica in musica e, conseguentemente, l’analisi musicale cognitiva.
• Les paramètres musicaux primaires versus secondaires
et la classification des motifs mélodiques
Les théoriciens de la musique soutiennent souvent que la catégorisation motivique
en musique est déterminée par des paramètres musicaux « primaries » — les
aspects spécifiques à la musique que sont le ton et la structure temporelle, tels les
intervalles ou les hiérarchies métriques, qui servent de base à la syntaxe musicale.
Par contre, les paramètres « secondaires », qui incluent d’importants aspects de
perception auditive extra-musicale comme le volume/l’intensité, le registre tonal, et
le timbre, n’établissent pas de catégories motiviques. Nous avons examiné de façon
systématique les effets des contrastes entre paramètres musicaux primaires et
secondaires sur la classification mélodique faite par des auditeurs. Des matrices de
structures mélodiques, chacune présentant 8 motifs, ont été créées par toutes les
interactions de deux conditions opposées dans trois caractéristiques musicales.
Dans la 1re expérience, quatre matrices manipulaient le contour tonal, les classes
d’intervalles de hauteurs, et un aspect composé englobant les paramètres
secondaires de dynamique, de registre tonal et d’articulation. Dans la 2de expérience,
quatre matrices différentes manipulaient la structure rythmique (accent métrique et
accent de durée), les intervalles de hauteur, et l’aspect composé utilisé dans la
1re expérience. Les participants (95 participants, dont 27 musicalement entraînés,
dans la 1re experience ; 88 participants, dont 23 musicalement entraînés, dans la 2de)
ont classé les stimuli de chaque matrice en deux groupes égaux (4-4) de motifs
« allant ensemble ». Dans les deux expériences, la plupart des participants ont
utilisé le contraste des paramètres secondaires comme base de classification, alors
que peu de classifications reposaient sur les différences dans l’intervalle de hauteur
(experience 2) ou dans l’intervalle de classe (experience 1). Les classifications
178
MS-Discussion_Forum_4B-RR.indd 178
23/06/09 11:41:26
Musical parameters and motivic classification
zohar eitan and roni y. granot
opérées par les participants musicalement entraînés appliquaient également le
contour mélodique et le rythme. Les résultats suggèrent une hiérarchie des paramètres
musicaux, qui conteste celle proposée par la plupart des théories musicales
académiques. Nous discutons des répercussions de ces résultats sur les notions de
la perception de la structure motivique thématique en musique, et, conséquemment,
sur l’analyse musicale cognitive.
• Primäre im Gegensatz zu sekundären musikalischen Parametern
und die Klassifikation melodischer Motive
Musiktheoretiker behaupten häufig, dass motivische Kategorisierungen in der Musik
durch „primäre” musikalische Parameter bestimmt werden, die als musikspezifische
Aspekte von Tonhöhe und zeitlicher Struktur (wie beispielsweise Intervalle oder
metrischen Hierarchien) verstanden werden und als Basis für musikalische Syntax
dienen. Im Gegensatz dazu konstituieren „sekundäre“ Parameter keine motivischen
Kategorien; zu ihnen zählen wichtige außermusikalische Aspekte der auditiven
Wahrnehmung von Lautstärke, Tonregister und Timbre. Wir untersuchten systematisch
Kontrasteffekte in primären sowie sekundären musikalischen Parametern in der
Melodieklassifikation von Hörern. Matrizen aus melodischen Mustern mit jeweils
acht Motiven wurden gebildet, wobei jeweils zwei kontrastierende Bedingungen in
den drei musikalischen Merkmalen interagieren. Im ersten Experiment manipulieren
vier Matrizen den Tonhöhenverlauf und die Intervallklasse, außerdem gibt es eine
komplementäre Bedingung mit den sekundären Parametern von Dynamik,
Tonhöhenregister und Artikulation. Im zweiten Experiment manipulieren vier
verschiedene Matrizen die rhythmische Struktur (metrische und zeitliche Akzente),
die Intervalle und die Komplementärbedingung wie in Experiment 1. Die
95 Versuchsteilnehmer (27 musikalisch geschult, Exp. 1) und 88 Versuchsteilnehmer
(23 musikalisch geschult, Exp. 2) klassifizierten die Stimuli jeder Matrix zu
zwei zahlengleichen (4-4) zusammengehörigen Gruppen von Motiven. In beiden
Experimenten verwendeten die meisten Versuchsteilnehmer die Kontraste der
sekundären Parameter als Basis für die Klassifikation, während nur wenige
Klassifikationen auf den Intervallunterschieden (Exp. 2) oder den Unterschieden in
der Intervallklasse (Exp. 2) beruhen. Musikalisch trainierte Versuchsteilnehmer
verwendeten für ihre Klassifikationen auch die melodische Kontur oder den
Rhythmus. Die Ergebnisse verweisen auf eine Hierarchie musikalischer Parameter,
die die Hierarchien der meisten akademischen Musiktheoretiker herausfordert. Wir
diskutieren Implikationen der Ergebnisse hinsichtlich der Wahrnehmung von
thematisch-motivischen Strukturen in der Musik und im Blick auf eine kognitiv
informierte Musikanalyse.
179
MS-Discussion_Forum_4B-RR.indd 179
23/06/09 11:41:26
Download