Keith Johnson Colloquium

GLOSS and the Department of Linguistics Colloquium
April 17, 3:00-5:00, 245 Straub
Keith Johnson
University of California Berkeley
Adventures in phonetic neuroscience
In studying linguistic knowledge and the cognitive processing that uses this knowledge, linguists
and psycholinguists have looked for ways to find out what is cognitively "real" that underlies the
patterns found in language and linguistic behavior. We are generally faced with the problem of
being on the outside looking in. Each method of acquiring data from people as they speak and
listen (elicitation of forms, recording of corpora, recording behavioral responses in experiments)
contributes to a more sophisticated understanding of linguistic knowledge and processing.
In this talk I will present some results from recent investigations in phonetic neuroscience. The
data come from recordings from a dense grid of electrodes placed directly on the surface of the
brain in patients who were undergoing surgery for epilepsy. These neural imaging data have fine
resolution in both time and frequency and have low enough noise that a relatively few trials is
needed in order to find rich phonetic information. In the first of the three studies I will describe,
we found that during speaking the motor cortex shows patterns of activity that group sounds by
articulator - labials are similar to other labials, dorsals are similar to other dorsals, etc (Bouchard
et al., 2013, Nature 495, 327-332). In the second study, we found that during listening the
superior temporal gyrus shows patterns of activity that group sounds by manner of articulation stops are similar to stops, fricatives to fricatives, etc. (Mesgarani et al., 2014, Science 10061010). The third study examines a pattern of activity in the motor areas of the cortex that
appears during listening. This pattern has been noted by prior researchers who have speculated
that motor activity during speech perception suggests the activity of mirror neurons and perhaps
that the motor theory of speech perception is supported. Our data complicate this interpretation
because we are able to decode the phonetic information in the motor area during perception and
what we find is surprising. The pattern of activity is like the pattern found by Mesgarani et al. in
STG - sounds are grouped with each other by manner of articulation, not by articulator. I'll
discuss the implications of this particular finding, and the broader implications of neuroscience
for linguistics