Preliminary Program - Swiss Center for Affective Sciences

advertisement
Wednesday 23rd
Workshop and presentations of doctoral and post-doctoral members of the Swiss Center for
Affective Sciences and the Neuroscience of Emotion and Affective Dynamics lab (NEAD)
9:30-10:00 am: Leonardo Ceravolo
Modulation of auditory spatial attention through emotion: an fMRI auditory dot-probe study
Emotional stimuli can modulate attentional orienting through signals sent by cortical nuclei (e.g. amygdala) which modulate
visual perception at early stages of processing. Very few studies aimed at investigating the influence of emotional stimuli on attentional
orienting in the auditory domain. We, therefore, used an auditory dot-probe paradigm involving simultaneously presented neutral and
angry non-speech utterances (voices) each lateralized to one side of the auditory space. Emotional utterances were immediately
followed by short and lateralized single sine wave tones presented on the side of the preceding angry voice (valid trial) or on the
opposite side (invalid trial). We supposed that angry compared to neutral voices attracted attention and facilitated processing of targets
which matched the same side of presentation. Behavioral and functional results show an involvement of the right STG for the decoding
of angry emotional vocalizations, even if they are not related to the participant’s task.
10:15-10:45: Andy Christen
Temporal, spectral and phase-locked dynamics of amygdalae and orbitofrontal cortex responses to emotion and spatial
attention
Detection of potentially relevant events in the environment may occur at different levels of processing, including processes
being engaged relatively independently of attention and other being dependent on voluntary attention and task demand. This study
aimed to shade light on human amygdalae (AMG) and medial orbitofrontal (OFC) reactivity during emotional prosody processing and
their potential modulations by attentional processes. By using intracranial local field potentials recorded in two humans, we will show
some findings characterizing temporal and spectral modulations of these regions by emotion and spatial attention. Furthermore, we will
present long-range cross-frequency coupling data suggesting that the anatomical connectivity between amygdala and the medial parts
of the orbitofrontal cortex contribute to the functional integration of attended and unattended processing of emotional prosody.
10:45-11:00: Coffee break
11:00-11:30: Valérie Milesi
How the brain integrates multimodal emotional information
Our main focus of interest is in understanding how visual information, namely a face expressing anger or having a neutral
expression, modulates the cortical representation of auditory information, such as an “ah” sound pronounced either in an angry or
neutral tone. We created our stimuli with FACSGen using the sounds from the GEMEP database. Our first fMRI study used congruent
and incongruent stimuli - sharing or not sharing the same expression – in two different tasks: an implicit task where the participants had
to discriminate the gender of the voice and an explicit one, where they had to respond if the voice was angry or neutral. We expect
emotional, as opposed to neutral information, to produce a stronger cortical activation and, concerning the task effect, we expect less
modulation effects on the lower processing stages (primary cortices, amygdala) compared to higher processing ones (pSTS, IFC,
OFC), as has already been shown in previous emotional prosody studies. As already shown by Kayser et al (2010) we predict a linear
increase of the impact of visual information along the auditory processing pathway, modulated by the emotional content of visual and
auditory information.
11:45-12:15: Sezen Cekic
Amygala and orbitofrontal causal relationships in local field potentials in humans: investigation through the concept of
“Granger-Causality”
The results of several functional magnetic resonance imaging (fMRI) studies have shown, among others, the involvement of
amygdala and orbitofrontal regions in the decoding of emotional prosody. To what extent these modulations of the activity are related?
Anatomical evidences in monkey and in studies using diffusion tensor imaging are in favor of a possible functional coupling between
these two brain regions. However, the causal relationships of the neuronal activities between these two regions and their modulations
by emotion have never been investigated. Using local field potential recordings in human, we are investigating to what extent brain
oscillations in specific frequency bands recorded within the amygdala have a causal influence on oscillations in specific frequency
bands in the medial orbito-frontal cortex during emotional prosody exposure. Here, the term of causality, or directional coupling, can
be reformulate as the causal influence between two time series; a solution of this problem has been resolved and exposed for the first
time by the economist Clive Granger (1969). His theory is based on the principle that ”the cause necessarily precedes the effect”. More
precisely, considering two time series X and Y, if the knowledge of the history of the series Y up to time t-1 allows a better prediction of
the X series at time t than the prediction of this series by its own history only, then the series Y is said to have a causal influence on X
series. The goal of this research is to extend this concept of “Granger Causality” to the spectral domain and to find a valid statistical
method allowing an accurate application of the “spectral Granger-Causality” to these LFP data and, in fine, to test the causal
relationships between amygdala and OFC in the context of emotion.
12:30-1:30 pm: Lunch
1:30-2:00: Sascha Fruehholz
The fronto-temporal network for the decoding of emotional prosody in healthy humans and MTL patients.
We used angry and neutral human utterances to reveal brain activations during implicit and explicit processing of vocal
emotional tones. In an fMRI study with healthy volunteers we acquired high resolution functional magnetic resonance imaging (fMRI)
brain scans covering superior temporal, inferior frontal and medial limbic cortices. The functional data suggest (1) several superior
temporal and frontal subregions for the explicit and implicit decoding of emotional prosody, (2) a high interconnectivity in this brain
network as revealed by a psycho-physiological interaction (PPI) analysis, and (3) a causal network with the right posterior superior
temporal gyrus as central voice sensitive regions as determined by a sequential dynamic causal modeling (DCM) approach.
Preliminary data from an fMRI study with epileptic patients which underwent uni-lateral medial temporal lobe resections revealed
specific cognitive and functional impairments in the decoding of emotional prosody.
2:15-2:45: Sona Patel
Some patterns of expressive prosody in speech and music
While there are a number of similarities in manner in which emotions are expressed in speech and music, differences exist
for some emotions. Here we investigated the perceptual and acoustic differences in the expression of the vowel /a/ sung in form of a
scale and nonsense sentences by a professional singer. To capture changes in voice quality, we examined measurements from the
inverse-filtered waveform. Results of the acoustic analysis were compared with emotional expression of the vowel /a/ portrayed by
professional actors.
3:00-3:30 Coffee break
3:30-4:00: Julie Péron
Emotional processing and basal ganglia: What can we learn from Parkinson's disease?
Parkinson’s disease (PD) provides a model for studying the neural substrates of emotional processing in two ways. First,
the striato-thalamo-cortical circuits, like the mesolimbic dopamine system that modulates their function, have been implicated in
emotional processing. As PD is histopathologically characterized by the selective, progressive and chronic degeneration of the
nigrostriatal and mesocorticolimbic dopamine systems, it can therefore serve as a model for assessing the role of these functional
circuits in humans. Second, there is growing evidence of a link between emotional impairments and deep brain stimulation (DBS) of
the subthalamic nucleus (STN), a treatment that constitutes a therapeutic advance for severely disabled PD patients. In this context,
the STN DBS PD patient model seems to represent a unique opportunity for studying the functional role of the STN in emotional
processing. In the present review, we will provide a synopsis of the emotional disturbances observed in PD in order to underline the
functional roles of (1) the dopaminergic pathways and basal ganglia, and (2) the STN in emotional processing. Finally, we will discuss
the functional roles of the striato-thalamo-cortical circuits and the STN in emotional processing. It seems reasonable to assume that
both the mesolimbic and striato-thalamo-cortical circuits are involved in emotional processing and that the STN plays a central role in
the latter.
4:15-4:45: Kim Torres-Eliard
Dynamic judgment of emotions expressed by music
In the field of music and emotions, there is an important distinction between the emotions felt while listening a piece of
music and the emotions expressed by music. Based on emotions expressed by music, the main issue of our work focuses on
understanding 1) the dynamics of emotional judgments in music, 2) the relationships between acoustical parameters and musical
structures and their correlations with dynamic emotional judgments, 3) the dynamics of brain areas involved in musical perception and
the related emotional judgments. In this context, a new approach called “dynamic judgments” has been developed, using the nine
emotional dimensions specific to music proposed by Zentner, Grandjean & Scherer (2008). This new approach seems to be a
promising avenue of research to better understand the links between music and emotions, especially linking these dynamic judgments
to the acoustical characteristics of the signal and the musical structures, using a dynamic Brunswikian modeling of emotion
communication (Brunswik, 1956).
5:00-5:30: Carolina Labbé
Rhythm, entrainment and emotion
One of the reasons music psychology has become increasingly popular in music research is probably its ability to evoke
and induce strong emotions. Here, we focus on the various mechanisms that might contribute to the induction of such emotions and on
the effects of rhythm and entrainment in particular. Entrainment, the tendency of physical and biological systems -oscillators- to
synchronize through interaction, is probably one of the most powerful candidates for the induction of musical emotions. In past studies
it has been shown to have had effects on physiological, neurological and subjective components of emotion, but never simultaneously.
It has also been suggested as the basis of metrical structure perception -through listeners’ own internal rhythmic processes- and even
as the basis for pulse perception. In a first study we will therefore explore the mental representation of meter and tempo using EEG
and their effects on brainwave and physiological entrainment.
5:45-6:15: Wiebke Trost
The temporal dynamics of felt musical emotions in terms of entrainment
We use fMRI and physiological measures (respiration and heart rate, pupil diameter) to study the temporal dynamics of
emotions, evoked by classical music. By applying inter-subject correlation analyses of brain activity and peripheral physiological
processes during music listening, we want to identify moments in the music with a high concordance between participants and verify by
additional behavioral measures if these moments represent a specific emotional flavor. Moreover, we study not only synchronization
processes between subjects but analyze also entrainment mechanisms between physiological processes and musical features,
especially the musical tempo, and test how these processes are influenced by inter-individual differences. Preliminary results show that
heart and respiration rate correlate positively with tempo. Interestingly, the strength of entrainment between tempo and respiration
could be predicted by the self-reported frequency of classical music listening.
6:30-7:00 Discussions
Download