Uploaded by Yina Paola Cabarcas Mena

Detection and classification of ERP

advertisement
Detection and Classification of ERP for Social
Cognition Evaluation
Yina Cabarcas-Mena
Karol Gutierrez-Ruiz
Kiara C. Campo-Landines
Department of Psychology
Department of Psychology
Department of Biomedical Engineering
Universidad Tecnológica de Bolı́var
Universidad Tecnológica de Bolı́var
Universidad Tecnológica de Bolı́var
Cartagena de Indias, 130010, Colombia Cartagena de Indias, 130010, Colombia Cartagena de Indias, 130010, Colombia
kcampo@utb.edu.co
kgutierrez@utb.edu.co
ymena@utb.edu.co
Sonia H. Contreras-Ortiz.
Department of Biomedical Engineering
Universidad Tecnológica de Bolı́var
Cartagena de Indias, 130010, Colombia
scontreras@utb.edu.co
Abstract—This paper describes an approach for generation,
acquisition and analysis of event related potentials (ERP) for
social cognition evaluation using a commercial wireless headset.
Index Terms—ERP components, EEG, wavelet filtering, social
cognition, affective computing
I. I NTRODUCTION
Social cognition refers to the ability to perceive, process,
and understand social information and respond appropriately.
It also involves understanding and inferring the mental states
and beliefs of oneself and others, and being aware that they
may be different. An alteration of these abilities can generate
deficits in social behavior such as aggression or social withdrawal, modify social interactions and impair well-being and
social success [1]. Social Cognition determines the ability to
establish satisfactory social relationships and comprises three
domains. The main one is the recognition of the information
issued by the other (social perception). It includes facial
expressions, tone of voice, gestures, and corporal language.
This information is integrated to give rise to the other two
domains: the understanding of the emotions of others (affective
empathy), and the interpretation of their behaviors in terms of
mental states and intentionality (theory of mind). As a result
of these processes, the subject regulates and adapts their own
behavior and emotional response (decision making) for social
interaction [2].
Emotions are short-lived multifactorial phenomena that appear as a result of significant stimuli and are accompanied
by subjective experiences (feelings) and physiological and
physical reactions (such as facial expressions) that help the
individual adapt to the challenges of the environment [3].
There are six primary emotions: happiness, sadness, anger,
fear, disgust, and surprise, and there can be more than twenty
secondary emotions [4]. Emotions can be classified in terms
This project was funded by Universidad Tecnológica de Bolı́var with grant
No. CI2021P06.
of their valence or pleasantness [5], and can be detected using
speech features, facial expressions or physiological signals.
The processing of emotional content can be studied with
paradigms used in the study of cognitive psychology. For
example, oddball tasks have been adopted for studying processing of deviant emotional stimulation through the emotional oddball paradigm (EOP) and measure ERPs (eventrelated potentials) to standard and/or deviant stimuli through
electroencephalogram (EEG) [6].
EEG has been used to study social cognition processes
by recording the brain activity of two people while they
interact [7] or in emotion recognition tests [8]. Emotion
recognition tests typically present emotion associated stimuli
(visual, auditory, or both) to participants using EOP while the
generated evoked potentials are recorded.
II. R ELATED W ORK
In recent years, a number of studies have investigated
emotion recognition from EEG signals [9]–[12], photoplethysmography (PPG) and galvanic skin response (GSR) [13], etc.
These studies seek to identify and classify emotional states.
Emotion recognition and classification studies include the use
of machine learning and EEG signals.
Evoked potential analysis (ERP) has also been found relevant for emotion identification and classification [14] [15] [16]
[17]. This is due to psychological research showing that ERP
can sensitively reflect the emotional activities of the brain,
taking this as a base for the recognition of emotions. Among
these the P1 or P100 component is evaluated that the amplitude
in the occiput in the negative emotion emotion is greater than
in the positive emotion [18]. Similarly in other studies it is
seen that the late positive component (LPP), P2, N2, and P3
in the negative emotion is greater than in the positive emotion
[19] [20] [21].
J. Jiang et al. [22] used a methodology for emotion recognition by detecting single-trial ERPs to some specific level of
emotions.This method was tested by classifying the emotional
valence on three levels, the extremely negative, the moderately
negative and the neutral. The method used the most relevant
spatial and temporal features of the entire ERP waveform to
recognize emotions. The SVM scoring results demonstrated
the accuracy of the methodology.
When a subject performs attention-related tasks changes
in the encoding of emotional stimuli have been observed.
A widely used source of visual scenes is the standardized
International Affective Picture Systems [23] . It provides visual
content with selective rating on valence as well as arousal of
the stimulli. One of the studies that has used IAPS found that
the evoked signals were affected in the ERP component N2
due to pleasant and unpleasant stimuli, resulting in a response
in which humans are affected by emotions while performing
tasks presented through EOP. [24].
In this context, this work developed a methodology, with the
elaboration of an application to evaluate the EOP. We obtained
an ERP according to each type of stimuli, these are pleasant,
unpleasant and neutral, in order to recognize the different types
of emotions.
III. M ETHODOLOGY
A. The emotional oddball paradigm (EOP)
The stimuli were selected from the International Affective
Picture System [23], the images validated in Colombia [25]
were used, Fig.1 shows an example of the stimuli used. The
images were selected taking into account their valence and
arousal:
• Unpleasant or negative stimuli: 25 images.
Valence: 1.0-3.99
Arousal: > 5.5 (high) and <5.5 (medium).
• Pleasant stimuli: 25 images
Valencia: 6.01-9.0
Arousal: 4.0-6.0
• Neutral stimuli: 25 images
Valencia: 4.0-6.0
Arousal: 1.0-4.99
The valence level of the emotion is scored in the range of 1
to 9, with 1 being very unpleasant and 9 being very pleasant.
Arousal represents the level of arousal produced by the image
and is scored on a scale from 1 (calm) to 9 (arousal).
The experiment included the presentation of 375 images
(300 frequent or standard and 75 infrequent or emotional). A
visual oddball emotional experimental paradigm was used in
which a standard stimulus was presented in 80% of the trials
and infrequent stimuli or target stimuli having an emotional
connotation (pleasant, neutral, unpleasant) in 20% of the trials.
B. Hardware
The equipment used for signal acquisition was EPOC X
(Emotiv, San Francisco, CA). It has a 14-electrode system, and
a saline solution to improve contact and signal quality. It is
a commercial system designed for basic use, as an alternative
to the medical EEG devices. It offers high cost-benefit ratio,
ease of use and speed of preparation, to expand the studies
Fig. 1. Stimuli Unpleasant-Pleasant-Neutral
of mental health [26]. The headset is used in conjunction
with EmotivPRO software (Version 3.2.2.413, Emotiv, San
Francisco, CA) to obtain the raw EEG signals. The sampling
frequency was 128 Hz and resolution 14 bits.
The acquisition of EEG signals for research purposes is
becoming more and more common in ERP research, it is
critical that these systems have precise and accurate timing.
Event-marking timing used with Emotiv’s commercial EEG
systems is sought to be reliable [27].
C. Application for image generation
We developed an application in Processing (3.5.4, Processing Foundation) to implement the protocol for image
generation. The main purpose was to generate a sequence of
images, with timing being one of the top priorities. Hence, the
program was developed on a 64-bit Windows 10 pro system,
with a screen of 1024x768 pixels.
The application generated images quasi-randomly with their
corresponding times declared in the protocol, and sent the
timestamps corresponding to the onset of the stimuli to the
Emotiv headset through a serial port. Fig.1 show the final result
of the application.
Fig. 2. Sequence of images generated by the application. Frequent image
D. Experimental protocol
13 university students participated in the study. Exclusion criteria were: neuro-psychological conditions and anxiety
symptoms. The trials were carried out in the Bio-engineering
Laboratory of the Universidad Tecnológica de Bolı́var by a
research assistant who is a psychology graduate and a master’s
student. Fig.3 shows the final test setup in this study, which
is described below.
• When the subject arrived at the laboratory, he signed an
informed consent, in which he was informed about the
test procedure and his questions were answered.
• The EPOC X electrodes were soaked in saline solution
and the headphones were placed on the subject’s head.
• A test program was given to the subject explaining which
images were to be counted.
•
•
•
The subject’s concentration and comfort were ensured so
that there was a minimum amount of movement and the
signals were clear when recorded.
Each time a stimulus-like image appeared, the marker
was recorded and the EEG signals were recorded.
Finally, the subject was asked to indicate the number of
stimulus images counted. The recorded data were stored
for review.
signal was in parietal P8. These ERPs are the average of the
signals obtained with the three types of stimuli. With the help
of an expert neuropsychologist, the ERPs were identified for
each subject and type of stimulus. Table I and II contain the
measurements of the parameters described previously.
EEG signal with pleasant stimuli
3
Electrode O2
Amplitude ( V)
2
1
0
-1
-2
N100
-3
0
100
200
300
400
500
600
700
800
Time (ms)
Fig. 4. EEG signal with pleasant stimuli
EEG signal with Unpleasant stimuli
4
Electrode P8
Fig. 3. Final assembly of the protocol on the subject
3
P300
For the evocation of the ERP a visual paradigm was used,
in which three stimuli were presented, unpleasant, pleasant
and neutral, 25 representative stimuli were obtained for each
category. The electrodes of the EPOC X headset used were:
F3, F4, O1, O2, P7 and P8. The best possible representation
of the evoked potentials N100 and P300 was required. The
EEG signals were processed using basic high-pass filtering
with a Hamming finite impulse response (FIR) filter to remove
the DC component resulting from the acquisition process.
The frequency of -6dB was chosen as 0.12 Hz, and the
order as 1800, this with the help of the latest research done
on these signals [28] [29]. Then used wavelet filtering to
divide the signal into its frequency components. Daubechies 4
wavelet was used, with a 6-level decomposition, obtaining the
reconstruction of the processed signal from the components,
D6, D5, D4 and D3, which had the most information from the
ERP.
Finally, when obtaining the vector for each category of
stimulus, these were averaged with 25 realizations where and
the following 100 samples were analyzed when these event
markers recorded with Emotiv appeared, which corresponds
to approximately 800 ms.
IV. R ESULTS AND D ISCUSSION
We used ERP amplitudes and latencies for the evaluation of
the stimulus responses. Fig.4 shows the ERP N100, that was
observed in the occipital electrodes, and was acquired with the
O2 electrode. Fig. 5 shows the ERP P300, which was observed
in the parietal and frontal electrodes, obtaining that the clearest
Amplitude ( V)
2
E. Signal processing
1
0
-1
-2
-3
-4
0
100
200
300
400
500
600
700
800
Time (ms)
Fig. 5. EEG signal with unpleasant stimuli
The analysis of the data was done using box-and-whisker
plots, to observe the distribution of the data and their medians.
The figure 6 shows the plot of the amplitudes of the ERP
N100 for the stimuli. This shows that the medians of the data
are similar, and the dispersion of the data is greater for the
unpleasant stimulus.
Similarly, the ERP N100 latencies have close medians, and
a much higher data dispersion in the negative stimulus, with
a lower median, this is evidenced in Fig 7. A lower median
for the negative stimulus denotes a tendency for the potential
to appear earlier than 100 ms with these stimuli.
Figure 8 illustrates the medians and data distributions for the
ERP P300 amplitudes, according to the stimulus. It is noted
that the dispersion of the data in the pleasant stimulus varies
with respect to the other stimuli, and the medians are close
for all the stimuli.
The variation of the latencies is shown in Figure 9. In which
we can see that the medians of the unpleasant and pleasant
stimuli differ from that of the neutral stimulus, which also
TABLE I
M EASUREMENTS FOR ERP N100
ID
Gender
AH58-F
CD14-F
KO45-F
MO48-F
NY41-F
YL47-F
AP17-M
AT11-M
CA37-M
CM23-M
CO16-M
CP19-M
CP27-M
Amplitude(µV)
-0,7057
-3,408
-1,007
-1,687
-1,064
-3,455
-1,229
*
*
-2,251
-4,027
-2,561
-3,101
F
F
F
F
F
F
M
M
M
M
M
M
M
Unpleasant
Latency(ms)
78,13
78,13
54,69
109,4
39,06
54,69
101,6
*
*
93,75
117,2
156,3
109,4
Location
O2
O2
O2
O2
O1
O2
O1
*
*
O2
O1
O2
O2
Amplitude(µV)
-1,835
-2,443
-1,402
-3,915
-1,276
-1,187
-2,797
-1,186
-1,667
*
-1,452
-3,042
-2,692
N100
Stimuli
Pleasant
Latency(ms)
93,75
62,5
39,06
93,75
171,9
101,6
156,3
109,4
101,6
*
109,4
85,94
101,6
Location
O2
O2
O2
O2
O2
O1
O2
O2
O2
*
O2
O1
O1
Amplitude(µV)
-4,072
-1,87
*
-1,119
*
-1,358
-2,352
-1,562
-1,126
-0,7137
-3,346
*
-2,169
Neutral
Latency(ms)
125
101,6
*
93,75
*
109,4
156,3
156,3
117,2
54,69
46,88
*
93,75
Location
O1
O2
*
O1
*
O1
O1
O1
O2
O1
O2
*
O1
Location
P8
*
P7
P8
P7
P7
P7
F4
P8
*
P7
*
F3
Amplitude(µV)
0,9027
2,437
0,7404
2,058
1,152
1,655
2,677
*
1,758
*
4,662
1,15
1,086
Neutral
Latency(ms)
289,1
343,8
335,9
398,4
320,3
335,9
453,1
*
429,7
*
359,4
382,8
343,8
Location
P8
P8
P8
P7
P8
P7
P7
*
P8
*
P8
P7
P7
TABLE II
M EASUREMENTS FOR ERP P300
Amplitude N100
-5
Unpleasant
Latency(ms)
429,7
*
351,6
359,4
257,8
281,3
343,8
421,9
296,9
*
312,5
*
320,3
Amplitude N100
-5
-4.5
-4.5
-4
-4
-4
-3.5
-3.5
-3.5
-3
-3
-3
-2.5
Value
-4.5
Value
Value
-5
Amplitude(µV)
1,206
*
1,939
1,897
1,951
4,94
1,982
1,463
1,638
*
2,675
*
5,358
F
F
F
F
F
F
M
M
M
M
M
M
M
-2.5
-2
-2
-1.5
-1.5
-1.5
-1
-1
-1
-0.5
-0.5
-0.5
0
1
Unpleasant stimulus
Amplitude(µV)
1,247
*
0,6449
2,655
0,6163
1,08
1,335
4,015
3,597
*
1,271
*
10,05
Amplitude N100
-2.5
-2
0
Location
P7
*
P8
P7
P8
P8
P7
P8
P7
*
P8
*
P8
1
Pleasant stimulus
0
180
1
Neutral stimulus
Fig. 6. Amplitude N100
presents the largest dispersion of data, this denotes a slower
reaction to neutral stimuli.
V. C ONCLUSIONS
We developed an approach to evaluate social cognition using
the Emotiv headset. Preliminary results show that the ERPs
N100 and P300 were recognizable in most of the subjects and
some differences were detected for the three classes of stimuli:
neutral, pleasant and unpleasant. This works contributes to the
use of objective measures in the study of emotions, especially
the recognition of emotions as one of the levels of study of
Latency N100
180
Latency N100
180
160
160
160
140
140
140
120
120
120
100
Value
AH58-F
CD14-F
KO45-F
MO48-F
NY41-F
YL47-F
AP17-M
AT11-M
CA37-M
CM23-M
CO16-M
CP19-M
CP27-M
P300
Stimuli
Pleasant
Latency(ms)
296,9
*
328,1
265,6
359,4
281,3
304,7
273,4
343,8
*
328,1
*
351,6
Value
Gender
Value
ID
100
100
80
80
80
60
60
60
40
40
40
20
20
1
Unpleasant stimulus
Latency N100
1
Pleasant stimulus
20
1
Neutral stimulus
Fig. 7. Amplitude N100
social cognition by measuring basic responses associated with
psychological processes. Further work in this area is necessary
given the growing interest in processing emotional stimuli and
the theoretical issue of how deviant emotional deviant stimuli
are processed differently from neutral.
VI. ACKNOWLEDGMENTS
The authors thank Universidad Tecnológica de Bolı́var for
supporting this project and for a postgraduate scholarship.
Amplitude P300
11
Amplitude P300
11
10
10
9
9
9
8
8
8
7
7
7
6
6
6
5
Value
10
Value
Value
11
5
4
4
3
3
3
2
2
2
1
1
1
0
0
Unpleasant stimulus
0
1
[11]
[12]
5
4
1
Amplitude P300
Pleasant stimulus
[13]
[14]
1
Neutral stimulus
[15]
Fig. 8. Amplitude P300
[16]
Latency P300
460
Latency P300
460
440
440
420
420
420
400
400
400
380
380
380
360
Value
440
Value
Value
460
360
Latency P300
[17]
[18]
360
340
340
340
320
320
320
300
300
300
280
280
280
260
260
260
[19]
[20]
1
1
1
Unpleasant stimulus
Pleasant stimulus
Neutral stimulus
[21]
Fig. 9. Latency P300
[22]
R EFERENCES
[1] Maryline Couette, Stéphane Mouchabac, Alexis Bourla, Philippe Nuss,
and Florian Ferreri. Social cognition in post-traumatic stress disorder:
A systematic review. British Journal of Clinical Psychology, 59(2):117–
138, 2020.
[2] Maria Arioli, Chiara Crespi, and Nicola Canessa. Social cognition
through the lens of cognitive and clinical neuroscience. BioMed research
international, 2018, 2018.
[3] Johnmarshall Reeve. Understanding motivation and emotion, 6th edition. John Wiley amp; Sons, 2014.
[4] Alan S Cowen and Dacher Keltner. What the face displays: Mapping 28
emotions conveyed by naturalistic expression. American Psychologist,
75(3):349, 2020.
[5] Peter J Lang. The emotion probe: Studies of motivation and attention.
American psychologist, 50(5):372, 1995.
[6] Helge Schlüter and Christina Bermeitinger. Emotional oddball: A review
on variants, results, and mechanisms. Review of General Psychology,
21(3):179–222, 2017.
[7] Ivana Konvalinka and Andreas Roepstorff. The two-brain approach:
how can mutually interacting brains teach us something about social
interaction? Frontiers in human neuroscience, 6:215, 2012.
[8] Chunmei Qing, Rui Qiao, Xiangmin Xu, and Yongqiang Cheng. Interpretable emotion recognition using eeg signals. Ieee Access, 7:94160–
94170, 2019.
[9] Adrian R Aguiñaga, Daniel E Hernandez, Angeles Quezada, and Andrés
Calvillo Téllez. Emotion recognition by correlating facial expressions
and eeg analysis. Applied Sciences, 11(15):6987, 2021.
[10] Ya Zheng, Jing Xu, Hongning Jia, Fei Tan, Yi Chang, Li Zhou, Huijuan
Shen, and Benqing Qu. Electrophysiological correlates of emotional
[23]
[24]
[25]
[26]
[27]
[28]
[29]
processing in sensation seeking. Biological psychology, 88(1):41–50,
2011.
Bella Rozenkrants, Jonas K Olofsson, and John Polich. Affective visual
event-related potentials: arousal, valence, and repetition effects for normal and distorted pictures. International Journal of Psychophysiology,
67(2):114–123, 2008.
Bella Rozenkrants and John Polich. Affective erp processing in a visual
oddball task: arousal, valence, and gender. Clinical Neurophysiology,
119(10):2260–2265, 2008.
Juan Antonio Domı́nguez-Jiménez, Kiara Coralia Campo-Landines,
Juan C Martı́nez-Santos, Enrique J Delahoz, and Sonia H ContrerasOrtiz. A machine learning model for emotion recognition from physiological signals. Biomedical signal processing and control, 55:101646,
2020.
Moon Inder Singh and Mandeep Singh. Emotion recognition: An
evaluation of erp features acquired from frontal eeg electrodes. Applied
Sciences, 11(9):4131, 2021.
Li Zhu, Chongwei Su, Jianhai Zhang, Gaochao Cui, Andrzej Cichocki,
Changle Zhou, and Junhua Li. Eeg-based approach for recognizing
human social emotion perception. Advanced Engineering Informatics,
46:101191, 2020.
G Sivaradje, R Nakkeeran, and P Dananjayan. Extraction of evoked
potential and its applications in biomedical engineering. IETE Technical
Review, 22(3):229–239, 2005.
Haiyan Wu, Chunping Chen, Dazhi Cheng, Suyong Yang, Ruiwang
Huang, Stephanie Cacioppo, and Yue-Jia Luo. The mediation effect
of menstrual phase on negative emotion processing: Evidence from n2.
Social neuroscience, 9(3):278–288, 2014.
N Kyle Smith, John T Cacioppo, Jeff T Larsen, and Tanya L Chartrand.
May i have your attention, please: Electrocortical responses to positive
and negative stimuli. Neuropsychologia, 41(2):171–183, 2003.
Tiffany A Ito, Jeff T Larsen, N Kyle Smith, and John T Cacioppo.
Negative information weighs more heavily on the brain: the negativity
bias in evaluative categorizations. Journal of personality and social
psychology, 75(4):887, 1998.
Xianxin Meng, Jiajin Yuan, and Hong Li. Automatic processing of
valence differences in emotionally negative stimuli: evidence from an
erp study. Neuroscience letters, 464(3):228–232, 2009.
Moon Inder Singh and Mandeep Singh. Development of a real time
emotion classifier based on evoked eeg. Biocybernetics and Biomedical
Engineering, 37(3):498–509, 2017.
Jingfang Jiang, Ying Zeng, Li Tong, Chi Zhang, and Bin Yan. Singletrial erp detecting for emotion recognition. In 2016 17th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence,
Networking and Parallel/Distributed Computing (SNPD), pages 105–
108. IEEE, 2016.
Peter J. Lang, Margaret M. Bradley, and Bruce N. Cuthbert. International
affective picture system. PsycTESTS Dataset, 2005.
Harald T Schupp, Junghöfer Markus, Almut I Weike, and Alfons O
Hamm. Emotional facilitation of sensory processing in the visual cortex.
Psychological science, 14(1):7–13, 2003.
Carlos Gantiva. Inducción de estados afectivos a través de imágenes.
segunda validación colombiana del sistema internacional de imágenes
afectivas (iaps). Revista Latinoamericana de Psicologı́a, 51(2), 2019.
M Duvinage, T Castermans, M Petieau, and T Hoellinger. Cheron,. g.,
and dutoit, t.(2013). performance of the emotiv epoc headset for p300based applications. Biomed. Eng. Online, 12:56.
Nikolas S Williams, Genevieve M McArthur, and Nicholas A Badcock.
It’s all about time: precision and accuracy of emotiv event-marking for
erp research. PeerJ, 9:e10700, 2021.
Yina P Cabarcas-Mena, Andres G Marrugo, and Sonia H ContrerasOrtiz. Classification of cognitive evoked potentials for adhd detection
in children using recurrence plots and cnns. In 2021 XXIII Symposium
on Image, Signal Processing and Artificial Vision (STSIVA), pages 1–6.
IEEE, 2021.
Isabela M Mercado-Aguirre, Karol Gutiérrez-Ruiz, and Sonia H
Contreras-Ortiz. Acquisition and analysis of cognitive evoked potentials
using an emotiv headset for adhd evaluation in children. In 2019 XXII
Symposium on Image, Signal Processing and Artificial Vision (STSIVA),
pages 1–5. IEEE, 2019.
Download