ppt

advertisement
A Tensorial Approach to Access Cognitive
Workload related to Mental Arithmetic
from EEG Functional Connectivity
Estimates
S.I. Dimitriadis, Yu Sun, K. Kwok, N.A. Laskaris , A. Bezerianos
AIIA-Lab, Informatics dept., Aristotle University of Thessaloniki, Greece
SINAPSE, National University of Singapore
Temasek Laboratories, National University of Singapore
1
Outline
Introduction
• functional connectivity have demonstrated its potential use
• for decoding various brain states
• high-dimensionality of the pairwise functional connectivity
limits its usefulness in some real-time applications
Methodology
• Brain is a complex network
• Functional Connectivity Graphs (FCGs) were treated
as vectors
• FCG can be treated as a tensor
• Tensor Subspace Analysis
Results
Conclusions
2
Intro
Method
Results
Conclusions
ΕΕG & CONNECTOMICS
FROM MULTICHANNEL RECORDINGS TO CONNECTIVITY GRAPHS
FUNCTIONAL/EFFECTIVE
MEASURES
3
Intro
Method
Results
Conclusions
EEG & CONNECTOMICS
COMPLEXITY IN THE HUMAN BRAIN
4
Intro
Method
Results
Conclusions
Motivation and problem statement
• Association of functional connectivity patterns with particular cognitive tasks has
been a hot topic in neuroscience
• studies of functional connectivity have demonstrated its potential use for decoding
various brain states e.g. mind reading
• the high-dimensionality of the pairwise functional connectivity limits its usefulness
in some real-time applications
• We used tensor subspace analysis (TSA) to reduce the initial high-dimensionality of
the pairwise coupling in the original functional connectivity network
which
1. would significantly decrease the computational cost
and
2. facilitate the differentiation of brain states
5
Intro
Method
Results
Conclusions
Outline of our methodology FCG matrix
TENSOR
VECTOR
6
Intro
Method
Results
Conclusions
The linear Dimensionality Reduction Problem in Tensor Space
Tensor Subspace Analysis (TSA) is fundamentally based on Locality Preserving Projection
• X can be thought as a 2nd order tensor (or 2-tensor) in the tensor space
• The generic problem of linear dimensionality reduction in the second order space is
the following:
1. Given a set of tensors (i.e. matrices) X1,..., X m  n1  n2
find two transformation matrices U of size n1  l1 and V of n2  l2 that maps these
tensors to a set of tensors Y1,..., Ym   l1   l 2 (l1  n1, l2  n2 ) such that Yi “represents” ,
where Yi  U T X iV
Xi
The method is of particular interest in the special case where
and is a nonlinear sub-manifold embedded in
X 1 ,..., X m  M
7
Intro
Method
Results
Conclusions
Optimal Linear Embedding
•
The domain of FCGs is most probably a nonlinear sub-manifold embedded in the tensor
space
• Adopting TSA, we estimate geometrical and topological properties of the sub-manifold
from “scattered data” lying on this unknown sub-manifold.
• We will consider the particular question of finding a linear subspace approximation to
the sub-manifold in the sense of local isometry
• TSA is fundamentally based on Locality Preserving Projection (LPP)
1. Given m data points X  { X ,..., X } sampled from the FCG sub-manifold M   
, one can build a nearest neighbor graph G to model the local geometrical structure
of M
2 . Let S be the weight matrix of G . A possible definition of is as follows:
n1
1
  Xi  X j

Sij  e

0
n2
m
2
t
If X j is among the k nearest neighbors of Xi
(1)
Otherwise
The function  x  x
e t
to
i
2
j
is the so called heat kernel which is intimately related
the manifold structure AND |.| is the Frobenius norm of matrix.
8
Intro
Method
Results
Conclusions
Optimal Linear Embedding
In the case of supervised learning (classification labels are available), the label
information can be easily incorporated into the graph as follows:
  Xi  X j

Sij  e

0
2
t
If X j and Xi share the same label;
(2)
Otherwise
Let and be the transformation matrices. A reasonable transformation respecting the graph
structure can be obtained by solving the following objective functions:
min  U T X iV  U T X jV Sij
2
U ,V
(3)
ij
The objective function incurs a heavy penalty if neighboring points and are mapped far apart
D  S
With D be a diagonal matrix
the optimization problem was restricted to the following equations
ii
(D U - SU )V   D U V
j
ij
(4)
Once is obtained, can be updated by solving the following generalized eigenvector
problem
(DV - SV )V  λDV V
(5)
Thus, the optimal and can be obtained by iteratively computing the generalized
eigenvectors of (4) and (5)
9
Intro
Method
Results
Conclusions
Data acquisition: Mental Task (summation)
5 Different difficult levels
e.g. 3+4 … 234 + 148
1 subject
64 EEG electrodes
Horizontal and Vertical EOG
> 40 trials for each condition
10
Intro
Method
Results
Conclusions
Preprocessing
• ICA (Independent Component Analysis)
• Signals were filtered within frequency range of 0.5 to 45 Hz (from δ to γ band)
• the data from level 1 (addition of 1-digit numbers – 187 trials) and level 5 (addition
of 3-digit numbers – 59 trials) was used for the presentation of TSA
Initial FCG Derivation and standard network analysis
• Functional Connectivity Graphs (FCGs) were constructed using a phase coupling
estimator called PLV
• We computed local efficiency (LE) which was defined as:
 j ,h  Gi , j ,hi (d jh )1
1
LE  
N i N
ki (ki  1)
(6)
where in ki corresponded to the total number of spatial directed neighbors (first level
neighbors) of the current node, N was the set of all nodes in the network, and d set the
shortest absolute path length between every possible pair in the neighborhood of the
current node
11
Intro
Method
Results
Conclusions
Brain Topographies and Statistical Differences of
LE between L1 and L5
BRAIN TOPOGRAPHIES OF NODAL LE
Trial–averaged nodalLE measurements
(across various frequency bands)
for PO7 and PO8 sensors
12
Intro
Method
Results
Conclusions
Trimmed FCGs
• We reduced the input for tensor analysis from FCGs of dimension N x N to sub-graphs
of size N’ x N’ (with N’=12sensors corresponding to the spatial neighborhood of PO7
and PO8 sensors)
• The total number of connections between sensors belonging in the neighborhood is:
66/2016≈3% of the total number of possible connections in the original graph
The distribution of PLV values over the
parieto-occipital (PO) brain regions and the
topographically thresholded functional
connections
13
Intro
Method
Results
Conclusions
Machine learning validation
1. The TSA algorithm, followed by a k-nearest-neighbor classifier (with k=3), was tested
on trial-based connectivity data from all bands
2. 10 – fold Cross – Validation scheme (90% for training and 10% for testing)
3. The following table summarizes the average classification rates derived after applying
the above cross-validation scheme 100 times
BILATERALLY PO


1
2
1
2

TSA
93.59
96.70
96.13
95.00
94.40
97.59
95.86
LDA
83.28
85.65
86.42
84.12
83.49
87.21
87.34
Left PO
TSA
87.81
89.86
91.13
90.31
87.63
83.86
84.86
LDA
80.39
81.29
82.14
79.41
78.39
76.25
75.83
Right PO
TSA
91.68
90.40
90.27
87.63
86.54
85.86
87.54
LDA
79.31
77.41
80.28
79.31
76.40
76.05
76.91
The selected options for TSA were:
• Weight mode=Ηeat Kernel
• Neighbor mode=1
• Supervised learning
• Number of dimensions=3
Table 1 summarizes the classification performance of TSA+knn and of LDA+knn (Linear
Discriminant Analysis) as a baseline validation feature extraction algorithm
14
Intro
Method
Results
Conclusions
Conclusions
• We introduced the tensorial approach in brain connectivity analysis
• The introduced methodological approach can find application in many other
situations where brain state has to be decoded from functional connectivity
structure
•
After extensive validation, it might prove an extremely useful tool for on-line
applications of :
Human-machine interaction
Brain decoding
Mind reading
15
Download