powerpoint slides - Learning Research & Development Center

advertisement
2007 Pittsburgh Brain Activity Interpretation Competition
Interpreting subject-driven actions and sensory experience in a
rigorously characterized virtual world
Overview Welcome
Speakers: Walter Schneider & Greg Siegle of the U. of Pittsburgh
Call in speakers:
• Emanuele Olivetti; ITC-IRST, Italy
• Greg Stephens; Princeton U., USA
• Alexis Battle; Stanford U., USA
This is a short auditory
graphical overview of the
competition.
For details see,
please see our web page at:
http://www.braincompetition.org
© 2007 University of Pittsburgh
1
WebCast Features
• May need to
turn off pop-up
blockers
• Submit Questions
(Live only)
• Answer Survey
• Skip Slides
(Download Only)
• Enlarge Slide
Window
2
Goals of Competition
• Advance the understanding of how the brain represents and
manipulates information dynamically during real-world
behaviors.
• Show off the “power” of brain imaging
• Enable scientists from many disciplines and nations,
including nonimagers, to develop new brain interpretation
methodologies from the same data
• Provide focus, data, and educational materials to expand
research in brain interpretation
• Give top groups visibility
3
Overview of competition
• Who can compete:
– People from many disciplines, nations, and types of
positions (students, faculty, scientists, engineers)
– Individuals or groups
– Cross disciplinary teams
– Classes
(Note: only one cash prize per institution)
4
Overview of what to do
• You will be examining the brain activity and feature
ratings of 3 people operating in a virtual reality world
• Using fMRI you will predict what individuals perceive
and how they act and feel in a novel Virtual Reality world
involving searching for and collecting objects,
interpreting changing instructions, and avoiding a
threatening dog
• Develop classifier systems such that for Run1 and Run2
you can predict the related feature rating data
• Apply that classifier to the Run3 brain activity data to
predict the feature vectors produced during Run3
5
Brain Activity and Eye Movement
Data Collection
• Data was collected at the Brain Imaging Research
Center in Pittsburgh http://www.birc.pitt.edu/
• Brain activity data on 3 subjects during 3 VR runs
• Eye movement data was collected during the runs
Auditory
6
Task: Anthropologist Search of VR World
Anthropologist visiting a neighborhood collecting
artifacts, taking pictures of piercing and avoiding a dog.
Interactive environment with subject action providing:
– Perception based on normal interaction in a complex
environment
– Differential executive tasking
– Complex multiple object search
– Interactions with objects
– Threat processing via a snarling dog
– Reward processing with money on the line
– Objective eye movement based feature processing
7
Anthropologist Search Environment
Anthropologist visiting a neighborhood collecting
artifacts, taking pictures of piercing and avoiding a dog.
Visiting multiple streets and interiors.
8
Anthropologist Search
Four tasks called in on a cell phone.
• 1) Collect a sequence of fruits in order (apple, grapes
banana pineapple), ignore vegetables
• 2) Collect toy weapons (ignoring other objects such as
tools)
9
Anthropologist Search
• 3) Take pictures of the people in the neighborhood with
piercing.
• 4) Stay away from stray dog that
growls before possible attack and
costs real money every time you’re bitten.
10
Rich Graphic Environment
VR2 world copyright Psychology Software Tools Inc.
11
High Quality VR Graphics
12
Sample Video of a Run
13
Task Design
• Train subjects ~ 4 hours in the environment before magnet
run.
• Signs in the environment provide information as to where
objects are
• In magnet, 3 runs of ~20 minutes per subject
• Provide subject money at the beginning that they can loose
Pay subjects bonus to visit all areas
• After run subject rates continuous arousal, emotional
valence (positive/negative) and discrete emotions (happy,
sad, annoyed/angry, fearful/anxious, neutral).
14
Scanning
• Similar to last year: 3T Allegra 30 slices 1.75s TR,
reverse EPI full head coverage
• Concurrent eye tracking
• 3 subjects
• Full structural data will be made available
15
Data Coding
• Behavioral ratings
– Three 1 hr sessions rating sequentially arousal,
valence, discrete emotions
Subject and expert rating data (2 + 6) x860(1.75 s)
16
•
Objective Eye Fixation Coding
Eye movement traces
– Replay world in object colorized mode
– Overlay eye fixation on image
– Eye movement track (expect 80% good
data) overlaid on world
– Identify fixations (200ms minimal
movement)
– Analyze fixation frame to identify
object contact.
Eye Object Contact
f(ecentricityx,y)
Face
Face
Fruit
Fruit
Dog
DOG
AObject = Σ pixelsf(ecentricityx,y)
0
1
2
3
4
5
6
Time (s)
AFeature = AObject(Feature) /Σall AObject
17
Motion, Location, Action Coding
• Code velocity, rotation
• Interior/exterior
• Actions Grab object
Location & Action Data
Motion
Interior/Motion
ExteriorInterior/ext
Bar
0
50
100
150
200
Bar
Grab Object
Grab
250
Object
Time (s)
18
Required Feature Vectors
Required Features
Code
Feature
Description
Rating Type
R1
Arousal
How much does what is going on in the scene
affect how calm the subject is (subjective rating)
R2
Valence
How positive or negative is the environment
Subjective
Music
Degree to which subject heard music in the
environment
Computed
R4
Hits
Times when subject correctly picked up fruit or
weapon or took picture of a pierced person
Computed
R5
SearchPeople
Times when subject searched for pierced people
Computed
R6
SearchWeapons
Times when subject searched for weapons
Computed
R7
SearchFruit
Times when subject searched for fruits
Computed
R8
Instructions
Times when task instructions were presented
Computed
R9
Dog
Times when dog was seen or heard by subject
Computed
Faces
Times when subject looked at faces of a pierced
or unpierced person
Computed
R11
FruitsVegetables
Times when subject looked at fruits or
vegetables
Computed
R12
WeaponsTools
Times when subject looked at weapons or tools
Computed
InteriorExterior
Times when subject was inside a building
(1=subject was inside, 0=subject was outdoors)
Computed
Velocity
Times when subject was moving but not
interacting with an object
Computed
R3
R10
R13
R14
Subjective
19
Some Brief Background fMRI
• You will be looking at fMRI data that low frequency
filters the data
Feature Ratings
20
fMRI Data Brain Activity Data
BOLD -Blood Oxygenation Level Dependent contrast
% signal change
Neural pathway
Hemodynamics
MR Signal
15
0
-0.5
0.5÷2 4
stimulus
10
time [s]
21
Feature HRF
Full Run Feature Vectors
Cell
Activity
Fruit
Photo
Correct
200 Seconds Feature Vectors
Vel
Cell
200
400
600
800
1000
1200
1400
Time (seconds)
Fruit
Activity
0
Photo
Correct
Vel
360
410
460
Time (seconds)
510
560
22
Example Activation of Cell Phone
23
Diagram of VR Runs
Brain activation data 34x64x64x704 (1.75s)
Feature Vectors Computed & Subjective 14x704(1.75 s)
after hemodynamic lag
24
Basic Analysis Step
Load
VR Run1 Sub1
4D Functional Data
Regress
2 D activation to
Feature
Hemodynamics
Preprocess data
(spatial & temporal
filtering)
Do Post
Post Process
Clean-Up
Reduce
Dimensionationality
of Data
Score Fit
VR Run1
25
Example Linear Prediction Approach
FEATURE(Time)
=
100 highest r voxels
BETA
To predict each feature
calculate the betas to
linearly predict feature
strength from Activation
Table (ROI,Time)
n – number of time points
k – number brain areas
For a linear system
you can solve for
betas by taking
inverse
Note this approach is meant as an illustration
only and can be done as an exercise to learn to
work with the data
26
Developing/training techniques
• Use the data from Run1 & Run2 to develop the ability
to go from the brain activation data to the feature data
VR Run1
VR Run2
27
Prediction of rating data for VR Run3
Do not know what VR events occurred
?
?
?
?
?
Brain activation data 34x64x64x=704 (1.75s)
Subject and expert rating data 23x704(1.75 s) after hemodynamic lag
Use techniques to predict
rating data
28
Using matlab to process the data
• Load our provided data files into matlab
>> load sub1_run1_baseregs.mat
>> load sub1_run1_fmri.mat
>> load sub1_run1_vr_mask.mat
>> whos
Name
Size
baseregs_conv_run1
1x704
baseregs_run1
1x704
epi_run1
35499x704
featurenames_run1
1x1
wholebrain_run1
64x64x34
2
0
-2
Bytes Class
5632 double array
5632 double array
199930368 double array
78 cell array
1114112 double array
0
100
200
300
400
500
600
700
500
450
400
350
300
0
200
400
600
29
Regression in matlab
100
Detrend
for ct=1:size(epi_run1,1)
epi_run1(ct,:)=detrend(epi_run1(ct,:));
end
500
450
400
350
300 0
0
200
400
600
-100
0
800
100
200
300
400
500
600
700
800
0.5
Regress
correlation
0
% get zero order relationships
-0.5
0
0.5
for ct=1:size(epi_run1,1)
xy=corrcoef(baseregs_run1(1,:),epi_run1(ct,:));
corrs(ct)=xy(1,2);
end
% find the best voxels (I.e., with correlation >.25)
[inds]=find(abs(corrs>.25));
voxtouse=epi_run1(inds,:);
% simultaneously regress all the voxels against the feature vector
[Rsq,B,B0,Ypred]=mreg(voxtouse',baseregs_conv_run1');
100
1
1.5
2
2.5
3
4
x 10
Voxel #
200
3.5
300
4
40
1.5
R2=.69!!
Regressor
Prediction
1
0.5
0
-0.5
0
100
200
300
400
500
600
700
800
30
Plot the activations that were predictive
% make a brainimage by overlaying the predictions on the wholebrain image
[goodvox]=find(wholebrain_run1);
predbrain=zeros(size(wholebrain_run1));
for ct=1:length(goodvox)
predbrain(goodvox(ct))=corrs(ct).*(corrs(ct)>.25);
end
% plot a few slices to see the activations
for ct=11:15
subplot(1,5,ct-11+1);
pcolor((wholebrain_run1(:,:,ct)'+5.*predbrain(:,:,ct)'));
shading interp; colormap bone;
view(-180,90); axis off;
end
31
Or, use the MVPA toolkit from
http://www.csbmb.princeton.edu/mvpa/
• Works with our matlab data
• Creates subject record in which it’s easy to reference
features
• Allows detrending, whole brain regressions, etc. from
within a toolkit
• Has the whole process the 2nd place group used last year!
32
Example functional data sets
We provide multiple formats to minimize the start up
time and allow those without specific brain imaging
experience to get the benefits from experts on the
preprocessing stages.
For example, some computer science data-mining
students that may enter the competition may have no
brain imaging experience. Given that, we will make
available the data preprocessed.
DICOM slices
Analyze Volumes
DICOM, Analyze or MatLab formats provided
MatLab Matrixes
33
What we’ll score
• Top 3 scores:
– Those with the highest average correlation of
predictions to run 3 feature vectors.
– There are 14 extra credit features – we’ll average in
your top 5 – so you can tune your exploration to the
features you’re interested in.
• Special Cognitive Neuroscience Prize:
– The group with the best “story” that informs cognitive
neuroscience.
– E.g., how functional connectivity leads to features or
how the dog interferes with search behaviors.
34
Submission
• Submit Predicted Run3 data to competition before
May 21, 2007
– Note you are only allowed to submit the Run3
predictions 3 times
– Test out the methods on Run1 and Run2 data sets
• Submission enabled April 16
35
Keep this a Collegial Competition
• You might work with multiple people at your site
• We will be providing resources (e.g., readings, additional
formats, routines) that people wish to share
• If you have a processing step you would like others to try
and comment on contribute them
• We will be doing special events (web conferences) if
requested
36
Keep in Contact
• We will provide posting on the discussion board and
major notices via email
• We will be posting updates on procedures and
corrections of documents
37
Hope To See You In Chicago
• We will present awards June 14, 2007 at the Organization
for Human Brain Mapping in Chicago, Illinois, USA
• http://www.humanbrainmapping.org/
• We have space for posters and an hour long workshop
38
Competition Board of Scientists
G. Siegle & W. Schneider (University of
Pittsburgh – coordinating site)
A. Bartels (Max Planck Institute for
Biological Cybernetics)
E. Formisano & R. Goebel (Maastricht
University)
J. Haxby & G. Stephens (Princeton
University)
U. Hasson (New York University &
Weizmann Institute)
T. Mitchell (Carnegie Mellon University)
T. Nichols (University of Michigan)
A. Battle (Stanford University)
E. Olivetti, (ITC-IRST; Italy)
39
Credit & Contact
• For questions, e-mail ebc@pitt.edu
• Experience Based Cognition team members:
– PI: Walter Schneider & Greg Siegle
• Pittsburgh Technical staff: Kate Fissell, Lena Gemmer,
Kevin Jarbo, Dan Jones, Lori Koerbel, Kyung Hwa
Lee, Adrienne McGrail, Maureen McHugo, Sudhir
Pathak, David Pfendt , Melissa Thomas
• Psychology Software Tools Inc. for VR worlds Kyle
Brauch, Tom Yothers
40
Comments from the Winners
Diego Sona(1), Greg Siegle(2), Emanuele Olivetti(1), Sriharsha Veeramachaneni(1),
Alexis Battle(3), Greg Stephens(4), Walter Schneider(2)
41
Temporal and Cross-Subject Probabilistic Models for fMRI
Prediction Tasks
Alexis Battle, Gal Chechik, Daphne Koller
Stanford University AI http://ai.stanford.edu
42
Predicting Base Features with Supervoxels
Top (left-to-right): Ken Norman, Denis Chigirev, Matt Weber, Shannon
Hughes, Eugene Brevdo, Melissa Carroll. Bottom (left-to-right): Christopher
Moore, Greg Detre, Greg Stephens
Center for the Study of Brain, Mind and Behavior:
http://www.csbmb.princeton.edu
43
TGaussian process regression and recurrent
neural networks for fMRI image classification
Emanuele Olivetti, Sriharsha Veeramachaneni, Diego Sona
ITC/IRST (Center for Scientific and Technological Research)
is a public research Institute located in Trento in northern Italy http://sra.itc.it/
44
Answering Questions
For competition details see,
http://www.BrainCompetition.org
45
46
Download