Document

advertisement
Learning from Multiple Outlooks
Maayan Harel and Shie Mannor
ICML 2011
Presented by Minhua Chen
What You Saw is Not What You Get: Domain Adaptation Using Asymmetric Kernel Transforms
CVPR2011
Introduction
• A learning task often relates to multiple
representations, or called domains, outlooks.
• For example, in activity recognition, each user
(outlook) may use different sensors.
• There are no sample correspondence, nor feature
correspondence across outlooks; only the label space
(classification task) is shared.
• The goal is to use the information in all outlooks to
improve learning performance.
• The approach is to map the data in one outlook
(source) to another (target), so that the effective
sample size is enlarged in the target domain.
Labeled data (*) and unlabeled data in the target domain (square)
Classifier trained on labeled data in the target domain
Labeled data from the source domain comes in (+).
New classifier trained on both labeled target data and transferred source data.
Problem Formulation
• The central question is how to map data from one domain
to the other, possibly with different feature dimensions.
• The authors proposed an algorithm that computes
optimal affine mapping by matching moments of the
empirical distribution for each class.
Source domain
Target domain
Mathematical Solution
• Procrustes analysis can be applied to solve Ri.
• The formulation can be extended to multiple outlooks:
Experiments
• Activity recognition task with the following human
activities: walking, running, going upstairs, going
downstairs, lingering.
• Data recorded by different users are regarded as
different outlooks (domains), since the sensors used are
different.
• Two setups are examined: domain adaptation with
shared feature space, and multiple outlooks with
different feature spaces.
• The authors tested the success of the mapping
algorithm by classification of the target test data with a
SVM classifier trained on the mapped source data.
with the same feature space for all domains.
with the same feature space for all domains.
What You Saw is Not What You Get:
Domain Adaptation Using
Asymmetric Kernel Transforms
B. Kulis, K. Saenko and T. Darrel, CVPR
2011
Main Idea
Kernelization
Download