Learning on the Fly: Rapid Adaptation to the Image Erik Learned-Miller with Vidit Jain, Gary Huang, Laura Sevilla Lara, Manju Narayana, Ben Mears Computer Science Department “Traditional” machine learning Learning happens from large data sets • With labels: supervised learning • Without labels: unsupervised learning • Mixed labels: semi-supervised learning, transfer learning, learning from one (labeled) example, self-taught learning, domain adaptation Learning on the Fly 2 Learning on the Fly Given: • A learning machine trained with traditional methods • a single test image (no labels) Learn from the test image! Learning on the Fly 3 Learning on the Fly Given: • A learning machine trained with traditional methods • a single test image (no labels) Learn from the test image! • Domain adaptation where the “domain” is the new image • No covariate shift assumption. • No new labels Learning on the Fly 4 An Example in Computer Vision Parsing Images of Architectural Scenes Berg, Grabler, and Malik ICCV 2007. • Detect easy or “canonical” stuff. • Use easily detected stuff to bootstrap models of harder stuff. Learning on the Fly 5 Claim This is so easy and routine for humans that it’s hard to realize we’re doing it. • Another example… Learning on the Fly 6 Learning on the fly… Learning on the Fly 7 Learning on the fly… Learning on the Fly 8 Learning on the fly… Learning on the Fly 9 What about traditional methods… Hidden Markov Model for text recognition: • Appearance model for characters • Language model for labels • Use Viterbi to do joint inference Learning on the Fly 10 What about traditional methods… Hidden Markov Model for text recognition: • Appearance model for characters • Language model for labels • Use Viterbi to do joint inference DOESN’T WORK! Prob( |Label=A) cannot be well estimated, fouling up the whole process. Learning on the Fly 11 Lessons We must assess when our models are broken, and use other methods to proceed…. • Current methods of inference assume probabilities are correct! • “In vision, probabilities are often junk.” • Related to similarity becoming meaningless beyond a certain distance. Learning on the Fly 12 2 Examples Face detection (CVPR 2011) OCR (CVPR 2010) Learning on the Fly 13 Preview of results: Finding false negatives Viola-Jones Learning on the Fly Learning on the Fly 14 Eliminating false positives Viola-Jones Learning on the Fly Learning on the Fly 15 Eliminating false positives Viola-Jones Learning on the Fly Learning on the Fly 16 Run a pre-existing detector... Learning on the Fly 17 Run a pre-existing detector... Key Face Non-face Close to boundary Learning on the Fly 18 Gaussian Process Regression learn smooth mapping from appearance to score negative positive apply mapping to borderline patches Learning on the Fly 19 Major Performance Gains Learning on the Fly 20 Comments No need to retrain original detector • It wouldn’t change anyway! No need to access original training data Still runs in real-time GP regression is done for every new image. Learning on the Fly 21 Noisy Document Initial Transcription We fine herefore t linearly rolatcd to the when this is calculated equilibriurn. In short, on the null-hypothesis: Learning on the Fly 22 Premise We would like to fine confident words to build a document-specific model, but it is difficult to estimate Prob(error). However, we can bound Prob(error). Now, select words with • Prob(error)<epsilon. Learning on the Fly 23 “Clean Sets” Learning on the Fly 24 Document specific OCR Extract clean sets (error bounded sets) Build document-specific models from clean set characters Reclassify other characters in document • 30% error reduction on 56 documents. Learning on the Fly 25 Summary Many applications of learning on the fly. Adaptation and bootstrapping new models is more common in human learning than is generally believed. Starting to answer the question: “How can we do domain adaptation from a single image?” Learning on the Fly 26