Announcements • HW 6: Written (not programming) assignment. – Assigned today; Due Friday, Dec. 9. E-mail to me. • Quiz 4 : OPTIONAL: Take home quiz, open book. – If you’re happy with your quiz grades so far, you don’t have to take it. (Grades from the four quizzes will be averaged.) – Assigned Wednesday, Nov. 30; due Friday, Dec. 2 by 5pm. (Email or hand in to me.) – Quiz could cover any material from previous quizzes. – Quiz is designed to take you one hour maximum (but you have can work on it for as much time as you want, till Friday, 5pm). 1 Topics we covered • Turing Test • Uninformed search – Methods – Completeness, optimality – Time complexity • Informed search – Heuristics – Admissibility of heuristics – A* search 2 • Game-playing – Notion of a game tree, ply – Evaluation function – Minimax – Alpha-Beta pruning • Natural-Language Processing – N-grams – Naïve Bayes for text classification – Support Vector Machines for text classification – Latent semantic analysis – Watson question-answering system – Machine translation 3 • Speech Recognition – Basic components of speech-recognition system • Perceptrons and Neural Networks – Perceptron learning and classification – Multilayer perceptron learning and classification • Genetic Algorithms – Basic components of a GA – Effects of parameter settings • Vision – Content-Based Image Retrieval – Object Recogition 4 • Analogy-Making – Basic components of Copycat, as described in the slides and reading • Robotics – Robotic Cars (as described in the reading) – Social Robotics (as described in the reading) 5 Reading for this week (links on the class website) S. Thrun, Toward Robotic Cars C. Breazeal, Toward Sociable Robotics R. Kurzweil, The Singularity is Near: Book Precis D. McDermott, Kurzweil's argument for the success of AI 6 Robotic Cars • http://www.ted.com/talks/sebastian_thrun_google_s_driverless_car.html • http://www.youtube.com/watch?v=lULl63ERek0 • http://www.youtube.com/watch?v=FLi_IQgCxbo 7 From S. Thrun, Towards Robotic Cars 8 Examples of Components of Stanley / Junior • Localization: Where am I? – Establish correspondence between car’s present location and a map. – GPS does part of this but can have estimation error of > 1 m. – To get better localization, relate features visible in laser scans to map features. 9 Examples of Components of Stanley / Junior • Obstacles: Where are they? – Static obstacles: Build “occupancy grid maps” 10 – Moving obstacles: Identify with “temporal differencing” with sequential laser scans, and then use “particle filtering” to track – “Particle filter” – related to Hidden Markov Model 11 Particle Filters for Tracking Moving Objects From http://cvlab.epfl.ch/teaching/topics/ 12 Examples of Components of Stanley / Junior • Path planning: – “Structured navigation” (on road with lanes): • “Junior used a dynamic-programming-based global shortest path planner, which calculates the expected drive time to a goal location from any point in the environment. Hill climbing in this dynamic-programming function yields paths with the shortest expected travel time.” 13 From M. Montemerlo et al., Junior: The Stanford Entry in the Urban Challenge 14 Examples of Components of Stanley / Junior – “Unstructured navigation” (e.g., parking lots, u-turns) • Junior used a fast, modified version of the A* algorithm.This algorithm searches shortest paths relative to the vehicle’s map, using search trees. 15 From M. Montemerlo et al., Junior: The Stanford Entry in the Urban Challenge 16 Examples of Components of Stanley / Junior 17 18 New York Times: “Google lobbies Nevada to allow self-driving cars” http://www.nytimes.com/2011/05/11/science/11drive.html 19 Sociable Robotics 20 Kismet Kismet and Rich 21 What can Kismet do? 22 What can Kismet do? • Vision • Visual attention • Speech recognition (emotional tone) • Speech production (prosody) • Speech turn-taking • Head and face movements • Facial expression • Keeping appropriate “personal space” 23 Overview and Hardware 24 Expressions examples 25 From Recognition of Affective Communicative Intent in Robot-Directed Speech C. BREAZEAL AND L. ARYANANDA Perceiving “affective intent” 26 From Recognition of Affective Communicative Intent in Robot-Directed Speech C. BREAZEAL AND L. ARYANANDA Perceiving “affective intent” 27 Perceiving affective intent 28 From A context-dependent attention system for a social robot C. Breazeal and B. Scassellati Vision system 29 From people.csail.mit.edu/paulfitz/present/social-constraints.ppt External influences on attention Weighted by behavioral relevance Current input Skin tone Color Motion Habituation Saliency map Pre-attentive filters • Attention is allocated according to salience • Salience can be manipulated by shaking an object, bringing it closer, moving it in front of the robot’s current locus of attention, object choice, hiding distractors, … Vision System: Attention 31 From people.csail.mit.edu/paulfitz/present/social-constraints.ppt Internal influences on attention “Seek toy” – “Seek face” – low skin gain, high saturated-color gain Looking time 28% face, 72% block high skin gain, low color saliency gain Looking time 28% face, 72% block Internal influences bias how salience is measured The robot is not a slave to its environment Attention: Gaze direction 33 Attention System 34 From people.csail.mit.edu/paulfitz/present/social-constraints.ppt Negotiating interpersonal distance Person backs off Person draws closer Too close – Comfortable withdrawal response interaction distance Too far – calling behavior Beyond sensor range • Robot establishes a “personal space” through expressive cues • Tunes interaction to suit its vision capabilities Negotiating personal space 36 From people.csail.mit.edu/paulfitz/present/social-constraints.ppt Negotiating object showing Comfortable interaction speed Too fast, Too close – threat response Too fast – irritation response • Robot conveys preferences about how objects are presented to it through irritation, threat responses • Again, tunes interaction to suit its limited vision • Also serves protective role Negotiating object showing 38 Adapted from people.csail.mit.edu/paulfitz/present/social-constraints.ppt Turn-Taking • Cornerstone of human-style communication, learning, and instruction • Phases of turn cycle – Listen to speaker: hold eye contact – Reacquire floor: break eye contact and/or lean back a bit – Speak: vocalize – Hold the floor: look to the side – Stop one’s speaking turn: stop vocalizing and re-establish eye contact – Relinquish floor: raise brows and lean forward a bit Conversational turn-taking Web page for all these videos: http://www.ai.mit.edu/projects/sociable/videos.html 41 How to evaluate Kismet? What are some applications for Kismet and its descendants? 42 Leonardo http://www.youtube.com/watch?v=ilmDN2e_Flc 43