Eye Tracking and Point of Gaze Estimation Ryan Dusina

advertisement
Eye Tracking and Point of
Gaze Estimation
Ryan Dusina
Problem: Tracking Humans Track
● Accurate estimation of what a person is focusing on
visually
● Identifying facial features isn’t enough
● Feature detection must be extended in order to estimate
gaze
Feature Detection
Application - Marketing
Application - UI Design
Applications - Psychology
Other Applications
●
●
●
●
●
●
Fatigue detection in vehicles
Laser refractive surgery
New interaction methods (user intention)
Training (driving, sports, military / police, etc.)
Medium of communication for disabled people
Field sobriety testing
Many more...
Implementation Challenges
●
●
●
●
●
●
●
●
●
●
Eye structure is complex
Adjusting for motion and head pose
Sensor calibration
Individual user calibration
User distance
Ambiguous depth fixation
Changing light conditions
Eyelids
Lazy eyes
Noise
Challenge - Eye Structure
Challenge - Eye Structure
Challenge - Ambiguous Depth
Strategy 1: User Mounted Sensors
Approaches:
● Cameras attached to head
○
Outward / inward
● Specialized contact lenses
○
Accurate yet expensive
● Electromagnetic sensors (not vision related)
○
○
○
Works with eyes closed
Not good at determining fixation alone
Possible aid to other techniques
Strategy 2: Optical Tracking
Preferred Method
● Less invasive
● Easier to use
● Cheaper
Trends of Accurate Models
●
●
●
●
●
●
●
●
Infrared / near-infrared imaging
Structured infrared light source
Perkinje images
Hough transforms (or similar)
Stereo cameras
Facial feature detection
Pose estimation
Calibration per user
Stereo, Feature Detection, and Pose
Gaze detection is very sensitive...
Idea:
● Use feature detection to track known features
● Use stereo vision to estimate depth
● Use this information to estimate pose
● Incorporate this into equations for:
○
○
Improved accuracy
Free range of motion (to an extent)
Infrared / Near-Infrared Imaging
Very easy to identify pupil / reconstruct with Hough
transforms
Infrared Light Source / Perkinje
Helps estimate relative eye rotation and pose
User Calibration
Account for scenario / biological differences to improve
accuracy.
My Project… Much Simpler
Goal:
Consistently track pupils / iris to quantify movement of the
eyes. Hopefully, I can apply this data to get a rough
estimation of my gaze onto my computer screen.
Assumptions
● User’s head is always in a fixed location
● Manually measure distance from camera
● Consistently well lit environment
First Approach
1. Setup a fixed location for:
a.
b.
c.
Camera
Computer Monitor
Face
2. Took physical measurements:
a.
b.
c.
Approx. 15” from camera
Screen dimensions: approx. 15” x 8.5”
Tried to minimize camera angle relative to face
3. Created calibration points...
...First Approach
...First Approach
4. Took calibration selfies of how my eyes looked for each
of the points.
5. Found centroids of pupils and measured relative pixel
locations.
6. Next step was to use pupil location vectors to measure
displacement:
7. Finally map
onto computer screen location.
What Went Wrong?
● Even with a decent webcam at a relatively close
distance, pixel displacement was small…
● About a maximum of 30 pixel differences
● Only about 15 pixels to work with in any direction from
center
● Even if I could get accurate measurements of just 3
pixel differences...
5 Pixels?
What Went Wrong?
● Incredibly difficult to assume head will remain in a fixed
location.
● Glare on eyes made it surprisingly difficult to accurately
track pupils.
Next Steps
● Film much closer to eye
● Construct a theoretical pinhole-camera model of eye
● Experiment with special lighting conditions
○
Set up a special point light to take advantage of Perkinje reflection
● Quantify rotation vectors using Perkinje to construct a
rotation matrix
● Apply 3D transformation models we learned about in
class to at least be able to set up theoretical projection
models
Questions?
Sources
●
●
●
●
Bednarik, R., Kinnunen, T., Mihaila, A., & Fränti, P. (2005). Eye-Movements as a Biometric. In H.
Kalviainen, J. Parkkinen, & A. Kaarna (Eds.), Image Analysis (pp. 780–789). Springer Berlin
Heidelberg. Retrieved from http://link.springer.com/chapter/10.1007/11499145_79
Chen, J., Tong, Y., Gray, W., & Ji, Q. (2008). A robust 3D eye gaze tracking system using noise
reduction. In Proceedings of the 2008 symposium on Eye tracking research & applications
(pp. 189–196). New York, NY, USA: ACM. doi:10.1145/1344471.1344518
Hennessey, C., Noureddin, B., & Lawrence, P. (2006). A single camera eye-gaze tracking
system with free head motion. In Proceedings of the 2006 symposium on Eye tracking research
& applications (pp. 87–94). New York, NY, USA: ACM. doi:10.1145/1117309.1117349
Pfeuffer, K., Vidal, M., Turner, J., Bulling, A., & Gellersen, H. (2013). Pursuit calibration: making
gaze calibration less tedious and more flexible (pp. 261–270). ACM Press. doi:10.1145
/2501988.2501998
Download