Real Time Visual Body Feedback & IR Tracking in HMD Based Virtual

advertisement
Real Time Visual Body
Feedback & IR Tracking in
HMD Based Virtual
Environments Using
Microsoft Kinects
Speaker: Srivishnu ( Kaushik ) Satyavolu
Advisor: Dr. Pete Willemsen
Thesis Problem
Analysis of using multiple kinects to implement an
inexpensive IR based tracking system, in addition to real
time 3D visual body feedback and natural interaction to the
user in HMD based Virtual Environments.
Motivation
• Absence of real time ( self ) visual body feedback in
existing HMD based VEs
• Difficulties to study the perceptions / interaction of
multiple users simultaneously in HMD based VEs

Full-body Tracking possible only by highly expensive
tracking systems & Motion Capture suits
Initial Approach Using OpenCV
Issues???
Microsoft Xbox Kinect
Issues
• Camera Calibration
• RGB Depth Mapping
• 3D User segmentation, smoothing & representation
• IR Interference between multiple kinects
• Relative Orientation between multiple Kinects
• Tracking User Positions and/or pose
Camera Calibration &
RGB Depth Mapping
Issues:
Intrinsic Camera Calibration
Incorrect mapping between RGB and Depth
images
Why is this
interesting?: Accurate RGB to Depth Mapping
Solution:
OpenCV Camera Calibration using Standard
Chessboard Recognition Techniques
3D User Representation
Issue:
3D Point Cloud representation
Why is this
interesting?:
Real time Visual body feedback to the user
Approaches
used:
Triangulation, RGB/ IR / Depth Background
Subtraction, Basic Filtering Techniques
IR Interference
Issue:
multiple
Interference of IR between
kinects
Why is this
interesting?: For many reasons, which you will see shortly
Solution:
Not as bad as expected
Unresolved at the moment
Relative Orientation between multiple
kinects
Issue:
Extrinsic Camera Calibration
Why is this
Interesting?: Relative Orientation of Multiple Kinects
Solutions:
Manual Orientation
Automatic Calibration using OpenCV’s
Chessboard Recognition Techniques
Others?
User Tracking
Issues:
Tracking User Position, Orientation & Pose
across large VR Lab spaces
Why is it
Interesting?: Real time 3D Natural Interaction by User
across VR Lab spaces
Solution:
IR Based Kinect Tracking System?
IR based Kinect Tracking
System
• What is it for?
– Track User Head Position and orientation ( possibly ) in VR Lab
Spaces
• How is it done?
– Tracking an IR marker across a VR Space ( how large??? )
• Why do we need it?
– Interference issues ( again?? ) with existing Tracking Sytems like
World Viz etc.
– Extending OpenNI and Microsoft SDK pose estimation techniques(
effects of interference?? )
Issues
• External Interference ( and again??? )
• Marker visibility
• Jitter / Noise
Current Approach
• Principle of locality
• Multiple Kinects ( interference??? ) over the network
• Mean Filter
• Interpolation???
Experiments & Results
Experiment #1: Evaluation of noise/jitter of kinect's
IR based position tracking over a large space
Results:
Standard Deviation: 0.001 – 0.003 % of the mean
Future Scope & Conclusion
• Photorealisitc and real time 3D Self Representation
– 3D Visual Body Feedback to the User
• Real time 3D Natural Interaction of the User across VR
lab spaces
– Kinect Based IR tracking System in conjunction with
Microsoft / OpenNI pose estimation techniques
• Increased perception and level of presence in VEs
– Yet to be verified, but it's quite possible
• Better understanding of HMD based VEs with multiple
users
References
• Libfreenect
• Nicholas Burrus’ work ( RGBDemov0.4)
• Oliver Kreylos’ Kinect Viewer
• OpenCV
• And all the other websites from which I downloaded
some of the images used in this presentation
THANK YOU
Questions?
Download