AUGMENTED REALITY VIDEO PLAYLIST Surya Chandra EENG512

advertisement
0/29
AUGMENTED REALITY
VIDEO PLAYLIST
Surya Chandra
EENG512
1/29
Augmented Reality
 Augmented reality is the integration of digital
information with live video or user’s
environment in real time.
 It is an enhanced reality with possibly human
interaction involved.
Goal
2/29
[ Using either the live feed of a webcam or a video
recording as reference ]
1) Playlist: To play a set of random videos on top of
everyday objects (photos, books, etc.) lying around in the
room .
2) Interaction: To allow the user to point and select a
particular video and view it on the user’s palm as desired.
1a.Selecting markers
NO
Reference Images
YES
3/29
1b. Video feed
Webcam/Recording
4/29
2a.Extract Features : SURF
(Speeded Up Robust Features)
WEBCAM - FRAME 1
MATLAB: detectSURFFeatures
extractFeatures
5/29
6/29
2b.Match Features
WEBCAM - FRAME 1
MATLAB: matchFeatures
REFERENCE IMAGE/MARKER
2c.Inlier Matches
WEBCAM - FRAME 1
7/29
REFERENCE IMAGE/MARKER
MATLAB: estimateGeometricTransform
- Gives the inliers points and the transformation
3a. Rescaling Video
Resizing the video frame to match the dimensions of reference image
REFERENCE IMAGE/MARKER
MATLAB: imresize
VIDEO FRAME 1 (rescaled)
8/29
3b. Transforming Video
9/29
Applying the obtained reference image transformation to video frame 1
WEBCAM - FRAME 1
VIDEO FRAME 1 (Transformed)
MATLAB: Imwarp
vision.AlphaBlender: Given a mask, it blends two images
3c. Projected Result
RESULT: OUTPUT VIDEO FRAME 1
10/29
11/29
Problem:
 This method takes about 1 to 1.5 sec for each frame.
 Hence, using this method for all the frames will be very
expensive.
12/29
Solution:
POINT TRACKER - vision.PointTracker
 It tracks a given set of points using KLT(Kanade-LucasTomasi) feature tracking algorithm.
 Works well for tracking objects that do not change shape.
 Used in video stabilization, object tracking and camera motion
estimation.
4a. PointTracker
Initialize a point tracker with the inliers points previously obtained
INITIALIZED POINT TRACKER
13/29
4b. PointTracker
Go to the next webcam video frame and match the point tracker points
Webcam : Video Frame 1
Webcam : Video Frame 2
14/29
Repeat…
15/29
Rescale the next video frame
Transform the video frame
Project and blend
Reset PointTracker with new inliers
NEXT
FRAM
E
16/29
 PointTracker method executes around 8-10 frames per
second.
 The transformation needs to be accumulated till current
frame.
trackingTransform.T = refTransform.T * trackingTransform.T;
17/29
Problem:
 The point tracker works only for short-term tracking.
 Over time, points are lost due to lighting variations and
out of plane rotation.
 Points are to be reacquired periodically to track for a
longer time
Solution:
 To break the loop when points being tracked < 6 and restart
from step 1, extracting SURF features.
 It breaks once every 70-100 frames and restarts.
Preliminary Testing
RESULT VIDEO
https://www.youtube.com/watch?v=qCWVcxSxAo4
18/29
19/29
Preliminary Testing 2
TWO VIDEOS
https://www.youtube.com/watch?v=5XZ1_utCYIQ
20/29
Problem:
 The Transformation Matrix was badly conditioned.
 Easy Fix : rcond < 10^-6, Break and Restart.
5a. Interaction
21/29
Red-Tape is used as marker
RGB Image
Filtered – remove noise
Red component – Grayscale(average)
Apply threshold
5b. Interaction
22/29
 Used BlobAnalysis to extract red regions within certain
area range and find their centroids.
https://www.youtube.com/watch?v=xjguVXAZdnk
5c. Interaction
23/29
 Scaled the video to the distance between these centroids.
 Found the 2D transformation matrix using the angle.
 Used the same process as in part 1 to project the video.
https://www.youtube.com/watch?v=ePx_H3LTvRo
6. Adding Interaction (Results)
https://www.youtube.com/watch?v=G69nCvYhJGA
24/29
7. Multiple Videos (Results)
selecting the video closest to the markers
https://www.youtube.com/watch?v=zTttISVHhV8
25/29
26/29
8.Future Work Fix some coding issues with handling remaining
point trackers while a video is being viewed by the
user. (as seen in the previous video)
 To try to implement marker-less detection of
fingertips and hand pose estimation.
27/29
9.Take away
 The frame rate of the webcam video and the videos in the playlist
should be same.
 PointTracker tracks loses accuracy eventually and has to be
re-initialized.
 Cases where the reference image is completely out of frame has to
be considered.
 … and Computer Vision works!
28/29
10. REFERENCES
 [1]
Bay, Herbert, et al. "Speeded-up robust features
(SURF)." Computer vision and image understanding 110.3 (2008):
346-359.
 [2] Lee, Taehee, and Tobias Hollerer. "Handy AR: Markerless
inspection of augmented reality objects using fingertip
tracking." Wearable Computers, 2007 11th IEEE International
Symposium on. IEEE, 2007.
 [3] Ta, Duy-Nguyen, et al. "Surftrac: Efficient tracking and continuous
object recognition using local feature descriptors." Computer Vision
and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on.
IEEE, 2009.
29/29
Download