Satellites in Our Pockets: An Object Positioning System using

advertisement
Satellites in Our Pockets:
An Object Positioning System
using Smartphones
Justin Manweiler, Puneet Jain, Romit
Roy Choudhury
TsungYun 20120827
Outline
•
•
•
•
•
•
Introduction
Primitives for Object Localization
System Design
Evaluation
Future Work
Conclusion
Introduction
• Augmented Reality (AR)
– Location based Query
• “Restaurants around me?”
– Distant Object based Query
•
•
how expensive are rooms in that nice hotel far away?
is that cell tower I can see from my house too close
for radiation effects?
Introduction
• Wikitude
– http://www.youtube.com/user/Wikitude
• out-of-band tagging
– Objects in the environment should be annotated
out-of-band
– someone visited Google Earth and entered a tag
Introduction
• Problem
– Can a distant object be localized by looking at it
through a smartphone?
– The problem would have been far more difficult
five years back
• SmartPhone !!
– Camera, GPS, Accelerometer, Compass, and
Gyroscope
Introduction
• OPS (Object Positioning System)
– Computer vision
– Smartphone sensors
– Mismatch optimization
• Contribution
– Localization for distant objects within view
– System design and implementation on the
Android Nexus S platform
Introduction
• OPS overview
Primitives for Object Localization
• (A) Compass triangulate
Primitives for Object Localization
• We cannot ask the user to walk too far
– distance between camera views is much smaller
than the distance from the camera to the object
– Compass precision becomes crucial
– Smartphone sensors are not nearly designed
to support such a level of precision
• GPS can be impacted
– Weather, clock error, …
Primitives for Object Localization
• (B) Visual trilateration
– Trilateration is used in GPS, but not in the distant
object positioning
Primitives for Object Localization
• The possible position lies on a curve
– Visual angle: Computer vision + accelerometer
Primitives for Object Localization
• (C) Visual Triangulation
– Parallax
• Multiple views of an object from different angles
produce visual distortions
– The properties of parallax and visual perception in
general are well-understood
• We can find the interior angle
– Still form a curve
Primitives for Object Localization
Primitives for Object Localization
• Combining Triangulation and Trilateration
Primitives for Object Localization
• We do not obtain a single point of intersection
across all curves
– Due to errors from GPS, compass, and inaccurate
parameter estimation from the visual dimensions
– Increasing the number of camera views will help
• it will also increase the number of curves (each with
some error)
– Rely on optimization techniques to find a single
point of convergence
System Design
System Design
• Structure form Motion (SFM)
– State-of-art computer vision technique
– Input: multiple photos from the user
– Feature detector, Bundle Adjustment, LevenbergMarquardt Algorithm
– Output: (a) 3D point cloud of the geometry
(b) the relative positions and orientation
of the camera
System Design
• Structure form Motion (SFM)
System Design
• The other issues
– Capture user Intent
• OPS must be able to automatically infer which object in
view the user is most-likely interested
• the object-of-interest roughly at the center of the
camera’s viewfinder
– Privacy
• user can only upload the keypoints and feature
descriptors
System Design
• We utilize SFM as a “black box” utility
• However, GPS/compass readings themselves
will be noisy
• Optimization
– Minimize the Compass error
– Minimize the GPS noise
– OPS Optimize on Object Location
System Design
• Triangulation via Minimize the Compass error
– this scales to support an arbitrary number of
GPS and compass bearing pairs
– We want all C(n, 2) pairs points converge to a
single point
– A minimize question
• Add a error term to each compass value
System Design
• Triangulation via Minimize the Compass error
System Design
• Minimization of GPS Noise, Relative to Vision
– Adjust the GPS reading from the position user take
the photograph
– solve a scaling factor λ that proportionally
expands the distances in the SFM point cloud
to match the equivalent real-world distances
–
=
+
System Design
• Minimization of GPS Noise, Relative to Vision
System Design
• OPS Optimization on Object Location
System Design
• OPS Optimization on Object Location
System Design
• Extending the Location Model to 3D
– Pitch
• rotational movement orthogonal to the plane of the
phone screen, relative to the horizon
– Adjustment
• From our 3D point cloud, there is a unique mapping of
every 3D point back to each original 2D image
Evaluation
• Experiment
– More than 50 buildings
– Distance from user to the building: 30~150m
• far enough away that it makes sense to use the system
• limited by the user’s ability to clearly see the object and
focus a photograph
– 4 pictures, each photograph was taken between
0.75m and 1.5m
Evaluation
• Experiment
– Processing time: 30~60s
• primarily attributable to structure from motion
– Quality of photographs
• Lighting
• Blur
• Overexposed
Evaluation
Evaluation
Evaluation
Evaluation
• Introduce some noise
– Gaussian distribution with mean 0 and a varied
standard deviation
– Sensitivity to GPS
– Sensitivity to compass error
• Sensitivity to Photograph detail
– Varied resolution
Evaluation
Evaluation
Evaluation
Future Work
• Live Feedback to Improve Photograph Quality
– feedback to the user
• Improving GPS Precision with Dead Reckoning
– Dead Reckoning is the process of calculating one's
current position by using a previously determined
position
• Continual Estimation of Relative Positions
with Video
Conclusion
• Localization for distant objects using the view
from camera, without any off-band effort
• Real and implementation on the Android
Nexus S platform
• Achieve a promising results
– The prime limitation is from GPS error
• The two equation to calculate the “height”
Download