talk - Microsoft Research

advertisement
Parameterized
Environment Maps
Ziyad Hakura, Stanford University
John Snyder, Microsoft Research
Jed Lengyel, Microsoft Research
Static Environment Maps (EMs)
Generated using standard techniques:
•Photograph a physical sphere in an environment
•Render six faces of a cube from object center
Ray-Traced vs. Static EM
Self-reflections are missing
Parameterized
Environment Maps (PEM)
EM1
EM2
EM3
EM4
EM5
EM6
EM7
EM8
3-Step Process
1) Preprocess: Ray-trace images at each viewpoint
2) Preprocess: Infer environment maps (EMs)
EM1
EM2
EM3
EM4
EM5
EM6
EM7
3) Run-time: Blend between 2 nearest EMs
EM8
Environment Map Geometry
EM Geometry
EM Texture
Mapping
Reflection
Ray
(u,v)
N
EM Texture
Eye
Why Parameterized
Environment Maps?
•Captures view-dependent shading in environment
•Accounts for geometric error due to approximation
of environment with simple geometry
How to Parameterize the Space?
•Experimental setup
•1D view space
•1˚ separation between views
•100 sampled viewpoints
•In general, author specifies parameters
•Space can be 1D, 2D or more
•Viewpoint, light changes, object motions
Ray-Traced vs. PEM
Closely match local reflections like self-reflections
Movement Away from
Viewpoint Samples
Ray-Traced
PEM
Previous Work
•Reflections on Planar Surfaces [Diefenbach96]
•Reflections on Curved Surfaces [Ofek98]
•Image-Based Rendering Methods
•Light Field, Lumigraph, Surface Light Field, LDIs
•Decoupling of Geometry and Illumination
•Cabral99, Heidrich99
•Parameterized Texture Maps [Hakura00]
Surface Light Fields [Miller98,Wood00]
Surface Light Field
Dense sampling over
surface points of
low-resolution lumispheres
PEM
Sparse sampling over
viewpoints of
high-resolution EMs
Parameterized Texture Maps [Hakura00]
p2
V
U
View
p1
Captures realistic pre-rendered shading effects
Comparison with
Parameterized Texture Maps
•Parameterized Texture Maps [Hakura00]
•Static texture coordinates
•Pasted-on look away from sampled views
•Parameterized Environment Maps
•Bounce rays off, intersect simple geometry
•Layered maps for local and distant environment
•Better quality away from sampled views
EM Representations
•EM Geometry
•How reflected environment is approximated
•Examples:
•Sphere at infinity
•Finite cubes, spheres, and ellipsoids
•EM Mapping
•How geometry is represented in a 2D map
•Examples:
•Gazing ball (OpenGL) mapping
•Cubic mapping
Layered EMs
distant EM
reflector
local EM
Segment environment into local and distant maps
•Allows different EM geometries in each layer
•Supports parallax between layers
Segmented, Ray-Traced Images
Distant
Local Color Local Alpha
Fresnel
EMs are inferred for each layer separately
Distant Layer
distant EM
R
reflector
N
Eye
Ray directly reaches distant environment
Distant Layer
distant EM
R
reflector
N
Eye
Ray bounces more times off reflector
Distant Layer
distant EM
R
reflector
N
Eye
Ray propagated through reflector
Local Layer
Local Color
Local Alpha
Fresnel Layer
Fresnel modulation is generated at run-time
EM Inference
A
HW Filter
Coefficients
x =
Unknown
EM Texels
EM Texture
b
Ray-Traced
Image
Screen
Hardware
Render
Inferred EMs per Viewpoint
Distant
Local
Color
Local
Alpha
Run-Time
•“Over” blending mode to composite local/distant layers
 L rgbL  (1   L ) rgbD F
•Fresnel modulation, F, generated on-the-fly per vertex
•Blend between neighboring viewpoint EMs
•Teapot object requires 5 texture map accesses:
2 EMs (local/distant layers) at each of
2 viewpoints (for smooth interpolation) and
1 1D Fresnel map (for better polynomial interpolation)
Video Results
•Experimental setup
•1D view space
•1˚ separation between views
•100 sampled viewpoints
Layered PEM vs.
Infinite Sphere PEM
Real-time Demo
Summary
•Parameterized Environment Maps
•Layered
•Parameterized by viewpoint
•Inferred to match ray-traced imagery
•Accounts for environment’s
•Geometry
•View-dependent shading
•Mirror-like, local reflections
•Hardware-accelerated display
Future Work
•Placement/partitioning of multiple environment shells
•Automatic selection of EM geometry
•Incomplete imaging of environment “off the manifold”
•Refractive objects
•Glossy surfaces
Questions
Timing Results
On the
Manifold
Off the
Manifold
#geometry
passes
2
3
texgen time
35ms
35ms
frame time
45ms
57ms
FPS
22
17.5
Texel Impulse Response
Texture
Screen
Hardware
Render
To measure the hardware impulse response,
render with a single texel set to 1.
Single Texel Response
 
s 
 0 
    
    
H  1    s1 
 
    
  
 
 sk 1 
 
Model for Single Texel
one column per texel
one row per
screen pixel










s0
s1
sk 1
A










 
s 
 0 
 
 
1   s 
 1 
 
 
 
 
 sk 1 
 
x 
b
Model for MIPMAPs
A
 filter coefficients for s0,0 
 filter coefficients for s 
0,1
































filter coefficients for s

m 1,n 1 

x
b
0



x0,0
 s0,0 



 s

level

0 

0,1


 xu01,v 1 













1


x0,0









 level1 


 x1u v 



1, 1

 
2 2 


















l 1


x
0,0







level l -1 



 l 1




 x u 1, v 1



l 1
l 1

 2 2 

 sm1,n1 


Conclusion
PEMs provide:
•faithful approximation to ray-traced
images at pre-rendered viewpoint samples
•plausible movement away from those samples
using real-time graphics hardware
PEM vs. Static EM
Download