• Wednesday, December 10, 7-10 pm,
Gates B01
• Mainly from material in the second half of the quarter
– will not include material from last part of last lecture (volume rendering, image-based rendering)
• Review session slides available from class website
• Office hours as regularly scheduled
• Image warping, texture mapping
• Perspective
• Visibility
• Lighting / Shading
• Coordinate systems
–
[u,v,q] => [x o
, y o z o
, w o
] => [x w
, y w
=> [x, y, w] z w
, w w
]
–
Assuming all transforms are linear, then
– [A][u, v, q]’ = [x, y, w]
• Common mappings
– forward mapping (scatter), texture->screen
– backward mapping (gather)
• Rotation, translation
• perspective
• Minification (decimation)
– unweighted average: average projected texel elements that fall within a pixel’s filter support
– area-weighted average: average based on area of texel support
• Magnification
–
Unweighted
–
Area-weighted
– bilinear interpolation
= texel
= pixel
1.
Mipmapping
1.
multi-resolution texture
2.
bilinear interpolation at 2 closest resolutions to get 2 color values
3.
linear interpolate 2 color values based on actual resolution
2.
Summed area tables
1.
fast calculation of prefilter integral in texture space
• 1. What are some of the problems associated with Mipmaps?
• 2. What are some of the problems associated with SAT?
• Perspective Projection
– rays pass through center of projection
– parallel lines intersect at vanishing points
• Parallel Projection
– center of projection is at infinity
– oblique
– orthographic
How many vanishing points are there in an image produced by parallel projection ?
• Observer position (eye, center of projection)
• Viewing direction (normal to picture plane)
• Clipping planes (near, far, top, bottom, left, right)
• Object Space
• Eye Coordinates
• Projection Matrix
• Clipped to Frustum
• Homogenize to normalized device coordinates
• Window coordinates
Why do we not just uniformly scale Z coordinates?
1.
6 visible-surface determination algorithms:
1.
Z-buffer
2.
Watkins
3.
Warnock
4.
Weiler-Atherton
5.
BSP Tree
6.
Ray Tracing
how does it work what are the necessary preconditions?
asymptotic time complexity how can anti-aliasing be done?
how can shading be incorporated?
well-suited for hardware?
parallelizable?
ease of implementation best-case/worst-case scenarios
• Project all polygons to the image plane, at each pixel, pick the color corresponding to closet polygon
What has to be done to render transparent polygons?
• Scanline + depth
– progressing across scanline, if pixel is inside two or more polygons, use depth to pick
– process interpenetrating polygons, add those events
• Start with area as original image
– subdivide areas until either:
• all surfaces are outside the area
• only one inside, overlapping or surrounding
• a surrounding surface obscures all other surfaces
*
• Cookie-cutter algorithm:clips polygons against polygons
– front to back sort of list
– clip with front polygon
Why is this so difficult?
• Provides a data structure for back-tofront or front-to-back traversal
– split polygons according to specified planes
– create a tree where edges are front/back, leaves are polygons
• “Ray Casting”
– for each pixel, cast a ray into the scene, and use the color of the point on the closest polygon
–
Parametric form of a line: u(t) = a+(b-a)t y a t b
(0,0) x
• Sphere: | P P c
| 2 – r 2 = 0
• Plane: N • P = -D
• Can you compute the intersection of a ray and a plane? A ray and a sphere?
• Point in polygon tests
–
Odd, even rule
• draw a line from point to infinity in one direction
• count intersections: odd = inside, even = outside
–
Non-zero winding rule
• counts number of times polygon edges wind around a point in the clockwise direction
• winding number non zero = inside, else outside
• Terminology
–
Radiant flux: energy/time (joules/sec = watts)
–
Irradiance: amount of incident radiant flux / area (how much light energy hitting a unit area, per unit time)
–
Radiant intensity (of point source): radiant flux over solid angle
–
Radiance: radiant intensity over a unit area
• Q. As every scout knows, you can start a fire on a sunny day by holding a magnifying glass between the sun and a piece of paper placed on the ground.
–
Is the radiance of the sun as seen from the focal point of the lens more, less, or the same as the radiance as seen from the same point in the absence of the magnifying glass?
–
Is the irradiance due to the sun at the focal point more, less, or the same as the irradiance at the same point in the absence of the magnifying glass?
• Point to area transport
–
Computing the irradiance to a surface
– Cos falloff: N • L
–
E = F att x I x (N • L)
• Lambertian (diffuse) surfaces
–
Radiant intensity has cosine fall off with respect to angle
–
Radiance is constant with respect to angle
–
Reason: the projected unit area ALSO gets smaller as a cosine fall off!
–
F att x I x K d x (N • L)
N
N
I
length = cos(t) V
V
Radiance intensity: intensity/solid angle
• If you place a candle in the middle of a hollow sphere, what happens to the total amount of light falling on the inside surface of the sphere as the sphere gets bigger? Defend your answer in one or two sentences.
• BRDF = Bidirectional Reflectance Distribution
Function
– description of how the surface interacts with incident light and emits reflected light
–
Isotropic
• Independent of absolute incident and reflected angles
–
Anisotropic
• Absolute angles matter
– Don’t forget the generalizations to the BRDF!
• Spatially/spectrally varying, florescence, phosphorescence, etc.
• Phong specular model
– Isn’t true to the physics, but works pretty well
– reflected light is greatest near the reflection angle of the incident light, and falls off with a cosine power
–
L spec
= K s x cos reflected ray n (a), a= angle between viewer and
– how do you compute the reflected ray vector?
• (assume normalized vectors)
R
N L
V
• Local vs. infinite lights
–
Understand them! Know how to draw the goniometric diagrams for various light/viewer combinations
• N • H model
–
H is the halfway vector between the viewer and the light
–
What is the difference in specular
R
H
N highlight?
V
L
• Gouraud shading
–
Compute lighting information (ie: colors) at polygon vertices, interpolate those colors
–
Problems?
• Misses highlights
• need high resolution mesh to catch highlights
• mach bands!
• Angle interpolation
– interpolate normal angles according to the implicit surface
– compute shading at each point of the implicit surface
–
CORRECT! But very expensive
• Phong shading
–
Compute lighting normals at all points on the polygon via interpolation, and do the lighting computation on the interpolated normals (of the polygon)
–
Problems? Difference with angle interpolation?
N1
Polygon approximation
N2
Implicit surface
• Know the OpenGL 1.1, 1.2 light equations (what terms mean what)
• Environment/reflection mapping
• Alphas for selecting between textures/shading parameters
• Bump mapping
• Displacement mapping
• Object placement
• 3d textures
More review questions at: http://graphics.stanford.edu/courses/cs248-99/final_review