CS248 Final Review

advertisement

CS248 Final Review

CS248 Final

• Wednesday, December 10, 7-10 pm,

Gates B01

• Mainly from material in the second half of the quarter

– will not include material from last part of last lecture (volume rendering, image-based rendering)

• Review session slides available from class website

• Office hours as regularly scheduled

CS248 Final Review Contents

• Image warping, texture mapping

• Perspective

• Visibility

• Lighting / Shading

Texture Mapping

• Coordinate systems

[u,v,q] => [x o

, y o z o

, w o

] => [x w

, y w

=> [x, y, w] z w

, w w

]

Assuming all transforms are linear, then

– [A][u, v, q]’ = [x, y, w]

• Common mappings

– forward mapping (scatter), texture->screen

– backward mapping (gather)

Texture Warps

• Rotation, translation

• perspective

• Minification (decimation)

– unweighted average: average projected texel elements that fall within a pixel’s filter support

– area-weighted average: average based on area of texel support

Texture Warps

• Magnification

Unweighted

Area-weighted

– bilinear interpolation

= texel

= pixel

Textures

1.

Mipmapping

1.

multi-resolution texture

2.

bilinear interpolation at 2 closest resolutions to get 2 color values

3.

linear interpolate 2 color values based on actual resolution

2.

Summed area tables

1.

fast calculation of prefilter integral in texture space

Questions

• 1. What are some of the problems associated with Mipmaps?

• 2. What are some of the problems associated with SAT?

Viewing: Planar Projections

• Perspective Projection

– rays pass through center of projection

– parallel lines intersect at vanishing points

• Parallel Projection

– center of projection is at infinity

– oblique

– orthographic

How many vanishing points are there in an image produced by parallel projection ?

Specifying Perspective Views

• Observer position (eye, center of projection)

• Viewing direction (normal to picture plane)

• Clipping planes (near, far, top, bottom, left, right)

Viewing: OpenGL Pipeline

• Object Space

• Eye Coordinates

• Projection Matrix

• Clipped to Frustum

• Homogenize to normalized device coordinates

• Window coordinates

Why do we not just uniformly scale Z coordinates?

Visibility

1.

6 visible-surface determination algorithms:

1.

Z-buffer

2.

Watkins

3.

Warnock

4.

Weiler-Atherton

5.

BSP Tree

6.

Ray Tracing

Things to know

how does it work what are the necessary preconditions?

asymptotic time complexity how can anti-aliasing be done?

how can shading be incorporated?

well-suited for hardware?

parallelizable?

ease of implementation best-case/worst-case scenarios

Z-buffer

• Project all polygons to the image plane, at each pixel, pick the color corresponding to closet polygon

What has to be done to render transparent polygons?

Watkins

• Scanline + depth

– progressing across scanline, if pixel is inside two or more polygons, use depth to pick

– process interpenetrating polygons, add those events

Warnock Subdivision

• Start with area as original image

– subdivide areas until either:

• all surfaces are outside the area

• only one inside, overlapping or surrounding

• a surrounding surface obscures all other surfaces

*

Weiler-Atherton Subdivision

• Cookie-cutter algorithm:clips polygons against polygons

– front to back sort of list

– clip with front polygon

Why is this so difficult?

BSP Trees

• Provides a data structure for back-tofront or front-to-back traversal

– split polygons according to specified planes

– create a tree where edges are front/back, leaves are polygons

Ray Tracing

• “Ray Casting”

– for each pixel, cast a ray into the scene, and use the color of the point on the closest polygon

Parametric form of a line: u(t) = a+(b-a)t y a t b

(0,0) x

Ray Tracing

• Sphere: | P P c

| 2 – r 2 = 0

• Plane: N • P = -D

• Can you compute the intersection of a ray and a plane? A ray and a sphere?

Ray Tracing

• Point in polygon tests

Odd, even rule

• draw a line from point to infinity in one direction

• count intersections: odd = inside, even = outside

Non-zero winding rule

• counts number of times polygon edges wind around a point in the clockwise direction

• winding number non zero = inside, else outside

Lighting

• Terminology

Radiant flux: energy/time (joules/sec = watts)

Irradiance: amount of incident radiant flux / area (how much light energy hitting a unit area, per unit time)

Radiant intensity (of point source): radiant flux over solid angle

Radiance: radiant intensity over a unit area

Sample question (2000)

• Q. As every scout knows, you can start a fire on a sunny day by holding a magnifying glass between the sun and a piece of paper placed on the ground.

Is the radiance of the sun as seen from the focal point of the lens more, less, or the same as the radiance as seen from the same point in the absence of the magnifying glass?

Is the irradiance due to the sun at the focal point more, less, or the same as the irradiance at the same point in the absence of the magnifying glass?

Lighting

• Point to area transport

Computing the irradiance to a surface

– Cos falloff: N • L

E = F att x I x (N • L)

Lighting

• Lambertian (diffuse) surfaces

Radiant intensity has cosine fall off with respect to angle

Radiance is constant with respect to angle

Reason: the projected unit area ALSO gets smaller as a cosine fall off!

F att x I x K d x (N • L)

N

N

I

 length = cos(t) V

V

Radiance intensity: intensity/solid angle

Sample question (2002)

• If you place a candle in the middle of a hollow sphere, what happens to the total amount of light falling on the inside surface of the sphere as the sphere gets bigger? Defend your answer in one or two sentences.

Lighting

• BRDF = Bidirectional Reflectance Distribution

Function

– description of how the surface interacts with incident light and emits reflected light

Isotropic

• Independent of absolute incident and reflected angles

Anisotropic

• Absolute angles matter

– Don’t forget the generalizations to the BRDF!

• Spatially/spectrally varying, florescence, phosphorescence, etc.

Lighting

• Phong specular model

– Isn’t true to the physics, but works pretty well

– reflected light is greatest near the reflection angle of the incident light, and falls off with a cosine power

L spec

= K s x cos reflected ray n (a), a= angle between viewer and

– how do you compute the reflected ray vector?

• (assume normalized vectors)

R

N L

V

Lighting

• Local vs. infinite lights

Understand them! Know how to draw the goniometric diagrams for various light/viewer combinations

• N • H model

H is the halfway vector between the viewer and the light

What is the difference in specular

R

H

N highlight?

V

L

Shading

• Gouraud shading

Compute lighting information (ie: colors) at polygon vertices, interpolate those colors

Problems?

• Misses highlights

• need high resolution mesh to catch highlights

• mach bands!

Shading

• Angle interpolation

– interpolate normal angles according to the implicit surface

– compute shading at each point of the implicit surface

CORRECT! But very expensive

Shading

• Phong shading

Compute lighting normals at all points on the polygon via interpolation, and do the lighting computation on the interpolated normals (of the polygon)

Problems? Difference with angle interpolation?

N1

Polygon approximation

N2

Implicit surface

Lighting and Shading

• Know the OpenGL 1.1, 1.2 light equations (what terms mean what)

Exotic uses of textures

• Environment/reflection mapping

• Alphas for selecting between textures/shading parameters

• Bump mapping

• Displacement mapping

• Object placement

• 3d textures

Good Luck!

Good Luck on the Final!

More review questions at: http://graphics.stanford.edu/courses/cs248-99/final_review

Download