lensCameraBasics

advertisement
CS 395/495-25: Spring 2004
IBMR:
Measuring Lights, Materials,
Lenses and more
Jack Tumblin
jet@cs.northwestern.edu
Recall: An Image Is…
2D Image:
A map of light
intensities
A ‘Camera’:
?What are ALL
the possibilities?
Position(x,y)
Light + 3D Scene:
Illumination,
shape, movement,
surface BRDF,…
An Planar Projection Image Is…
2D Image:
Collection of rays
through a point
Image Plane
I(x,y)
‘Center of
Projection’
(P3 or P2 Origin)
Position(x,y)
Angle(,)
Light + 3D Scene:
Illumination,
shape, movement,
surface BRDF,…
Image Making: Pinhole  Thin Lens
• Interactive Thin Lens Demo
(search ‘physlet optical bench’)
http://www.swgc.mun.ca/physics/physlets/opticalbench.html
• From this geometry (for next time)
Can you derive Thin Lens Law?
Incident Light Measurement
• Flux W = power, Watts, # photons/sec
• Uniform, point-source light:
flux on a patch of surface falls with distance2
E = Watts/r2
r
Light Measurement
• Flux W = power, Watts, # photons/sec
• Irradiance E: flux arriving per unit area,
(regardless of direction)
E = Watts/area = dW/dA
But direction makes a
big difference when
computing E...
Foreshortening Effect: cos()
• Larger Incident angle i
spreads same flux over larger area
• flux per unit area becomes W cos( i) / area
circular ‘bundle’ W
of incident rays,
flux W
i
• Foreshortening geometry imposes
an angular term cos(i) on energy transfer
Irradiance E
•
•
•
•
To find irradiance at a point on a surface,
Find flux from each (point?) light source,
Weight flux by its direction: cos(i)
Add all light sources: or more precisely,
integrate over entire hemisphere 
Defines Radiance L:
L = (watts / area) / sr
(sr = steradians; solid angle;
= surface area on unit sphere)

Radiance L
• But for distributed (non-point) light sources?
integrate flux over the entire hemisphere .
But what are the units of what we integrate?
Radiance L
L = (watts / area) / sr
(sr = steradians; solid angle;
= surface area on unit sphere)

Lighting Invariants
Why doesn’t surface intensity change with distance?
• We know point source flux drops with distance: 1/r2
• We know surface is made of infinitesimal point sources...
Cam
‘intensity’: 1/r2
‘intensity’: constant (?!?!)
Lighting Invariants
Why doesn’t surface intensity change with distance?
Because camera pixels measure Radiance, not flux!
– pixel value  flux *cos() / sr
– ‘good lens’ design: cos() term vanishes. Vignetting=residual error.
• Pixel’s size in sr fixed:
– Point source fits in one pixel: 1/r2
– Viewed surface area grows by r2,
cancels 1/r2 flux falloff
Cam
Light ‘intensity’: 1/r2
Surface ‘intensity’: constant (?!?!)
Lighting Invariants
Radiance Images are LINEAR:
·(Radiance caused by (Light 1)) +
·(Radiance caused by (Light 2))
= Radiance caused by (· Light 1 + ·Light 2)
+
=
http://www.sgi.com/grafica/synth/index.html
Lighting Invariants
Light is Linear:
Allows ‘negative’ light!
·(Radiance caused by (Light 1)) +
·(Radiance caused by (Light 2))
= Radiance caused by (· Light 1 + ·Light 2)
-
=
http://www.sgi.com/grafica/synth/index.html
Point-wise Light Reflection
• Given:
– Infinitesimal surface patch dA,
– illuminated by irradiance amount E
– from just one direction (i,i)
• How should we measure the returned light?
• Ans: by emitted
i
RADIANCE

measured for all
outgoing directions:
(measured on surface of )
i
dA
Point-wise Light Reflection: BRDF
Bidirectional Reflectance Distribution Function
Fr(i,I,e,e) = Le(e,e) / Ei(i,i)
• Still a ratio (outgoing/incoming) light, but
• BRDF: Ratio of
outgoing RADIANCE in one direction: Le(e,e)
that results from
incoming IRRADIANCE in one direction: Ei(i,i)
• Units are tricky:
L
Ei
i

BRDF = Fr = Le / Ei
i
e
dA
Point-wise Light Reflection: BRDF
Bidirectional Reflectance Distribution Function
Fr(i,I,e,e) = Le(e,e) / Ei(i,i)
• Still a ratio (outgoing/incoming) light, but
• BRDF: Ratio of
outgoing RADIANCE in one direction: Le(e,e)
that results from
incoming IRRADIANCE in one direction: Ei(i,i)
• Units are tricky:
E

i
BRDF = Fr = Le / Ei =
( Watts/area/sr) /(Watts/area)
i
i
Le

dA
Point-wise Light Reflection: BRDF
Bidirectional Reflectance Distribution Function
Fr(i,I,e,e) = Le(e,e) / Ei(i,i)
• Still a ratio (outgoing/incoming) light, but
• BRDF: Ratio of
outgoing RADIANCE in one direction: Le(e,e)
that results from
incoming IRRADIANCE in one direction: Ei(i,i)
• Units are tricky:
BRDF = Fr = Le / Ei = ( Watts/area/sr) /
(Watts/area)
= 1/sr
Point-wise Light Reflection: BRDF
Bidirectional Reflectance Distribution Function
Fr(i,I,e,e) = Le(e,e) / Ei(i,i), and (1/sr)units
• ‘Bidirectional’ because value is SAME if we
swap in,out directions: (e,e) (i,i)
Important Property! aka ‘Helmholtz Reciprocity’
• BRDF Results from surface’s
microscopic structure...
• Still only an approximation:
ignores subsurface scattering...
Ei
i
Le
i

dA
Scattering Difficulties:
For many surfaces, single-point BRDFs do not exist
Example:
Leaf Structure
Ei
i
Angles Depend on
refractive index,
scattering, cell wall
structures, etc.
Depends on total area
of cell wall interfaces
Le
i

dA
Subsurface Scattering Models
Classical: Kubelka-Monk(1930s, for paint; many proprietary variants),
CG approach: Hanrahan & Krueger(1990s)
More Recent: ‘dipole model’ (2001, Jensen)
Marble BRDF
Marble BSSRDF
Subsurface Scattering Models
Classical: Kubelka-Monk(1930s, for paint; many proprietary variants),
CG approach: Hanrahan & Krueger(1990s)
More Recent: ‘dipole model’ (2001, Jensen)
Skin BRDF (measured)
Skin BSSRDF (approximated)
BSSRDF Model
Approximates scattering result as
embedded point sources below a BRDF surface:
BSSRDF: “A Practical Model for Subsurface Light Transport” Henrik
Wann Jensen, Steve Marschner, Marc Levoy, Pat Hanrahan,
SIGGRAPH’01 (online)
BSSRDF Model
• Embedded point sources
below a BRDF surface
• Ray-based, tested,
Physically-Measurable
Model
• ?Useful as a
predictive model for
IBMR data?
Wann Jensen et al., 2001
Summary: Light Measurement
•
•
•
•
Flux W = power, Watts, # photons/sec
Irradiance E = Watts/area = dW/dA
Radiance L = (Watts/area)/sr = (dW/dA)/sr
BRDF: Measure EMITTED radiance that
results from INCOMING irradiance from just
one direction:
BRDF = Fr = Le / Ei = (Watts/area) /
(Watts/areasr)
IBMR Tools
• Digital Light Input:
– Light meter: measure visible irradiance E
(some have plastic ‘dome’ to ensure accurate foreshortening)
– Camera: pixels measure Radiance Li ; flux arriving at
lens from one (narrow solid) angle
• Digital Light Output:
– Luminaires: point lights, extended(area) sources
– Emissive Surfaces: CRT, LCD surface
– Projectors: laser dot,stripe,scan; video display
• Light Modifiers (Digital?):
– Calibration objects, shadow sources, etc.
– Lenses,diffusers, filters, reflectors, collimators...
– ?Where are the BRDF displays / printers?
Two Big Missing Pieces
• Computer controlled BRDF.
– Can we really do without it?
– are cameras and projectors enough to
‘import the visible world’ into our computers?
• BRDF is not enough:
– Subsurface scattering is
crucial aspect of photographed images
– ? how can we model it? measure it? use it?
More help:
• GREAT explanation of BRDF:
• www.cs.huji.ac.il/~danix/advanced/RenderingEq.pdf
• Some questions about measuring light:
END
IBMR---May 13,2004:
Projects?
(due Tues May 25!)
Let’s discuss them…
Summary: Light Measurement
•
•
•
•
Flux W = power, Watts, # photons/sec
Irradiance E = Watts/area = dW/dA
Radiance L = (Watts/area)/sr = (dW/dA)/sr
BRDF: Measure EMITTED radiance that
results from INCOMING irradiance from just
one direction:
BRDF = Fr = Le / Ei = (Watts/area) /
(Watts/areasr)
IBMR: Measure,Create, Modify Light
How can we measure ‘rays’ of light? Light Sources? Scattered rays? etc.
Cameras capture
subset of these
rays.
Shape,
Position,
Movement,
Emitted
Light
BRDF,
Texture,
Scattering
Reflected,
Scattered,
Light …
Digital light
sources
(Projectors) can
produce a subset
of these rays.
‘Scene’ modifies Set of Light Rays
What measures light rays in, out of scene?
Measure Light LEAVING a Scene?
Towards a camera?...
Measure Light LEAVING a Scene?
Towards a camera: Radiance.
Light Field Images
measure Radiance L(x,y)
Measure light ENTERING a Scene?
from a (collection of) point sources at infinity?
Measure light ENTERING a Scene?
from a (collection of) point sources at infinity?
‘Light Map’ Images
(texture map light source)
describes Irradiance E(x,y)
Measure light ENTERING a Scene?
leaving a video projector lens?
‘Reversed’
Radiance LCamera:
emits Radiance L(x,y)
Measure light ENTERING a Scene?
from a video projector?—Leaving Lens:
Radiance L
Irradiance E
‘Full 8-D Light Field’ (10-D, actually: time, )
• Cleaner Formulation:
camera
– Orthographic camera,
– positioned on sphere
around object/scene
– Orthographic projector,
– positioned on sphere
around object/scene
– (and wavelength and time)
F(xc,yc,c,c,xl,yl l,l, , t)
projector
Summary: Light Measurement
•
•
•
•
Flux W = power, Watts, # photons/sec
Irradiance E = Watts/area = dW/dA
Radiance L = (Watts/area)/sr = (dW/dA)/sr
BRDF: Measure EMITTED radiance that results
from INCOMING irradiance from just one direction:
BRDF = Fr = Le / Ei = (Watts/area) /
(Watts/areasr)
Lenses map radiance to the image plane (x,y):
THUS: Pixel x,y must measure Radiance L at x,y.
well, not exactly; there are distortions!…
What do Photos Measure?
What We Want
What We Get
Film Response:
(digital cameras, video cards too!)
approximately linear,
but ONLY on log-log axes.
Two Key parameters:
m == scale == exposure
 == gamma == ‘contrastyness’
Problem:Map Scene to Display
Domain of Human Vision:
from ~10-6 to ~10+8 cd/m2
starlight
moonlight
10-6
10-2
??
office light
1
10 100
0
daylight
10+4
255
10+8
??
Range of Typical Displays:
from ~1 to ~100 cd/m2
flashbulb
High-Contrast Image Capture?
• An open problem! (esp. for video...)
• Direct (expensive) solution:
– Flying Spot Radiometer:
brute force instrument, costly, slow, delicate
– Novel Image Sensors:
line-scan cameras, logarithmic CMOS circuits,
cooled detectors, rate-based detectors...
• Most widely used idea: multiple exposures
• Elegant paper (Debevec1996) describes how:
(On class website)
Use Overlapped Exposure Values
starlight
moonlight
office light
daylight
flashbulb
Use Overlapped Exposure Values
starlight
moonlight
office light
daylight
flashbulb
Use Overlapped Exposure Values
starlight
moonlight
office light
daylight
flashbulb
Use Overlapped Exposure Values
starlight
moonlight
office light
daylight
?
flashbulb
f(logL)
What is the camera response curve?
And what are the pixel radiances?
(See Debevec SIGGRAPH 1997:)
j=0
j=1
i=2
j=3
j=4
j=5
j=6
Pixel Value Z
Debevec’97 Method
j=0 1 2 3 4 5 6
logLi
?
f(logL)
STEP 1:
--number the images ‘i’,
--pick fixed spots (xj,yj)
that sample scene’s
radiance values logLi well:
j=0
j=1
i=2
j=3
j=4
j=5
j=6
Pixel Value Z
Debevec’97 Method
j=0 1 2 3 4 5 6
logLi
?
f(logL)
STEP 2:
--Collect pixel values Zij
(from image i, location j)
--(All of them sample the
response curve f(logL)…)
Debevec’97 Method
• In image Ij , `exposure’ changed by log(2j) = j * log(2)
• In image Ij ,pixel at (xi,yi) has known pixel value Zij
and unknown radiance logLi
Zij
j=0 1 2 3 4 5 6
logLi
Pixel Value Z
Pixel Value Z
• Film response curve: f(logL) = Z; we know several logLi:
f(log(Lj * 2j)) = Zij , or more simply: f(logLi + j*C) = Zij
F(logL) ?
?
logL
Debevec’97 Method:
It’s another Null-Space Problem…
• In image Ij ,pixel at (xi,yi) has known pixel value Zij
and unknown radiance logLi
•
f(logLi + j*C) = Zij How do we find f() and logLi?
• TRICK: Use f() as a scale factor for each pixel value Zij
Zij
j=0 1 2 3 4 5 6
logLi
Pixel Value Z
Pixel Value Z
fij * (logLj + j*C) – Zij = 0
fij
?
f(logLi – (j*C))
Debevec’97 Method:
It’s another Null-Space Problem…
fij * (logLj + j*C) – Zij = 0
Wait,wait, wait. We have TWO unknowns?
Zij
j=0 1 2 3 4 5 6
logLi
Pixel Value Z
Pixel Value Z
NEXT WEEK: Read Debevec’97
Explain how we solve this!
fij
?
f(logLi – (j*C))
Camera Abilities / Limitations
• Nonlinear Intensity Response: S-shaped
(on log-log axes)
• Low-Contrast Devices: Noise limited (~500:1)
• Varied Spectral Response: RGB1 != RGB2...
• Color Sensing Strategies:
– 3-chip cameras: best, but expensive!
– Mosaic sensor: trades resolution for color
• Nonuniform sensitivity & geometry
– Lens limitations (vignetting, radial distortion,
bloom/scatter, uneven focus, ...)
– CCD Sensor geometry: VERY exact, repeatable
Display Abilities / Limitations
• Nonlinear Intensity Response: S-shaped
• Low-Contrast Devices
– scattering usually sets upper bounds
– Best Contrast: laser projectors, some DLP
devices, specialized devices...)
• Varied Spectral Response: RGB1 != RGB2...
• Color Reproducing Strategies: varied...
• Nonuniform sensitivity & geometry:
– CRTs: e-beam cos(), distortion, focus,
convergence...
– LCDs, DLPs: VERY exact, (but pixels die, etc.)
Light Modifiers?   Discuss!
• Low-Contrast BRDF ‘Devices’ to measure light?
–
–
–
–
‘Light Probe’ mirror sphere BRDF = ?
Diffuse reflectances limited to about 0.020.95
Diffractive materials: complex BRDF may be useful...
(Transmissive LCDs?) ?Can you name more?
• PRECISELY Linear ‘Response’ to light...
BRDFs are fixed ratios; no intensity dependence!
• Smudge, nick may modify BRDF drastically
• Shadows? Precision? Inter-reflections?
• PRECISE input/output symmetry
--BUT-• Scattering WITHIN material can be trouble...
What is the complete IBMR toolset?
• Camera(s) + light probe, etc:
 arbitrary Radiance meter.
• Sphere of Projectors/CRTs:
 arbitrary Irradiance source.
• Some (as yet unknown) device:
 arbitrary BRDF / light ray modifier
Is our toolset complete complete?
have we spanned the IBMR problem? ...
Missing the most important tool…
• Human Visual System.
– the receiver/user for MOST IBMR data.
– Eye is a very poor light meter, but very good at
sensing BRDF and (some) shape.
– Eye senses change;
integration used to estimate the world
– Eye permits tradeoffs of
geometry vs. surface appearance
– Eye permits selective radiance distortions,
especially to illumination:
Picture: Copy Appearance
Details Everywhere;
segmented partial ordering of intensities.
Local changes matter.
Absolute intensities don’t
matter much,
but boundaries, shading,
& CHANGES do.
---WANTED:--visually important
information
in machine-readable form.
Visible Light Measurement
• ‘Visible Light’ = what our eyes can perceive;
– narrow-band electromagnetic energy:
  400-700 nm (nm = 10-9 meter)
<1 octave; (honey bees: 3-4 ‘octaves’ ?chords?)
• Not uniformly visible vs. wavelength :
– Equiluminant Curve
defines ‘luminance’
vs. wavelength
– eyes sense spectral
CHANGES well, but
not wavelength
– Metamerism
http://www.yorku.ca/eye/photopik.htm
Visual Appearance Measurement
• Measurement of Light—easy. Perception?—hard.
– ‘Color’ ==crudely perceived wavelength spectrum
– 3 sensed dimensions from spectra.
– CIE-standard X,Y,Z color spectra: linear coord. system
for spectra that spans all perceivable colors X,Y,Z
– Projective!
luminance = Z
chromaticity = (x,y) = (X/Z, Y/Z)
– NOT perceptually uniform... (MacAdam’s ellipses...)
• Many Standard Texts, tutorials on color
– Good: http://www.colourware.co.uk/cpfaq.htm
– Good: http://www.yorku.ca/eye/toc.htm
– Watt & Watt pg 277-281
END
Download