Introduction to Computer Graphics: ITCS 4120/5120

Introduction to Computer Graphics: ITCS 4120/5120
Dr. Zachary Wartell
Revision 1.2
2/16/07
Copyright 2006, Dr. Zachary Wartell, UNCC,
All Rights Reserved
Introduction to Computer Graphics: ITCS 4120/5120
Professor: Dr. Zachary Wartell
www.cs.uncc.edu/~zwartell
TA: unknown
Textbook: Computer Graphics with OpenGL, Hearn &
Baker
Suggested (depending on background):
-C++ for Java Programmers, Timothy Budd, 1999.
-C++ Primer Plus: Teach Yourself Object-Oriented
Programming, Stephen Prata
©Zachary Wartell
Prerequisites for ITCS 4120/5120
Juniors/Seniors/Masters/Ph.D. Students
MATH 2164: Matrices and Linear Algebra
ITCS 2214: Data Structures
Strong programming and debugging skills!
Languages:
-you know C++
or
-you know Java
- & understand low-level programming such as:
-ITCS 3182 Computer Organization and Architecture or
-ITCS 3110 Compiler Construction or
-have programmed in C
- & capable of learning a new language on your own
©Zachary Wartell
Tools for Programming Projects in ITCS 4120/5120
 C++
 Microsoft Visual Studio 2005
Woodward 335
COIT/BISOM students MSDNAA:
http://www.labs.uncc.edu/basics/compguide.html
OpenGL – [MSVC 2005] – 2D/3D graphics API
library
FLTK – [class website] – GUI API library
SVN – [class website] - source code revision control
system. Used for turning in projects.
Assignments are best done in Woodward 335
Tools and API’s have been tested in 335
©Zachary Wartell
What is computer graphics (CG)?
It’s a core software & hardware technology in:
-Computer Aided Design (CAD)
-Scientific Visualization
-Medical Visualization
-Education
-Computer Interfaces
-Computer/Video Games
-Virtual Reality & Visual Simulation
-Movies
©Zachary Wartell
CG: a core technology in CAD
AutoCADTM
5SpiceTM
©Zachary Wartell
CG: a core technology in Scientific Visualization
Molecular Vis.
Weather
©Zachary Wartell
CG: a core technology in Medical Visualization
©Zachary Wartell
CG: a core technology in Computer Interfaces
Microsoft Windows
GNOME
Mac OS X
©Zachary Wartell
CG: a core technology in Games
FarCry
©Zachary Wartell
Falcon 4.0
(UbiSoft)
Zelda
(Nitendo)
(Microprose)
CG: a core technology in VR & Vis. Sim.
Pilot Training
Exposure Therapy
for Flying Phobia
(Larry Hodges et.al)
©Zachary Wartell
(AlSim, Inc.)
CG: a core technology in VR & Vis. Simulation
Two-hand Interface
for Weather Vis.
“Holospace”
Surround ScreenDisplay
©Zachary Wartell
(Barco)
CG: a core technology in Movies
Star Wars: Episode IITM
(LucusFilm)
Shrek 2TM
©Zachary Wartell
(Dreamworks)
What disciplines does CG technology draw on?
algorithms
math
-basic graphics (ITCS 4120) –
linear/vector algebra, geometry & trig.
-advanced graphics
advanced calculus, computational geometry, differential
geometry, topology, …..
optics (very approximate in ITCS 4120)
software engineering and programming
hardware engineering
psychophysics (branch of psychology)
-human visual system
industrial art & design
©Zachary Wartell
ITCS 4120 subset of these disciplines
ITCS 4120: Lecture Material
45% - algorithms
45% - math (linear/vector algebra, geometry & trig.)
3% - optics
3% - hardware engineering
3% - psychophysics
1% - programming
ITCS 4120: Projects
50(?)% - programming (C++, OpenGL, FLTK)
50(?)% - deeply understanding Lecture Material
“you don’t really understand it until you’ve
implemented it”
©Zachary Wartell
How long has CG been around?
Ivan Sutherland, SketchPad, 1963 MIT
CRT, light-pen, direct-manipulation 2D graphics
©Zachary Wartell
How long has CG been around?
William Fetter, 1960, Boeing Aircraft Co.
“Boeing Man”, human figure simulation, credited with “computer graphics”
©Zachary Wartell
In what way do CG applications differ?
2D versus 3D
Speed – Frames Per Second (FPS)
Realism
$$$
vs
1950’s, Whirlwind, $4.5M, 40K adds/s
 today’s PC: $1K, 2-3B ops/s
 CG: 1995, $100K, SGI = 2004, $1K PC
©Zachary Wartell
Differences in CG applications: Speed
Speed: Time to compute one image
(time)
Movies
hours
General Interactive
Application
200 ms
Games
33ms
Visual Simulation
16 ms
FASTER
(FPS)
0.0xx
Frames Per Second
©Zachary Wartell
5-15
15-60
Interactive Graphics
60+
Differences in across CG applications: Realism
Realism
- more math, more physics → more realism
(real-time CG → ray-tracing → radiosity → “rendering equation”)
Games
Vis. Sim.
“Cartoon”
Movies
Movie Special
FX
Product
Evaluation
More Photo Realistic
- display technology & human visual perception
(image fidelity, stereopsis, motion parallax)
©Zachary Wartell
CG: Speed vs. Realism
generally: more realism → less speed
but Moore’s Law continues to reign
-price/performance improves 2x every 18 months
-since 1995 gaming market driving
graphics hardware
-Nintendo GameCubeTM (ATI)
-XboxTM (Nvidia inside)
-PC: nVidia Geforce 7900, ATI Radeon X1900
display capability still lags human eye’s precision
(but there is substantial and continuing advances)
©Zachary Wartell
What are a CG application’s components?
Image Synthesis
Display
Application
Software
Eye
Brain
CG 3D API CG GUI (2D)
API
CG System Software
OS
Graphics Computing
Hardware
General
Computing Hardware
Body
Input Device
©Zachary Wartell
Image Synthesis Processes
Image Synthesis
Modeling
Viewing
Rendering
 Modeling: The process of creating objects of a scene that will be
rendered by the graphics hardware.
 Viewing: Specification of camera and a viewing window
that determines the part of the world (of objects) that will be included
in the final image.
 Rendering: The process that creates an image of the objects within
the current view, taking into account lighting parameters and material
characteristics.
©Zachary Wartell, K.R. Subramanian
2D Image Synthesis: Coordinate Systems (CS)
- Computation
- Data
View
Transform
(2D→2D)
View
(2D Window)
World CS
Coordinate
Transforms
2D Models
Model CS
©Zachary Wartell
Convert
Geometry to Image
2D World
World CS
2D Image
[Display] Device CS
(DCS)
3D Image Synthesis: Coordinate Systems (CS)
- Computation
View
- Data
(Eye &
Window)
View
Transform
(3D→2D)
Lights
World CS
Coordinate
Transforms
Convert
Geometry to Image
3D Models
3D World
Model CS
World CS
©Zachary Wartell
2D Image
[Display] Device CS
(DCS)
Image Data Structures & Display Hardware
 So far: CG application components, Image Synthesis,
Coordinate Systems. Let’s start with details about:
2D Image
[Display] Device CS
(DCS)
Slide 22
Graphics Computing
Hardware
General
Computing Hardware
Display
Slide 20
Slide 20
©Zachary Wartell
Basic Definitions
 RASTER: A rectangular array of points or dots (either on physical
display or a data structure in memory).
 PIXEL (Pel): One dot or picture element of the raster
 SCAN LINE: A row of pixels
Raster Displays create
display an image by
sequentially drawing
out the pixels of the
scan lines that form
the raster.
©Larry F. Hodges, Zachary Wartell
Pixel
• Pixel - The most basic addressable element in a
image or on a display
– CRT - Color triad (RGB phosphor dots)
– LCD - Single color element
• Resolution - measure of number of pixels on a image
(m by n)
– m - Horizontal image resolution
– n - Vertical image resolution
©Larry F. Hodges, Zachary Wartell
Other meanings of resolution
• Dot Pitch [Display] - Size of a display pixel, distance
from center to center of individual pixels on display
• Cycles per degree [Display] - Addressable elements
(pixels) divided by twice the FOV measured in
degrees.
• Cycles per degree [Eye] - The human eye can
resolve 30 cycles per degree (20/20 Snellen acuity).
©Larry F. Hodges, Zachary Wartell
Basic Image Synthesis Hardware (Raster Display)
Peripheral
Devices
CPU
System
Memory
raster images
found here
System Bus
Framebuffer
Display
Processor
Display
Processor
Memory
©Larry F. Hodges, Zachary Wartell
Video
Controller
Raster – Bit Depth
• A raster image may be thought of as computer memory
organized as a two-dimensional array with each (x,y)
addressable location corresponding to one pixel.
• Bit Planes or Bit Depth is the number of bits
corresponding to each pixel.
• A typical framebuffer resolution might be
1280 x 1024 x 8
1280 x 1024 x 24
1600 x 1200 x 24
©Larry F. Hodges, Zachary Wartell
Displaying Color
• There are no commercially available small pixel
technologies that can individually change color.
• spatial integration – place “mini”-pixels of a few fixed
colors very close together. The eye & brain spatially
integrate the “mini”-pixel cluster into a perception of a
pixel of arbitrary color
• temporal integration - field sequential color uses red,
blue and green liquid crystal shutters to change color in
front of a monochrome light source. The eye & brain
temporally integrate the result into a perception of pixels
of arbitrary color
©Larry F. Hodges, Zachary Wartell
CRT Display
CRT
Shadow Mask
Electron Guns
Red Input
Green
Input
Blue Input
Focusing
System
©Larry F. Hodges, Zachary Wartell
Deflection
Yoke
Red, Blue,
and Green
Phosphor Dots
Electron Gun
•Contains a filament that, when heated, emits a stream of
electrons.
•Electrons are focused with an electromagnet into a sharp
beam and directed to a specific point of the face of the
picture tube.
•The front surface of the picture tube is coated with small
phosphor dots.
•When the beam hits a phosphor dot it glows with a
brightness proportional to the strength of the beam and how
often it is excited by the beam.
©Larry F. Hodges, Zachary Wartell
Color CRT
G
•Red, Green and Blue electron
guns.
•Screen coated with phosphor
triads.
R
B
G
B
G
R
G
R
B
B
G
•Each triad is composed of a red,
blue and green phosphor dot.
•Typically 2.3 to 2.5 triads per pixel.
FLUORESCENCE - Light emitted while the phosphor is being struck by
electrons.
PHOSPHORESCENCE - Light given off once the electron beam is
removed.
PERSISTENCE - Is the time from the removal of excitation to the moment
when phosphorescence has decayed to 10% of the initial light output.
©Larry F. Hodges, Zachary Wartell
Shadow Mask
•Shadow mask has one small hole for each phosphor triad.
•Holes are precisely aligned with respect to both the triads and the electron
guns, so that each dot is exposed to electrons from only one gun.
•The number of electrons in each beam controls the amount of red, blue and
green light generated by the triad.
SHADOW MASK
Phosphor Dot
Screen
Red
Green
Blue
©Larry F. Hodges, Zachary Wartell
Convergence
Point
Scanning An Image
Frame: The image to be scanned out on the CRT.
•Some minimum number of frames must be displayed
each second to eliminate flicker in the image.
CRITICAL FUSION FREQUENCY
•Typically 60-85 times per second for
raster displays.
•Varies with intensity, individuals,
phosphor persistence, room
lighting.
©Larry F. Hodges, Zachary Wartell
Interlaced Scanning
1/30 SEC
1/30 SEC
1/60 SEC
1/60 SEC
1/60 SEC
1/60 SEC
FIELD 1
FIELD 2
FIELD 1
FIELD 2
FRAME
FRAME
Time
•Display frame rate 30 times per second
•To reduce flicker at lesser bandwidths (Bits/sec.),
divide frame into two fields—one consisting of the
even scan lines and the other of the odd scan lines.
•Even and odd fields are scanned out alternately to
produce an interlaced image.
•non-interlaced also called “progressive”
©Larry F. Hodges, Zachary Wartell
Scanning
(0,0)
Device CS
(alternate conventions)
(0,0)
VERTICAL SYNC PULSE — Signals the start of the next field.
VERTICAL RETRACE — Time needed to get from the bottom of
the current field to the top of the next field.
HORIZONTAL SYNC PULSE — Signals the start of the new scan
line.
HORIZONTAL RETRACE — Time needed to get from the end of
the current scan line to the start of the next scan line.
©Larry F. Hodges, Zachary Wartell
Example Video Formats
NTSC – ? x 525, 30f/s, interlaced (60 fld/s)
PAL – ? x 625, 25f/s, interlaced (50 fld/s)
HDTV – 1920 x 1080i, 1280 x 720p
XVGA – 1024x768, 60+ f/s, non-interlaced
generic RGB – 3 independent video signals
and synchronization signal, vary in resolution
and refresh rate
generic time-multiplexed color – R,G,B one
after another on a single signal, vary in
resolution and refresh rate
©Larry F. Hodges, Zachary Wartell
Calligraphic/Vector CRT
Video
Controller
P0
P0
P1
P1
Line (P0,P1)
older technology
vector file instead of framebuffer
wireframe engineering drawings
flight simulators: combined raster-vector CRT
©Zachary Wartell
Flat-Panel Displays
Flat-Panel
Emissive
Non-Emissive
LCD
Plasma
CRT
(90°deflected)
LED
Thin-Film
electroluminescent
©Zachary Wartell
DMD
Active- PassiveMatrix Matrix
(TFT)
Flat-Panel Displays (Plasma)
Flat-Panel
Emissive
Non-Emissive
LCD
Plasma
CRT
(90°deflected)
LED
DMD
Active- PassiveMatrix Matrix
Thin-Film
electroluminescent
ToshibaTM, 42”, Plasma HTDV
$4,500 (circa 2005)
©Zachary Wartell
Flat-Panel Displays (LED)
Flat-Panel
Emissive
Non-Emissive
LCD
Plasma
CRT
(90°deflected)
LED
DMD
Active- PassiveMatrix Matrix
Thin-Film
electroluminescent
BarcoTM “Light Street” (LED)
©Zachary Wartell
Flat-Panel Displays (DMD)
Flat-Panel
Emissive
Non-Emissive
LCD
Plasma
CRT
(90°deflected)
LED
Thin-Film
electroluminescent
Active- PassiveMatrix Matrix
4 μm
Digital Micro-mirror (DMD)
©Zachary Wartell
DMD
LCD
• Liquid crystal displays use small flat chips which
change their transparency properties when a voltage
is applied.
• LCD elements are arranged in an n x m array call the
LCD matrix
• Level of voltage controls gray levels.
• LCDs elements do not emit light, use backlights
behind the LCD matrix
©Larry F. Hodges, Zachary Wartell
LCD Components
Small
Diffuser
LCD
Linear
fluorescent
Module
Polarizer
Linear
Color
tubes
Polarizer
Filter
©Larry F. Hodges, Zachary Wartell
Wavefront
distortion
filter
LCD Resolution
LCD resolution is occasionally quoted as number of
pixel elements not number of RGB pixels.
dot pitch
Example: 3840 horizontal by 1024 vertical pixel
elements = 4M elements
Equivalent to 4M/3 = 1M RGB pixels
"Pixel Resolution" is 1280x1024
©Larry F. Hodges, Zachary Wartell
LCD
• Passive LCD
screens
– Cycle through each
element of the LCD
matrix applying the
voltage required for
that element.
– Once aligned with the
electric field the
molecules in the LCD
will hold their
alignment for a short
time
©Larry F. Hodges, Zachary Wartell
• Active LCD (TFT)
– Each element contains
a small transistor that
maintains the voltage
until the next refresh
cycle.
– Higher contrast and
much faster response
than passive LCD
– Circa 2005 this is the
commodity technology
LCD vs CRT
flat & Lightweight
low power consumption
always some light
pixel response-time (1230ms)
view angle limitations
resolution interpolation
required
©Larry F. Hodges, Zachary Wartell
heavy & bulky
strong EM field & high voltage
true black
better contrast
pixel response-time not
noticeable
inherent multi-resolution
support
Recall our generic CG box…..
Peripheral
Devices
CPU
System
Memory
System Bus
Framebuffer
Display
Processor
©Larry F. Hodges, Zachary Wartell
Display
Processor
Memory
Video
Controller
Framebuffer
• A frame buffer may be thought of as computer
memory organized as a two-dimensional array with
each (x,y) addressable location corresponding to one
pixel.
• Bit Planes or Bit Depth is the number of bits
corresponding to each pixel.
• A typical frame buffer resolution might be
640 x 480 x 16
1280 x 1024 x 24
1920 x 1600 x 24
©Larry F. Hodges, Zachary Wartell
1-Bit Memory: Monochrome Display
(Bit-map Display)
1 bit
2 levels
Electron
Gun
©Larry F. Hodges, Zachary Wartell
3-Bit Color Display
3
red
green
blue
blackred
redgreen
green
yellow
cyan magenta
COLOR: black
blueblue
yellow
cyan magenta
white
R
G
B
0
0
0
©Larry F. Hodges, Zachary Wartell
1
0
0
0
1
0
0
0
1
1
1
0
0
1
1
1
0
1
white
1
1
1
True Color Display
224 = 16,777,216
24 bitplanes, 8 bits per color gun.
N
N
Red
N
Green
Blue
©Larry F. Hodges, Zachary Wartell
Color-Map Lookup Table
extends the number of colors that can be
displayed by a given number of bit-planes.
y
RED
max
GREEN
255
y
0
0
1
0
0
1
0
BLUE
1
67
1001 1010 0001
67 100110100001
R
Pixel in
bit map
at x', y'
0
0
x
Bit map
G
B
0
x
max
Look-up table
Fig. 4.LUT Video look-up table organization. A pixel with value 67
(binary 01000011) is displayed on the screen with the red electron
gun at 9/15 of maximum, green at 10/15, and blue at 1/15. This look-up
table is shown with 12 bits per entry. Up to 24 bits per entry are
common.
©Larry F. Hodges, Zachary Wartell
Pixel displayed
at x', y'
Display
Pseudo-Color: 28 x 24 Color Map LUT
Could be used to define 256 shades of green or
64 shades each of red, blue, green and white,
etc.
RED
255
254
256 colors chosen from a
palette of 16,777,216.
Each entry in the color map
LUT can be user defined.
3
2
1
0
©Larry F. Hodges, Zachary Wartell
GREEN
BLUE
Recall our generic CG box…..
Peripheral
Devices
CPU
System
Memory
System Bus
Framebuffer
Display
Processor
©Larry F. Hodges, Zachary Wartell
Display
Processor
Memory
Video
Controller
Display Processor
Synonyms: Graphics Controller, Display Co-Processor,
Graphics Accelerator, or GPU
Specialized hardware for rendering graphics primitives into the
frame buffer.
Block-Pixel
Copy
(bitblt)
b
b
Scan-Conversion
a
a
Line(a,b)
©Larry F. Hodges, Zachary Wartell, K.R. Subramanian
Pixels: P0,0, P1,1, P2,2, P3,2, …. P7,7
Display Processor (2)
• Fundamental difference among display systems is
how much the display processor does versus how
much must be done by the graphics subroutine
package executing on the general-purpose CPU.
CPU
Display
Processor
©Larry F. Hodges, Zachary Wartell
GPU
1400M trans.
240 shader
processors @
1400Mhz
1GB memory
21.4 B pix./s
NvidiaTM GTX 285(2009)
Video Controller
Cycles through the frame buffer, one scan line at a time. Contents
of the memory are used the control the CRT's beam intensity or
color.
X address
=
M
e
m
o
r
y
Linear
address
Framebuffer
Raster scan
generator
Y address
Data
Pixel
value(s)
RAMDAC
©Larry F. Hodges, Zachary Wartell
Set or increment
Set or decrement
Horizontal
and vertical
deflection
signals
Intensity
or color
Single Frame Buffer & Animation
problem with one object
Video
Controller
Graphics
Processor
Framebuffer
problem with multiple objects
Graphics
Processor
©Zachary Wartell
Video
Controller
Double Buffer & Animation
Video
Controller
Graphics
Processor
Frame Buffer 0
(Front)
(Back)
Frame Buffer 1
(Back)
(Front)
©Zachary Wartell
page flipping
Projectors
• Use bright CRT, LCD or
DMD screens to generate
an image which is sent
through an optical system
to focus on a (usually)
large screen.
• Full color obtained by
having separate
monochromatic projector
for each of the R,G,& B
color channels
©Larry F. Hodges, Zachary Wartell
Advantages/Disadvantages of Projection Display
• Very large screens can provide large FoV and can be
seen by several people simultaneously.
• Image quality can be fuzzy and somewhat dimmer
than conventional displays.
• Sensitivity to ambient light.
• Delicate optical alignment.
©Larry F. Hodges, Zachary Wartell
Displays in Virtual Reality
• Head-Mounted Displays (HMDs)
– The display and a position tracker are
attached to the user’s head
• Head-Tracked Displays (HTDs)
– Display is stationary, tracker tracks the
user’s head relative to the display.
– Example: CAVE, Workbench, Stereo
monitor
©Larry F. Hodges, Zachary Wartell
3D Display
binocular vision & stereopsis – a strong depth cue in physical
world
3D Display – must generate at least two different images
simultaneously, one per eye. The two images mimic the
geometric differences seen in the physical world
volumetric display – pixels physically (or optically) spread over real 3D volume
stereoscopic display – two planar images. Optically channel left image to left
eye and right image to right eye. Used in HMD, HTD, and regular displays.
-auto-stereoscopic versus wearing glasses
-commodity products available ($100 for LCD 3D glasses)
”wavefront” display – (in practice a hologram) a single planar display surface.
Uses holographic optics to create full (omni-directional) wavefronts of light
emanating from the virtual 3D world. [Interactive holography still in pure
research phase – MIT, 3” partial-parallax hologram]
©Larry F. Hodges, Zachary Wartell
Color and Vision (Brief)
Before leaving displays, lets briefly discuss:
• Light
• Eye
- Physiology
- Functionality
• Color
- Perceptually based models
Light & Vision
• Vision is perception of electromagnetic energy
(EM radiation).
• Humans can only perceive a very small portion
of the EM spectrum:
Gamma X UV
Violet Blue
400
Infra Radar
Green
500
Wavelength (nm)
© Kessler , Watson, Hodges, Ribarsky
FM TV AM AC
Yellow
600
Red
700
Eye Structure
• The eye can be viewed as a dynamic, biological camera: it
has a lens, a focal length, and an equivalent of film.
• A simple diagram of the eye's structure:
lens
ciliary
muscle
retina
iris
cornea
pupil
© Wartell
suspensory
ligments
Eye: The Lens
• The lens must focus (accommodation) on
directly on the retina for perfect vision:
• But age, genetic
factors, malnutrition
and disease can
unfocus the eye,
leading to near- and
farsightedness:
Nearsighted
© Kessler , Watson, Hodges, Ribarsky
Farsighted
Eye: The Retina
•
The retina functions as the eye's "film".
•
It is covered with cells sensitive to light. These cells turn the
light into electrochemical impulses that are sent to the brain.
•
There are two types of cells, rods and cones
Retina
© Kessler , Watson, Hodges, Ribarsky
Numbers of rods or cones per mm2
The Retina: Cell Distribution
180,000
Blind spot
140,000
rod
s
100,000
60,000
cones
“Blind Spot Trick”
© Kessler , Watson, Hodges, Ribarsky
(Right Eye)
Optic di
sk
Te
pe mpo
rip ra
he l
ry
Fovea
20,000
al ery
s
h
Na rip
pe
The Retina: Rods
• Sensitive to most visible frequencies
(brightness).
• About 120 million in eye.
• Most located outside of fovea, or center of
retina.
• Used in low light (theaters, night)
environments, result in achromatic (b&w)
vision.
• Absorption function:
400
© Kessler , Watson, Hodges, Ribarsky
nm
700
The Retina: Cones
• R cones are sensitive to long wavelengths (nm),
G to middle nm, and B to short nm.
• R: 64%, 32% G, 2% B
• About 8 million in eye.
• Highly concentrated in fovea, with B cones more
evenly distributed than the others (hence less in
fovea).
• Used for high detail color vision (CRTs!), so they
will concern us most.
© Kessler , Watson, Hodges, Ribarsky
The Retina: Cones
• The absorption functions of the cones
are:
445 nm 535 nm 575 nm
B
G
R
400
© Kessler , Watson, Hodges, Ribarsky
700
Color Constancy
• If color is just light of a certain
wavelength, why does a yellow object
always look yellow under different lighting
(e.g. interior/exterior)?
• This is the phenomenon of color
constancy.
• Colors are constant under different
lighting because the brain responds to
ratios between the R, G and B cones, and
not magnitudes.
© Kessler , Watson, Hodges, Ribarsky
Vision: Metamers
• Because all colors are represented to the brain as ratios
of three signals it is possible for different frequency
combinations to appear as the same color. These
combinations are called metamers. This is why RGB
color works!
• Example – [Goldstein,pg143]
mix 620nm red light with 530nm green light
matches color percept of 580 nm yellow
530 + 620
B
1.0
580
R
B
5.0 8.0
1.0
G
© Kessler , Watson, Hodges, Ribarsky
G
R
5.0 8.0
Sensitivity vs Acuity
• Sensitivity is a measure of the dimmest light
the eye can detect.
• Acuity is a measure of the smallest object
the eye can see.
• These two capabilities are in competition.
– In the fovea, cones are closely packed.
Acuity is at its highest, sensitivity is at its
lowest (30 cycles per degree).
– Outside the fovea, acuity decreases
rapidly. Sensitivity increases
correspondingly.
© Kessler , Watson, Hodges, Ribarsky
Eye Versus 1280 x 1024 Display
66 cm
28°
1280 pixel =
640 cycles
33 cm
Pictured: 22.8 c/d (cycles/degree_
with Vres=600/x → 26.25→ 20/26.25 vision
(Snellen acuity)
Widescreen at 60° → 20/56.25 vision
© Kessler , Watson, Hodges, Ribarsky
Input Devices
Logical Devices
• Locator, to indicate a position and/or orientation
• Pick, to select a displayed entity
• Valuator, to input a single value in the space of real numbers
• Keyboard/String, to input a character string
• Choice, to select from a set of possible actions or choices
Locator Devices: Tablet, Mouse, Trackball, Joystick, Touch Panel,
Light Pen
Keyboard devices: Alphanumeric keyboard (coded - get single
ASCII character, unencoded - get state of all keys - more flexible)
Valuator Devices: Rotary dials (Bounded or Unbounded), Linear
sliders
Choice Devices: Function keys
©Larry F. Hodges, Zachary Wartell
Locator Devices
degrees of freedom
2 DOF (mouse)
3 DOF (mouse+wheel), 3-space position tracker, mouse with rotation
6 DOF: 3-space position and orientation tracker
isotonic, isometric, elastic
controller order (differential mapping of physical to virtual
movement)
0th order – position
1st order – velocity
2nd order – acceleration
SpaceballTM
©Larry F. Hodges, Zachary Wartell
Revisions
Revision 1.1 – updated slide 75 on metamers, updated eye diagram slide 68
Revision 1.2 – added double buffer slides