Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Visual Appearance K. H. Ko Department of Mechatronics Gwangju Institute of Science and Technology Shadows Shadow algorithms determine which surface can be “seen” from the light source. They are essentially the same as visible-surface determination algorithms. The surfaces that are visible from the light source are not in shadow. Those that are not visible from the light source are in shadow. First we consider shadow algorithms for point light sources. 2 Shadows When a point on a surface cannot be seen from a light source, then the illumination calculation must be adjusted to take it into account. I I a k a O d S i f att i I p i k d O d ( N L i ) k s O s ( R i V ) n 1 i m Si: 0, if light i is blocked at this point. 1, if light i is not blocked at this point. Note that areas in the shadow of all point light sources are still illuminated by the ambient light. For simplification, we assume that all objects are polygons. 3 Scan-Line Generation of Shadows One of the oldest method: Augment a scan-line algorithm to interleave shadow and visiblesurface processing. Using the light source as a center of projection, the edges of polygons that might potentially cast shadows are projected onto the polygons intersecting the current scan line. When the scan crosses one of these shadow edges, the colors of the image pixels are modified accordingly. 4 A Two-Pass Object-Precision shadow Algorithm It performs shadow determination before visible-surface determination. It processes the object description by using the same algorithm twice: For the viewpoint. For the light source. The results are then combined to determine the pieces of each visible part of a polygon that are lit by the light source and the scene is scan-converted. The shadows are not dependent on the viewpoint. All the shadow calculations may be performed just once for a series of images of the same objects seen from many different viewpoints as long as the light source and objects are fixed. 5 A Two-Pass Object-Precision shadow Algorithm Determine those surfaces that are visible from the light source’s viewpoint. The output of this pass is a list of lit polygons. The lit polygons are transformed back into the modeling coordinates and are merged with a copy of the original database as surface-detail polygons, creating a viewpoint-independent merged database. Hidden-surface removal is then performed on a copy of this merged database from the viewpoint of an arbitrary observer. 6 A Two-Pass Object-Precision shadow Algorithm 7 A Two-Pass Object-Precision shadow Algorithm 8 A Two-Pass Object-Precision shadow Algorithm Multiple light sources can be handled by processing the merged database from the viewpoint of each new light source, merging the results of each pass. 9 Shadow Volumes A shadow volume is defined by the light source and an object, and is bounded by a set of invisible shadow polygons. Shadow polygons are not rendered themselves, but are used to determine whether the other objects are in shadow. 10 A Two-Pass z-Buffer Algorithm A shadow-generation method based on two passes through a z-buffer algorithm For the viewer For the light source It is based on image-precision calculations. 11 Global Illumination Shadow Algorithms Ray-tracing and Radiosity algorithms They have been used to generate some of the most impressive pictures of shadows in complex environments. 12 Transparency Much as surfaces can have specular and diffuse reflection, those that transmit light can be transparent or translucent. Transparent materials: We can see clearly through them although in general the rays are refracted. Diffuse transmission occurs through translucent materials such as frosted glass. We consider here two things: non-refractive transparency and refractive transparency 13 Non-refractive Transparency Simple model: Ignores refraction. Light rays are not bent as they pass through the surface. Note that total photographic realism is not always the objective in making pictures. Two methods that approximate the way in which the colors of two objects are combined when one object is seen through the other. Interpolated transparency Filtered transparency 14 Non-refractive Transparency Interpolated Transparency x Determines the shade of a pixel in the 2 intersection of two polygons’ projections by linearly interpolating the individual 1 shades calculated for the two polygons Iλ = (1-kt1)Iλ1 + kt1 Iλ2. z Line of sight kt1: the transmission coefficient. It measures the transparency of polygon 1. When 1, perfectly transparent. 1- kt1 : the polygon’s opacity. For a more realistic effect, interpolate only the ambient and diffuse components of polygon 1 with the full shade of polygon 2, and then add in polygon1’s specular component. 15 Non-refractive Transparency Filtered Transparency. It treats a polygon as a transparent filter that selectively passes different wavelengths. Iλ = Iλ1 + kt1 Otλ Iλ2 Otλ is polygon 1’s transparency color. 16 Refractive Transparency More complex to model since the geometrical and optical lines of sight are different. Indices of refraction Snell’s Law 17 Refractive Transparency Calculating the refraction vector T (N I ) r 1 r (1 ( N I ) ) N 2 2 r I 18 Refractive Transparency Total Internal Reflection When light passes from one medium into another whose index of refraction is lower, the angle θt of the transmitted ray is greater than the angle θi. If θi becomes sufficiently large, then θt exceeds 90o and the ray is reflected from the interface between the media, rather than being transmitted. The smallest angle θi of at which it occurs is called the critical angle. When sinθt is set to 1, θi = sin-1(ηtλ / ηiλ ) 19 Global Illumination An illumination model computes the color at a point in terms of light directly emitted by light sources and of light that reaches the point after reflection from and transmission through its own and other surfaces. Global illumination: indirectly reflected and transmitted light Local illumination: light that comes directly from the light sources to the point being shaded. So far global illumination has been modeled by an ambient illumination term. That was held constant for all points on all objects. 20 Global Illumination Much of the light in real-world environments does not come from direct light sources. Two different classes of algorithms for global illumination Ray tracing: view-dependent Given the viewer’s direction, discretize the view plane to determine points at which to evaluate the illumination equation. Radiosity: view-independent Discretize the environment and process it in order to provide enough information to evaluate the illumination equation at any point and from any viewing direction. 21 Recursive Ray Tracing The basic ray-tracing algorithm for visible-surface determination is extended to handle shadow, reflection and refraction. It determines the color of a pixel at the closest intersection of an eye ray with an object, by using any of the illumination models. To calculate shadow, we fire an additional ray from the point of intersection to each of the light sources. If one of these shadow rays intersects any object along the way, then the object is in shadow at that point and the shading algorithm ignores the contribution of the shadow ray’s light source. 22 Recursive Ray Tracing Extended illumination model for ray tracing model to include specular reflection and refractive transparency I I a k a O d S i f att i I p i k d O d ( N L i ) k s O s ( R i V ) n k s I r k t I t 1 i m Irλ is the intensity of the reflected ray, kt is the transmission coefficient, and Itλ is the intensity of the refracted transmitted ray. Values for Irλ and Itλ are determined by recursively evaluating the equation at the closest surface that the reflected and transmitted rays intersect. To approximate attenuation with distance, Iλ calculated for each ray is multiplied by the inverse of the distance traveled by the ray. 23 Recursive Ray Tracing Extended illumination model for ray tracing model to include specular reflection and refractive transparency I I a k a O d S i f att i I p i k d O d ( N L i ) k s O s ( R i V ) n k s I r k t I t 1 i m Si is made as a continuous function of the kt of the objects intersected by the shadow ray is used. A transparent object obscures less light than an opaque one at those points it shadows. 24 Recursive Ray Tracing In addition to shadow rays, recursive ray-tracing algorithm conditionally spawns reflection rays and refraction rays from the point of intersection. The shadow, reflection, and refraction rays are often called secondary rays to distinguish them from the primary rays from the eye. 25 Recursive Ray Tracing Each of the reflection and refraction rays, in turn, recursively spawn shadow, reflection, and refraction rays. Ray-tracing is particularly prone to problems caused by limited numerical precision. False intersection can result in visual problems 26 Recursive Ray Tracing 27 Recursive Ray Tracing Efficiency Considerations Item buffers Reflection maps Ray is not cast if its contribution to the pixel’s intensity is estimated to be below some preset threshold. Light buffers Do less work for the secondary rays than for primary rays Adaptive tree-depth control Not to use it at all when determining those objects directly visible to the eye. Shadow rays are special in that each is fired toward one of a relatively small set of objects. Increase the speed with which shadow rays are processed. Similar to 3D spatial partitioning of the 3D view of its light. Ray classification Spatial-partitioning approach. 28 Recursive Ray Tracing A Better Illumination Model The specular light expressions are scaled by a wavelengthdependent Fresnel reflectivity term. Also take into account the contribution of transmitted light directly emitted by the light sources. Is also scaled by the Fresnel transmissivity term. The global reflected and refracted rays take into account the transmittance of the medium through which they travel. 29 Recursive Ray Tracing Area-Sampling Variations One of conventional ray tracing’s biggest drawbacks is that this technique point samples on a regular grid. To avoid aliasing resulting from point sampling on a regular grid by casting solid beams rather than infinitesimal rays. Cone tracing: generalizes the linear rays into cones. Beam tracking: an object-precision algorithm for polygonal environments that traces pyramidal beams, rather than linear rays. Pencil tracing: approach to solve some of the problems of cone tracing and beam tracking. 30 Recursive Ray Tracing Distributed Ray Tracing It is based on a stochastic approach to supersampling that trades off the objectionable artifacts of aliasing for the less offensive artifacts of noise. The ability to perform antialiased spatial sampling can also be exploited to sample a variety of other aspects of the scene. Motion blur, specular reflection from rough surface, etc. “Distributed” means that rays are stochastically distributed to sample the quantities. 31 Recursive Ray Tracing Stochastic sampling Aliasing results when a signal is sampled with regularly spaced samples below the Nyquist rate. If the samples are not regularly spaced, the sharply defined frequency spectrum of the aliases is replaced by nose, an artifact that viewers find much less objectionable than the individually recognizable frequency components of regular aliasing such as staircasing. 32 Recursive Ray Tracing Stochastic sampling Pure random samples may cluster together in some areas and leave others unsampled. Use of a minimum distance Poisson distribution in which no pair of random samples is closer than some minimum distance. Satisfactory approximation to the minimum-distance Poisson distribution Displace the position of each element of a regularly spaced sample grid by a small random distance. 33 Recursive Ray Tracing Sampling other dimensions The same basic technique of stochastic sampling can also be used to distribute the rays to sample other aspects of the environment. Motion blur is produced by distributing rays in time. Depth of field is modeled by distributing the rays over the area of the camera lens. The blurred specular reflections and translucent refraction of rough surfaces are simulated by distributing the rays according to the specular reflection and transmission functions. Soft shadows are obtained by distributing the rays over the solid angle subtended by an extended light source as seen from the point being shaded. 34 Radiosity Methods Disadvantage of Ray Tracing Use of a directionless ambient-lighting term to account for all other global lighting conditions. For more accurate treatment of inter object reflections, approaches based on thermal-engineering models for the emission and reflection of radiation are used to eliminate the need for the ambient-lighting term. Assume the conservation of light energy in a closed environment. All energy emitted or reflected by every surface is accounted for by its reflection from or absorption by other surfaces. The rate at which energy leaves a surface, called its RADIOSITY, is the sum of the rate at which the surface emits energy and reflects or transmits it from that surface or other surfaces. 35 Radiosity Methods They first determine all the light interactions in an environment in a view-independent way. Then one or more views are rendered with only the overhead of visible-surface determination and interpolation shading. 36 Radiosity Methods Radiosity methods allow any surface to emit light. All light sources are modeled inherently as having area. Break up the environment into a finite number n of discrete patches. Each patch emits and reflects light uniformly. It is an opaque Lambertian diffuse emitter and reflector. For surface patch i, Bi,Bj : the Radiosities of patches i Bi E i i i j n B j F j i Aj Ai and j Ei: the rate at which light is emitted from patch i ρi : patch i’s reflectivity Fj-i: dimensionless form factor. 37 Radiosity Methods The form factor specifies the fraction of energy leaving the entirety of patch j that arrives at the entirety of patch i, taking into account the shape and relative orientation of both patches and the presence of any obstructing patches. Bi E i i i j n B j F j i Aj Ai,Aj : the areas of patches i and j Ai The above equation states that the energy leaving a unit area of surface is the sum of the light emitted plus the light reflected. 38 Radiosity Methods A simple reciprocity relationship holds between form factors in diffuse environments. AF A F Rearranging terms, i Bi E i i i j j ji B j Fi j i j n The interaction of light among the patches in the environment can be stated as a set of simultaneous equations. 39 Radiosity Methods The above equation should be solved for each band of wavelength considered in the lighting model. ρi and Ei are wavelength dependent. The form factors are independent of wavelength and are solely a function of geometry. 40 Radiosity Methods Computation of a form factor The form factor from differential area dAi to differential area dAj is dF di dj cos i cos r 2 j H ij dA j 41 Radiosity Methods Computation of a form factor To determine Fdi-j, the form factor from differential area dAi to finite area Aj, we need to integrate over the area of patch j. Fdi j cos i cos r Aj 2 j H ij dA j The form factor from Ai to Aj is the area average over patch i: Fi j 1 Ai Ai A j cos i cos r 2 j H ij dA j dA i 42 Radiosity Methods Computation of a form factor Computing Fdi-j is equivalent to projecting those parts of Aj that are visible from dAi onto a unit hemisphere centered about dAi, projecting this projected area orthographically down onto the hemisphere’s unit circle base and dividing by the area of the circle. 43 Radiosity Methods Computation of a form factor Rather than analytically projecting each Aj onto a hemisphere, an efficient image-precision algorithm is developed that projects onto the upper half of a cube centered about dAi, with the cube’s top parallel to the surface dAi. 44 Radiosity Methods Computation of a form factor 45 Radiosity Methods Computation of a form factor 46 Radiosity Methods 47