Photographic Lenses FAQ by David M. Jacobson Newsgroups: rec.photo.moderated,rec.answers,news.answers Subject: Photographic Lenses FAQ Distribution: Followup-To: rec.photo.moderated Summary: This posting contains a list of Frequently Asked Questions about lenses. It is intended for photographers. It defines terms, gives a large number of formulas, discusses depth of field issues, vignetting, diffraction, lens aberrations, and the Modulation Transfer Function. Organization: Hewlett-Packard Laboratories Approved: news-answersrequest@MIT.EDU Expires: 22 April 1997 06:00:00 GMT Supersedes: <5eltmh$eer@cello.hpl.hp.com> Archive-name: rec-photo/lenses/faq PostingFrequency: monthly Last-modified 1997/03/11 Version: 1.18 Frequently Asked Questions regarding lenses. By David Jacobson jacobson@hpl.hp.com This is the Lens FAQ. It is a technical document describing a lot of optics related information of use to photographers. It does not answer questions about particular commerical lenses. You won't find out whether Canon or Nikon is better. But you will find all sorts of formulas and technical information. Q1. What is the meaning of the symbols in the rest of this FAQ? A. f focal length So distance from front principal point to object (subject) Sfar distance from front principal point to farthest point in focus Sclose distance from front principal point to closest point in focus Si distance from rear principal point to film (image) plane M magnification N f-number or f-stop Ne effective f-number (corrected for bellows factor) c diameter of largest acceptable circle of confusion, or the diameter of the circle of confusion h hyperfocal distance Here we use the more technical term "object" for the thing being focused on. Informally it is equivalent to the subject. See the technical notes at the end for more information on object distances, more information on the meaning of f-number and limitations to be observed when applying these formulas to lenses in which the aperture does not appear the same size front and rear. Q2. What is the meaning of focal length? In other words, what about a 50mm lens is 50mm? A. A 50mm lens produces an image of a distant object on the film that is the same size as would be produced by a pinhole 50mm from the film. See also Q5 below. Q3. What is meant by f-stop? A. The focal length of the lens divided by the diameter of the aperture (as seen from the front). It is also called an f-number, and is written like f/8, which means the aperture diameter is 1/8th the focal length. The term is used both in regard to the maximum aperture of a lens and in regard to the aperture selected in a specific situation. The brightness of the image on the film is inversely proportional to the f-number squared. The depth of field increases but diffraction is worsened when using a large f-number. The effective f-number for all 3 effects changes if the lens is focused extremely close. See Q7. The term "stops" purportedly comes from old technology in which the aperture was selected by turning a wheel with various sized holes in it, each one of which let in twice the light of the preceding one. Thus the phrase "open up a N stops" means to change to an aperture allowing in 2^N times as much light, and conversely with "stop down N stops". Q4. What is the basic formula for the conditions under which an image is in focus? A. = 1/f Q5. What is the M = Si/So There are several forms. 1/Si + 1/So (Si-f)*(So-f) = f^2 (Newtonian form) formula for magnification? A. There are several forms. M = (Si-f)/f M = f/(So-f) Q6. For a given lens and (Gaussian form) format what is angle of coverage? A. If the format has a width, height, or diagonal of distance X, the angle of coverage along width, height, or diagonal is 2*arctan(X/(2*f*(M+1))). For example a 35mm frame is 24x36 mm, so with a 50 mm lens and a distant object (i.e. M virtually zero), the coverage is 27 degrees by 40 degrees, with a diagonal of 47 degrees. See the technical notes at the end for qualifications. Q7. How do I correct for bellows factor? A. Ne = N*(1+M) Bellows factor is the factor by which the effective f-number gets multiplied as the lens is focused up close. See the technical notes. Q8. What is meant by circle of confusion? A. When a lens is defocused, a object point gets rendered as a small circle, called the circle of confusion. (Ignoring diffraction.) If the circle of confusion is small enough, the image will look sharp. There is no one circle "small enough" for all circumstances, but rather it depends on how much the image will be enlarged, the quality of the rest of the system, and even the subject. Nevertheless, for 35mm work c=.03mm is generally agreed on as the diameter of the acceptable circle of confusion. Another rule of thumb is c=1/1730 of the diagonal of the frame, which comes to .025mm for 35mm film. (Zeiss and Sinar are known to be consistent with this rule.) Q9. What is hyperfocal distance? A. The closest distance that is in acceptable focus when the lens is focused at infinity. (See below for a variant use of this term.) h = f^2/(N*c) Q10. What are the closest and farthest points that will be in acceptably sharp focus? A. Sclose = h * So / (h + (So - f)) Sfar = h * So / (h (So - f)) or, we can define a "hyperfocal ratio", hr = h/(So - f), or roughly the ratio of the hyperfocal distance to the object distance. Then Sclose = So * hr/(hr+1) Sfar = So * hr/(hr-1) These formulas are also correct when hr is defined as hr = h/So and the N used in computing h is actually Ne. If the denominator is zero or negative, Sfar is infinity. Q11. What is depth of field? A. It is convenient to think of a rear depth of field and a front depth of field. The rear depth of field is the distance from the object to the farthest point that is sharp and the front depth of field is the distance from the closest point that is sharp to the object. Sometimes the term depth of field is used for the combination of these two, i.e. the distance from the closest point that is sharp to the farthest point that is sharp. frontdepth = So - Sclose frontdepth = Ne*c/(M^2 * (1 + (So-f)/h)) frontdepth = Ne*c/(M^2 * (1 + (N*c)/(f*M))) frontdepth = So /(hr + 1) reardepth = Sfar - So reardepth = Ne*c/(M^2 * (1 (So-f)/h)) reardepth = Ne*c/(M^2 * (1 - (N*c)/(f*M))) reardepth = So/(hr - 1) In the last three, if the denominator is zero or negative, reardepth is infinity. These formulas using hyperfocal distance can be used as follows. Suppose I know that the object distance, So, is 1/8th of the hyperfocal distance. Then the range of distances that is acceptably sharp is from 8/9 of So to 8/7 of So. The front and rear depths of field are 1/9 So and 1/7 So. Q12. What is Depth of Field Preview? A. It is a feature on higher quality SLR cameras that allows you to stop down the lens while looking though the viewfinder. Ostensibly it allows you to "preview" the depth of field. Of course, since this stops down the aperture, the image also gets dimmer. Most people find it difficult to judge from a dim viewfinder image whether some part of the image will appear sharp in a slide or print. However, in many cases photographers will select a large aperture to deliberately blur background or foreground objects. DOF preview lets you see just what the effect will be. Q13. Where should I focus my lens so I will get everything from some close point to infinity in focus? A. At approximately the hyperfocal distance. More precisely, at So = h + f. In this condition the closest point that will be in focus is at half the object distance. (Some authorities use this as the definition of hyperfocal distance.) Q14. Are there some simpler approximate formulas for depth of field? A. Yes. When the object distance is small with respect to the hyperfocal distance, the front and rear depth of field are almost equal and depend only on the magnification and effective f-stop and the following approximate formulas can be used. frontdepth = reardepth = Ne*c/M^2 frontdepth = reardepth = So/hr In non-macro situations Ne is the same as the marked f-number. Q15. I have heard that one should use a long lens to get a shallow depth of field and a short lens to get a large depth of field. Is this true? A. Assuming that you frame the subject the same way, and that the object (subject) distance is small compared with the hyperfocal distance for the shortest focal length being considered, the front and rear depths of field are approximately equal and constant regardless of focal length. (See Question 14.) However, there are two situations in which focal length does matter. First, when the focal length is short enough, the hyperfocal distance, which varies with the square of the focal length, will not be many times longer than the object distance, violating the condition above. In this case the front depth of field is smaller and the rear depth of field is larger, with the latter extending to infinty about when the hyperfocal distance is less than the object distance. Second, the focal length of the lens also has a big effect on how fuzzy very distant points appear. Specifically, if the lens is focused on some nearby object rendered with magnification M, a point at infinity will be rendered as a circle of diameter c, given by c = f M / N which shows that the distant background point will be fuzzed out in direct proportion to the focal length. (See the lens tutorial for some graphs that may make this more intuitive.) Q16. If I focus on some point, and then recompose with that point not in the center, will the focus be off? A. Yes, but maybe only a little bit. If the object is far enough away, the depth of field will cover the shift in distance. An approximate formula for the minimum distance such that the error will be covered by depth of field is given by d = w^2/(2 N c) where d = minimum distance to make the point be sharply rendered d is measured from the film plane w = distance image point on the film is from center of the image For the 35mm format w^2/(2 c) is 5.4 meters, so you can recompose the image with the subject at the edge of the frame and still have it be sharp if the subject distance (at the center) was at least 5.4 meters (18 feet) divided by the f-number. See the technical notes at the end for a bunch of assumptions. Q17. If I get glasses (or bifocals) will my focusing be off? A. No. The focusing screen is a diffusing plane in the same optical position as the film. If the image is sharp on the focusing screen, it will also be sharp on the film. Many single lens reflex cameras have a split image focusing aid. In effect one half causes the eye to see through the left side of the lens and the other half causes the eye to see through the right side of the lens. Objects appear to be in the same position in both halves only if the image plane coincides with the plane of the focusing screen and thus with film plane, and hence are in sharp focus. All your glasses do is enable you to see the focusing screen or focusing aid better. Q18. What are vignetting and light falloff? A. Vignetting is a reduction in light falling on the film far from the center of the image that is caused by physical obstructions. Light falloff is a reduction of light far from the center because of fundamental optical reasons: First, an off-axis object sees a foreshortened apparent aperture (entrance pupil) so less light is collected. This results in a cos(theta) falloff, where theta is the angle off axis. Second, in a rectilinear lens the solid-angle-to-area magnification increases with cos^3(theta), spreading the light from a patch near the edge over more film than if the patch had been near the center. (The patch is presumed to face the camera at a constant very large distance.) As a result there is an overall cos^4(theta) falloff. The optical designer can compensate for these effects by making the entrance pupil enlarge and tip when viewed from off the optical axis. An alternative approach is to compensate by using a filter whose density varies appropriately with distance from the center. Q19. How can I tell if a lens has vignetting, or if a filter is causing vignetting? A. Open the back and, if necessary, trick the camera into opening the shutter and stopping down. Imagine putting your eye right in the corner of the frame and looking at the diaphragm. Or course, you really can't do this, so you have to move your head and sight through the corner of the frame, trying to imagine what you would see. If you "see" the entire opening in the diaphragm and through it to object space, there is no vignetting. However, at wide apertures in most lenses the edge of the rear element or the edge of the front element or filter ring will obstruct your vision. This indicates vignetting. Try to estimate the fraction of the area of the diaphragm that is obstructed. Log base two of this fraction is the falloff in fstops at the corner. You can also do this from the front. With SLRs hold the camera a fair distance away with a fairly bright area behind the viewfinder hole. With non-SLRs open the back and arrange so a reasonably bright area is behind the camera. Look through the lens, and rotate the camera until you are looking right at one corner of the viewing screen or frame. (If you are using the mirror-down technique with an SLR, choose an upper corner of the frame, i.e. look from below the axis.) Now for the hard part. Look at the aperture you see. If there is vignetting you see something about the shape of an American football. If the filter is causing the vignetting, one of the edges of the football is formed by the filter ring. A third way to detect vignetting is to aim the camera at a small bright spot surrounded by a fairly dark background. (A distant street light at night would serve well.) Deliberately defocus the image some and observe the shape of the spot, particularly in the corners. If it is round there is no vignetting. If it looks like the intersection of some arcs (i.e. like an American football), then there is vignetting. Note that near top of the image the top of the circle may get clipped a bit. This is because in many cameras some light (from the top part of the image) misses the bottom of the mirror. This affects only the viewfinder, not the film. You can use depth of field preview (if your camera has it) to determine the f-stop at which the spot becomes round. With wide-angle lenses the circle of confusion may not get large enough for this technique to be useful. Q20. For panoramic pictures, where is the best place to pivot the camera? A. The axis of the pivot should pass through the entrance pupil. The entrance pupil is the virtual image of the aperture as seen through the front of the lens. When you've got it right, the entrance pupil will not shift relative to fixed objects as you rotate the camera. (Stop down the lens so you can see the diaphragm and aperture.) There is a whole different type of panoramic camera in which the lens is rotated relative to the film. In this type of camera the lens rotates around the rear nodal point (for objects at infinity). See the lens tutorial for an explanation of nodal points. Q21. What is diffraction? A. When a beam of light passes through any aperture it spreads out. This effect limits how sharp a lens can possibly be. The diffraction is caused by the limiting of the beam to the size of the aperture, not primarily by sharp edges of the aperture. Even if one made a "soft edged" aperture that faded slowly from clear to opaque, there would still be diffraction, and the size of the central part of the diffraction pattern would not change much compared with the sharp-edged case. Q22. What is the diffraction limit of a lens. A. All lenses are diffraction limited to no more than about 1500/N to 1800/N line pairs per mm. See below under the question "What is MTF?". Q23. What are aberrations? A. Aberrations are image defects that result from limitations in the way lenses can be designed. Better lenses have smaller aberrations, but aberrations can never be completely eliminated, just reduced. The classic aberrations are: * Spherical aberration. Light passing through the edge of the lens is focused at a different distance (closer in simple lenses) than light striking the lens near the center. * Coma. Off axis points are rendered with tails, reminiscent of comets, hence the name. It can be shown that coma must occur if the image formed by rays passing near the edge of the lens has a different magnification than the image formed by rays passing near the center of the lens. * Astigmatism. Off-axis points are blurred in their the radial or tangential direction, and focusing can reduce one at the expense of the other, but cannot bring both into focus at the same time. Think of it as the focal length as varying around the circumference of the lens. (Optometrists apply the word "astigmatism" to a defect in the human eye that causes *on-axis* points to be similarly blurred. That astigmatism is not quite the same as astigmatism in photographic lenses.) * Curvature of field. Points in a plane get focused sharply on a curved surface, rather than a plane (the film). Or equivalently, the set of points in the object space that are brought to sharp focus on the film plane form a curved surface rather than a plane. With a plane subject or a subject at infinite distance the net effect is that when the center is in focus the edges are out of focus, and if the edges are in focus the center is out of focus. * Distortion (pincushion and barrel). The image of a square object has sides that curve in or out. (This should not be confused with the natural perspective effects that become particularly noticeable with wide angle lenses.) This happens because the magnification is not a constant, but rather varies with the angle from the axis. * Chromatic aberration. The position (forward and back) of sharp focus varies with the wavelength. * Lateral color. The magnification varies with wavelength. Q24. Can I eliminate these aberrations by stopping down the lens? A. The effect of all aberrations except distortion and lateral color is reduced by stopping down. The amount of field curvature is not affected by stopping down, but its effect on the film is. But note that stopping down also increases diffraction. Q25. Why do objects look distorted when photographed with a wide angle lens? A. This is because the size of the image of an object depends on the distance the object is from the lens. This is not a defect in the lens---even pinhole cameras with no lens at all exhibit this perspective effect. For image calculation purposes, think of the lens as being a pinhole one focal length in front of the film, and centered over the center of the film. (If the lens is not focussed at infinity, the distance from the film will be somewhat larger.) Then the image of an object point can be found by drawing a straight line from the object point through the pinhole and finding its intersection with the film. That line represents one light ray. (Diffraction and out-of-focus conditions have been ignored here, since they are irrelevant to this effect.) If you do this, you'll find that the image of a nearby object will be larger than the image of the same object farther away, by the ratio of the distances. You'll also find that any straight line in object space, no matter at what angle or position, will be rendered as a straight line on the film. (Proof outline: a line, and a point not on the line define a plane. All rays from the object line will stay in the plane defined by the line and the pinhole, and the intersection of that plane with the film plane is a straight line.) But parallel lines in object space are not necessarily parallel on the film. Q26. How does focal length affect perspective? A. It doesn't; it is subject distance that affects perspective. However, a longer lens provides more subject magnification at a given distance, so you can get farther from your subject without having the image be too small. By moving back, you make the magnification ratio between the front and back of your subject smaller, because the distance ratio is closer to one. So, in a portrait, instead of a nose that's magnified much more than the rest of the head, the nose is magnified only very slightly more than the rest of the head, and the picture looks more pleasing. You can get the same perspective with a shorter focal length lens by simply moving back, and enlarging the central portion of the image. Of course, this magnifies grain as well, so it's better to use a longer lens if you have one. Q27. What is "MTF". A. MTF is an abbreviation for Modulation Transfer Function. It is the normalized spatial frequency response of film or an optical system. The spatial frequency is usually measured in cycles per millimeter. For an ideal lens and ignoring diffraction, the MTF would be a constant 1 at all spatial frequencies. For all practical lenses lenses, the MTF starts out near 1 and falls off at increasing frequencies. MTFs vary with the aperture, the distance the image region is from the center, the direction of the pattern (along a radius or 90 degrees to that), the wavelength of the light, and the subject distance. Even for an ideal lens, diffraction effects fundamentally force the MTF be be zero at spatial frequencies beyond 1/(lambda*N) cycles per mm, where lambda is the wavelength of the light. For lambda = 555nm, the peak of the eye's response, this is very close to 1800/N cycles per mm. The MTF of a system is the product of the properly scaled MTFs of each of its components, as long as there are not two consecutive non-diffusing components. (Thus with proper scaling you can multiply camera lens MTF by film MTF by enlarger lens MTF by paper MTF, but usually not a telescope objective MTF by an eyepiece MTF. There are also some other obscure conditions under which MTFs can be multiplied.) Note that although MTF is usually thought of as the spatial frequency response function and is plotted with spatial frequency as the abscissa, some manufacturers (e.g. Canon) publish plots of the MTF at specific spatial frequencies with distance from the center of the image as the abscissa. Q28. What is SQF? SQF is an abbreviation for Subjective Quality Factor. SQF was developed by Ed Grainger of Eastman Kodak as an objective measurement that correlated well with subjective rankings of print quality. Somewhat simplified, it is just the MTF in the print (or referred to a designated print size or magnification) averaged from .5 to 2 cycles per mm. See the technical notes. One well-known popular magazine reports lens test results in terms of what they claim to be SQF, but they apparently use some other definition of SQF, despite showing Ed Grainger's picture and referring readers to his original SQF paper. Q29. What are "elements" and "groups", and are more better? A. The number of elements is the number of pieces of glass used in the lens. Single uncemented elements or two or more elements cemented together are called a group. Thus a lens that has 8 elements in 7 groups has 8 pieces of glass with 2 cemented together. It is impossible to completely correct all aberrations. Each additional element the designer has at his/her disposal gives a few more degrees of freedom to design out an aberration. So one would expect a 6 element lens to be better than a 3 element lens. However, each surface also reflects a little light, causing flare. So too many elements is not good either. Note that an unscrupulous manufacturer could slap together 13 pieces of glass and claim to have a 13 element lens, but it might be terrible. So by itself the number of elements is no guarantee of quality. Q30. What is "low dispersion glass". A. Low dispersion glass is specially formulated to have a small variation of index of refraction with wavelength. This makes it easier for the designer to reduce chromatic aberration and lateral color. This kind of glass is most often used in long lenses. Marketing designators such as ED and SLD hint at the use of low-dispersion glass. Q31. What do APO and Apochromatic mean? A. The distance behind the lens at which monochromatic light (light of a single wavelength) comes to focus varies as a smooth function of the wavelength. If this function has a zero derivative in the visible range, and hence if there are two wavelengths at which the light comes to focus in the same plane, the lens is called achromatic. If there is a higher order correction, usually with the result that 3 or more visible wavelengths come to focus at the same distance, the lens is called apochromatic. Some authorities add more conditions. Apochromatic lenses often contain special low-dispersion glasses. APO is an abbreviation for apochromatic. It is frequently asserted in the rec.photo.* newsgroups that marketeers use the terms apochromatic and APO rather loosely. Q32. What is an "aspheric element"? A. It a lens element in which the radius of curvature varies slightly with angle off axis. Aspheric elements give the lens designer more degrees of freedom with which to correct aberrations. They are most often used in wide angle and zoom lenses. Q33. What is a teleconverter? A. A teleconverter is a device that enlarges the center portion of the normal frame to fill the whole frame. In 35mm systems where the frames are 24x36 millimeters, a 2X teleconverter expands the central 12x18 mm to fill the full 24x36mm frame. Q34. How does a teleconverter affect exposure, focusing, depth of field and image quality? A. A lens of focal length f and f-number N with a teleconverter of magnification K attached will behave in all respects like a lens of focal length K*f and f_number K*N. If the aperture diameter and focus are left untouched and an ideal teleconverter is attached, the lens will focus at the same distance, the image, including the diffraction effects and lens aberration effects, will be K times as large, the exposure will need to be K^2 times longer, the hyperfocal distance will be multiplied by K and the depth of field will be divided by K. A practical teleconverter will also contribute some of its own aberrations. (See the technical notes.) On the other hand, if you open the aperture to keep the same effective f-number and hence the same exposure time, the image will be enlarged by K, the diffraction will be unchanged, the depth of field will be divided by K^2 and the hyperfocal distance multiplied by K^2. The aberrations are increased by three effects: the lens is opened to a larger aperture, the teleconverter multiplies those (probably larger) aberrations by K, and then combines them with some of its own. Since the focusing is unchanged, the minimum focusing distance is the same whether or not a teleconverter is attached. (See the technical notes.) Q35. What is the difference between using a teleconverter at the time the picture is exposed vs. enlarging more in the printing process? A. Assuming the aperture diameter is the same in the two cases, the teleconverter case will require K^2 times the exposure while the enlarging case will enlarge the grain in the film K times as much. The teleconverter adds some aberrations of its own, while enlarging more will make aberrations in the enlarging lens more apparent. All other effects are identical. In 35mm format, grain is usually the dominate factor in image quality. Q36. How can I take "close up" pictures. A. There are several ways. a. Use a true macro lens. farther from the film with extension tubes b. Move the lens or bellows. c. Screw on "diopter lenses" or closeup "filters". setting on many zoom lenses. A true macro lens generally d. Use the macro gives the best quality. The f-number needs to be corrected according to the formulas above, unless metering is done through the lens. Most macro lenses go from infinity to 1:1 or 1:2 (mag = 1 or 1/2). Some that go to 1:2 come with an accessory screw-on lens that gets to 1:1. Extension tubes and bellows move the lens farther from the film, allowing it to focus closer. All lenses are optimized for specific situations, and using extension tubes or bellows makes the lens operate out of the region for which it was designed, possibly compromising the quality a bit. You can compute the magnification from the extension using the formulas above. If the magnification exceeds one, it is best to reverse the lens with an adapter. Extending the lens also changes the effective f-number. See the formulas above. However, if you meter through the lens, the meter is affected in exactly the same way, so you don't need to do any calculation. Lenses on some modern electronic cameras require electrical connections to the body, complicating the construction of these devices. Add-on lenses shorten the effective focal length of the lens and reduce the working distance. Single element add-on lenses are of inadequate quality for critical work. Many photographers report good results using 2-element lenses at small apertures. No correction is required to the effective f-number. Some zoom lenses have special macro ranges. However, few zoom lenses get larger magnification than 1:4, and in many lenses the macro feature operates only at the short focal length end of the zoom. This is not for really serious work. If you are photographing flat objects, such as postage stamps, freedom from distortion is important, as is a reasonably flat field. Working distance is the distance from the front of the lens to the subject at a particular magnification. For nature work, a reasonably long working distance is important because working farther away is less likely to frighten insects, etc., and shadows are less likely to fall on the subject. Q37: What is the optimum aperture for a pinhole camera? A. d = .036 sqrt(Si), where d is the diameter of the pinhole in millimeters and Si is the distance from the pinhole to the film in millimeters. See the technical notes. Q38. How can I use my photographic light meter or camera to measure illumination? A. Take an exposure reading with an incident meter or a reflected meter pointed at an 18% gray subject (gray card). (Meters built into cameras are reflected meters. If the lens is a variable aperture zoom, be sure it is zoomed to where the aperture reading is correct. Turn off special intelligent or evaluative metering modes.) Then use one of the following formulas, where E is illuminance. E_in_foot_candles = 25 N^2 / (ISO * exposure_time_in_seconds) E_in_lux = 269 N^2 / (ISO * exposure_time_in_seconds) See the technical notes. Technical notes: The object distance, So, as used in the formulas is measured from the object to the lens's front principal point. More commonly one hears of the front nodal point. These two points are equivalent if the front medium and rear medium are the same, e.g. air. They are the effective position of the lens for measurements to the front. In a simple lens the front nodal/principal point is very near the center of the lens. If you know the focal length of the lens, you can easily find the front nodal point by taking the lens off the camera and forming an image of a distant object with the light going through the lens backwards. Find the point of sharp focus, then measure one focal length back (i.e. toward the distant object). That is the position of the front nodal point. On most cameras the focusing scale is calibrated to read the distance from the object to the film plane. There is no easy way to precisely convert between the focusing scale distance and So. The formulas presented here all assume that the aperture looks the same size front and rear. If it does not, use the front diameter and note that the formulas for bellows correction and depth of field will not be correct at macro distances. Formulas for this situation are given in the lens tutorial, posted separately. The formula for angle of coverage applies to rectilinear lenses. An alternative form, 2*arctan(X/(2*Si)), applies to both rectilinear lenses and pinholes. (Rectilinear lenses give the same projection as a pinhole.) These formulas do not usually apply to fisheye lenses, and can't possibly apply to a fisheye lens that covers 180 degrees or more. The conditions under which the formula for the minimum distance at which the effect of focusing and re-composing will be covered by depth of field are: 1. w is no more than the focal length of the lens. At the edge w=18mm for 35mm, so this will very seldom be a problem. 2. The lens's two nodal points are not very widely separated. But if the front nodal point is in front of the rear nodal point, which I think is the more common case, the formula is too conservative, so this is not a problem either. 3. The camera is rotated about the front nodal point. Almost always the camera will be rotated about an axis behind the front nodal point which again makes the formula too conservative. The guide number given assumes c=.03mm. The SQF is the weighted average of the MTF over the range .5 to 2 lines per mm referred to a designated print size or magnification. The weighting function is 1/spf, where spf is the spatial frequency. It turns out that this is equivalent to just a simple "visual" average when the MTF is plotted against the log of the spatial frequency. A further mean is taken between the the saggital (optics-speak for radial) and tangential components. It appears to this author that an additional, probably weighted, averaging must be done over regions of the image (center, edges, corners). When I find out the specifics of this weighting I will add it to the lens FAQ. Note that the section on teleconverters in several places assumed that the aperture diameter was left unchanged. On lenses with mechanical aperture setting levers or rings this will happen naturally if the aperture setting is not changed. However, beware that fancy electronic cameras may compensate for the presence of the teleconverter. Most camera systems have focusing scales that read from some reference mark on the body, usually at the film plane. With a teleconverer attached, they read from a point the thickness of the teleconverter in front of this reference mark. The optimum aperture for a pinhole camera depends on what criteria is used. The formula given maximizes the spatial frequency at which the MTF for 555nm light will be 20%. There is no universal agreement on the constant in the relation between exposure, film speed, and illumination. This document tentatively shows 25 for the foot-candles case, which I reverse engineered from a Gossen Lunasix meter, but one can find values from 18 to 30 in the literature or by reverse engineering other meters. The constant for lux is 10.7639 (the number of square feet in a square meter) times the value for foot-candles. Acknowledgements Thanks to Bill Tyler for contributing the section on perspective effects. The technique for detecting vignetting in the viewfinder was suggested by Maohai Huang.