Compilation: Determining the Brightness of Light Bulbs Compilation: Determining the Brightness of Light Bulbs Date: Tue, 9 May 2006 From: Nathan Harada Subject: Determining Brightness of Light bulbs In going through circuits, a question came up regarding how to determine the brightness of light bulbs. I thought that the power rating of a light bulb was probably related to the brightness and suggested a direct proportionality between power and brightness. Is this correct? And if so, how can we quantify "brightness" to verify this relationship? -----------Date: Traci Maxted I do a lab to test this in one of my classes. Using a wattmeter (W) to test the input and a photometer (W/cm2) to test output, the results we get are linear. The slope is close to one, but not quite. Generally lower wattage bulbs (20W, 40W) are higher than rated, and larger wattage ones (100W, 200W) are a little lower than rated. The difference is usually less than 5 % with 40 W bulbs being about 42 W and 100's are about 98W. --------------Date: Wed, 10 May 2006 From: David Brookes It has been suggested (not by me) that you can divide all equations in physics up into two broad categories. The first is physical quantities such as v = x/t. This is a definition of a physical quantity. It does not make sense to talk about testing a proportional relationship between v and x, there is nothing causal. The second sort of equation is a physical relationship or physical law relating physical quantities. These are either causal (like a = Fnet/m) or constraint-based, like conservation of energy. These can be tested. So on to brightness and power. I'll assume the same surface area throughout so that we don't have to complicate it with talk about intensity. In some sense brightness IS power (or power is brightness) so this is a physical quantity definition, rather than a physical law, and I would say it’s meaningless to test. On the other hand, if you're talking about the relationship between the power of the light bulb and what the human eye perceives, then I am not sure if you have the relationship correct. I believe (but have no time to check) that the human eye perceives light power (intensity x area of eye aperture) logarithmically, not linearly. A fun way to test my claim might be to pick light bulbs of doubling power, say 20W, 40W, 80W and 160W. (Make sure they're all the same physical size; otherwise you mess it all up with the intensity issue.) Don't tell your students what the power ratings are and just ask them to rank the relative brightness. Namely pick which bulb they perceive as twice as bright as the darkest one (20W). Also ask them to rank how much brighter the other bulbs are than the darkest one. (1.4 x, like that.) Then plot. I'm not even sure if this will work. But I just went to a talk by a Harvard astronomer and she said that good amateur astronomers can rank intensities of stars with the naked eye down to a tenth of a solar magnitude (or was it even better than that, I can't remember). -----------Date: Wed, 10 May 2006 From: John Clement The simple way to determine brightness is to look at the box the bulb comes in. It has the number of lumens. Other than that, a light sensor from Vernier or Pasco is a good method. 1 Compilation: Determining the Brightness of Light Bulbs Brightness is obviously not linear if you look at the rated lumens and the power of a light bulb. When you go from 40 to 60 to 100 W the power doubles with each step. If you look at brightness vs power consumed by a single bulb it is also not linear. Part of the reason why it is not linear for commercial bulbs is that there is a peaked curve of brightness vs frequency. At lower temperatures more of the curve is in the infrared. As you increase the temperature, the peak of the curve moves up to a higher frequency. The bulb becomes more efficient in the visible region. Perhaps an interesting question for students would be to figure out which is more economical, 60W or 40W bulbs, when you have a minimum illumination requirement. The 60W is obviously better because one 60W will give the same light as (2) 40W. ---------------Date: Thu, 11 May 2006 From: Andy Edington If bulb wattage ratings are a measure of "brightness", why is a 15-watt compact fluorescent Hg- Vapor bulb the same "brightness" as a 60-watt incandescent? Power tells us the rate of electric potential energy transformation into electromagnetic energy that is radiated from the bulb. The bulb's "brightness" depends on the spectral distribution of the radiated energy. (Our eyes detect only a limited range of e-m energy and our eyes are not equally sensitive to all wavelengths.) For an incandescent bulb, much of the radiation is in the infrared range, for a fluorescent bulb, much of the radiated energy is in the visible range, and for an "uncoated" fluorescent bulb, much of the radiated energy is in the uv range (think tanning beds.) The 1000-watt bulbs in our fieldhouse are interesting (the custodians gave me an old one.) The bulbs have both a mercury vapor chamber and a filament. When you first energize the bulbs, the bulbs are dim. The radiating filament radiates energy which vaporizes the liquid mercury droplet inside the separate, central chamber. Once the mercury is vaporized, the vapor becomes part of the conducting path and the bulb gets very bright. Interestingly, the inner chamber is clear glass with no visible coating. Does anyone know what is incorporated into the glass or coated on the glass to transform the uv to visible e-m? ----------------Date: Thu, 11 May 2006 From: DAVID HURWITZ Traci Maxted reported that the photometer readings were slightly lower for larger wattages. Part of the issue is the physical size of the bulbs and the volume of inert gas between the filament and the wall of the bulb. The loss can be estimated by taking photometer readings at different distances and checking them against the inverse square rules. Decent measurements should deliver an offset that may explain the slightly lower readings. --------------Date: Fri, 12 May 2006 From: bRant hinrichs At what wavelengths is the photometer sensitive? Do you get a spectrum out (intensity vs. wavelength) or just a count of photons? ----------------Date: Fri, 12 May 2006 From: "Jean Oostens The brightness of bulbs is expressed in Lumen (written on the package or even the bulb itself: for example 60 Watt - 830 Lumen. One obvious way to measure is to use a light detector, but beware of the spectral sensitivity: 2 Compilation: Determining the Brightness of Light Bulbs 1. A solar cell connected to an ammeter will respond to the infrared, and favor incandescent bulb over fluorescent ones. 2. The CdS cells (Cadmium sulphide) have a resistance inversely proportional to the visible light falling upon them. Take 1/R as a good measure. They respond closely but not perfectly like the human eye. 3. Phototransistors, like solar cells, favor infrared. 4. Photographic exposure meters should work well, by definition! 5. Vernier light sensors are fine, if you have the interface to go with them. 6. The light comparator is low tech, but it follows the human eye response perfectly by design: Two blocks of paraffin are joined and looked at by the operator. The two lights to be compared fall on the left and the right side of the device. If the lighting is the same, the dividing line will disappear. (I can e-mail you a picture of the apparatus I use in my lab). 3