One of the problems with using the number of watts consumed to produce a particular output of another form of energy (light, motion, heat, etc) is that there are always losses when one form of energy is converted to another. Although the law of conservation of energy always applies, energy lost in conversion to other forms of energy is not always obvious. Let’s consider devices that convert electrical energy into light as an example.
There is no better example of the trouble with watts than we see in the lighting industry today. When Thomas Edison invented the first practical incandescent light bulb, he rated his bulbs in candlepower. In his wisdom, he knew that the “deliverable” was the amount of light produced. Also in his wisdom, however, he knew that the customer would use a certain amount of energy to produce that light – – and guess who provided the energy. He quoted new lighting installations based on so many light fixtures each providing the equivalent light of so many candles as it was the amount of light produced that was important to the potential buyer. As time passed, the candlepower rating gave way to watts as a way to rate light bulbs. Although I have not been able to find a clear explanation of why this happened, I suspect that the cost of electricity had something to do with it. Knowing the rate of energy consumption of each bulb provided a way to determine how much energy would be used overall. This determination, of course, would allow for the time that each bulb would be illuminated and could be converted directly into cost.
Despite his best efforts, Edison’s first light bulbs were not very efficient. It takes a lot of energy to heat a piece of carbonized bamboo to incandescence with a lot of the energy ending up as heat. Over the years, there were continued efforts to produce more light using less energy. Over time, the efficiency of the light bulb did improve but, having become a convention, watts remained the way of expressing the light output capacity of a light bulb. In simple terms, a 60 watt light bulb, for example, became a bit brighter with each improvement. The changes in efficiency were not huge so people were happy to continue to use watts as a way to, indirectly, define the brightness of a light bulb.
In more recent years, starting with the invention of the fluorescent bulb, improvements in efficiency became significant enough to begin the downfall of the watt as a means to express the brightness of a light source. At first, the unique design of fluorescent lamps (tubes instead of bulbs) provided enough distinction that the inequality of watts vs. brightness wasn’t an issue. However, as technology advanced even further to light emitting diode light sources and fluorescent lamps took on a form emulating that of the incandescent bulb, the inequity in the relationship between watts and brightness became so significant that a new way had to be found to express the brightness of a light source that was not linked to its energy consumption. As a result, light bulbs today are rated in lumens with watts being used as an indication of efficiency. One lumen is defined the amount of light that falls on a one square foot area at a distance of one foot from a burning candle (right back to Edison). Now, instead of shopping for a 60 watt light bulb (for example) we shop for a light source that produces a certain number (or range) of lumens. The following chart shows some approximate ranges for comparison.
Watts have given way to lumens as a way of specifying light sources. But this is just one problem with watts. Others will be discussed in upcoming blogs.
– FJF –