In a previous blog, we talked about the effect that different waveforms have on RMS vs. Peak voltage. Now we’re going to look at the ramifications of that when it comes to power consumption. For purposes of discussion, we are going to use an old-fashioned light bulb with a filament – something that’s easier for most of us (especially me) to understand than the new-fangled light bulbs that present a little different picture (a subject for another blog). You may have noticed that light bulbs have not only a wattage rating but a voltage rating as well. We don’t think about the voltage rating as much because we are usually talking only of a voltage rating of 120 volts. So what does a bulb rating of “100 Watts” mean? Exactly this – – When a particular voltage (120 volts for a typical light bulb) is applied across the filament, the resistance of the filament will draw a current of .83 amps consuming energy at the rate of 100 Watts. Doing a little more math, R(Resistance)=E(Voltage)/I(Amps) we can calculate that the resistance of the filament when operating at 100 watts is approximately 145 ohms. It should be noted that this is the resistance at the operating temperature as the resistance does change (gets higher) as the filament heats up but, for now, let’s say it is fixed at 145 ohms to keep things simple. This is reasonable because the resistance of the filament depends on its temperature and the temperature does not increase or decrease appreciably over a 1/60th of a second cycle.
Note – A light bulb rated at 60 watts at a voltage of 120 volts has a different filament resistance to consume 60 watts instead of 100. The filament in a 60 watt bulb has a resistance of 240 ohms as compared to 145 ohms for a 100 watt bulb.
The above assumes that the voltage source is constant at 120 volts. But, when the source is a sine wave, the voltage is not constant at 120 volts. An RMS voltage sine wave source varies between minus 170 volts and plus 170 volts. A resistive load produces the same current flow despite polarity. A positive voltage produces the same amount power (amount of light) as the same negative voltage in the case of a light bulb. Thinking a little further though, because the voltage is varying and the resistance of the filament in the bulb remains the same, the power consumption varies over time with the average power consumption being 100 watts.
Current flow (amperage) varies with the voltage applied across the filament. When there is no voltage (at the beginning of the cycle), there is no power consumption. Basically the bulb is off. As the voltage increases to 120 volts, the power increases accordingly to 100 watts. As the voltage goes even higher (the peak of the sine wave) it reaches a peak of approximately 170 volts. Since the resistance of the filament doesn’t change (145 ohms) the current at 170 volts is 1.17 amps. The resultant power consumption at that instant is just short of 200 watts. So the light bulb, rated at 100 watts at 120 volts is actually consuming energy at the rate of 198.5 watts at the peak of the voltage cycle!
In researching for this blog (I do research these things) I found a video that illustrates the above. The filament never goes totally dark in the video because of its thermal inertia but the pulsations in brilliance are evident.
The “take away” here is that a 100 watt light bulb is rated at 100 watts only when operated at its intended voltage (direct current or RMS). Reduced voltage results in less power while higher voltage results in higher power unless the resistance of the filament is changed. A 100 watt light bulb intended for operation at 120 volts is no longer rated at 100 watts if the voltage changes!
Note – Yes, this is exactly why “long life” light bulbs of the past were rated 100 watts at 130 volts instead of 120. When operated at 120 volts, they were dimmer – not consuming a full 100 watts. This resulted in a longer operating life because the peak operating power was considerably reduced.
– JF –