Posted 10 March 2004 - 08:38 AM
The company also reasoned that a higher-than-desired input supply could result in higher losses and operating current. Lamps operating at higher current than the optimum level will result in faster carbonizing of filaments, reduce strength of ionization and brightness.
The company claimed that their product will help to avoid the pace of carbonizing by "optimization" of the current and voltage, thereby resulting in consistent brightness and longer lifespan of lamps. They stressed that theirs is not a power factor correction device
As my knowledge and understanding of the workings of such kind of technology is very limited, I would appreciate if anyone could help to verify the validity of the claims made. Any advise on how to evaluate such products is also welcomed. To me, it just simply sounds too good to be true. ;p;
Posted 10 March 2004 - 09:05 AM
well the answer is yes, you can save energy if the voltage is too high, but it is a question of how much.
If you have a resistive load (Incandescent lamps are resistive) with a constant resistance, then the power is equal to the voltage squared divided by the resistance. If the voltage is ten percent higher than it should be, then you will dissipate 21% more power than you should.
In the case of an incandescent lamp however, the resistance of the lamp is the resistance of the filament and this is metalic. With metalic resistances, as the temperature rises, the resistance also rises. This will reduce the increased power consumption of the lamp to a lower figure, but there will be some increase in lamp power. The higher filament temperature will increase the rate of vapourization of the filament and degrade the lamp life as well.
As the filament degrades, it is common for the light output of the filament to rise rather than fall, so the lamp can get brighter up till it fails.
I would guess that with the example above, the increased filament temperature will reduce the power increase to closer to the voltage rise, so a 10% rise in voltage would result in closer to a 10% rise in output than the theoretical 21% for a constant resistance case.
30% sounds like a very high figure to me, unless the voltage is being dropped well below the nominal voltage and the lamps are producing less than their rated output.
In many parts of the world, there is a limit in the range of voltage that is supplied. In this country, the voltage is 230Volts +/- 5% so the varience on output power would be clower to 5%.
As usual, I would strongly recommend tests to be done.
I believe with some discharge type lamps, there may well be greater voltage dependance, but often as these lamps age, their required voltage increases, so you need to adjust the voltage over the life of the lamp.
Perhaps someone else out there can add from real life experience in this area.
Mark Empson | administrator
Skype Contact = markempson | phone +64 274 363 067
LMPForum | Power Factor | L M Photonics Ltd | Empson family | Advanced Motor Control Ltd | Pressure Transducers | Smart Relay | GSM Control | Mark Empson Website | Soft Starters
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users