Jump to content


Voltage "Optimiser"

  • Please log in to reply
1 reply to this topic

#1 bambam


  • Members
  • Pip
  • 1 posts

Posted 10 March 2004 - 08:38 AM

I have recently been approached by a company on a product that could save up to 30% of energy in lamps (except those fitted with electronic ballast). The product "optimises" the line voltage input by introducing an impedance in the lighting circuit via an autotransformer. It appears that most lamps are designed to operate at a desired voltage level. However, in most cases, the lamps are actually operating at a higher voltage level. By stepping down the circuit voltage to the design level, the product could evidently reduce kVA and kW as well as prevent high I2R losses in the circuit.

The company also reasoned that a higher-than-desired input supply could result in higher losses and operating current. Lamps operating at higher current than the optimum level will result in faster carbonizing of filaments, reduce strength of ionization and brightness.

The company claimed that their product will help to avoid the pace of carbonizing by "optimization" of the current and voltage, thereby resulting in consistent brightness and longer lifespan of lamps. They stressed that theirs is not a power factor correction device

As my knowledge and understanding of the workings of such kind of technology is very limited, I would appreciate if anyone could help to verify the validity of the claims made. Any advise on how to evaluate such products is also welcomed. To me, it just simply sounds too good to be true. ;p;

#2 marke


    Posting Freak

  • Moderator
  • PipPipPipPipPipPip
  • 2,600 posts
  • Gender:Male
  • Location:Christchurch, New Zealand

Posted 10 March 2004 - 09:05 AM

Hello bambam

well the answer is yes, you can save energy if the voltage is too high, but it is a question of how much.
If you have a resistive load (Incandescent lamps are resistive) with a constant resistance, then the power is equal to the voltage squared divided by the resistance. If the voltage is ten percent higher than it should be, then you will dissipate 21% more power than you should.
In the case of an incandescent lamp however, the resistance of the lamp is the resistance of the filament and this is metalic. With metalic resistances, as the temperature rises, the resistance also rises. This will reduce the increased power consumption of the lamp to a lower figure, but there will be some increase in lamp power. The higher filament temperature will increase the rate of vapourization of the filament and degrade the lamp life as well.

As the filament degrades, it is common for the light output of the filament to rise rather than fall, so the lamp can get brighter up till it fails.

I would guess that with the example above, the increased filament temperature will reduce the power increase to closer to the voltage rise, so a 10% rise in voltage would result in closer to a 10% rise in output than the theoretical 21% for a constant resistance case.
30% sounds like a very high figure to me, unless the voltage is being dropped well below the nominal voltage and the lamps are producing less than their rated output.

In many parts of the world, there is a limit in the range of voltage that is supplied. In this country, the voltage is 230Volts +/- 5% so the varience on output power would be clower to 5%.

As usual, I would strongly recommend tests to be done.

I believe with some discharge type lamps, there may well be greater voltage dependance, but often as these lamps age, their required voltage increases, so you need to adjust the voltage over the life of the lamp.

Perhaps someone else out there can add from real life experience in this area.

Best regards,

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users