Jump to content

Powerperfector


peaceje

Recommended Posts

I visited the e2 energy exhbiton in London yesterday and was intrduced to the powerperfector voltage optimisation product, which claims to save 9 to 20% of energy use by optimising the voltage supply on the LV side of site transformers. Is anyone familiar with the product, can explain the technology and more importantly does it work?

 

 

Link to comment
Share on other sites

peaceje,

can you supply a link to their web site?

 

In another web site I've found this statement "The estimated cost per kWA ", but this cannot be a serious statement.

 

We have already seen many tens products claiming energy savings, we would be interested to know another one.

Regards

Mario

Mario Maggi - Italy - http://www.evlist.it - https://www.axu.it

Link to comment
Share on other sites

There seem top be 3 camps of "energy savers" out there now: The Nola Ciruiteers who try to sell various new and improved versions of the same old tired PF controller, The Capaciteers who try to convince everyone that adding PF correction capacitors is saving energy, and the Transformer Jockeys who assume that because you reduce the voltage at a site, the power consumption will go down. This outfit falls into that last category, Transformer Jockeys.

 

http://www.powerperfector.com/index.html

 

The website is chocked full of glitzy marketing words and "save the planet from deadly carbon" stuff, but very short on descriptive technology information. But I found it, right here under their "Ohms Law" paper:

 

It is simply assessing the incoming supply Voltage, selecting a Voltage tap down which would be acceptable throughout the site (i.e. the performance of the equipment would not be affected) and Bobs your Auntie! Savings from day 1.

 

That part about "the performance of the equipment would not be affected" is the big lie IMHO. MOST people do not have significant over voltage problems. Some do, and they would probably benefit from this. But most do not, so lowering your voltage will not necessarily be of benefit and in fact may even be a problem. If you have loaded motors, the current will increase, kW stays the same, no savings. Unloaded motors, same issue as the Nola controllers. If they are unloaded, why are they running? If you have lighting, you get less light output. maybe you can live with that, maybe not. If you have electric heating, the power goes down, but it just then takes longer to do the required work, so the kWH stays the same! If you have computer loads, lowering the voltage may cause your power supplies to run hotter and fail sooner. Sure, you save energy when your computers are not working, so I guess that is technically not a lie.

 

Bottom line, they are selling something that does on the surface appear to work, but requires you to look at what you are willing to sacrifice to get it. For some, it may be worth it, for others, probably not.

"He's not dead, he's just pinin' for the fjords!"
Link to comment
Share on other sites

mariomaggi, this link attempts to explain the theory

 

http://www.powerperfector.com/downloads/energy_world.pdf

 

I dont claim to understand the theroy but Im sure there is someone out there that does?

 

speaking to the guy on the stand he spoke about low copper losses and advanced japanese thyristor technolgy etc. However, I came away with the feeling that it could work on certain applications but to the detriment of output.

 

There seemed to be lots of case studies though for commercial / public sector installations in the UK. not too many industrial applications though. Wonder if anyone out there has bought one of thses units? And if so what was the outcome with regards to:

 

1. Overall savings.

2. Payback.

3. Performance of equipment - heating lighting machinery etc.

Link to comment
Share on other sites

Technically, most savings are realised in induction motors and lighting equipment.

When you optimise voltage to motors, within their normal operating range, core and winding losses are reduced, so they run more efficiently, with less stress, and live longer.

Lighting benefits by being returned to its 'design' voltage and brightness, so both current and power is reduced and lamp life is increased substantially.

 

The moment that I see reference to saving money/energy on the operation of induction motors, I get very suspicious! When motors operate at efficiencies above 92%, there is very little to be saved. You will only save energy on induction motors operating at a constant speed if they are significantly unloaded and you reduce the voltage applied to them well below the rated voltage.

An interesting point is, if you increase the voltage on a loaded motor, you increase the iron loss, but you reduce the winding loss. If you reduce the voltage, you reduce the iron loss, but increase the winding (copper) loss. You do not reduce both, unless the motor is unloaded.

 

If you operate lamps above their rated voltage, they will produce a higher than rated light output and draw a higher than rated power input. Operation below the rated voltage will result in reduced light output, reduced colour temperature and reduced power input. - no free lunch!!

 

for more info see Energy_Savers

 

Best regards,

Link to comment
Share on other sites

Hi

 

I work in the water industry in electrical engineering and energy roles. I have tried the Nola principle energy saving motor controllers and to the astonishment of the manufacturer, it did nothing to reduce consumption on a 66% loaded motor. Didn't surprise me though!

 

I was recently introduced to the PowerPerfector literature and again I am highly sceptical. The only benefit can be to those whose voltage is well above the declared normal (230V in the UK), and the best solution to that is not buy £1000's worth of additional step-down transformer (claimed to be better than 99% efficiency apparently!) but to simply tap-down the supply transformer to give you normal voltage. In our industry, most larger sites have either our own transformer(s) or a supply company transformer that solely feeds us, so can be tapped to whatever we want within the limits of its tapchanger (usually -5 to +5%) provided the voltage remains within statutory limits (currently 230V +10%/-6% but ultimately +10%/-10%).

 

For fluorescent lighting, there are good savings to be made by reducing the supply voltage. In my tests, we used a device which lowered the output voltage by 15% via an auto-transformer, provided that the output current remained fairly static and the supply voltage was above a certain limit in the first place. Any increase in current causes it to revert to full voltage for 3 minutes to help newly turned on lamps achieve normal operating conditions, then reduces again. The measured drop in light levels was imperceptible, whilst the reduction in energy was considerable (>25%). In large fluorescent lit areas I would recommend these, but in our case the cost of combining circuits to adequately utilise as few devices as possible on a building soon to be sold off was not justifiable.

Link to comment
Share on other sites

Excellent post JohnD, thanks. I think your experience sums up the entire "energy saver" issue quite nicely. Some of the benefits are real, most are not and those that are can usually be accomplished much more simply.
"He's not dead, he's just pinin' for the fjords!"
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...