Jump to content

Analytical Or Field Studies Of Pf Correction Energy Savings


Dan_G

Recommended Posts

This forum has quite an extensive history discussing power factor correction related topics. I read many posts regarding energy savings, and found in several of them the statement that there are no energy savings whatsoever. I've seen lots of claims from unscrupulous vendors out there claiming extraordinary energy savings of 20% or more, but the fact remains that there are non-zero savings because when pf is corrected, less current flows, and I2R losses are reduced in transformers, conductors, etc. I have been told by several PQ experts that such energy savings will usually be below 1% of consumption in commercial and industrial facilities, but that in rare instances savings could amount to as much as 3% of consumption.

 

I have tried several times to find studies that document these small energy savings from pf correction, but so far, all I've been able to find is anecdotal. I'd really like to have a report based on a field study or maybe a theoretical study using a model of a typical electrical distribution system of a C&I facility. Such evidence would provide a firm foundation for combatting inaccurate energy savings claims. Does anyone know of any studies that could help?

 

 

Link to comment
Share on other sites

Hello Dan_G

 

The addition of power factor correction does save energy, there is no dispute about that, but it will not reduce the energy consumed by equipment connected to the supply. The energy saved is in the distribution equipment on the supply side of the power factor correction equipment.

 

If you apply power factor correction to your main switchboard, you will not see any savings on your supply meter unless there is a long cable between your meter and your main board.

 

Some sales people claim that energy (KWHrs) will be reduced significantly in the corrected plant and this is not true. The energy savings are in the distribution system and that is not metered at the consumers plant. Cost savings are based on power factor or KVA penalties.

 

In the distribution system, the addition of power factor correction not only reduces i2r losses in the cables and transformers, but it increases the utilization of transformer and generators. The transformer and alternator loading is based on current. (often expressed as KVA) If you have a load power factor of 0.5, the maximum KW loading on a transformer is halved. Correcting the power factor up to 1.0 would enable two time the KW loading on a transformer compared to it's maximum load at a power factor of 0.5 - This saves a lot of investment in distribution plant.

 

I do not have any case studies covering the savings in the distribution system, but it is easy to calculate what savings can be made provided that you know what the losses are.

 

Best regards,

Link to comment
Share on other sites

Here is a study that implies it.

Look at table 2 on page 3. It shows that they implemented a $55,000.00 power factor improvement program, but list the energy savings as "N/A", meaning there was none. They do show cost savings and payback because they avoided a lot of penalties, but that had nothing to do with energy savings.

 

http://www1.eere.energy.gov/industry/bestp...rld_alumina.pdf

 

By the way, here is the results of my search at this site, there are some other interesting reads there.

 

http://search.nrel.gov/query.html?qp=url%3...amp;x=0&y=0

"He's not dead, he's just pinin' for the fjords!"
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...