Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations The Obturator on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Voltage Reduction for Energy Saving

Status
Not open for further replies.

mogley

Electrical
Apr 11, 2005
3
Can anyone advise me on the subject of end-users operating their commercial and industrial buildings at a reduced site voltage to reduce energy consumption?

I can understand the potential savings on offer by reducing the supply voltage to resistive type loads (within mfr's tolerances), but for motor loads for example with a fixed mechanical load, a reduction in the supply voltage will not reduce power consumption (and thus energy consumption).

Does anyone out there have experience of such applications and if so, could you advise me on the issues that would need to be considered?

Thanks
 
Replies continue below

Recommended for you

Even the resistive loads can be troublesome if controls are involved. They would be better managed by simply turning down the temperature thereby saving energy.

The motor guys here will probably drop on this idea like a ton of bricks. :)

Me, I think it is a very poor idea. The chances of a control problem being caused by low voltage is great. A single control problem could cause machinery damage or product loss, or production disruption far exceeding any savings.
 
Where did they get the idea that reducing voltage would reduce energy consumption in the first place? If that were the case, why not just run everything at reduced votlage? Heck, if you reduce it to 0, you save a TON of energy!

There still exists a bunch of people on the internet promoting this concept for AC motors, based on an old NASA patent that reduced the voltage of UNLOADED motors in order to reduce the total energy consumption. That concept is for the most part bogus because you can only save energy that is being wasted and all their claims are based on amounts of waste that are completely false. Check out this paper for more info on that Energy Savers paper link. Anyway, it only worked as I said, on UNLOADED motors. If your motors are actually DOING anything, reducing the voltage does nothing more than overheat the motors, actually wasting more energy.

As for other loads, what are the other loads? Power supplies such as those in electronic equipment and HID or fluorescent lighting will draw power at higher amps if the votlage is reduced, so again, no savings and in fact a risk of overheating them. Resistive loads will draw less power, but of course that is because they will produce less of what they are supposed to be doing. For instance, incandescent lights will be dimmer, resistive heaters will be cooler. If you are heating a tank of fluid and you reduce the voltage to the heaters, they will produce less heat but then it will take longer to heat up the tank. Ultimately you gain nothing. In fact, taking longer allows more time for convection losses, so in fact it may take MORE kWHs to heat the tank! So again, what's the point?

"Our virtues and our failings are inseparable, like force and matter. When they separate, man is no more."
Nikola Tesla

 
I should have mentioned one other possibility however.
If you already have TOO HIGH of voltage at the facility, you are already wasting energy, so lowering it might work by bringing everything back into spec. By too high, I mean 10% or more too high. So if a facility is getting 528V instead of 480V, then it would make sense to lower the voltage because you were getting increased excitation losses on most of your loads.

"Our virtues and our failings are inseparable, like force and matter. When they separate, man is no more."
Nikola Tesla

 
One possible area where overvoltage can cost you money is in power factor penalty. A motor designed to put out 10 hp at 460V and connected to a load which requires 10 hp will still draw about the same total power to do the job at 480V but will do it at a significantly worse power factor. Check if the site is paying unusual power factor penalties (assuming it is that large), and if the service voltage is on the higher range of specification, which is not uncommon. If so, your first step would be to evaluate the feasability of reducing the transformer one or two taps provided you are sure there aren't any serious long low voltage runs to motors which might be doing a serious voltage drop in the feeder past your voltage test point.

A site's ideal voltage is relative. The goal is to keep provided voltage as seen by all the load equipment at its terminals at all times within manufacturer's specs, and given that, at the level which minimizes billing by minimizing energy wasted in I^2R in conductors and minimizing power factor penalties. Don't forget, occasionally service providers will reduce the voltage they provide to you for reasons of their own as well, though in my experience that would be an unusual circumstance. Quite common in California in 2000 though apparently.

Pechez les vaches.
 
It is true with lighting fixtures and also true with motors running at partial loads most of the time.

With illumination, you need to be satisfied with lesser light output at reduced voltage than rated (true especially with incandescent lamps).

In case of motors the saving is due to reduced flux resulting reduced hysteresis and eddy current losses.

In case of street lights / perimeter lights, afterall, it is not a bad idea to reduce voltage during less traffic periods and is akin to switching-off selected light fixtures through timers.
 
Reducing voltage will generally reduce power consumption on mixed use type feeders. The amount of reduction depends on the types of loads. As jraef mentioned, reducing voltage to an ac motor does not really reduce energy consumption to the motor. But in aggregate, less voltage means less power consumed.

Some utilities take advantage of this in their distributions substations and automatically adjust the voltage regulators to cut voltage during periods of high demand, in order to reduce their demand charges paid to the power wholesaler. They don't brag much about this obviously, but they do it, and all modern regulator controllers have this capability. However, these are relatively small voltage decreases (0.5% to maybe 2.5%).
 
Many thanks guys.....appreciate all your help.
 
As dpc and steveal's link pointed out, this has some (debatable) effect on the utility as a whole, but there would no "upside" to it for a single facility, only risks.

"Our virtues and our failings are inseparable, like force and matter. When they separate, man is no more."
Nikola Tesla

 
Also, you have to distinguish between reducing peak demand and reducing energy consumption.

If you are truly wanting to reduce energy consumption (kWh), then reducing voltage probably won't do too much.

If you want to reduce your demand charge (kW), you would be better off looking at shutting down non-critical loads automatically as demand increases.

Again, this really depends a lot on the type of facility and the types of loads you have.

 
Thanks again for all of your responses guys.

After receiving more information from the client, he claims that since reducing the voltage at two substations by 5% in May/June, he's realised energy savings of £10k at each substation.

The buildings being fed by the substations are office buildings, as well as industrial process buildings.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor