Can anyone tell me why in a single phase when we reduce voltage, the power consumption reduces (eg. Lighting) where as it is not the case with a 3 Phase equipment or non- resitive loads.
I have no idea what this should have to do with single- or 3-phase operation mode. To my knowledge such techniques are only applied to single-phase loads, but of course the voltage reduction devices are made up in 3-phase design because they can be economic only where greater numbers of these (smallish) loads are being operated. Then, as a rule, they are (more or less) evenly distributed across the 3 phases.
But it remains to be considered that different loads react in totally different manners to the application of this technique:
The voltage reduction technique is a very good means of making fluorescent lamps with magnetic ballasts more efficient. In fact at 207 V a 58 W lamp with a class B1 ballast is already more efficient than with an electronic class A3 ballast! Similar effects may eventually be expected in street lighting (high pressure discharge lamps).
With incandescent lamps a certain energy saving is possible if a substantial loss of light output is accepted. Lamp life can be substantially extended.
With nearly all other applications the effect is either futile or adverse, may even become dangerous! So the voltage reduction technique should only be applied where dedicated lines for the lighting are provided and the type of ballasts used is known.
An excellent comment Stefan but you forgot to mention that many people who fit voltage reduction equipment do so for an entirely different reason.
UK nominal voltage is 230V single phase and 400V three phase. But measure what your supply and you may discover that you are in the range 253-256V that a large number of household and industrial sites are.
This high voltage doesn't inherently "damage" your loads but it can over stress them and therefore voltage reduction may well increase the life of the sensitive components.
Further since many devices are designed to operate over large input windows and have fixed power outputs it may seem as though changing the voltage will only increase the current without reducing the power, but even on laptop power supplies a very small amount of power is wasted (as heat) when the input voltage is higher than "nominal". So savings on a varied load won't be the square of the voltage reduction as you would expect on a purely resistive load but if you started with above nominal voltage may be higher than predicted by Stefans comments.
Great comments from Stefan&Nathan
Nathan wrote "but you forgot to mention that many people who fit voltage reduction equipment do so for an entirely different reason"
But Nathan himself forget or somehow scare to say for following further reduction ( of course safely) in voltage reduction. You can find many documents for example for 1% voltage reduction from the nominal, i.e one can check it from:
Statistical Test of Energy Saving Due to Voltage Reduction Kirshner, D.; Giorsetto, P. IEEE Transactions on Power Apparatus and Systems Volume PAS-103, Issue 6, June 1984 Page(s):1205 - 1210 Digital Object Identifier 10.1109/TPAS.1984.318450
Statistical Test of Energy Saving Due to Voltage Reduction Kirshner, D.; Giorsetto, P. IEEE Transactions on Power Apparatus and Systems Volume PAS-103, Issue 6, June 1984 Page(s):1205 - 1210 Digital Object Identifier 10.1109/TPAS.1984.318450
Summary: A number of utilities have conducted field tests of the effect of distribution voltage reductions on energy consumption. The tests have been interpreted as giving mixed results. This paper reviews eight tests and statistically analyzes the results. These tests consistently support significant energy savings as a result of voltage reduction. Percent energy savings for each 1% reduction in voltage average 0.76%, 0.99%, and 0.41% for residential, commercial, and industrial class loads, respectively.
Well, it is two different pairs of boots, as we say in Germany, whether you reduce your line voltage to get below the rated voltage, down to the lower tolerance limit or even lower, in order to (try to) save energy, or whether you reduce your overvoltage in order to come down to rated voltage!
The study confirms just what I say: If all loads in the households had been of the (uncontrolled) ohmic type, 1% of voltage reduction would have lead to 1.21% energy saving. Since a part of them is controlled and the rest is not ohmic and hence do not save anything, you reap a mean saving varying from 0.4% to 1%. These savings occurred in fluorescent lamps with magnetic ballasts and in incandescent lamps, the former losing a little bit (0.5%?), the latter a relatively great deal of brightness (4%?), neither of which was realized by the consumers, or they felt there was nothing they could do about it.
Voltage reduction for lighting ( not inculding HID lamp ) was one kind of method to save energy by using human eye adpation to lower illuminace level, study by US Light Research Centre that our eyes hardly notice a 10% drop in illuminance level. Also using higher color temperature lamp 6500 K has a better visual performance than a lower color temperature lamp of 2700 K. They call this effect is cause by " Pupil Lumen "
The voltage stabilizer-reducers are commonly used in public lighting too. Mostly in sodium vapour and mercury lamps, with savings better than 25%. They are used in both ways, as reflected before, to stabilize with improvings in lamps life and reducers to achieve energy savings.
Fan Loads & Power Regulations For High Profits
In cities like Delhi where I live peak demand varies between 2,200 MW to 4,200 MW depending on the season and even during peak summer season rains due to local depression could bring down the cooling load and avoid irrigation load – mainly fans, air conditioning and tube wells, power demand can be managed well by reducing voltage in the distribution network.
Fans load which is significant responds very well reducing power consumption by lowering supply voltage. Theoretically 10% reduction in rpm results in 30% lowering of load when air circulation is reduced by 10% only. Whenever there is power cut and fans are operating on inverter I turn the regulator one step lower that is enough to bring down load on inverter by 25% though reducing air delivery by about 10%. Thus reducing load on inverter and extending its battery life.
Utilities in Delhi too like UK tends to supply power at 240V-250 voltage range against 230V specified to pump more units in to consumer bills in affluent areas where every additional unit billed is charged at highest tariff rate and at 200V in low income areas thereby reducing units billed where tariff rates for saved units are substantially lower. (There are three tariff slabs in Delhi and most of India for domestic power.)
Utilities use voltage regulations effectively to bill more units at highest tariff and simultaneously reducing sales in low tariff zones.
Ravinder Singh, Inventor
If lighting levels can be reduced without affecting activities in the lit-up spaces, voltage reduction could serve as a temporary measure. The permanent measure ought to be to downsize the lights where possible.
Fan Loads & Power Regulations For High Profits In cities like Delhi where I live peak demand varies between 2,200 MW to 4,200 MW depending on the season and even during peak summer season rains due to local depression could bring down the cooling load and avoid irrigation load – mainly fans, air conditioning and tube wells, power demand can be managed well by reducing voltage in the distribution network. Fans load which is significant responds very well reducing power consumption by lowering supply voltage. Theoretically 10% reduction in rpm results in 30% lowering of load when air circulation is reduced by 10% only. Whenever there is power cut and fans are operating on inverter I turn the regulator one step lower that is enough to bring down load on inverter by 25% though reducing air delivery by about 10%. Thus reducing load on inverter and extending its battery life. Utilities in Delhi too like UK tends to supply power at 240V-250 voltage range against 230V specified to pump more units in to consumer bills in affluent areas where every additional unit billed is charged at highest tariff rate and at 200V in low income areas thereby reducing units billed where tariff rates for saved units are substantially lower. (There are three tariff slabs in Delhi and most of India for domestic power.) Utilities use voltage regulations effectively to bill more units at highest tariff and simultaneously reducing sales in low tariff zones.
Ah yes, another version of the perpetual motion machine. As described in the original response, energy reduction (and corresponding reduction in output) by voltage reduction is simple and easy, the big question is whether efficiency improvements can be achieved by voltage reduction.
The primary case described is that of the magnetic fluorescent ballast. Light output and energy use are both decreased by voltage reduction. Lower light output is often not noticeable but of course you don't need a fancy voltage reduction gadget to achieve this and at some point it most certainly is noticeable. What is more interesting is the claim that energy efficiency of the fluorescent magnetic ballast systems is improved by the voltage reduction. This claim is made on the basis of improved efficiency of the fluorescent tube operating at lower voltage and of reduced ballast losses at lower voltage.
Fluorescent tubes do not necessarily improve efficiency when operated at lower voltages. They operate most efficiently at a particular power level, corresponding to a particular coltage and current, and at lower efficiencies at both higher and lower power levels. In the particular case study (funded of course by a magnetic ballast manufacturer lobby group), the common 58W T8 tube was used and shown to operate slightly more efficiently at below 58W. Most tubes do not show this relationship, in fact many of them operate more efficiently at significantly increased power levels. When either lower light output is desired from an existing number of fittings, or when improved efficiency can be achieved by lowering power levels, ballasts with a ballast factor below 1.00 are available.
The second factor is the reduced (magnetic) ballast losses at lower input voltage. These are described here as proportional to the square of the current which is a little simplistic, nevertheless ballast losses decrease when the ballast is operated at a lower power. A different way of achieving lower ballast losses would be to use a more robust ballast (thicker windings, etc.) to achieve lower losses at the same power. The case study I have seen (again funded by the copper sellers) plays a little fast and loose with such things as the power factor correction and the measurement of the real and apparent ballast losses. Be very careful since in most application you will need power factor correction and will not achieve the savings shown in the study.
In summary, energy efficiency savings by this method can be made when applied to certain types of existing equipment, not to others. In all cases, the best energy efficiency will be achieved by appropriate choice of equipment to operate at your supplied voltage, not by adding additional equipment in front of sub-optimal appliances. Whether and when an optimal solution should be chosen will depend on retro-fit costs vs the payback period.
We are not promoting any perpetual motion machine. We do not promote any specific machine at all. Where did you read we do?
Indeed the big question is whether efficiency improvements can be achieved by voltage reduction. However, we did not want to ruminate this question but rather provide you with an answer, and this can only be done by a qualified measurement. Believe us, we have been searching for such measurement results virtually for years now, but either this has never ever really been measured, or the results were such that the corresponding organisations were displeased to disclose the results. So we had to have this done on our own account. Unfortunately this costs money. Raising questions, doubts and speculations does not. So the world is teeming with statements but lacking results and evidence. Finally we set out to put an end to this and spent the money for a measurement. Feel free to repeat our efforts, financed from another source but the copper industry! Feel free to provide us with measured results from other lamp types (instead of allegations) if you can afford more than we could! Afterwards let us continue discussing. But so far it is evident that with the configurations we have had tested so far the input energy saving was greater than the output energy loss. Nothing more did we allege. We do not doubt there are many other claims around but you must have read those elsewhere.
By the way, accounting for another request we recently received we underwent another expenditure and had the new 51 W T8 lamp by Philips tested, which is totally compatible with the present commonplace 58 W model. We will publish the results in the foreseeable future. So far we must say: Major loss of output for a minor saving on the input side when operated with a magnetic ballast, about proportional input savings versus output losses with an electronic ballast. What we unfortunately were not able to include is the influence of temperature, so feel free to support us with your own results!
Of course, by principle it does not take a fancy voltage reduction gadget to achieve the achievable saving. Not in principle. But we concentrate our efforts on solutions the existing user is able to buy on the market and implement. Note that we also point out that e. g. using a 240 V magnetic ballast in a luminaire rated 230 V or 220 V also does the job (provided you use electronic starters).
Also note that it says there the loss in a magnetic ballast is approximately proportional to the square of the current! So if you call our statement “a little simplistic”, then you are just quoting us. Of course it is! This is no science. The practician needs approximate and easily understandable data and implementable instructions. This is what we want to provide, not indigestible science. May I guess that your employer is a university?
Next, you say “A different way of achieving lower ballast losses would be to use a more robust ballast (thicker windings, etc.) to achieve lower losses at the same power.” Again, you are only quoting what we say (add “More and / or better magnetic steel” to make the list of the most important items complete and “simplistic” enough for the practicioner). This is nothing else but what is actually done, shifting from ballasts class D through C through B2 and hopefully one day up to B1 – but perhaps rather on a voluntary basis, for we see some drawbacks in a general prohibition of B2. So do the producers.
Of course the loss of light does most certainly become noticeable at some point. We do point this out, and more than this, even if not, we also claim that the user must be notified of this loss, even if it is noticeable only on a metering instrument! Anything else would be dishonest.
We are talking of energy savings here, i. e. active power. Please do not confuse with reactive power! This is dealt with very much in detail elsewhere (see http://lighting.copperwire.org/5.1.php). Not everything at once, please! The poor practician!
Of course the best energy efficiency will be achieved by appropriate choice of equipment to operate at the rated supplied voltage and not by adding additional equipment in front of sub-optimal appliances. Unfortunately not all appliances are optimized with respect to efficiency but rather to maximum output. In fact a fluorescent lamp with a magnetic ballast is at present the only device we can think of which overall works better at (slightly) reduced voltage. All others lose more at the end than is saved at the beginning, if any at all. Mind our publications on this.
Are people still playing with this silly business of Auto-Transformers to "save energy". If so its beyond ridiculous and they should be discredited immediately.
Fluorescent lighting cannot save energy by lowering the input voltage without a virtual equal reduction in the light level. If you want the figures I'll send them to anybody who asks. I design, manufacture and test lighting electronics.
Auto-Transformers are pedalled by con-merchants who should stop praying on ill informed members of the public. Any business considering buying one of these units should take advise from a knowledgable expert not a sales person.
Sure there are some still looking at this. I manage a warehouse where the vast majority of equipment is still from the 60s,70's, and early 80s. In which many use linear power supplies. Is it not true that a reduction in AC voltage level will mainly mean a reduction in DC filtered voltage to the regulator and so it will result in watts needing to be bled off in the form of heat to get the regulated voltage.
I measured my line voltage at 123 volts, which seems a bit high to me. Most of my heavy equipment specifies 110-120v so this is out of range, although this 123v is when nothing is running.
I've retapped the transformer to bring it down 2.5% to 119.5 volts or so, and would like to believe the 20 or more t12 8 foot flourescent light fixtures won't be affected much. Most of the ballasts I believe are +10 years old and likely magnetic.
Does any of this make sense to anyone else? Am I fooling myself into thinking the total watts is also going to go down?
OK - I would like to make the following comments
Resistive loads like cookers Ovens and Water heaters - Watts = VxI so voltage reduction just reduce the heat output
Incandescent lamps - almost the same except the design voltage gives the design light output and give the stated live span, in fact taking the voltage down to the design level do not make a noticeable difference in light output in the high temperature halogen light.
Florescent lights the over voltage just cause heat in the inductor as it does in motors and an power supplies it has no gain in output performance (Faradays law not Ohms Law)
The incoming voltages vary during the day due the restive loads on the supply cables and therefore the energy consumed will vary
This works no matter how may phases there are in fact by balancing the output voltage of each phase further increase the savings and multi-phase equipment imbalance of phase voltage just cause vibration due to saturation variation in rotating or fixed core.
Hope this helps
OK Expert How do fluorescent lights work? Know anything about the Valence Electrons and the fact that the exit speed from its orbit is due to the excitation caused by the energy caused by the application of the voltage all this is determined by the laws of physics.
The electron produces photons in the UV range then produces visible light by striking fluorescing powder.
The other factor is the Inductor that is in line with the “tube”
The back EMF that the inductor produces (magnetic resistance value) is determined by the number of coils the frequency and the saturation value of the inductor core which is determined by the BH loop and the designed working voltage. All this is governed by Faraday and Lenz’s laws determined by Maxwell’s equations so the higher the voltage is not the higher the light output for fluorescent light emitting devices.
And with incandescent lighting yes the higher the voltage the greater the illumination up to the point where the filament vaporise so over voltage just cause 5% increase in visible light 40% increase in Infra-red light (non visible) which is heat! And finally reduce the effective life by 40% or more.
Finally before making assumptions regarding “auto-transformers” understand the laws of Magnetism and Physics as there are systems on the market that are proven to save energy and money by operating equipment at the right designed voltage.
(Not a salesman)
Ok – you are right ish !
Have you checked the voltage variations that occur during the day? Or night?
The inductor will have an optimal voltage to produce the back EMF (magnetic resistance) this is designed to work at the ideal point of the BH loop. What is all this??? A magnet can only do so much i.e. a big magnet has more force than a small one so the magnetic core is designed to work at its designed voltage put more in and you just get heat, put in too less and you get heat, put in the ideal and you get the best efficiency.
Easy way to check, reduce the voltage slowly whilst seeing how much current is being used on full load, whilst it drops great, as soon as it start to rise bad. Then put your set point to just above the level where it started to go up.
OK – To explain in simple terms.
Cookers, water heaters etc are resistive loads. THESE WORK ON OHMS
Motors, Transformers, Ballasts are all inductive loads. THESE WORK ON FARADAYS LAW
Every volt that is over the minimum designed voltage just produces heat and has (or should have) no effect on the output performance. Therefore a 1 HP or 740 W motor will produce that power output at the Minimum Designed voltage you may get 1.01 HP if you up the voltage but the core should be designed to operate at its optimum efficiency not over saturation as this is an impossibility that physics will not allow.
Yes there is a designed tolerance but if you are sold something with a stated output it must be at the minimum designed voltage.
CE marking has required that equipment must operate at the “harmonised” voltage range whish is 207 V to 253 V therefore it provides is most efficient operation at 207 V but will get rid of the additional heat at voltages up to 253 – which you pay for as most suppliers want you to have the maximum as it earns them more money.
Putting aside the many references to lighting above, I have to ask myself does the reduction of voltage from the aparent 240-250 volts on our site down to 220volts nominal UK voltage actually save energy. If I am to believe http://www.streamline-power.com/ and http://www.powerperfector.com/ then the answer is yes with many case studies at least in the latter.
Picking up lighting again briefly, the idea that the drop in voltage causes a reduction in light level disurbes me a little as while the eye might not dicern a small decrease, where the light levels are already low for a number of reason any further reduction would be unwelcom.
In the end the pressure to show energy savings for environmental reasons and energy savings for budgetary reasons, we have little choice but to go ahead with a sample project on one of our site. The many comments above do not make the task easy as there are both strong emotive comments against and some quite technical proofs in favour. My own view having read the comments above is that in our case at least the installation of some form of device is justified.
If it is found to be advantageous to have lower voltage coming in to your home or business couldn't this be done by the power company changing the tap on the step down transformer outside the building?
I am considering reducing the voltage to a number of installations where the landlords load is almost exclusively designplan DMQ282DLPB fittings.
These have 28 watt 2D lamps in them with electronic ballasts.
A typical site may have 100 of these installed and the voltage is around 245v
Would using transformer equipment to reduce the voltage to 220v actually produce any energy savings and if so, by how much?
One source tells me no another tells me yes
Can anyone point me to any case studies?
Thanks in advance
Would I be correct in my assumption that certain electrical appliances would in fact cost more to run with a reduced voltage supply.
Example electric kettle, a quick calculation tells me that a 2.2 kW 220v rated kettle, when supplied with 200v would be effectively be reduced to a 1.8 kW kettle. So would take a longer time to boil, because it would still take the same amount of energy (4.186 Jules per gm per degree C). But as the kettle would loose energy through convection ( Jules lost per second = temperature differential surface area etc) the amount of energy lost would therefore actually be greater due to the extra time to boil?
Actually if v reduce voltage for lightings means we can achieve power savings. Becoz normally 210V enough for lights instead of 240V and above.
I am considering using a device called Fluoresave that purportedly saves 30% power on magnetic ballast / fluorescent lighting circuits. Can't some of the "magic" come out of reduced heat?
Please send me anything scientific you have that can dissuade me.