Why is the rating of transformers given in kVA and not in kW?

kVA is the unit for apparent power.

Apparent power consists of active and reactive power. Active power is the share of the apparent power which transmits energy from the source (generator) to the user. Reactive power is the share of the apparent power which represents a useless oscillation of energy from the source to the user and back again. It occurs when on account of some »inertia« in the system there is a phase shift between voltage and current. This means that the current does not change polarity synchronous with the voltage. But the heat generated in a winding as well as the eddy current losses generated in a transformer core depend on the current only, regardless of whether it aligns with the voltage or not. Therefore the heat is always proportional to the square of the current amplitude, irrespective of the phase angle (the shift between voltage and current). So a transformer has to be rated (and selected) by apparent power. It is often helpful to think of an extreme example: Imagine a use case where the only and exclusive load is a static var compensator (and such cases do exist). Would the load then be zero because the active power is zero? Most certainly not. – Caution: In this situation the voltage across the output terminals will increase with load rather than drop!

► Access to further resources on transformers

 

Supplement:

Special care has to be taken if the load current of a transformer includes any higher frequencies such as harmonics. Then the transformer may even overheat although the TRMS load current, measured correctly with a TRMS meter, does not exceed the current rating!

Why is this? It is because the copper loss includes a share of about 5% to 10% of so-called supplementary losses. These arise from eddy currents in mechanical, electrically conductive parts made of ferromagnetic materials and especially in the low voltage windings with their large cross sections. The magnetic stray fields originating from a lack of magnetic coupling between the HV and LV windings (main stray canal) induce something that could be called an “eddy voltage” inside the conductors, which drives an eddy current flowing around in a circle across the conductor, perpendicular to the main load current. Now the amplitude of this “eddy voltage” is proportional to the rate of change of the magnetic field strength. The rate of change of the magnetic field strength is proportional to both the amplitude and the frequency of the current. So the eddy current increases proportionally to the load current and proportionally to the operating frequency, for the limitation to the eddy current is Ohm’s Law. The supplementary power loss caused by the eddy current is eddy current times “eddy voltage”.

Hence, the supplementary losses increase by the square of the load current, which excites the magnetic stray field, and by the square of the frequency, while the “main copper loss” increases only by the square of the load current amplitude. Therefore the transformer runs hotter when the load current has the same amplitude but is superimposed by higher frequency constituents above the rated frequency. This additional heat loss is difficult to quantify, especially as the transformer’s stray reactance limits the passage of higher frequency currents to some extent, but in an extreme case it may drive the supplementary loss up from 10% to 80% of the copper loss. This means that the transformer may run some 70% hotter (of temperature rise above ambient) than specified for rated (sinusoidal) current. Since the ohmic heat loss, however, depends on the square of the current, it is enough to limit the load current to some 65% of its rating to avoid overheating.

Comments

Anonymous's picture

This article is very detailed, however it could be "dumbed down" a little. This would make great sense to an expert in thid field, but then this article isn't written for experts since experts don't need to know this. It would be nice to see an article that expains these principals so that even a common electrician can understand it.

By Anonymous (not verified) 12/07/2011
Robert's picture

simple answer: the reason is the Power of the load and its powerfactor.\\not all about the copper and core losses because we know that transformer is one of the most efficient electrical devices we have.

a 1000 KVA, 100KV transformer is to be loaded with:

condition 1. a load of 800KVA at 0.5 pf lagging
-result- the transformer is capable of handling the load
(not overloaded)

Condition 2. a load of 800kw at 0.9 pf lagging
- result- the transformer still works fine
since S =800KVA/0.9 pf = 888.88 KVA

condition 3. a load of 800kw at 0.6 pf lagging
- result- the transformer is now overloaded (bad scenario)
since S=88kw/0.6pf = 1333,33 KVA

worst condition if xformer is rated in kilowwatts, say 1000 KW
then it is loaded with only 500KW at 0.3 pf lagging
walahhh!!! S= 500KW/0.3 pf =1666.67 KVA it is really overloaded

note: I = S/V since S is proportional to I so the current is greater than the maximum current ratings of the windings of the transformer in condition 3 and in the worst condition... - there you have it!!!

By Robert (not verified) 04/12/2011
eng_2013's picture

the only answer to this question is that the power factor of load is not known while it is leading or lagging so transformers are rated in kVA.

By eng_2013 (not verified) 01/02/2012
Parthiban's picture

what about losses in transformer

By Parthiban (not verified) 21/02/2012
Log in to post comments

Follow us