Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations SSS148 on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Who, when and why established standard 50 (or 60 ) Hz? 1

Status
Not open for further replies.

MissMe

Electrical
Jun 27, 2003
6
Hello, all!
I was trying to find the answer to the question in a subject on some official pages (CENELEC, IEC...) but I didn't get any response.
I hope some of you know the answer to the question who, when and why decided that standard value of network frequency should be 50/60 Hz?
I'm more interested in 50 Hz as I am in Europe.
Thanks in advance.
MissMe
 
Replies continue below

Recommended for you

I am sure there is a story behind this...

But all I want to say is if you are an inventor or a trail blazer...you get to set your own standards and others follow.

I would not be surprised if this is traced back to the fact that flicker in incandecent lamps becomes unnoitceable (very comfortably) at that frequency, without going much higher frequency.

 
I think rbulsara is right about the flicker issue. . .

It was either IEEE IAS or Spectrum that had an interesting article on this about a year ago. . . I recall the article mentioning that many many different frequencies were tried back in the day -- there was a 15Hz hydro plant mentioned (possibly still in operation?).
 
Isn't there a relation between frequency, line losses, and how big the equipment becomes?

For example at 50Hz, the line losses are greater, but the equipment (transformers) can be smaller (for the same kVA as 60Hz)? or is it the other way around?
 
Just found this at
"The system of three-phase alternating current electrical generation and distribution was invented by a nineteenth century creative genius named Nicola Tesla. He made many careful calculations and measurements and found out that 60 Hz (Hertz, cycles per second) was the best frequency for alternating current (AC) power generating. He preferred 240 volts, which put him at odds with Thomas Edison, whose direct current (DC) systems were 110 volts. Perhaps Edison had a useful point in the safety factor of the lower voltage, but DC couldn't provide the power to a distance that AC could.

When the German company AEG built the first European generating facility, its engineers decided to fix the frequency at 50 Hz, because the number 60 didn't fit the metric standard unit sequence (1,2,5). At that time, AEG had a virtual monopoly and their standard spread to the rest of the continent. In Britain, differing frequencies proliferated, and only after World War II the 50-cycle standard was established. A big mistake, however.

Not only is 50 Hz 20% less effective in generation, it is 10-15% less efficient in transmission, it requires up to 30% larger windings and magnetic core materials in transformer construction. Electric motors are much less efficient at the lower frequency, and must also be made more robust to handle the electrical losses and the extra heat generated. Today, only a handful of countries (Peru, Ecuador, Guyana, the Philippines and South Korea) follow Tesla’s advice and use the 60 Hz frequency together with a voltage of 220-240 V."

Great posting by the way!
 
Can anyone confirm this statement: "Not only is 50 Hz 20% less effective in generation, it is 10-15% less efficient in transmission, it requires up to 30% larger windings and magnetic core materials in transformer construction"

Those numbers seem too big. . . . And the Europeans CAN'T be blindly ignoring all this if it's true (can they?). . . .
 
There have been numerous previous threads on this subject - try doing a search in this forum.

 
That's exactly what I thought too PB. Didn't make sense. I'm sure it's some conspiracy theorist at work
 
Try these....
thread238-89493
thread238-66746
thread238-59627
Thread237-31684
Thread238-38672
Thread237-43167
 
I am with peebee..not only that I have worked in both systems and all those efficiency and size claims in controlnovoce's post appear baseless based on my experience.

Most losses in electrical system are I^2R losses, frequency has nothing to do with it. Impedance in transmission lines consists of largely inductance, affecting any voltage drops etc..and a given inductor will have greater imepdance at higher freuencies..(capacitive impdance is inversly proportional to frequency but plays a lesser role than inductance)....so those claims just appears counter intuitive.

I dont think magentics are affected that much by 20% difference in frequency..speaking of generation, 50 hz generatiing machines run at a lower speed that those of 60..it has to translate into lesser mechanical and heat losses..if anything..

I have no back up of an research data..but would love to see some...if some has any..
 
I do not know if the numbers are consistent but a higher frequency transformer requires less iron in the core vs. a smaller frequency transformer. I am thinking of the 400Hz as used in aircract. This is done to reduce weight. Whether or not the 'effciency' is affected is another question. It maybe with the same core material and core size (and # of turns or even wire size) that when a lower freq xformer is compared with one capable of operation at higher frequencies, then the lower freq would appear to be less efficient. But then who would design both transformers with the same core/material knowing this?
 
It is probably way too late to change all of Europe, most of Africa, Australia, India and other places running at 50Hz. And isn't China also 50Hz? It would be a huge investment in power plant and machinery. It may be inefficient, but then so are lots of things if taken in isolation.

Maybe the Yanks should change to 50Hz instead, to rationalise on the sizes and of motors, generators etc in the market place, and as their market is probably the smaller of the two, they should change ;-)



Bung
Life is non-linear...
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor