Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Why 400 Hz in mainframes? 3

Status
Not open for further replies.

peebee

Electrical
Jun 10, 2002
1,209
0
0
US
Does anyone know why IBM used to require 400 Hz power to their old mainframes?
 
Replies continue below

Recommended for you

Now I am really getting interested. I can not tolerate an unsolved issue, even if it is only trivia.
I worked on the construction of an IBM data center in California for So. Cal Edison.
They moved their computer mainframes from San Onofre Nuke plant. I understand a radiation leak will erase the discs.
I asked them; "if they moved the humans for the same reason?" I understood that a radiation leak would erase them also. My humor was not appreciated.

3600 RPM is the max on 60 Hertz. There has to be a logical reason for 400 Hertz. I will research this with some friends from Sandia Labs.
 
Hi,

400Hz is easier to create the - 0.85V (Logical 1) and the -1.85V (logical 0) from.
These powersuplies were capable of supplying 100-200 Amps and stayed within a few mV stable.
Try that with 60HZ....

The disk motors run on either 1 phase 50/60Hz or 3 phase 60Hz and when I remember correctly at 3000 RPM. (but I am not sure anymore)
Depending of the type of disk used (E.g. a 3330 had a 1HP 3phase 380V/50Hz motor... here in Europe)
The 400 Hz was not used for the disk motors.... only for the powersuppies.

Best regards, Jan

 
JanH, can you provide further explanation for your statement that "400Hz is easier to create the - 0.85V (Logical 1) and the -1.85V (logical 0) from." We are talking about DC here, aren't we? A 60Hz to DC conversion is no more difficult to achieve than a 400Hz to DC conversion, except the capacitors would need to be larger to maintain the same ripple voltage. It still seems like it would be much smaller, lighter, and cheaper to go from 60Hz directly to DC and provide larger caps than to go through two conversions.

If you register your name with eng-tips and give me just a little more convincing detail I'll vote you for a star. . . .
 
peebee,

The idea is that at 60Hz you have a 16.6 mS between top-top voltage and at 400 Hz only 2.5 mS.

A DC voltage under load will go down in voltage when the AC voltage is in the process of going through zero (alternate it polarity)
This is simple because the AC voltage is lower that the DC voltage you want.
The longer this takes the more the DC voltage will drop.
Where I live we call it the "rimple" of the DC voltage.

With 400 Hz this time is about 6x shorter then with 60 Hz
That means the this "rimple" is much smaller.

The powersupplies we talk about can handle 100-200 Amps.
This is such a heavy load that with 60 Hz you will have real problems keeping the "rimple" small enough.
When I remember correctly these powersupplies were allowed a max "rimple" of 3 mV... under full load...


Best regards, Jan



 
That's assuming you're using a brute-force full-wave rectification without any output filtering.

The only specifications I've ever seen for 400 Hz power is MIL-STD-704, which is nearly unregulated to start with, so any benefits that come from the higher frequency are completely overwhelmed by variations in line voltage and frequency.

I was going to go on about circuitry and decided to search for the history of ECL and ran across the following link about a computer built at Rice. Of particular note is the mention of 400 Hz power provided by a Navy SURPLUS generator:


So the answer may simply be that there was an abundance of surplus high power generators available after the war ;-)

TTFN
 
JanH & IR, stars to each of you.

Cbarn, if I had the background to do a cost estimate to build 1970's vintage power supplies, do you think I would have posted this question in the first place? I've got a high respect for you & your eng-tips posts, but geez, you seem to have a wild hair going on this question. . . .

Anyway, I still tend to believe that the answer falls somewhere between the marketing thing and IR's surplus power supply suggestion.
 
In data centers, they likely copied airplanes.

Why they picked 400 or 415 in planes though seems to be a good question. You'd think they'd want a nice multiple of 60 Hz since they were using M-G sets, but you couldn't get too high and still fit the windings in easily. 7x60=420. It's not clear to me, though, how 420 drops to 415 or 400 -- 5% slip? I guess frequency drift might not have been a big problem, though, so synchronous speed might not have been a concern.

Just a guess.
 
Hi,

I see that there is an idea that 400 Hz powersupplies were used because there was a surplus of them.
Well, IBM mainframes were using brand new , specially designed powersuplies, so no surplus what so ever.

400 Hz in planes is for the same reason as in Mainframes.
Smaller transformers, smaller powersupplies, smaller electromotors... and for planes very important... therefor less weight

Beswt regards, Jan
 
If they were starting from scratch, they could have used any frequency they wanted. Why not 1000 Hz? The transformers would have been even smaller.

The reason 400 Hz was used by a variety of companies, not just IBM, was the simple fact that the military had already developed the hardware, the design know-how and the engineering talent.

Don't forget that the electrical engineering talent that IBM and other companies used after the war were mostly busy developing military hardware during WWII.

So, in essence, while IBM hardware wasn't surplus, most of the engineers were. Additionally, remember that the first computers were almost exclusively used by the military, since they were the only ones that could afford it.

TTFN
 
400Hz has proved to be best frequency... actually 415Hz when my memory serves me right.
1000 Hz is too high. Transformer coils are actually heavy duty chokes...so to say,...high resistance to high frequency.

I complete agree with you that other companies also made their own supplies.
I mentioned IBM because I knew the had/have their own designs.

One thing good about a war is that technoligy takes a leap forward.
Could very well be that the 400Hz suplies were developed in the war.

Best regards, jan
 
My quick question is, alot of computer power supplies run at 120VAC/60HZ. By using a Chroma programmable AC power source we were able to run the power supplies at 400hz and up 1000hz. Our intention is to place these into military airplanes. My question is what are the effects on these power supplies due to the higher frequency? Are there any firmulas that rate effieciency of load based in input frequency? ANy help would be greatly appreciated.


joshua.noonan@tag.com

Thanks!
 
Status
Not open for further replies.
Back
Top