Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations IDS on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Heat load reduction?

Status
Not open for further replies.

John_187

Mechanical
Apr 21, 2018
68
Hello, there is an electrical/server room. The reported heat rejection load is 45 kW, however the room temperature is allowed to get up to 85 F.

I figure most reported heat loads are heat rejected into a room of a typical 72 F. So if the heat is rejected into a hotter room, the temperature difference and driving potential would be less. So theoretically the heat rejection would be less.

Is there a rule of thumb equation for heat load reduction in a scenario like this? What is a typical surface temperature of electrical panels, IT servers? Thanks
 
Replies continue below

Recommended for you

Questions you should be asking
[ul]
[li]Define room temperature: equipment intake, exhaust or a mixed condition?[/li]
[li]Does your equipment have temperature ratings which define upper temperature limits?[/li]
[li]Does the IT/electrical equipment have fans? This will define a deltaT for the space (typically ~ 20 degF)[/li]
[li]Is the equipment arranged in hot/cold aisles?[/li]
[li]What is the configuration of your cooling equipment?[/li]
[li]Do you need redundant cooling equipment equipment?[/li]
[/ul]



 
To the first order, the heat load does not change; the temperature at the components rises.

You are limited to whatever component ratings are for surface temperatures on the parts, typically 70C or possibly 85C.

However, this is not for free. Component reliability potentially will decrease by about a factor of 2, compared to the same room kept at 70F

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
A lot of good answers previously but to sum it up and add a little more:

Most server type equipment has a reduced operating life when continuously exposed to environment above 85 deg F. Manufacturer information will over-rule this, but if they don't give that data this is a typical value.

The heat rejection from the servers will remain constant, as long as you keep the room temperature low enough to effectively exchange the energy with the internal components. This temperature likely aligns with the maximum operating temperature/environment limits the manufacturer should give you. If you maintain a value lower than this, it will just cool itself more effectively and/or dissipate the heat quicker and/or operate at lower internal temperatures, etc, depending on how the servers internal systems work.

The other important aspect of your room temperature setpoint, is the associated higher or lower operating relative humidity in the space (compared to your systems delivered air condition, the adjacent building spaces, etc). So be sure to see if your server manufacturer has a range for that as well. Higher RH can lead to corrosion, lower RH can lead to electro static charge.
 
There used to be a time when all computer rooms were blisteringly cold, to keep the computers more reliable and to run faster. Note that today's computers are all CMOS, which runs slower when hot. This means that in addition to any gross reliability loss, the chips run slightly slower, and that might be OK 99.999% of the time, but there may be calculations that could take excessive time and result in the wrong answer, because the propagation times exceeded the clock periods. Nowadays, server rooms are often quite toasty, since the hardware likely will go obsolete before it fails. However, the dictum about processing speed still remains; the colder the room the more reliable the calculations.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor