Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Minisplit AC in Small Server Room 1

Status
Not open for further replies.

AxisCat

Mechanical
Apr 7, 2008
29
Hi All,

Let me start by making you aware I am a HVAC contractor but have lots of years in the field. I am selecting a system for a small server room, A couple of normal size racks and all the stuff you would normally see. I am awaiting a full audit of the equipment from their IT person so I can run the cooling load.

In the meantime I am playing around with a nominal 3-ton unit. If this was a really serious room I would be using a unit specifically designed for the task, but around here these small rooms all seem to get mini-splits which is what I am looking at using.

I plugged in 68 DB and 57 WB (50% RH) as my entering air conditions in my design software. IT guys like to keep things cold. To my surprise the software complained about low inlet temperatures on the air handler. After playing around a bit I discovered 59 deg. WB is the minimum regardless of the DB. I can only assume that when running low DB and WB we could approach freezing conditions on the evap coil.

But like I mentioned there are tons of mini's out there doing these small computer rooms and they don't seem to have any issues. But as we all know 99% of the cooling load is going to be sensible so how can these units not get into trouble by drying the air out below 50% RH?

What do you all think?
 
Replies continue below

Recommended for you

Does having dry air not cause problems with the IT equipment?

What about using a refrigeration system where you could control the suction pressure, and use a larger coil? Or maybe a water chiller and a fan coil unit?

Have seen some mini splits work fine and others ice up in server rooms.

Interested in opportunities to gain knowledge and skills in engineering and refrigeration. linkedin.com/in/sean123
 
Thanks for the reply. What was concerning is the minimum 59 deg. WB temperature on this particular mini-split. And it is a major brand, commercial duty unit. I can see cases where this will be exceeded. I am going to reach out to my vendor and see if they can help me out. I just hope i don't get some answer like "our mini-split systems are not approved for computer room applications".... but I bet I do

Around here, Kentucky; these small rooms that is all I see and have not ran into any issues with them freezing up. I just don't want to be the first.
 
You should talk to the manufacturer about the limitations. You also should use a commercial grade unit for the much longer run hours compared to just home or office cooling use.

Review equipment requirements. Most IT equipment is very tolerant to air conditions with very wide range of RH and temperature. It isn't the old paper-punchhole-tape days where humidity had to be controlled to prevent paper from being stuck. Electronics don't care very much about the conditions as long as you are within manufacturer spec. My car is full of electronics and just works in winter and humid summer. Many IT server rooms just get cooled with filtered ambient air. Even if you leave a server at 100°F, it life doesn't get diminished enough to worry about it because it becomes obsolete before dying anyway. Maybe keeping the server at 70°F instead of 100°F will extend it life from 12 to 13 years. So what? No one keeps a server more than 5 years.

Ultimately you should talk to the owner what they want.
 
Thanks for the reply HVAC-Novice, I agree with everything you said and I plan on using commercial grade equipment. Very true modern IT equipment is very tolerant to temperature and humidity. It is the IT guys who think they need to run their room like a walk-in cooler.

Quick lil story...I just recently installed a RAID card in my PC and had a fan jury rigged to blow on. The fan vibrated loose and stopped blowing on the heat sink, the chip in the card was reading over 150 degrees... Celsius! And was still working.

Here is one I did a few years back... each one is 20 tons.
1_rgr8bf.jpg
2_hutbe7.jpg
3_fgkq77.jpg
4_ndwiq5.jpg
 
I work for a large employer and often work with our IT department on their needs. Almost all their equipment is rated to work at over 100°F. Obviously you sometimes have humans in those spaces as well, so you would design for lower temps. But the equipment really doesn't care. Some of their switches are even available as 156°F (or something like that) rated units. We sometimes have them install those in a small switch closet in unconditioned rooms since that avoids adding an HVAC unit (which adds maintenance need, potentially water, noise, and so on). So talking to the data center manager is best... ideally that person also is in charge of the payments. So, if they have ridiculous demands, they have to pay for those :)
 
The equipment DOES care; every 3 degC could increase failure rate by 2x. The cooler everything runs, the longer it'll keep running.

Temperature ratings are not drop dead points. We accidentally got a bunch of chips to run at 225C junction temperatures and 95% of them survived 1000 hrs; the bad news is that they weren't supposed to go over 185C junction temps and it was supposed to be 100% survival, so we had to scrap the test devices and start the qual test over.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
225C is smoking hot. I can't imagine why any silicone in typical equipment would need to run much over 100. Of course all I can do is provide a target space temperature in the room and what happens inside the cases is up to someone else. For design purposes that target is 68 degF. In reality, the cooling load will be somewhat lower than pencil and paper shows so they could run this room down to a point they freeze up the AC units if they wanted to.
 
Nice Alistair,

I am doing primary with backup and they offer a unit rotation mode but still tied to a single controller, guess I can't get everything want. But really thanks, I am going to check into these.
 
there is a third generation just released called NX which the controllers are more intelligent and you can have two or more plus blue tooth and wifi in the mix.

I just bought for myself a 10kw twin system 3rd generation NX. Haven't installed it yet, so the only reason why I know of the server room stuff was it kept coming up in the marketing. To note the system I bought isn't yet on the web as available. You have to join the pro club then go hunting through the documentation and then you work out the product codes and stick your invented one into the document search and it suddenly appears. Then ask the dealer for one... never heard of it, hey the ordering system recognises it....

They have a cold room solution as well which can be linked to the server room system. Which they market for wine cellars. From memory they can hold the room steady at 7 degs C with an outside temp -15 to +45 I used to work for ericson at an RnD base and all the server rooms were kept at 16C and the AXE10 test plant was at 18C.

They had ducted cold air coming into the back of the server racks.
 
Great stuff Alistair. I appreciate you pointing me to it; looks like exactly the setup for my project. Tomorrow I will search out my regional sales rep for these products and see where it goes. Thanks again and enjoy the rest or your day!
 
If you had 225°C junction temperature several other things went wrong and none of that have anything to do with the server room temperature:
- a fan or contact to the heatsink failed
- thermal throttling failed
- auto-shutoff failed

Yes, if you have the server room at 75°F instead of 60°F, the server will fail already after 25 years, and not after 26 years. but boo-hoo... servers and other equipment get replaced every 3-5 years since the new equipment is so much more efficient and powerful. Witht he energy and HVAc equipment saved by designing for 75° instead f 60°F you can buy IT equipment en masse. A server room ruining old equipment is literally burning money in form of electricity. Even if an IT department doesn't care about energy, a 5 year old server isn't able to meet today's load anyway. This isn't equipment where you wrote a Word document 5 years ago, and today you have the exact same need. IT needs increase exponentially. Just for that reason you don't need a very long life of equipment. and the random failures due manufacturing defects and general silicon-lottery aren't related to temps. Temps play a role very long term.
 
Depends what the servers are doing.

The server rooms I dealt with had mostly solaris multi core simulation servers in them running at .9 load most of the time. Enterprise level raid disk banks. When they moved away from unix email to exchange server that thing was a pain.

Personally I would see if you could do a couple high pressure cassettes instead of wall units and pump the flow into the racks. It will have the added advantage that the servers will get the really cold air directly and then warm it for the ambient for the workers. The ericson server rooms were very cold for humans. But let the temp go up by a couple of degs and the hardware did get quite a bit hotter. HPA cassette gives you way more options anyway they cost the same as well. I ended up with a ceiling and cassette twin setup.

I did make this suggestion at Ericson before I left for a career change about the cooling in the test plant where they did telephone exchange stress tests. And I am still in contact with a couple of the network guys there. And they did move to direct hardware injection of cold air when they had to start thinking about water cooling. The server rooms with wall units are not pleasant because the wall units when working hard the outflow air is very cold and blows over the humans, then it would get sucked in the back of the racks and exit out the top. So head height was at about 30 degs and your ankles at sub 15 degs.

BTW we had TC's stuck all over the hardware on a separate alarm system 65 C was normal, 85 deg was deemed stressed but acceptable for short durations. Over 95 deg's various high up managers started get messages on their phones. And us grunts started getting multiple emails and it happened quite often, they reckoned the exchange hardware could handle it for 6 hours then the failure possibility increased by 5% every hour after that.

 
Pumping the cold air directly into the racks would be a nice optimization. I will have to talk to them about that option. I intend to do ceiling cassettes as they have better air distribution than the wall mounts, but that is me thinking from the perspective of human comfort. In my small pond I only do a couple projects like this a year and most are very small compared to what you all are describing. Personally I hate wall mount units, they are pain to get the condensate to work and ever more of a pain to service.
 
This place was quite unusual. I was a FEA nonlinear couple field analyst and ended up as a research engineer at a university. Department guru with the unix boxes and network. When I decided that academia wasn't for me and my CV went out some pimp phone me and asked can I dump all this engineering stuff off your CV and just leave the IT stuff?

Err why?

Cause I can get you on site tomorrow with that 30 UKP an hour.

Delete away

10 mins late phone call interview, told them them the truth about my experience. "you can mount a disk which is more than most I speak too can, and your honest which is novelty in this game, see you tomorrow"

Ericsson I was taken on as extremely low level 3rd line support unix and MS but as it was a fluid RnD base the Mech Eng knowledge started getting used a lot. I also seem to have a knack of lateral problem solving across flavours of engineering. I enjoyed myself over the 2-3 years I was there and it gave me time and money to career change to my current profession.

My unit is for a workshop I am building DIY. I am really no expert on this stuff just spent a month researching for my workshop a solution for keeping it above zero during the winter when the outside temps go down to -20. The local "experts" all gave utterly useless not fit for use or conditions or my power supply capability. So I had to sort out a solution myself and found the PAci gear.

If your interested there is a thread by me in the hobby's forum.

Would like to hear what your eventual solution is. My lateral problem solving quirk had already got me thinking about flap gates across the output from the cassettes, they do have phenoms for the cassettes but again its a bit of a mission to get the product codes for them.

 
How are the servers arranged and what can they re-arrange? Most servers use a hot-aisle/cold-aisle setup. the hottest server is what matters, so good distribution of cold air matters a lot.
 
Very on topic post above.

The Microsoft exchange server was my personal enemy for 6 months. We used to run 3500 user's email on an old spark 5 with a keyboard you could bang nails into a concrete wall with. It would never go above .3 load and never got hot. And had a 18 month up time when I eventually shut it down. And I know the only reason why it wasn't longer was because one of the ms guys pulled it's power supply to prove a point. Which didn't work because the DHCP network services spark was the cache for it. So everything was up in under 3 mins after restore and processed 5 mins later.

The Microsoft boys managed to sell the additional fluff to the boss and this Compaq quad core with Veritas raid zero turned up costing a 5 figure sum.

It ended up in prime cooling at the top of the rack with the duct going virtual straight into it and it's raids in the rack next to it in the same position. The number of reboots that thing needed was just nuts. And to add insult to injury it used to take 4 hours to come off load 1 to re-index everything and start moving mail real time.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor