Posts: 381
Threads: 1
Joined: Aug 2014
02-11-2016, 03:16 PM
(This post was last modified: 02-11-2016, 03:53 PM by Flomac.)
(02-11-2016, 07:46 AM)mamexp Wrote: Assume you go with the build, do you have a dedicated room for the cluster? If you dont, have you considered watercooling? If you do, have you check the air ventilation of that room to be able to cool the cluster? Very good point with the cooling. Having a heating system of 10kW is not been dealt by putting a window on tilt.
But the concept of watercooling is having a low temperature on the spot (GPU, CPU) and does not guarantee a cool system over all, because at the other end the heat still has to dissipate into a radiator. It's tricky to build, not very reliable and expensive.
Air cooling instead is no rocket science at all. Just use reference design cards, throw a bunch of cooler in the case and you're done.
Posts: 2,936
Threads: 12
Joined: May 2012
Posts: 9
Threads: 1
Joined: Feb 2016
(02-11-2016, 08:57 PM)epixoip Wrote: lol watercooling, gtfo.
How old are you? ... cause you talk like my teenager son.
(02-11-2016, 03:16 PM)Flomac Wrote: (02-11-2016, 07:46 AM)mamexp Wrote: Assume you go with the build, do you have a dedicated room for the cluster? If you dont, have you considered watercooling? If you do, have you check the air ventilation of that room to be able to cool the cluster? Very good point with the cooling. Having a heating system of 10kW is not been dealt by putting a window on tilt.
But the concept of watercooling is having a low temperature on the spot (GPU, CPU) and does not guarantee a cool system over all, because at the other end the heat still has to dissipate into a radiator. It's tricky to build, not very reliable and expensive.
Air cooling instead is no rocket science at all. Just use reference design cards, throw a bunch of cooler in the case and you're done.
Watercooling does not mean no system fans. I thought that is a safe assumption.
Posts: 2,936
Threads: 12
Joined: May 2012
02-12-2016, 06:58 AM
(This post was last modified: 02-12-2016, 07:17 PM by epixoip.
Edit Reason: grammar
)
(02-12-2016, 05:11 AM)mamexp Wrote: (02-11-2016, 08:57 PM)epixoip Wrote: lol watercooling, gtfo.
How old are you? ... cause you talk like my teenager son.
My apologies. I was merely attempting to talk to you on your level, since recommending water cooling means you're about as experienced as my teenage nephew.
First, water is completely unnecessary. If you can't cool on air, you're doing something very, very wrong. Especially with new Nvidia GPUs; they draw a fraction of the power AMD GPUs draw, and are a cakewalk to cool.
Second, just because you're cooling with water doesn't mean you've removed the heat, you've merely moved the heat. This is what Flomac was attempting to tell you: you still have to have a way to dissipate all that heat you've collected. If you try to do it with a traditional radiator setup, you'll end up in the exact same position you'd be in with air cooling, in that you still have to remove all that heat from the room. So you've gained nothing by using water.
Third, no respectable datacenter or colocation facility will permit you to rack up anything water cooled. They will laugh you out of the building. And if your company has their own datacenter / server room, good luck convincing your Network Infrastructure team to let you rack up something that's water cooled.
Fourth, water cooling is a major pain in the ass to maintain. Water cooled systems require a very watchful eye to regularly inspect for leaks and to ensure the reservoirs are topped off, which is highly inappropriate for remote systems located in a datacenter. They also need to be completely torn down, flushed, and thoroughly cleaned at least once every six months, because the water blocks and reservoirs will get all gummed up due to the biocides, antimicrobial additives, and other gunk. When you tear the system down for cleaning, you also need to replace all of the seals and o-rings, because they never quite seal right once they've been used, removed, and replaced (they're essentially one-time use items.) The hoses also need to be replaced at least annually, because they too will also start to break down over time. All of this means continued maintenance expenses and downtime, both of which are highly inappropriate for production systems.
And if you decide to forego or skimp on the maintenance, then you get to experience the joys of catastrophic failure, because when water cooling fails, it fails spectacularly. De-ionized water doesn't stay de-ionized; as soon as de-ionized water comes into contact with metal and air, it beings stripping the ions from them. This means as soon as you add de-ionized water to your system, your de-ionized water is no longer de-ionized and will conduct electricity, which is great news for your GPUs and motherboard!
Anyone who has been doing water cooling for at least a few years, if they're honest with you, will tell you they have experienced at least one leak, if not two or three. But it's hard to find someone who's actually been doing water cooling that long, because most abandon it after a year or two due to the maintenance overhead. And most people who will tell you that water cooling is awesome and worry-free are those who have been doing it for less than two years, and have not yet had their first leak.
All in all, water cooling is fad that you should do only because you genuinely enjoy the hobby of water cooling and don't mind the maintenance involved. If you do it for any other reason, you will be sorely disappointed. Everyone I know who has flirted with water cooling has gone back to air cooling, and the only people I've encountered who think it's cool are people who've never actually done it before.
So great job recommending water, grandpa. Looks like you really know your shit.
Posts: 381
Threads: 1
Joined: Aug 2014
02-12-2016, 07:23 PM
(This post was last modified: 02-12-2016, 07:25 PM by Flomac.)
(02-12-2016, 05:11 AM)mamexp Wrote: (02-11-2016, 03:16 PM)Flomac Wrote: But the concept of watercooling is having a low temperature on the spot (GPU, CPU) and does not guarantee a cool system over all, because at the other end the heat still has to dissipate into a radiator. It's tricky to build, not very reliable and expensive.
Air cooling instead is no rocket science at all. Just use reference design cards, throw a bunch of cooler in the case and you're done.
Watercooling does not mean no system fans. I thought that is a safe assumption. I never said you get rid of system fans with water cooling. I just meant everything gets more complicated, expensive and, the worst part, more unreliable.
As someone who deals with IT risk management on a regular basis I take every component and weigh its risk of failing to be disastrous for that certain system.
In a usual server the loss of one, two or even three cooler is no big deal. In a multi-GPU setup the failing of a GPU cooler is not nice, but then that card will clock down and maybe gets destroyed. Very unlikly to take down the rest of the rig.
With water cooling everything chages. Every seal broken is a potential hazard. The pump could fail, so you need more than one. But every pump more ramps up the chance one of them fails. So you need a contruction where each GPU is being served by more than one pump. That means more seals, but then more seals ramp up again the chance of one failing. And, as it has been explained by epixoip, one broken seal means death of the system.
With such a huge water cooling setup we're talking about several liters of water circulating. If water is leaking it might exit that system and reach other systems. Spilled water can conduct to a shortcut and there is also a chance that a system catches fire, affecting the other systems too.
At all, the risk of failing in a certain time is getting near to definite. That alone is a reason not to use water cooling in HPC enviroments. Or datacenter. Or simply anywhere someone has to give out a certain warranty for such a system.
If water cooling's been used in HPC it usually looks like this:
Custom made, almost no seals, way lower risk.
Posts: 9
Threads: 1
Joined: Feb 2016
02-12-2016, 07:37 PM
(This post was last modified: 02-12-2016, 09:15 PM by epixoip.
Edit Reason: remove excessive quoting
)
(02-12-2016, 06:58 AM)epixoip Wrote: (02-12-2016, 05:11 AM)mamexp Wrote: How old are you? ... cause you talk like my teenager son.
My apologies. I was merely attempting to talk to you on your level, since recommending water cooling means you're about as experienced as my teenage nephew.
Oh so now you want to discuss?
Too bad being an immature little prick does not get my interest to even discuss any further with you.
Maybe one day you willll grow up and learn something new.
(02-12-2016, 07:23 PM)Flomac Wrote: (02-12-2016, 05:11 AM)mamexp Wrote: Watercooling does not mean no system fans. I thought that is a safe assumption. I never said you get rid of system fans with water cooling. I just meant everything gets more complicated, expensive and, the worst part, more unreliable.
First, there is failsafe system you can use. Its not always an off the shelf product.
Watercooled data center has been available for years now. Lots of your concern has been addressed.
Alo the term watecooling is very general. There are actually lots of methods but since the medium of heat exchange is liquid, the term watercooling is used.
With a right setup changing a failed part in watercooling is as simple and quick as changing an air cooler.
Posts: 381
Threads: 1
Joined: Aug 2014
(02-12-2016, 05:11 AM)mamexp Wrote: First, there is failsafe system you can use. Its not always an off the shelf product.
Watercooled data center has been available for years now. Lots of your concern has been addressed.
Alo the term watecooling is very general. There are actually lots of methods but since the medium of heat exchange is liquid, the term watercooling is used.
With a right setup changing a failed part in watercooling is as simple and quick as changing an air cooler.
Yes, there are datacenter specifically designed for water cooling. Does not mean you can place your water cooled rack in any datacenter and hope they are fine with it. They're not.
The reason to use water cooling in datacenter is the problem of air conditioning and the ability to reuse the heat and not let it waste off.
But with hashcat it's always about GPUs and there is no proper solution to cool 8 GPUs safely over a certain time without any problems. With air cooling in a usual server chassis with 8-12 fans you cannot go wrong. Then again water cooling is more expensive, way more complicated and less reliable.
So I'm still missing out why someone should use it. Any suggestions?
Posts: 2,936
Threads: 12
Joined: May 2012
(02-12-2016, 07:37 PM)mamexp Wrote: Oh so now you want to discuss? Too bad being an immature little prick does not get my interest to even discuss any further with you.
Actually no, I have no interest in discussing anything with you. The few comments you've made in the past two days show me that you have absolutely nothing of interest or value to contribute to this community, and are clearly lacking the experience required to speak with the authority you're attempting to speak with. Just because you believe you are older than me means absolutely nothing; we measure worth in knowledge & experience here, and it's very clear you have neither.
What it seems like to me is, you've seen and/or worked with an enterprise water cooling solution (probably from Asetek or CoolIT) on one occasion, and now you think you're the expert on datacenter cooling. Then you probably built this totally massive 4-GPU rig once, and now think you're the expert on GPU cooling as well. Then you're going to come here and spout off all this uninformed and ill-advised bullshit, and be condescending to those who have proven themselves to be experts in this field? Good luck with that.
|