Years ago I built a system with 4 GTX 980s that's been great. Looking to upgrade a bit and it looks like we can build an 8 GPU box for roughly twice the cost as a 4 GPU system. Is hashtopolis stable enough, and efficient enough, to be worth building two separate new 4 GPU systems? That would allow us to continue using the older system as well, combining all 3 systems under hashtopolis. We often have multiple consultants with separate jobs that need to run, so this would let us use the systems separately when necessary as well. Anyone actively using hashtopolis with experience?
Hey, Nathan - good to see you again!
Por que no los dos?
Higher density is always good if affordable/feasible - and then herding them with Hashtopolis can still totally happen on top of that. I'd personally go for the 8-GPU option, since it also allows you to carry out non-HtP jobs easily.
Either way, it's still definitely worth getting Hashtopolis into the mix. Some attacks are more easily "hashtopolized" than others (most attacks are possible, but it gets trickier). If it's mostly a fast hash like NTLM, then even some simple workload assignment - still using Hashtopolis, just throwing some nodes at a stack of mask attacks, while doing wordlist+rules on other nodes, etc. - can compensate for that. But even just being able to pause and resume jobs, and move systems around the workloads, etc. is definitely worth the setup effort.
And yeah, hashtopolis is quite stable - and performant
https://twitter.com/TychoTithonus/status...7208970240
Perfect. Thanks Royce!
And love that response to Paul Moore. He's trolled some of our blog posts in the past & it's always nice to see him shut down. ;-)
Here's what we've settled on unless someone points out problems.
Case: CHENBRO RM41300-FS81
Mobo: AsRock Rack ROMED8-2T
CPU: AMD EPYC 7252
GPUs: 4x EVGA GeForce GTX 1080 Ti
HD: Western Digital WD Blue SN550 NVMe M.2 2280 1TB
RAM: G.SKILL Ripjaws V Series 64GB (2 x 32GB) 288-Pi DDR4 3200
PSU: EVGA SuperNOVA G+ 2000W
CPU Cooler: Noctua NH-U12S TR4-SP3
Additional Case Fans: 4x 110 CFM fans 120mm
Fan Controller: Aquacomputer Aquaero 6 PRO USB
Total cost: $6,000
Planning to build two of them, so basically double everything.
(12-14-2020, 11:56 PM)nsweaney Wrote: [ -> ]Here's what we've settled on unless someone points out problems.
Case: CHENBRO RM41300-FS81
Mobo: AsRock Rack ROMED8-2T
CPU: AMD EPYC 7252
GPUs: 4x EVGA GeForce GTX 1080 Ti
HD: Western Digital WD Blue SN550 NVMe M.2 2280 1TB
RAM: G.SKILL Ripjaws V Series 64GB (2 x 32GB) 288-Pi DDR4 3200
PSU: EVGA SuperNOVA G+ 2000W
CPU Cooler: Noctua NH-U12S TR4-SP3
Additional Case Fans: 4x 110 CFM fans 120mm
Fan Controller: Aquacomputer Aquaero 6 PRO USB
Total cost: $6,000
Planning to build two of them, so basically double everything.
How much are you paying for the GPUs? Why are you getting 1080TIs?
(12-15-2020, 04:06 PM)Longtail Wrote: [ -> ]How much are you paying for the GPUs? Why are you getting 1080TIs?
There aren't a lot of options with both a blower-style fan and a front air intake. The 2080s are completely out of stock everywhere. The 1080 TIs are the next best option, but they're fairly hard to find. I found 4 for $750 and 4 for $1,000.
The topic has been fairly well covered in other posts, but for those who come after me, this article is extremely helpful and details what to look for in GPUs.
https://passwordvillage.org/hardware.html#rig_setup
One addition I'd add is that several of the newer cards I've seen with blower fans don't have any air intake at the front (2080 Supers in particular). They just expect to draw air from the side which doesn't work when you pack them together in a case.
(12-15-2020, 07:47 PM)nsweaney Wrote: [ -> ]There aren't a lot of options with both a blower-style fan and a front air intake.
This is pretty much the biggest problem I am also seeing in the current state of the Hardware market. You either have to go for last or even last-last gen cards to get a good cooling design or you have to come up with some other solution.
One of them is water cooling, which I don't have any experience with, but it is used by the folks of Decryptum / Comino (although I don't like their rig layout with 12 GPUs smashed into a system with just a 10900K and 64 GB of RAM).
The other one, which works for us at least, is actually a setup with axial coolers. I know that this is absolutely not optimal and I would have loved some blower styles instead. However, the system we have is running 4x 2080 Ti FE cards in a Nanoxia Hydra II 6U GPU case. We put in 6x Enermax D.F. Storm Fans and used Thermaltake Riser Cables to get it working. It turns out, that it works really well, if you close the side openings of the case with duct tape or something similar and set the fan speed of the GPUs to 90% flat. Temperatures of the cards are never above 75C, but it has to be mentioned that the system is located in an air-conditioned room, that never goes over 18C.
Now, this opens the option to use axial fan GPUs, but unfortunately the case seems pretty much unavailable by now. Also using ribbon-style riser cables is kinda messy, even in a 6U case. I would love to see how these risers from Comino work (
https://comino.com/en/risers/) which were 100 bucks before.