Help w/ build
Hello all,

I want to start a build but have some questions. I'm super familiar with most consumer PC components, and through my job as a sysadmin have built/fixed my share of workstations, but find myself lost when it comes to how to power a rig with 4-8 cards in it.

I want to build a cracking rig with 4x GPUs and possibly have room for expansion to eight. I've done a bit of reading and have an idea of what kind of CPU, RAM, etc., but I'm not sure how this works in terms of the PSU(s) and probably the motherboard too. Are there single PSUs that are large enough to power that many GPUs? Or do I need several PSUs?

If these builds need several PSUs, do they live in the chassis or are they external? Do I need a special motherboard to interface with multiple PSUs?

If anyone has a link to a blog or article that details each component of a cracking rig from the last year or two, I would love to take a look at it. Also, if anyone has recommendations on a rack-mounted chassis and motherboard, I would very much appreciate those as well! As I understand, I need a CPU with 10+ cores if I wanted to have 8 GPUs, so I've been leaning towards a 5900x, unless there's a Xeon build that will be cheaper (which I'm curious about given the motherboards that are made for Xeons vs consumer CPUs).

Lots of questions here, thanks for any answers!
Obligatory link to

TLDR for my part:
- Reasonable builds with Consumer Hardware top out at 4 GPUs per system
- With Server Hardware AND Blower style GPUs you can do up to 10, depending on your barebone (e.g. Tyan Thunder HX FA77-B7119)

If you want more GPUs with consumer hardware, you will need to use risers and that is a really ugly topic. For me, ribbon risers (like these worked fine on a 4 GPU system with a single 1700W PSU.
Unfortunately, you are bound to use a single PSU with these (With 2 PSUs and ribbon risers, you would mix the circuits of both PSUs as GPUs draw up 75W from the PCIe Slot on your Mainboard and the rest from their 12/8/6 Pin connectors). Mining risers solve the power problem, as they are usually seperately powered through a Molex or similar. But there are several reasons to avoid these:
1. Molex is not rated for 75W and therefore is a literal fire hazard, if used that way.
2. Most risers are PCIe x1, which will impact your performance massively for certain attacks, like dictionary attacks with fast hash modes. The x1 bandwidth is simply to small, x4 is the minimum to avoid this.

Apart from that, there is pretty much no consumer chassis that can house more than 4 GPUs (except open frames).

Also, I don't recommend consumer platforms like AM4 / LGA 1200 for more than 2 GPUs. They have something around 24 PCIe Lanes, where some are already fixed for Chipset, M.2, etc. If you want 4 GPUs on consumer hardware, go with HEDT platforms like TR4 / LGA 2066. They have plenty of lanes, but be sure to also check the lanes of the specific chip you want to use.
(10-29-2021, 07:44 PM)NoReply Wrote: Also, I don't recommend consumer platforms like AM4 / LGA 1200 for more than 2 GPUs. They have something around 24 PCIe Lanes

Ahh, that's a great point that I overlooked. Would it be worth while to wait a month and get the new LGA 1700 since it will support PCIe 5.0? Does the DDR5 RAM have any advantage here? Or just stick with LGA 2066?

As for the PSU... If I plan to have 4x 3080s, will one ~1800w miner PSU be sufficient? I can't seem to find any PSUs greater than 1500w from the pool of consumer stuff that I'm familiar with and don't know much about the miner PSUs. Or should I go with 2x 1200w consumer PSUs? If 2x PSUs, how does that work with a consumer motherboard?
DDR5 won't make a huge difference, since cracking is mostly done on the GPU directly. PCIe 5.0 is also not too important, unless you plan to operate your cards on x4 links.

As for the PSU: NEVER use mining hardware! Pretty much anything with a 'Mining' label is lacking substantially in quality, which is also a fire hazard in the worst case. PSUs of good manufacturers top out at 1700/1600W usually. This is probably insufficient for 4x 3080s, because Nvidias Ampere GPUs are known to have pretty hard spikes in terms of power usage. This might trip the Overcurrent Protection of the PSU, resulting in a immediate system shutdown. Also they have a TDP of 320W, which leaves really little headroom for the rest of the system.

Using 2 PSUs is possible, but only if you do not mix the circuits of the PSUs (this also trips a protection, but not sure which one). This means, you can't connect for example the 24 Pin mainboard connector with one PSU and the 8 Pin CPU connector with the other one. This also holds for the GPUs, because the draw up 75W from the PCIe slot directly and the rest from their 6/8 Pin connectors. So the only isolated devices that remain are your drives and fans. The only solution to successfully using 2 PSUs is to have good risers with seperate power connectors.
Thanks for the replies!

Since my last post, I did stumble across this Add2PSU adapter, which seems to make sense. Unless anyone has heard anything bad about it, I will probably plan to use it. It seems like it keeps everything completely separate.

I'm very torn on CPU/mobo at this point. Do you have any recommendations on budget-mid priced setup?

I was looking at threadrippers and noticed that 1st gen 1900x is only $150. I'm confused why it's so much cheaper than 2nd and 3rd gen.

Intel HEDT is another option, but we're in a weird spot where the last gen to be released is a few years old and selling for MSRP, if not more, and the next gen is about 6+ months out.

Lastly, I'm assuming that I can pick up a Xeon that's a few years old for a fair price, but I'm very unfamiliar with the related motherboards.

If you have any specific suggestions, I'd be very appreciative.