hashcat Forum

Full Version: Possible use for rack mounted pci expansion chassis
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I have been searching through some of the posts here looking to see if anyone has ever done a build using a PCI expansion chassis to house the cards and then a separate server to drive them. I know this type of setup is popular for deep learning and protein folding calculations etc but curious if anyone has done this or if its possible for a hashcat build. 

Thinking of things like these: 

http://www.ebay.com/itm/Dell-PowerEdge-C...Sw5IJWfHVg

Or maybe two of these:

http://www.ebay.com/itm/NVidia-Tesla-S10...Swx2dYGoc0

It would probably take some finagling but might be doable and would be much cheaper than something like the SuperMicro SuperServer or a custom shops charge. Thoughts?
The chassis you just linked to only work with old passive Tesla GPUs and will not work with desktop GPUs.

You can find more comments about expansion chassis in this thread: https://hashcat.net/forum/thread-5064.html
(12-02-2016, 09:13 PM)epixoip Wrote: [ -> ]The chassis you just linked to only work with old passive Tesla GPUs and will not work with desktop GPUs.

You can find more comments about expansion chassis in this thread: https://hashcat.net/forum/thread-5064.html

ah thanks for the reply. Seems like the hardware to just house and power a bunch of GPUs should exist and be reasonable priced but I guess it just doesn't.
Hardware to house and power a bunch of GPUs is expensive to produce, and it's not exactly high-demand so few units are produced in each production run, which means the prices are higher than typical commodity hardware.