Posts: 32
Threads: 2
Joined: May 2017
01-02-2018, 11:07 PM
(This post was last modified: 01-03-2018, 01:48 PM by kjs.
Edit Reason: Linked to GitHub benchmark log
)
Benchmarks for 16 x NVIDIA P104-100 GPU's
GPU: 16 x NVIDIA P104-100 GPU (ref:
https://www.techpowerup.com/gpudb/2981/p104-100)
OS: EthOS 1.2.7
Hashcat Version: 4.0.1
NVIDIA Driver Version: 384.90
Memory Clock: 5433
Power Limit: 217W (max)
# sudo ./hashcat64.bin -b | tee hashcat-benchmark.log
https://github.com/kristofferjon/benchma...chmark.log
Posts: 406
Threads: 2
Joined: Dec 2015
A box with 8x V100 does like 1.5-2x the speed for ~half the power usage. And you can rent them from Amazon super cheap too.
https://twitter.com/Chick3nman512/status...4383591424
Posts: 32
Threads: 2
Joined: May 2017
01-03-2018, 01:31 AM
(This post was last modified: 01-03-2018, 01:56 AM by kjs.)
Last time I checked NVIDIA Tesla V100's were approximately 8k USD each, so that system would cost at least 64k USD (ref
https://www.sabrepc.com/nvidia-gpu-nvtv1...vlink.html).
AWS p3.16xlarge (8 x Tesla V100 GPUs): 17,870.40 USD / month (ref
https://aws.amazon.com/ec2/instance-types/p3/).
The entire 16 x P106-100 system costs less than a single V100 GPU, with a monthly cost to host (inc power) of 90 USD. The outright purchase cost is less than half the monthly AWS instance price you refer to in your twitter post.
I'm no math wizard but I recommend that you re-check your numbers..
Posts: 406
Threads: 2
Joined: Dec 2015
01-03-2018, 02:51 AM
(This post was last modified: 01-03-2018, 02:52 AM by Chick3nman.)
I was referring to spot pricing, not reserved pricing. Spot price right now for an 8x V100 box is only 9.10$/hr. Running that instance for a month would only cost you $6552, not almost 18k$. Of course the price can fluctuate a bit with spot pricing, but the graph on the N. Virginia region has been pretty stable recently. The price of 1 V100 is higher than running an AWS instance with 8 of them for a month.
Posts: 32
Threads: 2
Joined: May 2017
01-03-2018, 02:58 AM
(This post was last modified: 01-03-2018, 03:01 AM by kjs.)
(01-03-2018, 01:01 AM)Chick3nman Wrote: A box with 8x V100 does like 1.5-2x the speed for ~half the power usage. And you can rent them from Amazon super cheap too.
https://twitter.com/Chick3nman512/status...4383591424
(01-03-2018, 02:51 AM)Chick3nman Wrote: I was referring to spot pricing, not reserved pricing. Spot price right now for an 8x V100 box is only 9.10$/hr. Running that instance for a month would only cost you $6552, not almost 18k$. Of course the price can fluctuate a bit with spot pricing, but the graph on the N. Virginia region has been pretty stable recently. The price of 1 V100 is higher than running an AWS instance with 8 of them for a month.
Right, so the acquisition cost for the 16 x P104-100 GPU box is still less than the AWS 8x V100 instance spot pricing over the course of 1.5 months.
Posts: 406
Threads: 2
Joined: Dec 2015
That doesnt account for the 8x V100 box being ~1.5-2x faster though. The V100 box can get more work done in the same amount of time. Still, if you are going to be using the system for long term(2+ months), then building/buying hardware is definitely the way to go over renting. I'm not sure the P104-100 is the best card for hashcat in terms of perf/$ but there's definitely something to be said for building/buying your own system regardless.
Posts: 32
Threads: 2
Joined: May 2017
(01-03-2018, 03:07 AM)Chick3nman Wrote: That doesnt account for the 8x V100 box being ~1.5-2x faster though. The V100 box can get more work done in the same amount of time. Still, if you are going to be using the system for long term(2+ months), then building/buying hardware is definitely the way to go over renting. I'm not sure the P104-100 is the best card for hashcat in terms of perf/$ but there's definitely something to be said for building/buying your own system regardless.
We are agreed on that.
Yes, P104-100 may not be the best GPU for Hashcat (the P104-100 is effectively a 1070 without display ports). The P102-100 due in ~2 months should be a different story however.