05-22-2017, 07:10 AM
That would only work for a small number of hashes. Otherwise it'll slow down the attack a lot more than 95^3 (which is why I was hoping for an automated in-memory solution.)
Let's say I have a 100M wordlist. Normally I get 2B hashes/sec MD5 (GeForce 980), two MD5 calls per hash here, so, 1B hashes/sec, times 95^3 (call it 100^3 = 1M), 1000 words/s, and I should be done in a day for any reasonable number of hashes.
Now, if I have 10M hashes, it means I have to write 10M * 95^3 = 10 trillion lines (so, 400 terabytes). Since hashcat can't eat a 400 terabyte hashlist, I'm going to feed it in chunks of 100M, so I'll have to launch it 100,000 times.
I just did a test with 10M hashes x 10 salts (100M lines total) and it took hashcat (3.5.0) 12 minutes start to finish (including 8 minutes just to load and sort the hashlist), which gives me projected time 2 years for all salts.
Let's say I have a 100M wordlist. Normally I get 2B hashes/sec MD5 (GeForce 980), two MD5 calls per hash here, so, 1B hashes/sec, times 95^3 (call it 100^3 = 1M), 1000 words/s, and I should be done in a day for any reasonable number of hashes.
Now, if I have 10M hashes, it means I have to write 10M * 95^3 = 10 trillion lines (so, 400 terabytes). Since hashcat can't eat a 400 terabyte hashlist, I'm going to feed it in chunks of 100M, so I'll have to launch it 100,000 times.
I just did a test with 10M hashes x 10 salts (100M lines total) and it took hashcat (3.5.0) 12 minutes start to finish (including 8 minutes just to load and sort the hashlist), which gives me projected time 2 years for all salts.