How to optimize attacking very large hashes
#1
Hi,

I'm trying to attack a very large hashes, few GB in size and manage to get it down now to a few hundreds MB. The problem is that it takes a long time to:
  • remove recovered hashes from the hash file (i'm using --remove)


Example:
doing -a 3 ?a?a?a?a?a?a attack completed fast on my 2 x GTX 1080, but to get back to command prompt will take a while
  • when loading, it takes a long time to compare hashes with pot file
Code:
Comparing hashes with potfile entries...

How can I optimize this and make it a lot faster.

Thank you.

Best regards,
Azren
#2
(06-14-2016, 08:08 AM)azren Wrote: Hi,

I'm trying to attack a very large hashes, few GB in size and manage to get it down now to a few hundreds MB. The problem is that it takes a long time to:
  • remove recovered hashes from the hash file (i'm using --remove)


Example:
doing -a 3 ?a?a?a?a?a?a attack completed fast on my 2 x GTX 1080, but to get back to command prompt will take a while
  • when loading, it takes a long time to compare hashes with pot file
Code:
Comparing hashes with potfile entries...

How can I optimize this and make it a lot faster.

Thank you.

Best regards,
Azren
from what i understand from what u are saying is, u are using --remove so each hash u recover his password is removed from the list, and what take time is to reading from the pot file, 
why not just to change the name of pot file so hashcat wont read from it.
u can also try to remove duplicate recover hashes
#3
Don't use --remove, use --show and --left with -o /dev/null instead.
#4
That helps. Saved me about 5 to 10 minutes per iteration. Thanks.

Best regards,
Azren