hashcat Forum

Full Version: Hash cracking big list of hashes
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi

To be most efficient I should split a big file of hashes into multiple smaller ones. My question is, is there any better way of loading those lists in rather than doing it manually? It feels somewhat annoying to manually keep loading ~200 files 1 by 1 or is this how it's suppose to be?

Regards
splitting a list of hashes will be everyting, but not efficient

you attack each list with same settings and same password-candidates, so you will do the same work multiple times and this is not not not not not efficient

dont split the file
But what about the Bitmap table overflowed message? Should I just ignore it then?
can you show the output of hashcat when starting your attack?

how big is your file? is hashcat starting the attack or does it stop with overflow message?
(08-29-2022, 08:06 PM)jilaluxs Wrote: [ -> ]But what about the Bitmap table overflowed message? Should I just ignore it then?

Just like the feedback message says, you can adjust the bitmap-max (with --bitmap-max parameter), but it may not be as fast as if you hadn't overflowed. (But can still be faster than running half the list in two runs!)