Posts: 2
Threads: 1
Joined: Aug 2022
Hi
To be most efficient I should split a big file of hashes into multiple smaller ones. My question is, is there any better way of loading those lists in rather than doing it manually? It feels somewhat annoying to manually keep loading ~200 files 1 by 1 or is this how it's suppose to be?
Regards
Posts: 888
Threads: 15
Joined: Sep 2017
splitting a list of hashes will be everyting, but not efficient
you attack each list with same settings and same password-candidates, so you will do the same work multiple times and this is not not not not not efficient
dont split the file
Posts: 2
Threads: 1
Joined: Aug 2022
But what about the Bitmap table overflowed message? Should I just ignore it then?
Posts: 888
Threads: 15
Joined: Sep 2017
can you show the output of hashcat when starting your attack?
how big is your file? is hashcat starting the attack or does it stop with overflow message?
Posts: 930
Threads: 4
Joined: Jan 2015
(08-29-2022, 08:06 PM)jilaluxs Wrote: But what about the Bitmap table overflowed message? Should I just ignore it then?
Just like the feedback message says, you can adjust the bitmap-max (with --bitmap-max parameter), but it may not be as fast as if you hadn't overflowed. (But can still be faster than running half the list in two runs!)
~