Copy and reuse dictionary cache
#21
So this is limitation of hashcat itself. It uses only one core to cache the dictionary. I hope an update will be made to introduce the ability to cache multiple files at the same time or use multi cores.
Reply
#22
Exactly this.
~
Reply
#23
Simply adding multiprocessing to that step won't do the trick. The bottleneck is likely not the CPU but your hard drive from which the wordlist has to be read in full.
Reply
#24
(08-16-2020, 12:43 PM)undeath Wrote: Simply adding multiprocessing to that step won't do the trick. The bottleneck is likely not the CPU but your hard drive from which the wordlist has to be read in full.

I'm using NVMe which is super fast (around 600MB/s). The CPU is saturated at 100% usage (1 core only).

If the NVMe is being fully utilized, then the hashcat would finish caching 20GB file in less than a minute, but this is not happening and the disk is not fully utilized.
Reply