08-16-2020, 12:58 AM
So this is limitation of hashcat itself. It uses only one core to cache the dictionary. I hope an update will be made to introduce the ability to cache multiple files at the same time or use multi cores.
Copy and reuse dictionary cache
|
08-16-2020, 12:58 AM
So this is limitation of hashcat itself. It uses only one core to cache the dictionary. I hope an update will be made to introduce the ability to cache multiple files at the same time or use multi cores.
08-16-2020, 12:43 PM
Simply adding multiprocessing to that step won't do the trick. The bottleneck is likely not the CPU but your hard drive from which the wordlist has to be read in full.
08-16-2020, 09:43 PM
(08-16-2020, 12:43 PM)undeath Wrote: Simply adding multiprocessing to that step won't do the trick. The bottleneck is likely not the CPU but your hard drive from which the wordlist has to be read in full. I'm using NVMe which is super fast (around 600MB/s). The CPU is saturated at 100% usage (1 core only). If the NVMe is being fully utilized, then the hashcat would finish caching 20GB file in less than a minute, but this is not happening and the disk is not fully utilized. |
« Next Oldest | Next Newest »
|