07-22-2022, 05:49 PM
(07-22-2022, 05:36 PM)Chick3nman Wrote: Please read the errors in your output.
Code:Bitmap table overflowed at 18 bits.
This typically happens with too many hashes and reduces your performance.
You can increase the bitmap table size with --bitmap-max, but
this creates a trade-off between L2-cache and bitmap efficiency.
It is therefore not guaranteed to restore full performance.
Hmmm, I've read that and already halfed the used hashes down from 10.000.000. What is an appropriate nuber of hashes to process at a time? How do you still process all of them? Do you just split the file and write a bash script which executes hashcat for every file seperately or is there a more elegant solution?