Large SHA-1 hashfile
#3
(07-22-2022, 05:36 PM)Chick3nman Wrote: Please read the errors in your output.


Code:
Bitmap table overflowed at 18 bits.
This typically happens with too many hashes and reduces your performance.
You can increase the bitmap table size with --bitmap-max, but
this creates a trade-off between L2-cache and bitmap efficiency.
It is therefore not guaranteed to restore full performance.

Hmmm, I've read that and already halfed the used hashes down from 10.000.000. What is an appropriate nuber of hashes to process at a time? How do you still process all of them? Do you just split the file and write a bash script which executes hashcat for every file seperately or is there a more elegant solution?
Reply


Messages In This Thread
Large SHA-1 hashfile - by miles387 - 07-22-2022, 04:11 PM
RE: Large SHA-1 hashfile - by Chick3nman - 07-22-2022, 05:36 PM
RE: Large SHA-1 hashfile - by miles387 - 07-22-2022, 05:49 PM