Large SHA-1 hashfile - Printable Version +- hashcat Forum (https://hashcat.net/forum) +-- Forum: Support (https://hashcat.net/forum/forum-3.html) +--- Forum: hashcat (https://hashcat.net/forum/forum-45.html) +--- Thread: Large SHA-1 hashfile (/thread-10889.html) |
Large SHA-1 hashfile - miles387 - 07-22-2022 Hello everyone, I'm pretty new to this thus I have a few questions regarding large hashfiles and how they slow down hashcat. I'm getting nowhere near the performance I get when using ./hashcat --benchmark I have hashfile with 5.000.000 sha-1 hashes and a wordlist with 1.000.000.000 words. I use the best64 Rule. command: ./hashcat -a 0 -O -m 100 ".\hashes\5mil_small.hashes" ".\wordlists\generated_pass_1bil.dict" -r .\rules\best64.rule -o ".\cracked\passGAN_1Bil_small.cracked" --status --status-timer 10 in my benchmark i get a sha-1 performance of 22237 MH/s in my actual run i get ca 3897 KH/s which strangely increases over time? In the taskmanager there is nearly no load on my GPU Do you have any idea what my Problem could be and how to solve it? Thanks in advance. This is the output when starting hashcat with the command shown above. Code: hashcat (v6.2.5) starting RE: Large SHA-1 hashfile - Chick3nman - 07-22-2022 Please read the errors in your output. Code: Bitmap table overflowed at 18 bits. RE: Large SHA-1 hashfile - miles387 - 07-22-2022 (07-22-2022, 05:36 PM)Chick3nman Wrote: Please read the errors in your output. Hmmm, I've read that and already halfed the used hashes down from 10.000.000. What is an appropriate nuber of hashes to process at a time? How do you still process all of them? Do you just split the file and write a bash script which executes hashcat for every file seperately or is there a more elegant solution? |