Large dictionary
#1
Let's say I have a word list of 10 gb when I have 8 gb ram. Does hashcat split a word list automatically, or you have to do it manually?
Reply
#2
I'm asking this because in the Hashcat-utils docs, under splitlen section, it says that "this optimization is no longer needed by modern hashcat."
Reply
#3
Hashcat doesn't load the wordlist into memory almost at all, so even if you have a 100GB wordlist, it'll still work just fine. Only if you have a huge amount of rules (Millions) then it could start filling up your VRAM, but that's nothing to do with wordlist size
Reply