12-04-2021, 12:41 AM
Let's say I have a word list of 10 gb when I have 8 gb ram. Does hashcat split a word list automatically, or you have to do it manually?
Large dictionary
|
12-04-2021, 12:41 AM
Let's say I have a word list of 10 gb when I have 8 gb ram. Does hashcat split a word list automatically, or you have to do it manually?
12-04-2021, 01:18 AM
I'm asking this because in the Hashcat-utils docs, under splitlen section, it says that "this optimization is no longer needed by modern hashcat."
12-04-2021, 01:27 PM
Hashcat doesn't load the wordlist into memory almost at all, so even if you have a 100GB wordlist, it'll still work just fine. Only if you have a huge amount of rules (Millions) then it could start filling up your VRAM, but that's nothing to do with wordlist size
|
« Next Oldest | Next Newest »
|