Large dictionary - Printable Version +- hashcat Forum (https://hashcat.net/forum) +-- Forum: Support (https://hashcat.net/forum/forum-3.html) +--- Forum: hashcat (https://hashcat.net/forum/forum-45.html) +--- Thread: Large dictionary (/thread-10497.html) |
Large dictionary - maxspark - 12-04-2021 Let's say I have a word list of 10 gb when I have 8 gb ram. Does hashcat split a word list automatically, or you have to do it manually? RE: Large dictionary - maxspark - 12-04-2021 I'm asking this because in the Hashcat-utils docs, under splitlen section, it says that "this optimization is no longer needed by modern hashcat." RE: Large dictionary - penguinkeeper - 12-04-2021 Hashcat doesn't load the wordlist into memory almost at all, so even if you have a 100GB wordlist, it'll still work just fine. Only if you have a huge amount of rules (Millions) then it could start filling up your VRAM, but that's nothing to do with wordlist size |