How hashcat deal with a list of hash?
#1
Question 
At the end of the day, all hashcat does is just this (in python):

Code:
list_of_hash = []          
with open("wordlist.txt","r") as f:
    for line in f:                 
        calculate = hashlib.md5(line)
        for h in list_of_hash:
            if h == calculate:
                print(f"{h}:{line}")

My question is about this list_of_hash. Let's say I have 100k unsalted md5 hashes that I want to try. What is the best option the program does:
  • Load all hashes in the memory 
  • Load chunks of hashes every time. In this mode, it will run all lines of wordlist with the first 10k hashes, than loads more 10k and rerun the process
When will be too slow to loads all hashes in the memory? Does I desire the program to to load all hashes or this can make the program slower? What size is the best size? Where can I configure it?

Ps: All this questions is just for fast unsalted hashes
Reply


Messages In This Thread
How hashcat deal with a list of hash? - by rodrigo.Brasil - 07-20-2023, 03:38 AM