07-20-2023, 03:38 AM
At the end of the day, all hashcat does is just this (in python):
My question is about this list_of_hash. Let's say I have 100k unsalted md5 hashes that I want to try. What is the best option the program does:
Ps: All this questions is just for fast unsalted hashes
Code:
list_of_hash = []
with open("wordlist.txt","r") as f:
for line in f:
calculate = hashlib.md5(line)
for h in list_of_hash:
if h == calculate:
print(f"{h}:{line}")
My question is about this list_of_hash. Let's say I have 100k unsalted md5 hashes that I want to try. What is the best option the program does:
- Load all hashes in the memory
- Load chunks of hashes every time. In this mode, it will run all lines of wordlist with the first 10k hashes, than loads more 10k and rerun the process
Ps: All this questions is just for fast unsalted hashes