05-17-2024, 12:34 PM
So I am trying to figure out how hashcat handles duplicates of hashes such as NTLMv2 (5600) when you have say 2500+ hashes for only 12 unique users. My thought was always that the GPU's handled the algorithm and hashing the candidates, but then the CPU takes over for comparing them against the hashes. So to me, having duplicates as long as not in excessive amounts like several hundred thousand shouldn't really slow down the progress as I am currently hitting speeds of like 3300-3400MH/s. So, even if the GPU is doing all the work that it shouldn't slow down the speeds by much.
However, my friend tried to disprove me and his argument looks very appealing and wanted to see if his test was not correct or if someone that knows the platform better could help me understand what is happening under the hood.
I have attached two images, one with 2664 hashes for 12 unique users and one with only a single hash for the 12 unique users.
However, my friend tried to disprove me and his argument looks very appealing and wanted to see if his test was not correct or if someone that knows the platform better could help me understand what is happening under the hood.
I have attached two images, one with 2664 hashes for 12 unique users and one with only a single hash for the 12 unique users.