Posts: 10
Threads: 3
Joined: Feb 2018
Hello,
Just wondering if anyone thought this would be a good idea. Normally, when I have an NTLM hash or an LM hash to crack, I'll hit the web first and search a couple websites that have databases with literally over a trillion known hashes.
What do you guys think if hashcat where to have the option of checking these websites before attempting to crack a hash, to see if it was already cracked, and if it wasn't, if someone did successfully crack a hash, then to submit it to the website?
Just a thought.
Thanks!
Posts: 13
Threads: 4
Joined: Jan 2017
Actually, there exists a tool for this purpose:
https://github.com/kangfend/md5crack
Not sure if it is necessary to integrate it into hashcat.
Also, keep in mind, that probably not all cracking rigs that use hashcat are directly connected to the Internet (IMHO).
I would recommend writing a simple script that first runs a query with popular sites, like the one I linked above, and than starts Hashcat with the specific options You require.
Posts: 10
Threads: 3
Joined: Feb 2018
(03-31-2018, 07:13 AM)xkcd3301 Wrote: Actually, there exists a tool for this purpose:
https://github.com/kangfend/md5crack
Not sure if it is necessary to integrate it into hashcat.
Also, keep in mind, that probably not all cracking rigs that use hashcat are directly connected to the Internet (IMHO).
I would recommend writing a simple script that first runs a query with popular sites, like the one I linked above, and than starts Hashcat with the specific options You require.
See, I could do that, or modify the source code of hashcat to check the various websites, but the whole idea was for everyone to benefit, some of these websites have a very, very large number of known hashes. It'd be nice if we could fill in the missing ones. Every time someone successfully cracked any hash, it'd be added to a website. Eventually, there wouldn't be much cracking, or people would be using much, much stronger passwords I guess.
Whenever I successfully crack one that isn't added to the hashkiller website or whatever it is, I submit it. But I don't think anyone there really adds them. I still check the database for the hashes I submitted, and nadda.
Thanks for the link.
Posts: 2,301
Threads: 11
Joined: Jul 2010
It's a nice idea for an external tool but definitely should not end up in hashcat.
Posts: 10
Threads: 3
Joined: Feb 2018
(04-01-2018, 10:31 AM)undeath Wrote: It's a nice idea for an external tool but definitely should not end up in hashcat.
Is it because it'd be dealing with third-party websites or because there's so many different algorithms that hashcat supports that it'd be really hard to maintain? Site goes down, code has to be changed, or new algorithm gets adapted to hashcat, gotta find a site that collects the hashes, if any do exist?
And on top of that, the bandwidth for the website could be hammered?
Or do you have other ideas as to why it shouldn't end up in hashcat?
Posts: 2,301
Threads: 11
Joined: Jul 2010
hashcat is a specialised utility for cracking hashes quickly and efficiently. Querying third-party websites is totally out of scope and would require lots of additional maintenance, too.