I've generated two hashes myself using SHA1 with NO salt value and put them in a file:
40e5cdd056f635757c9df10b27d0e12ffd30c4db
45777ceee45c74daca22edfd44c94cb92a53de7e
I then created a table file with the password these hashes were created from:
12345
123456
But hashcat never cracks them. I must not be choosing the right options. What am I doing wrong here?
The command line the GUI generates is: hashcat-cli64.exe --hash-mode 100 --output-file D:/hashes2-recovered.txt D:/hashes2.txt D:\dictionaries
One thing I noticed is that the output file and input files are using forward slashes instead of backslashes.
01-19-2012, 04:07 PM (This post was last modified: 01-19-2012, 04:32 PM by strandedpirate.)
These are generated using C# with the code below. I attached a screen shot too. I can't get my code to generate a string matching the ones you posted. I think the issue is encoding. I've tried all the available ones in the framework. It seems that hashcat assumes a specific encoding and any other encoding will cause it to fail. Can you post the source code for the way your generating your hashes?
After a lot of testing this is either a text encoding problem or a logic mismatch in SHA1 algorithm implementations between Windows and Linux.
For a windows OS the strings I've posted are valid and confirmed by many other applications I've downloaded and tested using SHA1.
I hope its just a text encoding issue but I'm not certain of how or if its even possible to map the windows Windows-1252 text encoding to match the Linux encoding that this program can digest. I looped through every single encoding on Windows 7, roughly 80 different ones, and none of them remotely matched the strings that are being generated on your linux/unix boxes for the value '12345'.
Linux - 8cb2237d0679ca88db6464eac60da96345513964
Windows 7 (UTF8) - 1cba6360d8b03617fb7b33443596691b6e90006c
Windows 7 (UTF16) - 40e5cdd056f635757c9df10b27d0e12ffd30c4db
there is not algorithm implementation difference in linux and windows. thing is, passwords are usually not stored in utf8 / utf16. if you want to crack utf8 passwords you have to use hex-charsets. but i think this is not what you want to do.
I think your basing those two statements on nothing.. Both sql server and oracle have a nchar and nvarchar(2) which are both unicode data types which are both widely used at every company I've consulted for. .Net's algorithm is written by Microsoft which is clearly not the same group who wrote the linux version. So yes there could be a small or even large difference between the code bases.
And drum roll... I just found the answer. I was using a class called HMACSHA1 to generate the hashes. I found another SHA-1 class called SHA1Managed which DOES produce the the same exact string your linux box is spitting out and it is in UTF-8. So the bottom line is that Hashcat knows nothing about HMACSHA1 yet.
Whoever owns this codebase may want to head over to: http://msdn.microsoft.com/en-us/library/...raphy.aspx and check out the various .Net cryptography classes. It would be great if hashcat handled the various hashing methods out of the box.