Limit on input-hashes?
#1
Hi,
I have a hash list with ~12000 lines (NTLM hashes). When I try to process the list, I get following errors:

Code:
./cudaHashcat64.bin -a 0 -m 1000 /AUDITS/XXX/pwdump.txt /XXX/wordlist.txt
....

....
WARNING: Hashfile '/AUDITS/XXX/pwdump.txt' in line 11808 (): Line-length exception
WARNING: Hashfile '/AUDITS/XXX/pwdump.txt' in line 11809 (): Line-length exception
WARNING: Hashfile '/AUDITS/XXX/pwdump.txt' in line 11810 (): Line-length exception

If I  copy some lines from the list (e.g. the last 1000 lines) into another file, and use this new file as input, everything works fine.

Line format:
Code:
User:ID:LM-Hash:NTLM-Hash:::

The size of the input file is ~2MB. I use following OS and GPU:

Code:
3.16.0-57-generic #77~14.04.1-Ubuntu SMP Thu Dec 17 23:20:00 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux

01:00.0 VGA compatible controller: NVIDIA Corporation GK106 [GeForce GTX 660] (rev a1)

Is there a way to process all 12000 hashes at once?

Thanks for your help in advance!
#2
Well, you need to check what exactly is within line 11808, 11809, 11810 .
Are these empty lines ?

It just gives a warning that for line 11808, 11809, 11810 it wasn't able to parse these hashes (probably because there are no hashes in line 11808, 11809, 11810 etc).

Did you double-check that ?

No there is no such small limit (most of the time you will be limited by RAM etc but we were successfull to load over 100M MD5 hashes).
#3
(01-20-2016, 03:39 PM)philsmd Wrote: Well, you need to check what exactly is within line 11808, 11809, 11810 .
Are these empty lines ?

It just gives a warning that for line 11808, 11809, 11810 it wasn't able to parse these hashes (probably because there are no hashes in line  11808, 11809, 11810 etc).

Did you double-check that ?

No there is no such small limit (most of the time you will be limited by RAM etc but we were successfull to load over 100M MD5 hashes).

Hi,
thanks for the quick reply. The listing shows only the last few lines as an example. They are not empty.

In fact I get this error for every single line of the input file. But as I said, if I copy some lines into a new file, I don't get any errors. I just tested again using the last 17 lines (which include lines 11808, 11809, 11810):

Code:
Session.Name...: cudaHashcat
Status.........: Exhausted
Input.Mode.....: File (/PATH/wordlist.txt)
Hash.Target....: File (/AUDITS/***/test.txt)
Hash.Type......: NTLM
Time.Started...: 0 secs
Time.Estimated.: 0 secs
Speed.GPU.#1...: 15630.9 kH/s
Recovered......: 0/17 (0.00%) Digests, 0/1 (0.00%) Salts
Progress.......: 11809/11809 (100.00%)
Rejected.......: 262/11809 (2.22%)
HWMon.GPU.#1...: -1% Util, 43c Temp, 10% Fan

Started: Wed Jan 20 14:55:09 2016
Stopped: Wed Jan 20 14:55:11 2016
#4
Update:
Could it be that there is a problem with the line ending (Unix vs. Windows)?

After I copied the whole file (via copy-paste) on a Linux machine, all 12000 lines in the file were accepted.
#5
Thanks for the information.

But I am not able to reproduce this problem. Here it works with both line endings (\n and \r\n, i.e. 0a and 0d 0a).
Maybe there is something else strange with that output file...

It would be great and much appreciated if you can analyze this problem in more detail (maybe by hex dumping it, xxd etc), I even tried with dos2unix, unix2dos etc.
If you identify a problem and know how we can reproduce it, please open an issue here: https://github.com/hashcat/oclHashcat/issues and provide an anonymized/masked example + description.
Thank you very much
#6
I extracted the hashes with Invoke-DCSync.ps1 (https://gist.github.com/monoxgas/9d238accd969550136db) and used
Powershell's Out-File cmdlet to pipe the output in a textfile. This textfile was the problem.

After I opended the textfile in Ubuntu's gedit and copy-pasted the content in a newly created text file, the new text file did work with hashcat.

Hope that helps!