01-20-2016, 03:31 PM
Hi,
I have a hash list with ~12000 lines (NTLM hashes). When I try to process the list, I get following errors:
If I copy some lines from the list (e.g. the last 1000 lines) into another file, and use this new file as input, everything works fine.
Line format:
The size of the input file is ~2MB. I use following OS and GPU:
Is there a way to process all 12000 hashes at once?
Thanks for your help in advance!
I have a hash list with ~12000 lines (NTLM hashes). When I try to process the list, I get following errors:
Code:
./cudaHashcat64.bin -a 0 -m 1000 /AUDITS/XXX/pwdump.txt /XXX/wordlist.txt
....
....
WARNING: Hashfile '/AUDITS/XXX/pwdump.txt' in line 11808 (): Line-length exception
WARNING: Hashfile '/AUDITS/XXX/pwdump.txt' in line 11809 (): Line-length exception
WARNING: Hashfile '/AUDITS/XXX/pwdump.txt' in line 11810 (): Line-length exception
If I copy some lines from the list (e.g. the last 1000 lines) into another file, and use this new file as input, everything works fine.
Line format:
Code:
User:ID:LM-Hash:NTLM-Hash:::
The size of the input file is ~2MB. I use following OS and GPU:
Code:
3.16.0-57-generic #77~14.04.1-Ubuntu SMP Thu Dec 17 23:20:00 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
01:00.0 VGA compatible controller: NVIDIA Corporation GK106 [GeForce GTX 660] (rev a1)
Is there a way to process all 12000 hashes at once?
Thanks for your help in advance!