hashcat Forum
Status: Exhausted? - Printable Version

+- hashcat Forum (https://hashcat.net/forum)
+-- Forum: Deprecated; Previous versions (https://hashcat.net/forum/forum-29.html)
+--- Forum: Old oclHashcat Support (https://hashcat.net/forum/forum-38.html)
+--- Thread: Status: Exhausted? (/thread-4264.html)



Status: Exhausted? - kitoliwa - 04-07-2015

I am having a problem with hashcat. I seem able to only crack hashes as long as it is only one single hash inside my hashes text file. If I put multiple hashes. it will not crack any. I separate the hashes by putting each one on a new line, and I have generated the hashes myself and tried cracking them separately and it works. So, what am I doing wrong here?

My set-up is:

hashes.txt
Wordlists\wordlist1.txt
cracked.txt

My run of the code is:

cudahashcat64.exe -m 0 -a 0 -o cracked.txt hashes.txt Wordlists\wordlists1.txt

UPDATE: I am now using 'Notepad ++', same thing is still happening. I now get no hashes cracked still. Even if I have the passwords in my wordlist.

Here is the cmd log:

C:\Hashcat>cudahashcat64.exe -m 0 -a 0 hashes.txt Wordlists\wordlist1.txt -o cra
cked.txt
cudaHashcat v1.35 starting...

Device #1: GeForce GTX 970M, 3072MB, 1038Mhz, 10MCU
Device #1: WARNING! Kernel exec timeout is not disabled, it might cause you erro
rs of code 702
You can disable it with a regpatch, see here: http://hashcat.net/wiki
/doku.php?id=timeout_patch

Hashes: 2 hashes; 2 unique digests, 1 unique salts
Bitmaps: 8 bits, 256 entries, 0x000000ff mask, 1024 bytes, 0/1 rotates
Rules: 1
Applicable Optimizers:
* Zero-Byte
* Precompute-Init
* Precompute-Merkle-Demgard
* Meet-In-The-Middle
* Early-Skip
* Not-Salted
* Not-Iterated
* Single-Salt
* Scalar-Mode
* Raw-Hash
Watchdog: Temperature abort trigger set to 90c
Watchdog: Temperature retain trigger set to 80c
Device #1: Kernel ./kernels/4318/m00000_a0.sm_52.64.ptx

Cache-hit dictionary stats Wordlists\wordlist1.txt: 174 bytes, 20 words, 20 keys
pace

ATTENTION!
The wordlist or mask you are using is too small.
Therefore, oclHashcat is unable to utilize the full parallelization power of y
our GPU(s).
The cracking speed will drop.
Workaround: https://hashcat.net/forum/thread-4161.html

Session.Name...: cudaHashcat
Status......: Exhausted
Input.Mode.....: File (Wordlists\wordlist1.txt)
Hash.Target....: File (hashes.txt)
Hash.Type......: MD5
Time.Started...: 0 secs
Time.Estimated.: 0 secs
Speed.GPU.#1...: 0 H/s
Recovered......: 0/2 (0.00%) Digests, 0/1 (0.00%) Salts
Progress......: 20/20 (100.00%)
Skipped......: 0/20 (0.00%)
Rejected......: 0/20 (0.00%)
HWMon.GPU.#1...: 0% Util, 50c Temp, N/A Fan

Started: Wed Apr 08 14:27:30 2015
Stopped: Wed Apr 08 14:27:32 2015


RE: Status: Exhausted? - epixoip - 04-07-2015

hashed with newline char, trailing space in wordlist, wrong capitalization. could be any number of factors.


RE: Status: Exhausted? - kitoliwa - 04-08-2015

(04-07-2015, 05:34 AM)epixoip Wrote: hashed with newline char, trailing space in wordlist, wrong capitalization. could be any number of factors.

I've updated my post with more information. Thank you.


RE: Status: Exhausted? - atom - 04-08-2015

But how can we verify that your artifical data is correct if you do not post them? If you really want our help, think what we need to reproduce your case. What about the plaintexts, what about the hashes, how did you generate them, etc


RE: Status: Exhausted? - undeath - 04-08-2015

(04-08-2015, 10:59 AM)atom Wrote: What about the plaintexts, what about the hashes, how did you generate them, etc

On other threads people get banned for posting hashes without explicitly being asked to do so. Just saying.


RE: Status: Exhausted? - epixoip - 04-08-2015

There is an issue with -m 0 in cudaHashcat 1.35 and that's likely why it isn't finding the passwords.


RE: Status: Exhausted? - kitoliwa - 04-09-2015

(04-08-2015, 05:09 PM)epixoip Wrote: There is an issue with -m 0 in cudaHashcat 1.35 and that's likely why it isn't finding the passwords.

Yes, you are right. I am happy to report that I've downgraded to 1.31 and it all works now.

Thanks for all your help.