Resume is broken
#1
(I remade this post so hopefully my old confusing one will be gone soon as I asked a admin to delete it.)

So I am using the latest version of The-Distribution-Which-Does-Not-Handle-OpenCL-Well (Kali), which is all upto dateand runnin it Nividia Drivers.

I have my password file which is 1TB and .hccap file stored on an external USB 4TB drive

The command I use is hashcat -w 3 -m 2500 out.hccap.hccap  zyzyzyzy-abababab.txt

When I am in the external HD drives directory

/media/root/Zim

The hashcat command then starts and the command line returned is :

 Generating dictionary stats for zyzyzyzy-abababab.txt: 154010289600 bytes (14.92%), 17112254400 words,

Until it reaches 100% where It then starts cracking.


Session.Name...: hashcat
Status.........: Running
Input.Mode.....: File (zyzyzyzy-abababab.txt)
Hash.Target....: VM900358-2G (e8:fc:af:26:e7:18 <-> 60:fe:1e:20:1c:ab)
Hash.Type......: WPA/WPA2
Time.Started...: Fri Jul 15 15:13:48 2016 (2 mins, 33 secs)
Time.Estimated.: Thu Jul 28 00:35:02 2016 (12 days, 9 hours)
Speed.Dev.#1...: 107.1 kH/s (15.21ms)
Recovered......: 0/1 (0.00%) Digests, 0/1 (0.00%) Salts
Progress.......: 18743296/114661785600 (0.02%)
Rejected.......: 0/18743296 (0.00%)
Restore.Point..: 18743296/114661785600 (0.02%)

[s]tatus [p]ause [r]esume [b]ypass [c]heckpoint [q]uit =>

When I hit the C key it quits and saves the data.
I goto /root/.hascat and copy it, then saving it to an external folder on my external HD.
Then I type in hashcat --resume
and it goes back to cracking.


I always repeat the same procedure again and again, but sometimes it goes straight back to

Generating dictionary stats for zyzyzyzy-abababab.txt: 154010289600 bytes (14.92%), 17112254400 words,


From 0% even if I delete the .hascat folder and replace it with a backup that works.
Because it a 1TB file this can take hours and I dont think it resumes from where it left off.
It just leaves me back to cracking from 0% which is annoying as I need 6 days to crack my WPA key.

I have noticed since I un-installed hascat 2
and installed hashcat 3, where it was DL to my eternal drive.
Hashcat runs from a folder in usr/local/bin and not the file on my Hashcat-3.00 folder on my external drive.

So can any one help me please or will have to just split my password file down in to chunks that will take 8 hours for each one.

Thanks.
#2
Hashcat will recognize if a wordlist changed (or was recreated). That makes sense in your case, so it's correct if it starts from 0. And that's correct to do it, because if you change it, the keyspace is modified. So don't confuse the .dictstat with the .restore file.
#3
Is this because its on an external drive ?
The word list is read only and I have never edited the word list, nor changed its location.
All files remain in the same place, for some reason there is just a reset.

So why would hashcat think the word list has changed.
#
Thanks
#4
So in theory it should continue from where I left off ?
1Tb files takes hours to do calc the dictionary, its a pain in the ass.
I`m on 50%, so shall see what happens soon, I should go straight to 10% cracked.
#5
Sorry about all my questions, so without editing or changing the password dic file, what would make hascat think its been modified or recreated >
Edited to add I only replace .hashcat file when the dictionary thing starts again for no reason, I just quit with c back up and resume.
most of the time it works.
#6
it came to 6%, which I thik it resumed from an old hashcat.restore file so I replaced it with an update 10% file
and I`m now back to processing the dictionary file.

Would of been faster just to let it run as it takes about 4-8 hours just to start cracking because of the size of my dic file.
#7
The restore system is fine as it is and works well. It would be better to not use files that obviously include a BF keyspace. This is the root of your problem
#8
Yeah 1TB is a bit big, especially when recalculating, take a few hours and the rest.
I do love this program and the restore function.
I for some reason with the big file size and being on a portable HD, it glitches it.
I guess 1TB is far too big, so what the recommended max file size to use with this program.
Would 100GB be better or is that far too big.

Thanks
#9
Ill try this command then split -l 14613385216 zyzyzyzy-abababab.txt
as 14613385216 was the last line number to be checked. Because this was around 12%, it should be split into 10, 140GB something files.
Is this still too big ?
Hopefully Ill have less problems in future.
#10
was close its 131.5 GB, Just another 8 to do, even this is time consuming, its not as ifI have a slow machine either.