Posible bug using external salts -e
#1
Bug 
Today I was trying to recover a big list of SMF hashes where the salts are missing. the hashes where recovered. but after a while was supossed to hashcat stop, but my surprise was that hashcat was not responding and in the folder where the hash file was stored also was a file .new, that file was growing up till 7 GB !!!! also any hash was removed from the hash files. this are my logs.

Command
Code:
hashcat -a 0  -m 120 -e  '/root/dic/words.txt'     -o /root/Desktop/h_smf.txt '/root/Downloads/sha1-Uncracked.txt'   '/root/dic/antichat.ru.dic'  --remove

Initializing hashcat v0.44 by atom with 8 threads and 32mb segment-size...

Added external salts from file /root/dic/words.txt: 817 salts
Added hashes from file /root/Downloads/sha1-Uncracked.txt: 863601 (817 salts)

NOTE: press enter for status-screen


Input.Mode: Dict (/root/dic/antichat.ru.dic)
Index.....: 1/1 (segment), 3168698 (words), 28857060 (bytes)
Recovered.: 1/863601 hashes, 0/817 salts
Speed/sec.: 2.97M plains, 3.63k words
Progress..: 120368/3168698 (3.80%)
Running...: 00:00:00:33
Estimated.: 00:00:13:59

after a while

Code:
Input.Mode: Dict (/root/dic/antichat.ru.dic)
Index.....: 1/1 (segment), 3168698 (words), 28857060 (bytes)
Recovered.: 7/863601 hashes, 0/817 salts
Speed/sec.: 3.07M plains, 3.75k words
Progress..: 2995652/3168698 (94.54%)
Running...: 00:00:13:18
Estimated.: 00:00:00:46

Input.Mode: Dict (/root/dic/antichat.ru.dic)
Index.....: 1/1 (segment), 3168698 (words), 28857060 (bytes)
Recovered.: 7/863601 hashes, 0/817 salts
Speed/sec.: 3.04M plains, 3.72k words
Progress..: 3168698/3168698 (100.00%)
Running...: 00:00:14:11
Estimated.: --:--:--:--

I was force to use Ctrl+c cause that file was growing And I have not enough space.
this are some screenshots.

http://i47.tinypic.com/jakqb5.jpg - 6GB
http://i45.tinypic.com/ae9rbq.jpg - 7 GB
http://i48.tinypic.com/2meqozm.jpg - 7.2 GB whe finally pressed Ctrl+C

screenshot from terminal

http://i48.tinypic.com/2mdf615.jpg --- at beggining
http://i50.tinypic.com/2dsntlg.jpg ---- after a few times I pressed Enter hashcat did not respond and finally pressed Ctrl + C

Maybe I'm missing something about the function -e ?

Thnx
#2
what is being stored in the files that are growing in size?
#3
hashes with salt only.

http://i46.tinypic.com/2lvg0f7.jpg --screenshot
#4
Yes, I have this problem too. As the input file is being re-written (with the .new extension), it seems to grow without bounds, under certain conditions.

The solution is simple: Don't use the -remove option with -e. Remove the hashes in a second pass, with a different program.
#5
which program? that hashes are thousands, 876728 hashes D::::::
#6
I wrote my own. It's quite a bit faster than using hashcat, especially if you are removing a lot of hashes.

so, for example:

hashcat -a 0 -m 2611 -e salts -o mytemp.out input.txt passwords.txt

perl remove.pl mytemp.out <password.txt >password.new

where the remove.pl is something like

$infile = shift;
open (IN,"<$infile") || die "can't find input file $infile:$!";
while (<IN>) {
chop;
if (/([0-9a-f]{32}):/) {
$hf{$1} = $_;
}
}

while (<>) {
if (/([0-9a-f]{32})/) {
if (exists $hf{$1}) {
# optional, if you want the solutions in the output
# print "$hf{$1}\n";
next;
}
}
print;
}
#7
Wink 
Thnx a lot I'll Try it Big Grin