PRINCE creating dups
#1
for example i have a wordlist containing the following:
Code:
my
password
is
the
best

but i get a lot of dups even though the wordlist itself has unique words:
Code:
mypasswordisthebest
mypasswordisthebest
mypasswordisthebest
mypasswordisthebest
mypasswordisthebest
mypasswordisthebest

hashcat-cli64.bin -a 6 --stdout example.dict | wc -l
124184
hashcat-cli64.bin -a 6 --stdout example.dict | sort -u | wc -l
113987

any way to de-dup this?
#2
It's not about prince, it's about hashcat CPU. You can post a trac ticket if you want

Quote:root@ht:~/princeprocessor# ./pp64.bin < wordlist | grep mypasswordisthebest | wc -l
1
#3
does this mean it's better to use standalone prince and then pipe to hashcat vs. -a 6 mode?
#4
Since piping is considerably slower it's probably still faster to use the integrated mode.
#5
Perhaps I've misunderstood the hashcat status but I thought the 113986 number is total words in the wordlist? It would appear the integrated prince produced more chains than standalone version.

integrated prince = 113986
standalone prince = 5638

Code:
forumhero@cpx1:~$ cat md5.test.hash
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

forumhero@cpx1:~$ cat example.dict
my
password
is
the
best

forumhero@cpx1:~$ pp64.bin < example.dict | wc -l
5638

forumhero@cpx1:~$ hashcat-0.49/hashcat-cli64.bin -a 6 md5.test.hash example.dict
Initializing hashcat v0.49 with 8 threads and 32mb segment-size...

Added hashes from file md5.test.hash: 1 (1 salts)
Activating quick-digest mode for single-hash

NOTE: press enter for status-screen


Input.Mode: Dict (example.dict)
Index.....: 0/1 (segment), 113986 (words), 0 (bytes)
Recovered.: 0/1 hashes, 0/1 salts
Speed/sec.: - plains, - words
Progress..: 0/113986 (0.00%)
Running...: --:--:--:--
Estimated.: --:--:--:--

Started: Thu Jan  8 15:32:13 2015
Stopped: Thu Jan  8 15:32:14 2015
#6
It depends on some default values. The ones in standalone prince are set to a lower value than on hashcat cpu.

However, it does not matter. The user is supposed to use prince with a dictionary of 100k-10m words and in this case the total number is so high that it will run forever (regardless of the speed you guess).
#7
Both problems from this thread and the memory problem from a different thread should be fixed in latest beta version
#8
thank you, atom