[-m 16300] HOW: multiple hashes with the same salt
#1
Question 
Hi,
I need to work with hashes of this type (Ethereum Pre-Sale Wallet, PBKDF2-HMAC-SHA256):
$ethereum$w*100...00*ab...00*cd...00
$ethereum$w*200...00*ab...00*cd...00
peculiarity of these hashes is that only one character in the "ecseed" field is different; everything else (ethaddr, bkp) is the same. As a result, I get a warning: "This hash mode plugin cannot crack multiple hashes with the same salt, please select one of the hashes."
Hm. How to process many (thousands) of hashes with the same salt in 16300 mode?
Thanks for your attention.
Reply
#2
You may have fake wallets but either way, you can either just backdate to release Hashcat or separate each individual hash into it's own file. That error can sometimes be ignored. Also, you may want to just try one anyway to see if that works. It may just be that they all have the same password so trying all of them at once may not even be necessary. I'm not too sure though
Reply
#3
Thanks for your reply.
1. Is there any release of Hashcat where this error will not appear?
2. If I divide all the hashes into a thousand files (one in each) - how can I automate the processing? Will it be possible to run one Hashcat instance or many parallel ones?
Reply
#4
(05-16-2024, 10:33 AM)tao Wrote: 1. Is there any release of Hashcat where this error will not appear?

Try 6.2.5. There's no guarantee it'll crack all the hashes, though. See https://github.com/hashcat/hashcat/issues/3641
Reply
#5
(05-16-2024, 11:18 AM)buka Wrote:
(05-16-2024, 10:33 AM)tao Wrote: 1. Is there any release of Hashcat where this error will not appear?

Try 6.2.5. There's no guarantee it'll crack all the hashes, though. See https://github.com/hashcat/hashcat/issues/3641

Thanks for your reply.
The probability of skip hashes looks critical. But I'll try it.
And yet, is it possible to start automatic processing of multiple files with hashes (in parallel or sequentially) precisely due to the Hashcat’s options?
Reply
#6
(05-16-2024, 11:18 AM)buka Wrote:
(05-16-2024, 10:33 AM)tao Wrote: 1. Is there any release of Hashcat where this error will not appear?

Try 6.2.5. There's no guarantee it'll crack all the hashes, though. See https://github.com/hashcat/hashcat/issues/3641

no, it doesn't work in 6.2.5. version either:
Hashes: 150000 digests; 1 unique digests, 1 unique salts
Reply
#7
(05-16-2024, 05:23 PM)tao Wrote:
(05-16-2024, 11:18 AM)buka Wrote:
(05-16-2024, 10:33 AM)tao Wrote: 1. Is there any release of Hashcat where this error will not appear?

Try 6.2.5. There's no guarantee it'll crack all the hashes, though. See https://github.com/hashcat/hashcat/issues/3641

Thanks for your reply.
The probability of skip hashes looks critical. But I'll try it.
And yet, is it possible to start automatic processing of multiple files with hashes (in parallel or sequentially) precisely due to the Hashcat’s options?

A batch file could accomplish this, try this site:

https://jpm22.github.io/txt/Line-Combina...rator.html

add a box and put your hashs all in one box and the relevant hashcat commands either site and generate, then add 
#!/bin/bash
to the top line 
chmod +x batch.sh

then
./batch.sh

hope gives you rough idea
Reply
#8
(05-30-2024, 10:15 AM)CmdFlaz Wrote: hope gives you rough idea

Thanks for your reply.
I tried this here: https://hashcat.net/forum/thread-12004-p...l#pid60908
But it's too slow for me. Too much prep work (~20 seconds) for very short main work time (~5 seconds).
Reply