Hashes from clear passwords
#1
Is there a utility to create a hashes from cleartext passwords? (Hashcat in reverse).

I have a large list of recovered passwords in one format (sha1) that I would like to create hashes for in other formats such as NTLM, sha256, etc. where salts are not used (obviously).    I could script this in bash or python for some hashes, but this is inefficient and has limited hash type support compared to hashcat.

is there a capability / tool to perform this procedure?

thanks,

example:   Given the password "P@ssw0rd"
 -- the SHA1 is: 21bd12dc183f740ee76f27b78eb39c8ad972a757

# convert to sha256 
echo -n "P@ssw0rd" | sha256sum
b03ddf3ca2e714a6548e7495e2a03f5e824eaac9837cd7f159c67b90fb4b7342

#convert to md5
echo -n "P@ssw0rd" | md5sum
161ebd7d45089b3446ee4e0d86dbcf92
Reply
#2
"could script this in bash or python for some hashes, but this is inefficient and has limited hash type support compared to hashcat."

You could do every hash / type that Hashcat has in Python. The question is just: Why?
Reply
#3
(03-12-2024, 09:58 AM)DanielG Wrote: You could do every hash / type that Hashcat has in Python. The question is just: Why?

Yes, I 'could' recreate all the hash logic, but existing libraries have limited algorithm support.  

As to why:
  -- rapid audit of client environments during red-teaming, meeting the requirements not to reveal clear-text credentials (yes, you could do a A=B and B=C so A=C exercise, but this meets most requirements) 
  -- simplified data integration into products with specific hashing requirements
  -- etc.
Reply
#4
Okay fair enough. I tried looking around but I could not find a pre made tool that creates lists of different hashes from a wordlists. I'm afraid you'll need to define the hash types you want and get a custom tool/script made or make it yourself.
Reply
#5
do you want to build some kind of rainbowtable?

a script solution ala python/bash may seem slow, but you just need to run its once per inputfile so it seems a legit way to get your desired lists, you can store the result in different ways

or do you want a inmemory solution, generating all hashes on the fly?

i did a fast test with python and sha256:

generating 1.000.000 hashes in 1.84 seconds, so ~ 500.000 per second, i think this is fast enough for any inputfile

the only limiting factor is RAM (when you want to work inram, or storage io and storage) the main problem will be writing to disk or storing that amount of data (there is a reason rainbowtables are considered obsolete)

i think pythons hashlib will cover most used hashformats for your case, so i would stick with a python script
Reply
#6
Thanks all. This will not be updated all that frequently (monthly i'd guess), so I think python it is. As mentioned, most common cases can be handled by hashlib.

No need for in-memory or rainbow tables (which are mostly useless given modern GPUs). Not trying to recreate a broken wheel Smile
Reply
#7
If anyone is interested I created the script and posted a copy to pastebin:

https://pastebin.com/uQcK3V1h
Reply
#8
My response is a bit late, but for anyone else needing to hash a wordlist, I published a tool on github several years ago called hashgen. It's multi-threaded (30+ million md5/sec), supports $HEX[], and currently supports over a dozen output modes such as md5, ntlm, sha*, base64 encode/decode, crc32/64... etc.

https://github.com/cyclone-github/hashgen
~
Reply
#9
(05-23-2024, 05:51 PM)cyclone Wrote: My response is a bit late, but for anyone else needing to hash a wordlist, I published a tool on github several years ago called hashgen. It's multi-threaded (30+ million md5/sec), supports $HEX[], and currently supports over a dozen output modes such as md5, ntlm, sha*, base64 encode/decode, crc32/64... etc.

https://github.com/cyclone-github/hashgen


Haven't had a chance to read through your code yet, but I did something similar in python. (multithreaded, supports $hex[], STDIN or file input, STDOUT or file output, etc.)

https://github.com/wallacebw/hashutil 

Here's a snippet of the verbose output.   (parses ~235M sha1 hashes in 11 sec on a threadripper 3970X)

Code:
VERBOSE (process_multi): Total process pool cpu time: 0:09:54.737065
VERBOSE (process_multi):    Average process cpu time: 0:00:09.292767
VERBOSE (process_multi):                Elapsed time: 0:00:11.007720
VERBOSE (process_multi):    Process pool speed gain: 54.03x
VERBOSE (process_multi):          Process efficiency: 84.42%
VERBOSE (process_multi):          Temp File I/O time: 0:00:05.217029
VERBOSE (process_multi):      Process Pool loop time: 0:00:27.350114

Results:
      Input lines:
    skipped lines: 0
    Output lines: 236805847
======================================================
VERBOSE (main): Python init time time: 0:00:00.031992
VERBOSE (main): Total execution time: 0:00:27.353321
VERBOSE (main): Total processor time: 0:00:10.872765

Here are the parameters:

Code:
usage: hash_generator.py [-h] [-a HASH_ALGORITHMS] [-i [FILE]] [-p PARALLEL] [-t TEMP_DIRECTORY] [-s SEPARATOR] [-n] [-o OUTPUT_FILE] [-e
ERROR_FILE] [-u] [-v | -q]

Translate a file of cleartext strings (passwords) to hashes of the specified format(s)
to a text based separated file (TSV) with fields separated by -s / --separator [default ':']

options:
  -h, --help            show this help message and exit
  -a HASH_ALGORITHMS, --hash-algorithms HASH_ALGORITHMS
                        Comma separated Hash list to use (default: sha1) options are: sha1, sha224, sha256, sha384, sha512, sha3_224, sha3_256, sha3_384,, sha3_512, blake2b, blake2s, md5
  -i [FILE], --input-file [FILE]
                        The input file(s) of strings to parse, if omitted STDIN is used (comma separated)
  -p PARALLEL, --parallel PARALLEL
                        Number of processes to use or 'a' (default) for automatic detection
  -t TEMP_DIRECTORY, --temp-directory TEMP_DIRECTORY
                        Directory to use for temp files when --parallel is used default PWD)

Output Formatting:
  -s SEPARATOR, --separator SEPARATOR
                        The column separator to use in the output (default ':')
  -n, --no-header      Do not print the header line
  -o OUTPUT_FILE, --output-file OUTPUT_FILE
                        Output file, if omitted STDOUT is used
  -e ERROR_FILE, --error-file ERROR_FILE
                        Optional file to write lines that cannot be parsed
  -u, --hash-upper      Output the hash value in UPPERCASE (default is lowercase)

Output Verbosity:
  -v, --verbose        Verbose reporting of warnings (skipped lines) to STDERR (see -e) *** specify twice [-vv] for debugging (multiple messages per file line ***)
  -q, --quiet          Suppress all console output (STDOUT/STDERR)
Reply