Table of Contents

Distributing workload in oclhashcat


This article explains how to split large brute-force projects using oclHashcat.

Why do it this way?

Rather than just leaving a project running for days on end, you can split the project into chunks making it more manageable. Another benefit is you could also give other users / computers parts of the projects to run decreasing the overall run-time.


In this example, I'll setup a project to run mixalpha-num length 8:

Charset: 62

Length: 8

Lets say I get 2 billion/sec on my GPU. To calculate total crack-time, divide total_combinations by GPU speed/sec:

Now, I want to break it up into 4 hr long projects each so, to work out how many parts of the charset are in each project:

Basically, round it down to a whole number and put however many chars from the charset in each project and use the custom charsets to define whats being used:

./oclHashcat.bin -1 ?d?l?u -2 01234567 -o [output-file] [hashes-file] ?1?1?1?1 ?1?1?1?2
./oclHashcat.bin -1 ?d?l?u -2 89abcdef -o [output-file] [hashes-file] ?1?1?1?1 ?1?1?1?2
./oclHashcat.bin -1 ?d?l?u -2 ghijklmn -o [output-file] [hashes-file] ?1?1?1?1 ?1?1?1?2
./oclHashcat.bin -1 ?d?l?u -2 opqrstuv -o [output-file] [hashes-file] ?1?1?1?1 ?1?1?1?2
./oclHashcat.bin -1 ?d?l?u -2 wxyzABCD -o [output-file] [hashes-file] ?1?1?1?1 ?1?1?1?2
./oclHashcat.bin -1 ?d?l?u -2 EFGHIJKL -o [output-file] [hashes-file] ?1?1?1?1 ?1?1?1?2
./oclHashcat.bin -1 ?d?l?u -2 MNOPQRST -o [output-file] [hashes-file] ?1?1?1?1 ?1?1?1?2
./oclHashcat.bin -1 ?d?l?u -2 UVWXYZ -o [output-file] [hashes-file] ?1?1?1?1 ?1?1?1?2

Notice how the first custom charset deals with the whole charset and the second one deals with just the 8 parts of the charset.