Posts: 5
	Threads: 3
	Joined: May 2010
	
	
 
	
	
		How about a way to divide the workload to multiple computers so that if something takes 10 days to do, then it will reduce the time by 3 fold.  Something for bruteforce implementation.  For example, suppose a hash has 100 steps, so u have an option to divide it into multiple stages i.e 3 stages, 4, 5 etc.  So on one computer you run stage 1, while at the same time, on computer 2, you run stage 2, etc
	
	
	
	
	
 
 
	
	
	
		
	Posts: 5,232
	Threads: 233
	Joined: Apr 2010
	
	
 
	
	
		 (05-17-2010, 04:44 AM)richardsguy Wrote:  How about a way to divide the workload to multiple computers so that if something takes 10 days to do, then it will reduce the time by 3 fold.  Something for bruteforce implementation.  For example, suppose a hash has 100 steps, so u have an option to divide it into multiple stages i.e 3 stages, 4, 5 etc.  So on one computer you run stage 1, while at the same time, on computer 2, you run stage 2, etc
this is already supported by hashcat and oclHashcat by using -s and -l parameter in combination. example:
wordlist contains 10000 words and you have 4 pcs (all of same speed):
-s 0 -l 2500
-s 2500 -l 2500
-s 5000 -l 2500
-s 7500 -l 2500
--
atom