hashcat Forum
Reading very large dicts - Printable Version

+- hashcat Forum (https://hashcat.net/forum)
+-- Forum: Deprecated; Ancient Versions (https://hashcat.net/forum/forum-46.html)
+--- Forum: Very old oclHashcat-plus Support (https://hashcat.net/forum/forum-23.html)
+--- Thread: Reading very large dicts (/thread-1084.html)

Pages: 1 2


Reading very large dicts - blandyuk - 04-16-2012

Hi atom,

When reading very large dicts, it obviously takes awhile before it even starts, can u make oclHashcat-plus store a dict properties so once it's read a dict, it stores the number of lines for next time unless it's been modified? When running a batch job it will save quite a bit of time when having to read 36GB dicts Smile


RE: Reading very large dicts - atom - 04-16-2012

no, because if you change the dictionary the stored numbers become invalid.

you can work-around the line counter by using <, then it will start instant with cracking

with eta:

./oclHashcat-plus... hash dict

without eta:

./oclHashcat-plus... hash < dict


RE: Reading very large dicts - blandyuk - 04-16-2012

Yeah ok thanks man, I'll figure out how long they take then use the < on the dict so oclHashcat doesn't check it Smile


RE: Reading very large dicts - blandyuk - 04-16-2012

Tried < with windows exe and it doesn't work. Any ideas?


RE: Reading very large dicts - undeath - 04-16-2012

are you sure? works fine for me.


RE: Reading very large dicts - proinside - 04-16-2012

Give to me this message, if using < dic : Access is denied. (dic is the dic name folder, of course).
Normall behavior, but reading the every single dic before starts, whitout using the < parameter.
What's is wrong, perhaps the version we are running ?


RE: Reading very large dicts - undeath - 04-16-2012

doesn't work for directories of course.


RE: Reading very large dicts - M@LIK - 04-17-2012

great..
i was facing similar problems xD


RE: Reading very large dicts - proinside - 04-17-2012

(04-16-2012, 10:54 PM)undeath Wrote: doesn't work for directories of course.

atom, can this option be implemented ? At least it will give
a faster start for those that don't care too much about
details, like estimated time to finish, as example.


RE: Reading very large dicts - atom - 04-17-2012

if there is enough demand, yes