![]() |
fgets_sse2 versions of hashcaat-utils - Printable Version +- hashcat Forum (https://hashcat.net/forum) +-- Forum: Support (https://hashcat.net/forum/forum-3.html) +--- Forum: hashcat-utils, maskprocessor, statsprocessor, md5stress, wikistrip (https://hashcat.net/forum/forum-28.html) +--- Thread: fgets_sse2 versions of hashcaat-utils (/thread-2629.html) |
fgets_sse2 versions of hashcaat-utils - Mangix - 09-19-2013 TL;DR: Use the regular ones unless you really need speed. Summary: most of the hashcat-utils use fgetl (a wrapper around fgets) so for speed I ported them to use atom's fgets_sse2. Benefits....Speed I guess. Some benchmarks are available on the link at the bottom of this post. Not all tools become faster as the bottleneck is not always fgets... Drawbacks: They don't handle \r correctly at all. For example, "mp64 -i ?d?d?d | len 1 2" will show 0-9 using the windows binary of mp64 but probably works fine under linux. Some tools are also totally broken. rli and rli2 are such tools. Unfortunately I do not have enough understanding of C to fix them. Some data also is handled very oddly. enwik8 is one such example... Code: $ cat enwik8 | md5sum It seems to work just fine with wordlists though so no worries there. Code: $ cat rockyou.txt | md5sum combinator is also missing. It does not use fgets last I remember. Extras: I wrote a line-count tool which is equivalent to "wc -l". It's faster than cygwin's "wc -l" on large files last I remember. I also wrote a catf tool which should work just as well as cat. I also increased LEN_MAX on splitlen to 55. Not sure if this will really help anyone. Feel free to edit the source. My command of C is not that great... https://github.com/neheb/hashcat-utils RE: fgets_sse2 versions of hashcaat-utils - Rolf - 09-19-2013 Looks nice, but the fact that some tools don't work is worrisome. |