hashcat Forum

Full Version: Large zip/gz wordlists gives error
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
If multiple wordlists compressed with same software (7z v19.00 and WinRaR v5.50 they all work fine with hashcat up to a sertain size. Don't know exactly where is the threshold where it gets too big but I know this - 1GB is still works, but my main 34GB gives error "No such file or directory" in red. Like the same error you would get if you used anything else but Deflate compression method. Used deflate on all my wordlists and they work fine apart from that big one. Same with gz.

I have 16GB ram. Could that be the reason?

EDIT: just to clarify 34GB is in plain. When zipped its 9GB that gives error. Plain text working fine.
Same here. 1G is ok but 64G don't work. If not compressed, works fine. I have 32G RAM.
Must be a bug as there is no sense to have it just for small files


(01-20-2021, 01:53 AM)skalderis Wrote: [ -> ]If multiple wordlists compressed with same software (7z v19.00 and WinRaR v5.50 they all work fine with hashcat up to a sertain size. Don't know exactly where is the threshold where it gets too big but I know this - 1GB is still works, but my main 34GB gives error "No such file or directory" in red. Like the same error you would get if you used anything else but Deflate compression method. Used deflate on all my wordlists and they work fine apart from that big one. Same with gz.

I have 16GB ram. Could that be the reason?

EDIT: just to clarify 34GB is in plain. When zipped its 9GB that gives error. Plain text working fine.
Good news. So .gz worked for me on 65GB file. It loads to over 90%-98% and then you have to wait few minutes for hashcat to start. It's on the 6.2.5v. Will test on 150GB.
Just to report, that .gz compression on big wordlists works very nice with hashcat. 2.5TB wordlist compressed to 250GB took around 3h to start on hashcat and MVNe hard drive. Used -m 0 to read the file. Then -m 1 to cross it with other wordlists. You need to build the table only once.
How did u do that "big list compressed". Where is that big file located than compressed?