Dictionary cache building
#1
Dictionary cache creation hangs very slowly, what could be the problem?
AMD Ryzen 5950x, 3080 Ti
[Video: https://youtu.be/19oGJGPLTGE]
Reply
#2
How big is your dict?

according to your video 0,74% are 1,27 GB so in total your pw list should be around 171 GB?

this is quite a huge dictionary and for sure thats the problem
Reply
#3
(11-11-2021, 02:15 PM)Snoopy Wrote: How big is your dict?

according to your video 0,74% are 1,27 GB so in total your pw list should be around 171 GB?

this is quite a huge dictionary and for sure thats the problem

On another weaker PC, I loaded this dictionary without problems, but on this one more than 2 gigabytes, the dictionaries do not load!
Reply
#4
well then try running hashcat and storing your dict not on C:, windows is sometimes a little bit strange when running such programs on the main partition, if possible try another partition
Reply
#5
The problem is in the solid state drive, the dictionary is on it. I can't even copy to another hard drive! I don’t know yet how to solve the problem without formatting!
Reply
#6
If it is a 10gb dictionary, every time you have to reload the dictionary, it will waste a lot of time, and annoyingly waiting for it to load the dictionary to complete. In fact, this kind of working style is very bad

Especially in the cyclic working state, this method of reloading the dictionary every time is a waste of a lot of time

If you only need to cache the dictionary once, you can also accept the time required to wait for the dictionary to be cached once, but the dictionary must be cached every time, which is estimated to be unbearable for many people

Code:
:Loop

hashcat  -m22000  -a 0 HC.hc22000  10GB.dic

goto Loop
Reply
#7
The problem is in the solid state drive, from the other everything boots well!
Reply
#8
You have to reload the dictionary every time you use this dictionary. This is an unbearable wait for everyone.
I just can’t accept this working method of having to reload the dictionary every time
In fact, the dictionary has been cached once. If the path has not changed and the dictionary name has not been changed, then there is no need to cache the dictionary again. This is the direction for hashcat to optimize and improve
Reply
#9
I suggest, never use larger than 20gb wordlists. Its useless
Reply
#10
Hi Guys!,
Same problem here. I'm trying to build a rig of 3x 3080Ti to crack a password I forgot. Will be stuck w/ this work for years I guess. Now I'm trying to figure out the optimal settings. First when I supplied password list under 1GB it was complaining about too little work, and got integer overflow, Now currently with pre-generated 170G on test, it want start.
- I'm thinking about buying the 8TB mnev ssd. This should speed up the start up time some and help with manipulating the lists.
What more can be done?
Change priority in Win to real time?
Any flags that can speed up the process in hashcat? I see only one cpu core involved, It barely uses any memory or CPUs.
What is the bitmap max and segment size, how should it be set for faster performance?
Any input on optimization will be very appreciated
I'm searching this topic for days but can't find anything
Reply