Is my plan a good idea?
#11
(12-29-2014, 01:42 AM)undeath Wrote:
(12-29-2014, 12:03 AM)Saint Wrote: It would be saved in the list as this

098f6bcd4621d373cade4e832627b4f6:test

not like rainbow tables are already bad, you want to make it even worse.

That wasn't an example of a rainbow table structure, it was just a suggestion.
Reply
#12
Rainbow tables don't actually store the hash value, they store hash chains using a reduction function. So you still have to do quite a bit of calculation and false-alarm checking to look up a hash in a rainbow table, it is by no means instantaneous. And it does not scale well; the more hashes you have, the slower it gets. Rainbow tables are slow as shit with e.g. 20k hashes.

For much faster lookups you could store the hash (hopefully as a binary blob) in an indexed database. The tradeoff here is significantly more disk space since you're not using any reduction function, and it still won't be incredibly fast since your index will be way too large to hold in memory. If you don't care about disk space and have a few months you can spend running your laptop 24/7 to generate and index this table, then this would probably be what you'd want to do.

But again, I can't stress enough that this is a very silly solution to your perceived problem. It would be much better to just run straight dictionary attacks against fast hashes on your CPU, and leave your GPU to do more advanced high-yield attacks. And it doesn't really matter that you only have a laptop with a weak CPU and GPU. Several members of Team Hashcat got their start with only a netbook with an embedded CPU and no GPU, and have been more successful than people with multiple high-end GPUs because they are smart about their attacks.

And remember, quality over quantity. Most wordlists you download from the Internet are going to be pure garbage, and the larger it is the more garbage you're going to have in there.
Reply
#13
(12-29-2014, 09:29 AM)epixoip Wrote: Rainbow tables don't actually store the hash value, they store hash chains using a reduction function. So you still have to do quite a bit of calculation and false-alarm checking to look up a hash in a rainbow table, it is by no means instantaneous. And it does not scale well; the more hashes you have, the slower it gets. Rainbow tables are slow as shit with e.g. 20k hashes.

For much faster lookups you could store the hash (hopefully as a binary blob) in an indexed database. The tradeoff here is significantly more disk space since you're not using any reduction function, and it still won't be incredibly fast since your index will be way too large to hold in memory. If you don't care about disk space and have a few months you can spend running your laptop 24/7 to generate and index this table, then this would probably be what you'd want to do.

But again, I can't stress enough that this is a very silly solution to your perceived problem. It would be much better to just run straight dictionary attacks against fast hashes on your CPU, and leave your GPU to do more advanced high-yield attacks. And it doesn't really matter that you only have a laptop with a weak CPU and GPU. Several members of Team Hashcat got their start with only a netbook with an embedded CPU and no GPU, and have been more successful than people with multiple high-end GPUs because they are smart about their attacks.

And remember, quality over quantity. Most wordlists you download from the Internet are going to be pure garbage, and the larger it is the more garbage you're going to have in there.

Thanks for enlightening me on how a rainbow table works, I wasn't quite sure.

I have made a couple of wordlists myself, one being small, made up of a few small leaks, including, myspace, twitter and 10k most common passwords. Also a bigger one.

http://forum.insidepro.com/viewtopic.php?t=34386

Feel free to try it out.

What I find weird is, my CPU doesn't seem to be at its full potential. On a straight wordlist run, it says ~20 MH/s, but it only seems to run at ~10 MH/s. My wordlist has 302M words.

302,000,000 / 20,000,000 = 15

However it takes 27 seconds to complete. When running through a list of 45k hashes it takes 42 seconds. Which is ~7 MH/s, although it displays an average of 10 MH/s.

Threads: 4
Segments Size: 64MB
i5-2450M

Is there a bottleneck? Maybe my hard drive?
Reply
#14
Hard drive and RAM can both be a bottleneck, yes.

You also have to take into consideration startup & shutdown time, there is overhead there where you are not doing any cracking.

I had never really considered this before, but this might be a scenario where the operating system actually makes a difference. I suspect that the Linux kernel has better CPU scheduling and NUMA-aware memory/task placement than the Windows kernel does. The preemption model also would likely play a big role in desktop systems, where the system is constantly being interrupted to respond to events. This might be a fun experiment to try.
Reply
#15
(12-29-2014, 10:28 PM)epixoip Wrote: Hard drive and RAM can both be a bottleneck, yes.

You also have to take into consideration startup & shutdown time, there is overhead there where you are not doing any cracking.

I had never really considered this before, but this might be a scenario where the operating system actually makes a difference. I suspect that the Linux kernel has better CPU scheduling and NUMA-aware memory/task placement than the Windows kernel does. The preemption model also would likely play a big role in desktop systems, where the system is constantly being interrupted to respond to events. This might be a fun experiment to try.

Ah, funny you should mention that. I actually have Ubuntu on a USB stick lying around somewhere in which I was going to test your analogy on, but I forgot too haha. I'll give it a try right away, maybe install it for a duel boot while I'm at it.

Btw, my hard drive read speed is 65MB/s. Also I have 8GB of ram.
Reply
#16
65 MB/s is pretty slow, but not atypical of a laptop with a spinny disk.

For this task RAM quantity is not quite as important as RAM speed, and perhaps even more important than that is NUMA-aware scheduling.
Reply
#17
If your laptop has a expresscard port you can use one of those expresscard-to-pcie dongles. I used it with my old (2008) dell laptop and to my surprised it worked quite well with an ATI HD 5770 desktop GPU. I haven't tried a higher end GPU but something to think about if you're limited by laptop hardware.
Reply
#18
Well I've installed Ubuntu and downloaded hashcat onto it. But it is still going at the same speed. Which leads me to believe it's my hard drive that's bottlenecking it. I will buy the external hard drive because it has a read speed of 120MB/s to see if that makes a difference.
Reply
#19
(12-30-2014, 05:52 PM)Saint Wrote: Well I've installed Ubuntu and downloaded hashcat onto it. But it is still going at the same speed. Which leads me to believe it's my hard drive that's bottlenecking it. I will buy the external hard drive because it has a read speed of 120MB/s to see if that makes a difference.

Your logic is flawed, unless you are using ATA-1 or something you should be bound by processing limits. Just run 'top' or something and check the CPU usage. Though if you are convinced it's the I/O just apply some rules and see if the speed changes since that should slow down the I/O from the disk as there will be more processing to do.
Reply
#20
I am going to put the wordlists and hashcat onto the external hard drive and run it through Lubuntu which I have on my USB stick.

Lubuntu is blazingly fast by the way. But has all of the major functions you need it to have. A perfect little operating system in my opinion.
Reply