oclHashcat-plus v0.13 and oclHashcat-lite v0.14
#21
Upgraded AMD drivers and latest version of hashcat+

Works great did my test network in 7seconds using password list..

AMAZING keep up the good work...
#22
(02-02-2013, 09:22 AM)James1 Wrote: Thank you. But the problem with dictionaries is not completely fixed. Now the limit is 239-240 dictionary files. When loading 300 files dictionaries oclHashcat-plus v0.13 does not run brute. Thanks a lot.

... the hell is wrong with you, boy?
#23
who needs -a3? there's a dict for that!
#24
(02-02-2013, 10:37 PM)undeath Wrote: who needs -a3? there's a dict for that!

Big Grin Big Grin Big Grin

That's very good ! Smile
#25
(02-01-2013, 03:21 PM)Kuci Wrote: I'm getting clBuildProgram() -11 errors on all of mine ATI Radeon HD 4800. I'm using Catalyst 12.6 because in newer drivers these cards are not supported. So, my cards are now officially unsupported by oclHashcat ?

Hi Kuci,
AMD supports HD4800 with Catalyst 13.1. The driver was released on Jan 21. I own HD4830 and it runs hashcat-plus 0.12 on this driver (with force). New hashcat-plus 0.13 errors out.
#26
(02-02-2013, 12:15 AM)nylithic Wrote: MSSQL 2005 went down from 1000M/S to 192M/S. I wasn't expecting that big of a jump.
The speed meters are now different for salted algos. See:
https://hashcat.net/trac/ticket/24
https://hashcat.net/trac/ticket/46 (the last couple of posts)

EDiT: and this:
Code:
type: feature
file: host programs
desc: for salted algorithms, report the number of words tried per second, not crypts
trac: #24
#27
Again, good job atom! Luxurious updates, thanks.
#28
(02-03-2013, 06:02 PM)kolme Wrote:
(02-01-2013, 03:21 PM)Kuci Wrote: I'm getting clBuildProgram() -11 errors on all of mine ATI Radeon HD 4800. I'm using Catalyst 12.6 because in newer drivers these cards are not supported. So, my cards are now officially unsupported by oclHashcat ?

Hi Kuci,
AMD supports HD4800 with Catalyst 13.1. The driver was released on Jan 21. I own HD4830 and it runs hashcat-plus 0.12 on this driver (with force). New hashcat-plus 0.13 errors out.
Same for me.
#29
HD4000 series is not supported anymore.
#30
(02-02-2013, 01:19 PM)Dolphin Wrote: I'm personally seeing a slight decrease in performance when upgrading to Catalyst 13.1 (up to 15% on some algorithms) but that's not really a reason to complain. I'm glad that the bugs have been fixed as I've been just turning my clock back for 2 months Tongue.

I really don't see the point in having 300 dictionaries. I understood the need to fix the issue of 20 dictionaries together but I really don't see why you would need to use 300 dictionaries at once. If you have multiple small files then just DL a program that combines multiple dictionaries together, there are heaps around.

Hardly a bug in my eyes but you could submit it https://hashcat.net/trac/

First, every one of us works in different ways, and that's advantageous in that it will allow us to, as a community, accomplish more things (crack more passwords, hashes, password types, and so on) than we would working individually.

Second, the core questions here are:
Is having a limit a hard constraint imposed by external forces? If so, if we're at that limit, we're done. If not, continue.

Is having a limit a soft constraint imposed by external forces (speed of cracking decreases even for only 2 dictionaries if the limit is increased, for example)? If so, if we're at that limit, we could hold a reasonable discussion of options, but there's a solid reason for the limit being there. If not, continue.

Do the developers want to impose an arbitrary limit? If so, that is their privilege, though I would personally advocate allowing the program to accept as many as it can before a soft or hard constraint is it.

I would respectfully request that regardless of hard, soft, or arbitrary, that once the limit is stable, it be listed in the help text.


Third, as far as 300+ dictionaries, I can understand techniques that split dictionaries into many subsections - common [words | names | surnames | jargon | sports X | sports Y | etc] , uncommon [...], rare [...], short [...], long [...], various keywalking sets, etc. This by itself could get fairly large; if someone does this for 30 or 40 languages, and regenerates keywalking sets for dozens of different keyboard layouts, it would multiply very quickly.