oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - Printable Version +- hashcat Forum (https://hashcat.net/forum) +-- Forum: Deprecated; Ancient Versions (https://hashcat.net/forum/forum-46.html) +--- Forum: Very old oclHashcat-plus Announcements (https://hashcat.net/forum/forum-19.html) +--- Thread: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 (/thread-2024.html) |
RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - dwjs1974 - 02-02-2013 Upgraded AMD drivers and latest version of hashcat+ Works great did my test network in 7seconds using password list.. AMAZING keep up the good work... RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - epixoip - 02-02-2013 (02-02-2013, 09:22 AM)James1 Wrote: Thank you. But the problem with dictionaries is not completely fixed. Now the limit is 239-240 dictionary files. When loading 300 files dictionaries oclHashcat-plus v0.13 does not run brute. Thanks a lot. ... the hell is wrong with you, boy? RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - undeath - 02-02-2013 who needs -a3? there's a dict for that! RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - Hash-IT - 02-02-2013 (02-02-2013, 10:37 PM)undeath Wrote: who needs -a3? there's a dict for that! That's very good ! RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - kolme - 02-03-2013 (02-01-2013, 03:21 PM)Kuci Wrote: I'm getting clBuildProgram() -11 errors on all of mine ATI Radeon HD 4800. I'm using Catalyst 12.6 because in newer drivers these cards are not supported. So, my cards are now officially unsupported by oclHashcat ? Hi Kuci, AMD supports HD4800 with Catalyst 13.1. The driver was released on Jan 21. I own HD4830 and it runs hashcat-plus 0.12 on this driver (with force). New hashcat-plus 0.13 errors out. RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - M@LIK - 02-03-2013 (02-02-2013, 12:15 AM)nylithic Wrote: MSSQL 2005 went down from 1000M/S to 192M/S. I wasn't expecting that big of a jump.The speed meters are now different for salted algos. See: https://hashcat.net/trac/ticket/24 https://hashcat.net/trac/ticket/46 (the last couple of posts) EDiT: and this: Code: type: feature RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - M@LIK - 02-03-2013 Again, good job atom! Luxurious updates, thanks. RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - Kuci - 02-04-2013 (02-03-2013, 06:02 PM)kolme Wrote:Same for me.(02-01-2013, 03:21 PM)Kuci Wrote: I'm getting clBuildProgram() -11 errors on all of mine ATI Radeon HD 4800. I'm using Catalyst 12.6 because in newer drivers these cards are not supported. So, my cards are now officially unsupported by oclHashcat ? RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - undeath - 02-04-2013 HD4000 series is not supported anymore. RE: oclHashcat-plus v0.13 and oclHashcat-lite v0.14 - Incisive - 02-04-2013 (02-02-2013, 01:19 PM)Dolphin Wrote: I'm personally seeing a slight decrease in performance when upgrading to Catalyst 13.1 (up to 15% on some algorithms) but that's not really a reason to complain. I'm glad that the bugs have been fixed as I've been just turning my clock back for 2 months . First, every one of us works in different ways, and that's advantageous in that it will allow us to, as a community, accomplish more things (crack more passwords, hashes, password types, and so on) than we would working individually. Second, the core questions here are: Is having a limit a hard constraint imposed by external forces? If so, if we're at that limit, we're done. If not, continue. Is having a limit a soft constraint imposed by external forces (speed of cracking decreases even for only 2 dictionaries if the limit is increased, for example)? If so, if we're at that limit, we could hold a reasonable discussion of options, but there's a solid reason for the limit being there. If not, continue. Do the developers want to impose an arbitrary limit? If so, that is their privilege, though I would personally advocate allowing the program to accept as many as it can before a soft or hard constraint is it. I would respectfully request that regardless of hard, soft, or arbitrary, that once the limit is stable, it be listed in the help text. Third, as far as 300+ dictionaries, I can understand techniques that split dictionaries into many subsections - common [words | names | surnames | jargon | sports X | sports Y | etc] , uncommon [...], rare [...], short [...], long [...], various keywalking sets, etc. This by itself could get fairly large; if someone does this for 30 or 40 languages, and regenerates keywalking sets for dozens of different keyboard layouts, it would multiply very quickly. |