hashcat beta and ubuntu 16.04 summary
#1
There have been many issues reported when installing Ubuntu 16.04 and the latest beta (which is also the future release). 

I've installed it myself to see what's going on. 

Here we go:
  • Make sure to install Ubuntu 16.04 server as minimal as possible. No X11, no drivers, etc.
There's ultimately just one package you need:
  • ocl-icd-libopencl1
Additionally, if you compile from GitHub source (otherwise you can skip this):
  • opencl-headers
And that's it for the base. Now, depending on your hardware you need to install:

For NVidia's GPU:
  • nvidia-361
  • nvidia-opencl-icd-361
The default depencies of additional 350+ packages shocked me. You totally do not need them. 
If you install those packages, you really want to do it like this: 

Code:
$ apt-get install --no-install-recommends nvidia-361 nvidia-opencl-icd-361

Tripple check you did not install any other packages that begin with nvidia-*

For AMD's GPU:

You typically do _not_ want to install Ubuntu 16.04 because AMD stopped updating catalyst driver, also for the future. The latest Ubuntu that supports Catalyst is 15.10. AMD user have to decide to either switch Windows or use the OSS drivers. However, they are multiple times slower than Catalyst. If you absolutely want to stick to linux you need to switch to nvidia or wait and pray for better drivers. You have been warned.
  • mesa-opencl-icd
For AMD CPU:

Support is part of catalyst, which you can't install anymore. Therefore your last escape is to use pocl, which also does not exist as a Ubuntu package. You need to clone it from GitHub and compile it yourself.

For Intel's GPU:
  • beignet-opencl-icd
For Intel CPU:

There's no ubuntu package, you need to install Intel's OpenCL SDK yourself (which is great!)
Reply
#2
AMD's driver is available for Ubuntu 16.04 now. http://phoronix.com/scan.php?page=news_i...id-16.20.3
Reply
#3
Thanks. Anyone tried ?
Reply
#4
(05-30-2016, 09:54 AM)atom Wrote: Thanks. Anyone tried ?

I have completed a variation of an Ubuntu 16 install.  I chose Desktop instead of server though.

I had to work through a number of issues in order to get it to work.

I am running a Asus Nvidia GTX 1080 Founders Edition with Ubuntu 16 Desktop.

First, after installing to a hard drive from a USB drive, upon reboot, I got an "Out Of Range" error message on my monitor.  After that, I had a black screen and couldn't even toggle to a non-graphical terminal (Ctrl-Alt-F1, Ctrl-Alt-F2, etc...).  I couldn't even see the grub menu for recovery mode (or upstart).  To solve this, I had to boot off the live USB in order to edit /boot/grub/grub.cfg.  I added the word "nomodeset" to the top menu entry, on the linux line, after the word splash.  This allowed me to boot into a working GUI.

Next, was the actual Nvidia driver install.  I used a similar method that has been outlined here: http://www.sandalssoftwareconsulting.com...ntu-16-04/

I added these lines to my /etc/apt/sources.lst
#Nvidia proprietary drivers
deb http://ppa.launchpad.net/graphics-drivers/ppa/ubuntu xenial main
deb-src http://ppa.launchpad.net/graphics-drivers/ppa/ubuntu xenial main

sudo apt-get remove nvidia*

#switched to root
su -
apt-get update && apt-get upgrade
apt-get install --no-install-recommends nvidia-367 nvidia-opencl-icd-367
reboot

After the reboot, I seemed to have everything working, minus the GUI fan control available in nvidia-settings.  That was easily fixed by running "nvidia-settings" and then going to the bottom option in the GUI, "nvidia-settings Configuration", then clicking on "Save Current Configuration" on the bottom right.

After that I closed nvidia-settings and ran: "nvidia-xconfig --cool-bits=12" (4(thermal monitor page will allow configuration of GPU fan speed) + 8(allows setting per-clock domain and per-performance level offsets to apply to clock values)

Finally, after a restart and I had the options in the nvidia-settings to control the fan speed manually.

The only issue that remains at the end of this process is not seeing any grub menu during bootup.  But everything else is functional so I can live with that.

The remaining issues were reported by hashcat 3.00 on startup.  I didn't have libx11-dev installed so I corrected that with, "sudo apt-get install libx11-dev".  I also made the change to /etc/X11/xorg.conf, under Device to avoid the 702 error.  https://hashcat.net/wiki/doku.php?id=timeout_patch

Conclusion: I might try the headless server option in the near future, as opposed to the GUI desktop version of Ubuntu 16.  I will report here if I do.  One issue I predict is controlling clocks and fan speeds in a headless environment.  How to do that has been posted here in these forums, but working that out will be interesting.

Also, I went to all of this effort as a measure of troubleshooting.  I was trying to run hashcat-3.00 in a Windows 10/Cygwin environment, but it was making the entire system crash (a full powerdown).  It was doing this predictably. In other words, in the Windows 10/Cygwin environment, it would run for almost the same amount of time on every attempt before it would crash.  It is also doing this in Ubuntu 16.  I am still trying to rule out issues with the hardware as I am running hashcat 3.00 on a new build with fresh hardware.  It is entirely possible that I may have lost the silicon lottery somewhere along the way.  I will probably go buy another PSU to rule that out as well.  I'm already thinking about changing out the motherboard, but that's another story altogether.

Good night and good luck.
Reply
#5
This is my first post after reading a lot in this forum. I just finished setting up the hardware of first my rig, which I will document in a different post later. I quickly installed ubuntu 16.04 and got hashcat running for my GTX 980 and 1070.

By the way: I solved the "out of range" problem (HDMI related?) by "disabling" the cards first (my motherboard got switches for that), booting with the onboard GPU and installing the NVIDIA Driver.

But I get the same warning about the initial fan speed like in this post. I guess that it messed up something with one of commands that found in the forum trying to set up the nvidia or X11 settings. I'll start over with a fresh install in the next days.

I'd suggest to create a new bulletproof and tested HOW-TO like this one, but based on hashcat 3.0, ubuntu 16.04 LTS and nvidia drivers. I would test it with my GTX1070 and GTX980 and write the first version. The mentioned wiki article and atoms comments here are a nice baseline but I guess for newbies like me there are some things missing. We could add tuning tips for different nvidia cards which are flying around in the forum, but which aren't documented well (at least for hashcat 3.0).

Do I have to install a minimal X11-Environment (NVIDIA cards!)? You, atom, say no in the first post of this thread, but talk about the xorg.conf in the other one. I am mixing something up, huh? Furthermore nvidia-settings complains without X11.

Okay, lets start:

1. Install Ubuntu Server without anything except openssh.

2. Install some packages and disable multiarch (64-bit!): 
Code:
sudo apt install ocl-icd-libopencl1
sudo dpkg --remove-architecture i386
sudo apt update && sudo apt upgrade
sudo sync && sudo reboot

3. Add some repos and install the latest nvidia drivers (that means currently 367, correct?): 
Code:
sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt-get update
sudo apt-get install --no-install-recommends nvidia-367 nvidia-opencl-icd-367
Is this approach the same as the following? Which one is recommended?
Code:
wget http://de.download.nvidia.com/XFree86/Linux-x86_64/367.35/NVIDIA-Linux-x86_64-367.35.run
sudo ./Linux-x86_64/367.35/NVIDIA-Linux-x86_64-367.35.run

4. Install Intel-GPU-Stuff:
Code:
sudo apt install beignet-opencl-icd

5. Install Intel-CPU-Stuff (do I really need the full sdk or just the runtime?):
Code:
wget something
unzip someting
run something

6. X11 Stuff? I am referring to this postWhat do I exactly have to set up?

7. Configure nvidia-settings and nvidia-xconfig.  Special things for the different gpu models (980 vs 1070)? Here i ran into errors.

8. Install hashcat...

9. benchmark...
....

I stopped here to see if I'll run into errors again. 

What do you think about the idea of a new HOWTO? And please answer my questions (bold).
Reply
#6
For the moment I switched to 14.04 and did the configuration as described.

I configured using:
Code:
sudo nvidia-xconfig -s -a --force-generate --allow-empty-initial-configuration --cool-bits=12 --registry-dwords="PerfLevelSrc=0x2222" --no-sli --connected-monitor="DFP-0"
(wich seemed to work) and
Code:
sudo nvidia-settings -a GPUPowerMizerMode=1 -a GPUFanControlState=1 -a GPUCurrentFanSpeed=90 -a GPUGraphicsClockOffset[3]=100
which threw some errors saying, that a part of the variables can't be set, because they are read only...

So the "initial fan speed" issues persists, but I set the fan speed manually to 100% and used the --gpu-temp-disable option. Is there (beside reduced fan lifetime and higher power consumption) a problem with that? For me that's okay for the moment, but if I can help to spot this bug (or malconfiguration) just tell me which input you need.

I'm using hashcat v3.00-69-g804ee28 with
Code:
OpenCL Platform #1: NVIDIA Corporation  
======================================
- Device #1: GeForce GTX 1070, 2027/8110 MB allocatable, 15MCU
- Device #2: GeForce GTX 980, 1009/4037 MB allocatable, 16MCU

....

Session.Name...: hashcat
Status.........: Running
Rules.Type.....: File (hashcat-3.00/rules/best64.rule)
Input.Mode.....: File (wordlists/crackstation-human-only.txt)
Hash.Target....: File (hccap/multi.hccap)
Hash.Type......: WPA/WPA2
Time.Started...: Sat Aug 13 20:31:23 2016 (30 mins, 27 secs)
Time.Estimated.: Sun Aug 14 03:26:26 2016 (6 hours, 24 mins)
Speed.Dev.#1...:   286.6 kH/s (12.54ms)
Speed.Dev.#2...:   199.3 kH/s (12.43ms)
Speed.Dev.#*...:   486.0 kH/s
Recovered......: 0/6 (0.00%) Digests, 0/6 (0.00%) Salts
Progress.......: 18248438622/29461118610 (61.94%)
Rejected.......: 348976320/18248438622 (1.91%)
Restore.Point..: 38922456/63768655 (61.04%)

+-----------------------------------------------------------------------------+
| NVIDIA-SMI 367.35                 Driver Version: 367.35                    |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce GTX 1070    Off  | 0000:01:00.0     Off |                  N/A |
|100%   65C    P2   136W / 151W |   1146MiB /  8110MiB |    100%      Default |
+-------------------------------+----------------------+----------------------+
|   1  GeForce GTX 980     Off  | 0000:02:00.0     Off |                  N/A |
|100%   68C    P2   171W / 180W |   1201MiB /  4037MiB |    100%      Default |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID  Type  Process name                               Usage      |
|=============================================================================|
|    0      4045    G   /usr/bin/X                                       0MiB |
|    0      8950    C   ./hashcat-3.00/hashcat64.bin                  1145MiB |
|    1      4045    G   /usr/bin/X                                       0MiB |
|    1      8950    C   ./hashcat-3.00/hashcat64.bin                  1200MiB |
+-----------------------------------------------------------------------------+

user@server:~/hashcat-3.00$ nvidia-settings -q GPUCurrentClockFreqs

  Attribute 'GPUCurrentClockFreqs' (server:0.0): 1728,3802.
    'GPUCurrentClockFreqs' is a packed integer attribute.
    'GPUCurrentClockFreqs' is a read-only attribute.
    'GPUCurrentClockFreqs' can use the following target types: X Screen, GPU.
  Attribute 'GPUCurrentClockFreqs' (server:0.1): 1240,3004.
    'GPUCurrentClockFreqs' is a packed integer attribute.
    'GPUCurrentClockFreqs' is a read-only attribute.
    'GPUCurrentClockFreqs' can use the following target types: X Screen, GPU.
  Attribute 'GPUCurrentClockFreqs' (server:0[gpu:0]): 1728,3802.
    'GPUCurrentClockFreqs' is a packed integer attribute.
    'GPUCurrentClockFreqs' is a read-only attribute.
    'GPUCurrentClockFreqs' can use the following target types: X Screen, GPU.
  Attribute 'GPUCurrentClockFreqs' (server:0[gpu:1]): 1240,3004.
    'GPUCurrentClockFreqs' is a packed integer attribute.
    'GPUCurrentClockFreqs' is a read-only attribute.
    'GPUCurrentClockFreqs' can use the following target types: X Screen, GPU.
Reply
#7
(08-13-2016, 09:06 PM)hashcrash Wrote: For the moment I switched to 14.04 and did the configuration as described.

I configured using:
Code:
sudo nvidia-xconfig -s -a --force-generate --allow-empty-initial-configuration --cool-bits=12 --registry-dwords="PerfLevelSrc=0x2222" --no-sli --connected-monitor="DFP-0"
(wich seemed to work) and
Code:
sudo nvidia-settings -a GPUPowerMizerMode=1 -a GPUFanControlState=1 -a GPUCurrentFanSpeed=90 -a GPUGraphicsClockOffset[3]=100
which threw some errors saying, that a part of the variables can't be set, because they are read only...

So the "initial fan speed" issues persists, but I set the fan speed manually to 100% and used the --gpu-temp-disable option. Is there (beside reduced fan lifetime and higher power consumption) a problem with that? For me that's okay for the moment, but if I can help to spot this bug (or malconfiguration) just tell me which input you need.

I'm using hashcat v3.00-69-g804ee28 with
Code:
OpenCL Platform #1: NVIDIA Corporation  
======================================
- Device #1: GeForce GTX 1070, 2027/8110 MB allocatable, 15MCU
- Device #2: GeForce GTX 980, 1009/4037 MB allocatable, 16MCU

....

Session.Name...: hashcat
Status.........: Running
Rules.Type.....: File (hashcat-3.00/rules/best64.rule)
Input.Mode.....: File (wordlists/crackstation-human-only.txt)
Hash.Target....: File (hccap/multi.hccap)
Hash.Type......: WPA/WPA2
Time.Started...: Sat Aug 13 20:31:23 2016 (30 mins, 27 secs)
Time.Estimated.: Sun Aug 14 03:26:26 2016 (6 hours, 24 mins)
Speed.Dev.#1...:   286.6 kH/s (12.54ms)
Speed.Dev.#2...:   199.3 kH/s (12.43ms)
Speed.Dev.#*...:   486.0 kH/s
Recovered......: 0/6 (0.00%) Digests, 0/6 (0.00%) Salts
Progress.......: 18248438622/29461118610 (61.94%)
Rejected.......: 348976320/18248438622 (1.91%)
Restore.Point..: 38922456/63768655 (61.04%)

+-----------------------------------------------------------------------------+
| NVIDIA-SMI 367.35                 Driver Version: 367.35                    |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce GTX 1070    Off  | 0000:01:00.0     Off |                  N/A |
|100%   65C    P2   136W / 151W |   1146MiB /  8110MiB |    100%      Default |
+-------------------------------+----------------------+----------------------+
|   1  GeForce GTX 980     Off  | 0000:02:00.0     Off |                  N/A |
|100%   68C    P2   171W / 180W |   1201MiB /  4037MiB |    100%      Default |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID  Type  Process name                               Usage      |
|=============================================================================|
|    0      4045    G   /usr/bin/X                                       0MiB |
|    0      8950    C   ./hashcat-3.00/hashcat64.bin                  1145MiB |
|    1      4045    G   /usr/bin/X                                       0MiB |
|    1      8950    C   ./hashcat-3.00/hashcat64.bin                  1200MiB |
+-----------------------------------------------------------------------------+

user@server:~/hashcat-3.00$ nvidia-settings -q GPUCurrentClockFreqs

  Attribute 'GPUCurrentClockFreqs' (server:0.0): 1728,3802.
    'GPUCurrentClockFreqs' is a packed integer attribute.
    'GPUCurrentClockFreqs' is a read-only attribute.
    'GPUCurrentClockFreqs' can use the following target types: X Screen, GPU.
  Attribute 'GPUCurrentClockFreqs' (server:0.1): 1240,3004.
    'GPUCurrentClockFreqs' is a packed integer attribute.
    'GPUCurrentClockFreqs' is a read-only attribute.
    'GPUCurrentClockFreqs' can use the following target types: X Screen, GPU.
  Attribute 'GPUCurrentClockFreqs' (server:0[gpu:0]): 1728,3802.
    'GPUCurrentClockFreqs' is a packed integer attribute.
    'GPUCurrentClockFreqs' is a read-only attribute.
    'GPUCurrentClockFreqs' can use the following target types: X Screen, GPU.
  Attribute 'GPUCurrentClockFreqs' (server:0[gpu:1]): 1240,3004.
    'GPUCurrentClockFreqs' is a packed integer attribute.
    'GPUCurrentClockFreqs' is a read-only attribute.
    'GPUCurrentClockFreqs' can use the following target types: X Screen, GPU.

Hey hascrash, how did you set your fans manually to 100 percent, 
I ve tried different options but that didn't work for me, I set the gpu-temp to 90 , but after a 10 mins it cuts saying i ve reached my max temp
Reply
#8
Like this:
Code:
nvidia-settings -a [gpu:0]/GPUFanControlState=1 -a [fan-0]/GPUTargetFanSpeed=100

EDIT: but it's not necessary anymore: https://hashcat.net/forum/thread-5681-po...l#pid30787
Reply