Hashcat 4.1.0 slow speed
I'm using GTX 1050 graphic card and I have installed latest graphic driver. There is no speed problem with hashcat version 3.00. My graphic card can run 95k H/s with hashcat 3.00. Problem occurs with hashcat 4.1.0 version. Graphic card can run 1k H/s only with 4.1.0. When I append -w 3 to commandline speed grow up to 15k H/s. It's slow in compare with hashcat 3.00. How to speed up?
Are you sure that you are comparing the correct values?

Did you ran a benchmark (-b) with both versions?

There are a lot of people making the mistake to compare apples with oranges, for instance by comparing a huge hccapx files with dozens of wlan networks with an old hccap file with only a single network etc. Please double check that your numbers are correct and your testing/benchmark is performed as it should (e.g. with --benchmark)
Yes, values are correct. I have run benchmarks for both.

Hashtype: WPA/WPA2 97557 H/s (89.38ms)  hashcat-3.00
Hashmode: 2500 - WPA/WPA2 (Iterations: 4096) 102.2 kH/s (49.58 ms) @Accel:128 Loops:32 Thr:1024 Vec:1 hashcat-4.1.0

Mask makes problem. Why hashcat 4.10 speed gets down graduatelly? There are diffetent speeds for other masks. Hashcat 4.1.0 get 1kH/s for mask "dfrtg?d?d?d". Full speed exist at mask ?d?d?d?d?d?d?d?d with -w 3 about 100kH/s and speed gets down graduatelly (100k,99k,98k,97k...)
static mask prefix is bad for performance
Thanks, I didn't know that.
Note that you can still have full cracking speed even with a static prefix mask for slow hashes like WPA by using a pipe.

Full speed test:

root@ht:~/hashcat# ./hashcat -b -m 2500


Speed.Dev.#1.....:    62363 H/s (81.34ms) @ Accel:128 Loops:32 Thr:1024 Vec:1

With a pipe example (make sure to write the prefix backwards):

root@ht:~/hashcat# ./hashcat -a 3 ?a?a?a! --stdout | ./hashcat -m 2500 hashcat.hccapx -j '^h ^s ^a ^h' -w 3
hashcat (v4.1.0-29-g547025ec) starting...


Starting attack in stdin mode...


Speed.Dev.#1.....:    62680 H/s (81.33ms) @ Accel:128 Loops:32 Thr:1024 Vec:1



Status...........: Cracked
Ooh, that's good to know - thanks!