02-18-2019, 01:58 AM
I think not all AMD cards will work in an eGPU on MacOS by default. Here's the official Apple document on what is supported: https://support.apple.com/en-ca/HT208544
If you're not worried about getting support from Apple, there's also the unofficial route: https://egpu.io/
For me, I'm running an AMD RX580 in a Razer Core X and it Just Worked.
If you're not worried about getting support from Apple, there's also the unofficial route: https://egpu.io/
For me, I'm running an AMD RX580 in a Razer Core X and it Just Worked.
Code:
$ hashcat -d 3 -m 1000 -b
hashcat (v5.1.0) starting in benchmark mode...
Benchmarking uses hand-optimized kernel code by default.
You can use it in your cracking session by setting the -O option.
Note: Using optimized kernel code limits the maximum supported password length.
To disable the optimized kernel code in benchmark mode, use the -w option.
OpenCL Platform #1: Apple
=========================
* Device #1: Intel(R) Xeon(R) W-2140B CPU @ 3.20GHz, skipped.
* Device #2: AMD Radeon Pro Vega 56 Compute Engine, skipped.
* Device #3: AMD Radeon RX 580 Compute Engine, 2048/8192 MB allocatable, 36MCU
Benchmark relevant options:
===========================
* --opencl-devices=3
* --optimized-kernel-enable
Hashmode: 1000 - NTLM
Speed.#3.........: 19455.1 MH/s (61.15ms) @ Accel:256 Loops:512 Thr:256 Vec:1