Posts: 117
Threads: 6
Joined: Aug 2011
02-16-2015, 03:02 PM
While nvidia fanboys be fooled by nvidia and feel cool AMD works silent on its new card to top the 290x.
It will include Fiji-GPU, also Cooler Master will be develop the massiv cooling :=)) (we can be sure its water)
300-Watt-including the brand new HBM memmory.
Also not new , we all know its processed in 20 NanometerZzzz
.
Very nice info about amds new hbm
http://wccftech.com/amd-20nm-r9-390x-fea...han-gddr5/
http://videocardz.com/amd/radeon-rx-300/radeon-r9-390x
Careful;specs are not validated by now...
Release DateApril 2015
Launch Price$599 USD
Board ModelAMD C880
GPU Model 28nm Fiji XT
Cores : TMUs : ROPs 3584 : 224 : 64
Clocks
Base Clock1200 MHz
Memory Clock (Effective)1000 (1000) MHz
Memory
Memory Size4096 MB HBM
Memory Bus Width4096-bit
Memory Bandwidth512 GB/s
Physical
InterfacePCI-Express 3.0 x16
Thermal Design Power300 W
3584 sounds god for hashcat ;--)
Posts: 381
Threads: 1
Joined: Aug 2014
02-16-2015, 04:06 PM
(This post was last modified: 02-16-2015, 09:58 PM by Flomac.)
Although only rumors, 3584 shaders will be the 390.
390x comes with 4096 shaders.
Watercooling is for sure.
Not so the 300W.
But definitly the 28nm process.
EDIT: It's expected in the next four to six weeks according to the people who should know
380x will be a rebranded Hawaii (290x).
Tonga with 2048 shaders plays the 370x.
Posts: 2,936
Threads: 12
Joined: May 2012
Just as worthless for oclHashcat as the 295X2.
Posts: 117
Threads: 6
Joined: Aug 2011
we will see , its amd i trust.
Posts: 2,936
Threads: 12
Joined: May 2012
There's no innovation here. AMD's game plan for last generation and the next is simply "let's keep adding more cores to our aging architecture." Obviously not a scalable plan, and that's why they're releasing a reference design card that requires water.
Water cooling is an automatic non-starter. The power consumption is grotesque for a single GPU card. They claim 300W which means it will draw around 375W in practice (that's 6990 territory!). But of course PowerTune will try to keep power consumption under 300W, so the card will throttle under load regardless of the temperature.
This card is proof that AMD's days are numbered.
Posts: 36
Threads: 6
Joined: Jan 2015
Quote:This card is proof that AMD's days are numbered.
Hope you are wrong. Once AMD is out of the game for GPU's, that means that nVidia has no on motivating them to try a bit harder. So, if you are right - the gtx980 is the best card we see in the near future.
Hmm ?
Posts: 2,936
Threads: 12
Joined: May 2012
Not really. Nvidia is still the de facto compute vendor. AMD was never competitive in that space.
But remember that GPGPU is nothing more than a hack. Manycore CPUs will eventually displace GPUs for accelerated computation, possibly even within the next 3-5 years.
Posts: 27
Threads: 3
Joined: Feb 2015
02-21-2015, 03:39 AM
(This post was last modified: 02-21-2015, 03:45 AM by logistix111.)
(02-17-2015, 01:14 AM)epixoip Wrote: Not really. Nvidia is still the de facto compute vendor. AMD was never competitive in that space.
But remember that GPGPU is nothing more than a hack. Manycore CPUs will eventually displace GPUs for accelerated computation, possibly even within the next 3-5 years.
Yes! There is some truth and foresight to this. GPUs are kinda maxing out these days for the general public and consumers. So, just imagine if a person wrote their own code for something to take advantage of this much horsepower:
http://bit.ly/1CXMlMH
ASIC stands for "application specific integrated circuit" REPEAT AFTER ME "application specific" (the Bitcoin stuff is SHA-256 for example) this is where things get scary. Scary along the lines of FPGA. Scary along the lines of a particular agency that has a motto, "We BUILD what we cannot BUY"
Posts: 381
Threads: 1
Joined: Aug 2014
02-21-2015, 01:18 PM
(This post was last modified: 02-21-2015, 09:29 PM by Flomac.)
ASICs have been around all the time. The NSA surely will have the latest and even unavailable technology for breaking hashes.
Beside that I don't think CPU's displace GPUs for accelerated computing but the next big step will be CPU and GPU working closely together. Like AMDs HAS or Nvidias NVLink. And maybe both will merge together forever like FPU (i come from those days where you had to buy the FPU seperately and plug it into her own socket on the mainboard...)
Posts: 36
Threads: 6
Joined: Jan 2015
Guess we have to save this question for Snowden's AMA