PDA

View Full Version : AMD



John P. Myers
09-25-13, 10:26 PM
So i managed to dig up very, very few specs on the new AMD GPUs, even after watching the entire live broadcast of their launch party in Hawaii.

First, the 7970 equivalent of the new series will be called the R9 290X. As far as crunching goes, TFLOPS rating is ~25% higher than the 7970 (~5 TFLOPS vs. ~4 TFLOPS). Also seems to have 4GB VRAM.

These can be preordered beginning Oct. 3rd. Price unknown, but guessing $600 - $650.

Overall, the higher end GPUs will be known as the R9 200 series. Lower end will be the R7 200 series, and even lower R5 200 series, both which appear to be rebrands of the 7000 series.

No more Xfire connections on the top of the card. GPUs will now communicate through the PCIe 3.0 bus. What if you don't have 3.0? No idea. Xfire is overrated anyway :p

Offloads audio to be processed by the GPU. No more sound cards needed or CPU clocks used up.

Stated to be more power efficient, though TDP is unknown.

Edit: 300W TDP mentioned. Not official though, but likely. Also 6GB versions of the R9 290X are expected later.

zombie67
09-26-13, 12:23 AM
Yeah, once you approach 10k, time to restart the numbering. And clever with the two sets of numbers. R9/7/5 is like intel's i7/5/3. People are comfortable with three levels of performance, feels familiar and easy to understand. And then using a separate xxx set of numbers for the specific models allows room to grow again.

DrPop
09-26-13, 01:36 AM
It makes sense, but why not the XXX part start at 100? :) Seems like they just exed out a whole year's worth of naming by starting at 200 instead of 100 series. :p heh...

zombie67
09-26-13, 01:47 AM
Nobody wants the first gen of anything...

John P. Myers
09-27-13, 03:16 AM
As for the preorders beginning on the 3rd, there will only be 8,000 units available. Worldwide. Good luck with that :)

John P. Myers
09-30-13, 11:32 PM
The NDA expires Oct. 15th. Availability could also be the same day or soon after, but surely not later than the 24th. Not much more waiting to go :)

John P. Myers
10-05-13, 05:30 PM
If this is any indication...


1530

John P. Myers
10-05-13, 05:50 PM
***Update***

New info that's probably true :p

AMD claimed the R9 290X would have a bandwidth >300GB/s. Turns out to be rated at 320GB/s.

AMD claimed the R9 290X would compute at a rate >5 TFLOPS.
Base clock appears to be 800MHz which means 4.5056 TFLOPS
Boost clock appears to be 1000MHz which means 5.632 TFLOPS

Quite impressive. Oh and it has a 512-bit bus.

Fire$torm
10-06-13, 01:41 AM
***Update***

New info that's probably true :p

AMD claimed the R9 290X would have a bandwidth >300GB/s. Turns out to be rated at 320GB/s.

AMD claimed the R9 290X would compute at a rate >5 TFLOPS.
Base clock appears to be 800MHz which means 4.5056 TFLOPS
Boost clock appears to be 1000MHz which means 5.632 TFLOPS

Quite impressive. Oh and it has a 512-bit bus.

Just out of curiosity, with the 512 bit bus and those processing speeds, how close does the R9 come to saturating the PCIe 3.0 lanes?

John P. Myers
10-06-13, 04:06 AM
My Titan, which runs at 1071 MHz posts about 5.2 TFLOPS. I have had it installed in a PCIe x1 2.0 slot since i got it. Works perfectly fine :)

For gaming, again it wouldn't saturate even a 2.0 x16 slot. At least not on it's own. As i mentioned several posts back, there is no more Xfire bridge on these GPUs. All info is passed between the cards via the PCIe bus. I imagine running quad Xfire on a PCIe 2.0 bus would definitely cause some lag because of this, but 3.0 would have room left to spare.

The *effective* bandwidth of PCIe 3.0 is 985 MB/s per lane = 15.39 GB/s @ x16. To use it all, a single GPU running a 1080p monitor would have to achieve 1,992 frames per second.

DrPop
10-06-13, 04:21 AM
. . .The *effective* bandwidth of PCIe 3.0 is 985 MB/s per lane = 15.39 GB/s @ x16. To use it all, a single GPU running a 1080p monitor would have to achieve 1,992 frames per second.

Well, good to know I so desperately "NEEDED" those 40 lanes of PCIe 3.0 on these Socket 2011 mobos. :p I might actually use all that bandwidth in 20 years! :)) :D :rolleyes:

John P. Myers
10-06-13, 04:26 AM
Well, good to know I so desperately "NEEDED" those 40 lanes of PCIe 3.0 on these Socket 2011 mobos. :p I might actually use all that bandwidth in 20 years! :)) :D :rolleyes:

LOL well there is an advantage to having 3.0 over 2.0, which is time. Even if you only come up with 100 bytes of data that needs to be passed between the GPU and CPU, 3.0 can pass that data back and forth faster than 2.0, which speeds things up a little :)

John P. Myers
10-15-13, 02:49 AM
Sooo...it's the 15th and Z is wondering where the hell his case of R9-290Xs is... :p

Yeah...seems AMD is having issues and changed the launch date. Fortunately though, it's still soon, the 24th or maybe the 18th. Not even AMD seems to know which :/ eTailers and review sites are also under orders to keep quiet until then.

Remember the R9-290 and R9-290X are the only new GPUs. The 280X on down, which you can buy right now, are all 7000 series rebrands, the 280X being the 7970 GHz Edition.

Fire$torm
10-15-13, 05:57 PM
Sooo...it's the 15th and Z is wondering where the hell his case of R9-290Xs is... :p

Yeah...seems AMD is having issues and changed the launch date. Fortunately though, it's still soon, the 24th or maybe the 18th. Not even AMD seems to know which :/ eTailers and review sites are also under orders to keep quiet until then.

Remember the R9-290 and R9-290X are the only new GPUs. The 280X on down, which you can buy right now, are all 7000 series rebrands, the 280X being the 7970 GHz Edition.

So which are cheaper, the older 7xxx or their "R" clones?

Mumps
10-15-13, 06:02 PM
And the R clones aren't even clones with a die shrink? Or other minor improvements in the silicon?

John P. Myers
10-15-13, 06:03 PM
So which are cheaper, the older 7xxx or their "R" clones?

Pretty much the same price +- $10. At least on Newegg the 280Xs seem to be $10 cheaper than the 7970GEs ($300 vs. $310). However if you're in the market for a used 7970GE, those can be found even cheaper, while there are no used 280Xs yet, so that could save some money :)

Edit
@Mumps: Nope, no die shrink. Same 28nm as before. Last i heard TSMC won't be making the new 20nm wafers in useful quantity until June, but i'm not sure how AMD will handle that. It's too soon to release a "next gen" model based on the new die shrink, and at the same time there's no way they'd throw away the chance to release a "next gen" based on the new die shrink. I see release delays in the future...

Unlike Nvidia, AMD did tweak the original 7000 series chips a little for use in the Rx series, and based on compute benchmarks from Anandtech, the 280X seems to be 1-5% faster (though usually closer to 1%).

DrPop
10-16-13, 03:44 AM
@JPM - so what is the target price for the only "new" GPU - the 290X which we haven't seen yet? For the money, do you think it will be worth it over a used-price on a 7970 - I mean as far as cost per credits ratio?

John P. Myers
10-16-13, 07:15 AM
@JPM - so what is the target price for the only "new" GPU - the 290X which we haven't seen yet? For the money, do you think it will be worth it over a used-price on a 7970 - I mean as far as cost per credits ratio?

Cost/credit i think even a brand new 7970GE would beat a 290X. You'll be able to buy 2 7970GEs for the price of 1 R9 290X which will be around $600-$650. If the 7970GEs are used, you might even get 3 for that.

DrPop
10-16-13, 12:32 PM
OK, thanks. Guess that answers my question. ;)

John P. Myers
10-23-13, 06:33 PM
Rumor has it AMD may be dropping the price of the 290X just to stick it to Nvidia. Seems plausible since AMD intentionally ruined an Nvidia tech conference in Montreal last week (or the week before, idr). Anyway, the new *rumored* price would be $550. We'll find out for sure in a few hours when the NDA goes poof :D

zombie67
10-23-13, 11:57 PM
It's been a few hours.... Come on man, google it for me!!

John P. Myers
10-24-13, 05:51 AM
It's been a few hours.... Come on man, google it for me!!

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600100181%20600473898%20600473871&IsNodeId=1&name=Radeon%20R9%20290X
Though Newegg is charging $580, the rumor of $550 turned out to be true. Goodbye Nvidia with your slower $650 GPUs (though they're issuing a price cut next month).

No boost clock speed anymore. Now the stated clock speed of the GPU is the "up to" speed, and that speed will be maintained as long as temps don't exceed 95C. Yes, 95C. That is AMD's default max temp. On the plus side, the way the new settings work, the GPU will never run hotter than 95C unless you change the setting. This could be a good safety feature. If it tries to go above 95C, it throttles itself. Standard "up to" clock is 1000MHz.

L2 cache on the 290X has been increased to 1MB, up from 768kB on the 7970.

Length is 11", which is about 1/2" longer than a 780 or Titan.

6-pin + 8-pin power connectors.

Power usage....they claim TDP is 250W, however it's clearly closer to 300W. As i mentioned awhile back, it's obvious this GPU was meant to be made at a 20nm lithography, but AMD got in a hurry because of Nvidia being faster (at gaming) and TSMC was unable to pick up the pace, so AMD gets what they get.
1539

John P. Myers
10-24-13, 06:22 AM
Crunching specs that matter to us :)



GPU
GFLOPS FP32
GFLOPS FP64
Price


7970
3788.8
947.2
$300


GTX 780
3977
165.7
$500


7970GE
4096
1024
$310


Titan
4494
1299
$1000


GTX 780Ti
5046
210
$650


Titan @ 1GHz
5376
1593
$1000


R9 290X
5632
704
$550



Update: FP64 on 290X may have been intentionally crippled to 1:8 SP instead of 1:4 standard. Though i have found only 1 source stating this, it's a very reliable one. Needs verification.

Update 2: Verified :(

Mumps
10-24-13, 07:25 AM
No, that 165.7 on the 780 is not a typo. It just sucks that badly. :)

zombie67
10-24-13, 09:44 AM
Thanks!

DrPop
10-24-13, 01:19 PM
So clearly the 7970GE (or any 7970 O/Ced over 1GHz) is the best credits / dollar ratio. Thanks, this thread helped me with the big decision. ;)

Beerdrinker
10-24-13, 01:37 PM
So clearly the 7970GE (or any 7970 O/Ced over 1GHz) is the best credits / dollar ratio. Thanks, this thread helped me with the big decision. ;)


Me thinks that DrPop is getting ready to break out the creditcard!!

conf
12-26-13, 02:54 PM
I have 3 7970s and 1 7950 running. (and 5870, 6950, 3 Nvidias )
All cards are slightly overclocked and the first thing I do is resetting them to standard clocks with MSI Afterburner.
The older ones tend to get hot rams after years of using so I decrease the Ram speed.
They are 24/7 working and thats the best way to keep them alive.

FourOh
07-18-14, 10:42 AM
Ouch. AMD stock down 20% on news they have been downgraded to "underperform" by rating agency Canaccord Genuity. Yesterday AMD was trading at a two-year high... somehow I knew it wouldn't last!

John P. Myers
07-18-14, 04:26 PM
TSMC also lost 6% yesterday after shareholders found out Apple is switching back to Samsung for their lolphone processors. May drop more today because of it.

Sent from my Galaxy S4 using Tapatalk Pro

Fire$torm
07-19-14, 02:09 PM
TSMC also lost 6% yesterday after shareholders found out Apple is switching back to Samsung for their lolphone processors. May drop more today because of it.

Sent from my Galaxy S4 using Tapatalk Pro

Why does anyone still bother to shell out cash for any *yuck* iJunk?

The "White" status thing died with Jobs.

zombie67
07-19-14, 03:28 PM
Because it's a better product with a better ecosystem. IMO, of course.

dave c
07-20-14, 01:53 AM
Why does anyone still bother to shell out cash for any *yuck* iJunk?

The "White" status thing died with Jobs.
i got a brand new iphone for free threw my job i couldnt stand the thing , had it for over a year and couldnt wait to get back to an android phone

cineon_lut
07-20-14, 02:13 AM
iPhones, and Apple products in general are not about being the sum of their parts. If I wanted the highest megapixel /fastest gigahertz /most cores/biggest screen, of course I wouldn't buy Apple.

But when I want something that works for me and not vice versa, sorry, I'll pay the premium. It's the OS.


Vic (mobile)

zombie67
07-20-14, 02:47 AM
Everyone: Can we PLEASE not have arguments about which is obviously best and which is obviously worst?

We have team mates who prefer all kinds of phone OSs. Making proclamations about which is obviously "best" does nothing but force people into defending their opinion.

What is the point in instigating fights with friends?

John P. Myers
07-20-14, 03:51 AM
Making proclamations about which is obviously "best" does nothing but force people into defending their opinion.


My opinion is your new avatar might even be hotter than the last. And yes, i'm prepared to defend my position :D

Maxwell
07-20-14, 04:11 AM
My opinion is your new avatar might even be hotter than the last. And yes, i'm prepared to defend my position :D
LOUD NOISES!!!:p

myshortpencil
07-20-14, 09:18 AM
My opinion is your new avatar might even be hotter than the last. And yes, i'm prepared to defend my position :D

So, JPM is a leg man. :))

Duke of Buckingham
07-20-14, 11:31 AM
I am sad, I thought I was the best without discussion.
http://www.desibucket.com/wp-content/uploads/2012/12/Why-are-You-Sad00.jpg

John P. Myers
07-20-14, 03:48 PM
So, JPM is a leg man. :))

Yes i am =P~