PDA

View Full Version : Nvidia GTX 680



John P. Myers
03-20-12, 11:03 PM
It's not official yet, but retailers have reported the SRP of the GTX 680 has been dropped to $499. I've seen dozens upon dozens of benchmarkings and tests playing actual games that have shown the GTX 680 to be consistently faster than the 7970, though some of those tests were questionable, some seemed fairly straight up. Also of note: the GTX 680 has a TDP of only 195W, while the 7970 has a max of 230W.

It's thought Nvidia may have lowered the price to start another price war with AMD, which, looking at their profit margins lately, can't really afford to drop the price of the 7970, though they may have to anyway. Happy shopping :)

DrPop
03-21-12, 02:29 AM
Thanks for the update! That would be very nice if turns out to be true. Nothing better for us crunchers than a high end GPU price war!:cool:

John P. Myers
03-21-12, 06:28 AM
Well...something else i heard that makes alot of sense, but was quickly hushed, was that the GTX 670 Ti was supposed to be the first Keplar released. But after Nvidia tested it and saw how powerful it was compared to the 7970, they renamed it the GTX 680. Makes me shiver a little wondering just what kind of beast the 680 would have been....and i just stumbled upon a clue. The 670 Ti (now the GTX 680) is based on the GK104 GPU. But there's this other chip, the GK110 (or maybe GK112...whatever they decide to call it) which has an 87% larger die size than the GK104, putting the transistor count around 6 billion (the GK104 has 3.5 billion). So...if you're Nvidia, what do you do? Make the GTX 690 using 2 GK104s and a 685 using the GK110? Plan a 700 series? It's just speculation, of course, but surely it's gotta show up somewhere...

Mike029
03-21-12, 08:00 AM
Thanks for the update! That would be very nice if turns out to be true. Nothing better for us crunchers than a high end GPU price war!:cool:

Very true. If the price is around $500.00 that keeps it in my wheelhouse. These $600+ prices are getting nuts.

John P. Myers
03-21-12, 07:29 PM
In case it was missed in the other thread where i posted this, Here you go http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&N=-1&isNodeId=1&Description=gtx+680&x=24&y=14. Many are in stock as of this posting.


Edit: and now it's gone :(

kaptainkarl1
03-21-12, 08:28 PM
I think I just might. I really do.

John P. Myers
03-21-12, 09:04 PM
http://tof.canardpc.com/preview2/5dd2824a-dc85-4459-a5fb-a5ae135882d0.jpg

There is a screenshot of all the GTX 680s Newegg had listed.

spingadus
03-21-12, 10:35 PM
So, any idea about which model is best?

Mr. Hankey
03-21-12, 10:53 PM
I just got an email invite from newegg to be one of the first to buy the new gtx680... when you click on the buy now link it goes to the no search items found. The official launch is 3/22 so they should be available after midnight (west coast time since that is where the nvidia HQ is).

coronicus
03-22-12, 12:38 PM
Hurry up NVIDIA i need you to fill up all the shelves so AMD will be forced to lower the price on those sexy 7970... hehe

DrPop
03-22-12, 12:52 PM
Hurry up NVIDIA i need you to fill up all the shelves so AMD will be forced to lower the price on those sexy 7970... hehe

Yep! Gotta love that they're both getting competitive again. :D. Fun crunchin' times ahead!

Fire$torm
03-22-12, 12:54 PM
Well Newegg is listing the cards again, and they are all "Out of Stock"..... :P

Mr. Hankey
03-22-12, 02:34 PM
so check out the OpenCL bench numbers for dual / single precision.

http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-14.html

DrPop
03-22-12, 02:52 PM
so check out the OpenCL bench numbers for dual / single precision.

http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-14.html

Crap. I don't know what else to say. Is NVIDIA this pathetic? Are they saying they purposely detuned the card so it sucks for crunching? This probably to sell their high end Tesla cards or whatever instead, right?
Very, very disappointed with this. What's next to evaluate? Any other benchmarks we need to consider, etc to make our decisions?

Mike029
03-22-12, 04:09 PM
Crap. I don't know what else to say. Is NVIDIA this pathetic? Are they saying they purposely detuned the card so it sucks for crunching? This probably to sell their high end Tesla cards or whatever instead, right?
Very, very disappointed with this. What's next to evaluate? Any other benchmarks we need to consider, etc to make our decisions?

Well, this changes things a bit. The 7970 clearly outperforms the GTX680 for our purposes. Perhaps you should wait till we start to see these cards on DiRT, PG and others.

DrPop
03-22-12, 04:56 PM
Yeah, I agree with you there. The waiting game is on now. I will use this time to start getting it together for whichever GPU is better.

Maybe there is something to JPM's post about the GK110 chip that NVIDIA was supposed to come out with for the higher end? Wonder when all this will finally be out in the open and we can see our real options...:confused:

Fire$torm
03-22-12, 04:56 PM
Unfortunately that review is tainted. The AMD 6990 is a Dual GPU unit. Apples to Oranges. Stupid, very stupid. Or at least the review should have included the 6970. And somehow the reviews conclusion says the GTX-680 is a good card. How? It's intentionally crippled compute performance. At idle, it sucks up the same amount of power as the 7970. It cannot beat it's predecessor on several benchmarks. Plus, it's new transcoder is limited to H.264 output! So for all that, they want more money than a 580???

Sorry, but from my point of view nVidia it the biggest tech scam artist on the planet. All they have done is set it up so that Joe consumer is footing the bill for all their R&D, slanted reviews and over priced ad campaigns while reaping in the motherload from their commercial Fermi products. And the really sad news? Joe consumer is eating it up like a nine year old given free access to the candy store.......

Well at least the 680 will put pressure on the rest of the GPU market, nVidia and AMD alike.

John P. Myers
03-22-12, 05:08 PM
Crap. I don't know what else to say. Is NVIDIA this pathetic? Are they saying they purposely detuned the card so it sucks for crunching? This probably to sell their high end Tesla cards or whatever instead, right?
Very, very disappointed with this. What's next to evaluate? Any other benchmarks we need to consider, etc to make our decisions?

Nvidia has always done that, starting with the 400 series, however then they only crippled double precision by 1/8th. This 1/24th crap is pretty bad

However "We asked NVIDIA about this and they said that a folding update is in the works"

As it stands now, imagine Dirt on the 680. Nearly double what a 580 can do? Yes please.

EDIT: but yeah, if you work on double precision projects, always go with AMD. Nvidia will always intentionally screw you over there. I remember back on the old forums i ranted about this for days. Even ranted on Nvidia's forums and a few team members joined in. Clearly it hasn't changed things. But now, just because i can, i will write an article about it for LegitReviews directly aimed at letting us folders know what's really going on. I just need a little more info first...

Also, Nvidia's GPUs have always been better at double precision than AMD's. No one knows that because they keep it crippled, but uncrippled, Nvidia can do SP : DP @ 2:1, meaning the 680 *could have* a DP rating of 1545 GFLOPS (which is about 40% faster than AMD's 6990!!!!!). That's what we're really missing out on.

zombie67
03-22-12, 05:35 PM
Making it slower than their *own* previous generation product is insane. Suicidal even.

Which projects require double precision again?

John P. Myers
03-22-12, 07:34 PM
Making it slower than their *own* previous generation product is insane. Suicidal even.

Which projects require double precision again?

milkyway and gpugrid

zombie67
03-22-12, 07:59 PM
milkyway and gpugrid

I asked over on GPUGRID, and they said almost none. Which is good news.

And I think that leaves only MW. Since ATI always did better there anyway.

http://www.ps3grid.net/forum_thread.php?id=2776&nowrap=true#24089

Mr. Hankey
03-22-12, 08:54 PM
The new GPU LLR applications over at primegrid also require DP. If they could get a OpenCL app, the ATI cards would be ripping it up over there.

DrPop
03-22-12, 10:28 PM
So this is really turning into a project debate. If I want to run X project, then I need to purchase X GPU.
Hmmm...

Bryan
03-23-12, 12:10 AM
Tonight Robert, from the Polish National Team, posted on our boards that he has benchmarked his GTX 570 @ 840MHz and the new GTX 680 @ 1047MHz and the 680 produces 44% higher credits. That was on PPS Sieve. I would expect that to be about the same on DirT. That would make it 34% higher than a GTX 580. So if you normally run those projects it would be a pretty good boost!

Mike029
03-23-12, 08:16 AM
Tonight Robert, from the Polish National Team, posted on our boards that he has benchmarked his GTX 570 @ 840MHz and the new GTX 680 @ 1047MHz and the 680 produces 44% higher credits. That was on PPS Sieve. I would expect that to be about the same on DirT. That would make it 34% higher than a GTX 580. So if you normally run those projects it would be a pretty good boost!

Hmm, that does not seemed to be "crippling" the GTX 680. I think that does make this card a viable highend player.

Bryan
03-23-12, 10:19 AM
They limited the 64 bit double precision math capability and PG is 32 bit single precision. I'm guessing that DirT is also single precision but I'm not positive. MW would be limited, but as pointed out, ATIs have always had a big advantage there.

DrPop
03-23-12, 11:54 AM
They limited the 64 bit double precision math capability and PG is 32 bit single precision. I'm guessing that DirT is also single precision but I'm not positive. MW would be limited, but as pointed out, ATIs have always had a big advantage there.

OK, so to run the typical CUDA projects - i.e. PG and DiRT, with perhaps occasional Collatz - the 680 is indeed the good choice now?:confused:

John P. Myers
03-23-12, 12:34 PM
at PrimeGrid, the PPS and CW Sieves don't require double precision, but the GFN prime search does.

Al
03-23-12, 04:07 PM
OK, so to run the typical CUDA projects - i.e. PG and DiRT, with perhaps occasional Collatz - the 680 is indeed the good choice now?:confused:

DrPop, I think you should see a few extra patients a day and do a bulk buy on the 680's...then send me one...or two! :rolleyes: Sphinx would like that! She told me so!

DrPop
03-23-12, 04:26 PM
DrPop, I think you should see a few extra patients a day and do a bulk buy on the 680's...then send me one...or two! :rolleyes: Sphinx would like that! She told me so!

LOL! :)) Yeah, and Kim just told me Kat needs new shoes and she'd like to save up $ for a trip...go figure...what timing! :p:cool:
Seriously, though - I am going to figure out how to swing something new for the rig...either a new CUDA or one of the new AMDs give me about a week or so.
What I don't know is - this the best time or not? I mean, what are they going to introduce in 2 or 3 weeks? Or...maybe this is it, and we'll be waiting a long time for anything better now. I don't know, but I wish I did.

Al
03-23-12, 04:44 PM
LOL! :)) Yeah, and Kim just told me Kat needs new shoes and she'd like to save up $ for a trip...go figure...what timing! :p:cool:
Seriously, though - I am going to figure out how to swing something new for the rig...either a new CUDA or one of the new AMDs give me about a week or so.
What I don't know is - this the best time or not? I mean, what are they going to introduce in 2 or 3 weeks? Or...maybe this is it, and we'll be waiting a long time for anything better now. I don't know, but I wish I did.

I take that as a "no" on the whole Sphinx idea. Wow...Kim & Kat are really putting a damper on things here is NC. :D

DrPop
03-23-12, 04:45 PM
Ah, here's what I'm talking about. http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review In this article Anand refers to both a GK110 and a GK114 as "Big Kepler".
What I want to know (and no one seems to) is, when will these suckers hit the market, and how much $ above the 680 will they be?:confused:

zombie67
03-23-12, 04:50 PM
Ah, here's what I'm talking about. http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review In this article Anand refers to both a GK110 and a GK114 as "Big Kepler".
What I want to know (and no one seems to) is, when will these suckers hit the market, and how much $ above the 680 will they be?:confused:

Buy a 680 now. Sell it for funds when the better one comes out, in late summer. Waiting is just losing credits every day.... ;)

John P. Myers
03-23-12, 04:59 PM
Ah, here's what I'm talking about. http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review In this article Anand refers to both a GK110 and a GK114 as "Big Kepler".
What I want to know (and no one seems to) is, when will these suckers hit the market, and how much $ above the 680 will they be?:confused:

I know i know! lol

August, at best. But seriously...those things will murder your electric bill. You're talking 250W - 300W just for the single GPU card....seriously...

BUT! You asked if anything new would appear in the next 2 or 3 weeks. Yes. And some good stuff, i might add. Hold on a sec while i dig up a link...

http://www.evga.com/articles/00669/#GTX680SC overclocked 680s
http://www.evga.com/articles/00669/#GTX680FTW For The Win editions, overclocked and 1 includes 2x the memory (4GB)
http://www.evga.com/articles/00669/#GTX680Classified Classified edition - overclocked with 4GB memory
http://www.evga.com/articles/00669/#GTX680HydroCopper Hydro Copper (overclock, 2GB memory) and Hydro Copper Classified (overclocked with 4GB memory)

*click on the product names to the upper-right of the image for a listing of features and specs for each device

on the AMD side, they may get the 7990 out by then. no guarantees though

John P. Myers
03-23-12, 05:47 PM
at PrimeGrid, the PPS and CW Sieves don't require double precision, but the GFN prime search does.

About 18 mins ago, PrimeGrid proved the GTX 680 to be about 40% slower in double precision than the 580. Good thing i fueled the flames on that one to get it done sooner rather than later. The test was done using the Genefer app's benchmark, not by running a Genefer WU itself, which eliminated any possible variances due to WUs being different lengths.


Assuming that device 0 is the GTX 580 and device 1 is the GTX 680 -- that is BAD news.

The GTX 680 is substantially slower than the GTX 580, presumably due to the crippled floating point performance.

Were both GPUs running at stock clock speeds? (Just to make sure we're comparing apples to apples.)

EDIT: "One small step for a gamer; one giant leap backwards for number crunching."

A link to the important part of the thread: Primegrid GTX680 vs GTX580 results (http://www.primegrid.com/forum_thread.php?id=4044&nowrap=true#51930)

coronicus
03-23-12, 06:25 PM
For me im waiting for the nvidia to build up stock hopping that amd drops about $20-30 along with some rebate offers then im set but that might be just wishful thinking we will see. I would def give it a a few weeks to see how the market settles..

Bryan
03-23-12, 11:52 PM
Okay another post from Robert of the Polish National Team:

He said that PG PPS sieve was 44% faster with his new 680 vs 570 BUT look at this on DirT



DirT (N680 | N570):
mysqlsha1 3:28 2:30
md5-1 ~30 min 25:48
md5-2 ~30 min ~22 min

No good...



Unless they come out with a driver release that cuts loose the 64 bit and floating point math then this card isn't very good for crunching!

Fire$torm
03-24-12, 12:09 AM
Unless they come out with a driver release that cuts loose the 64 bit and floating point math then this card isn't very good for crunching!

But that is not going to happen as it would tempt the commercial market to buy the cheaper consumer units.

Also for what they want for the 680 you could buy two 570s. This gives you two advantages. First, more credits then a single 680. And second, if one card fails you only loose 50% of your production. Compared to losing 100% of production if the 680 fails (And 1st gen tend to fail more often) I'd think its better to go the 570 route.

John P. Myers
03-24-12, 01:13 AM
Zotac Prepares 2GHz GeForce GTX 680. Yes, Really.


If you think NVIDIA's GeFoce GTX 680 video card is tough, you're right, but that doesn't change the fact that it won't even hold a candle to what Zotac is working on.

According to a QQ report, Zotac has decided to develop a Kepler-based video adapter whose graphics processing unit runs at 2 GHz.

That's right, even though the GeForce GTX 680 already operates at the gargantuan clock speed of 1,006 MHz, Zotac is not satisfied.

In fact, all the specifications of the Kepler product are high, as expected of the so-called strongest single-GPU video controller in the world, but the OEM wants to go above and beyond them anyway.

The company intends to complete the “Godly” GeForce GTX 680 by the middle of next month (April, 2012).

The graphics chip has 1,536 CUDA cores operating at 2,012 MHz, which may or may not have been a deliberate setting on NVIDIA's part to reflect the year of launch. Probably not, but the coincidence is interesting nonetheless.

Amount of memory, speed of memory are to remain the same. Connections are also expected to stay the same.

Fire$torm
03-24-12, 01:30 AM
And what kind of premium should one expect for the "godly" edition?

John P. Myers
03-24-12, 01:33 AM
And what kind of premium should one expect for the "godly" edition?

Unknown. But we definitely need to add it to the list of stuff available within the next 3 weeks that DrPop is looking for :)

EDIT: You know, at that speed, this GTX680 *might* just equal the double precision power of the GTX 580 =))

DrPop
03-24-12, 01:41 AM
So this thing sucks wind at PG AND DiRT? Who would've thought? Combine this NVIDIA nuthouse with all the crap my bro's rig is doing on me right now, and man, my head hurts.:p:-/

John P. Myers
03-24-12, 01:48 AM
So this thing sucks wind at PG AND DiRT? Who would've thought? Combine this NVIDIA nuthouse with all the crap my bro's rig is doing on me right now, and man, my head hurts.:p:-/

You forgot it sucks at milkyway too. Oh but there's more good news, for the guys over at Folding@Home it doesn't work at all. And even more good news (you're gonna throw in the towel on this one) it appears that when Nvidia sent all the review sites sample GTX 680s and drivers, the drivers were better for the reviews, than what was actually released to the general public. Specifically, even on a brand new X79 motherboard with a brand new SB-E CPU full of PCIe 3.0 slots, the GTX 680 will only run at PCIe 2.0 speeds. I think now we can call it a day :)


GeForce GTX 680 supports PCI Express 3.0. The Intel X79/SNB-E PCI Express 2.0 platform is only currently supported up to 5GT/s (PCIE 2.0) bus speeds even though some motherboard manufacturers have enabled higher 8GT/s speeds.

Fire$torm
03-24-12, 02:11 AM
So this thing sucks wind at PG AND DiRT? Who would've thought? Combine this NVIDIA nuthouse with all the crap my bro's rig is doing on me right now, and man, my head hurts.:p:-/

You forgot it sucks at milkyway too. Oh but there's more good news, for the guys over at Folding@Home it doesn't work at all. And even more good news (you're gonna throw in the towel on this one) it appears that when Nvidia sent all the review sites sample GTX 680s and drivers, the drivers were better for the reviews, than what was actually released to the general public. Specifically, even on a brand new X79 motherboard with a brand new SB-E CPU full of PCIe 3.0 slots, the GTX 680 will only run at PCIe 2.0 speeds. I think now we can call it a day :)


GeForce GTX 680 supports PCI Express 3.0. The Intel X79/SNB-E PCI Express 2.0 platform is only currently supported up to 5GT/s (PCIE 2.0) bus speeds even though some motherboard manufacturers have enabled higher 8GT/s speeds.

Holy F'n Toledo! I said it before and I'll say it now. nVidia is the biggest tech scam artist on the planet......

FYI tangent: The reason I write nVidia the way I do is that I have never had any respect for that company. Its the way they used to present their name back in the day. They switched up to the capital N to make themselves more presentable to the "Adult" market. phhht. They are full of themselves..... and it.

Mike029
03-24-12, 08:02 AM
The last chance for the 680 will be with the Bitcoin miners. I wonder how it does mining? If it's as bad at that as it is with crunching then they've limited themselves to gamers only.

I still believe that more Crunchers and Miners bought HD5970's than gamers. You can prob. add the 5870 as well.

DrPop
03-25-12, 05:20 PM
What I want to know now - maybe JPM or F$ or Zombie or somebody else knows - is this stupid limitation on crunching a software limitation? Like a BIOS/firmware or driver issue that could be altered or remedied is some way? Or is it a hardware limitation where nVidia somehow screw with the chip just to make sure that crunching with the card is impossible.:confused:

The whole thing just stinks. It's one thing not to have the technology period. It's another thing for that technology to be "present" but out of reach financially to all but giant corporations or Gov entities such as NASA. It is quite another thing altogether for the technology to be present, and financially accessible...and then denied the consumer because of stupidity and uber greed on the part of the corporation who makes it. I'm not against them making a profit. I am against them scalping me because I want to use the CUDA portion of their technology whch EVERY ad of theirs boasts about.

I think my next GPU may have to be an AMD just for the principle of it. Is the 7970 turning out to be that hot of a card on Moo! or Donate or whatever the hot AMD/ATI project flavor of the month is?:rolleyes::p

John P. Myers
03-25-12, 07:38 PM
In the 680, it's a physical reduction of FP64 cores, which are then additionally crippled. Nothing can be done.

In the 400 series and 500 series i never found out how it's done for sure, but i DO know it isn't done through BIOS. So hacking that won't help at all.

It's possible it's done in the drivers. If that's true, the last driver release that wouldn't've been crippled would've been during the reign of the 200 series, however, conveniently enough, those drivers won't allow a 400 or 500 series card to work at all.

DrPop
03-25-12, 10:45 PM
In the 680, it's a physical reduction of FP64 cores...

In the 400 series and 500 series ... it isn't done through BIOS...

...the last driver release that wouldn't've been crippled [is] the 200 series, however, conveniently enough, those drivers won't allow a 400 or 500 series card to work at all.


Dude, you are the man with the info! Unfortunately it is not real positive info, but it is the truth none the less. OK, well that means for high end crunching I need a high end AMD card next I suppose. We'll see what I can come up with...maybe nothing, we'll see. This whole thing still blows in my mind. They get us sooooooo excited. For...NOTHING.:p

coronicus
03-25-12, 10:53 PM
Yeah that how im taking it also .. Im assuming the worse, they didnt want the 680 to compete with the quadro whatever its called. Either that or it was a way to reduce transistor count on the chip making it cheaper to produce and yet still do extremly well in games. We can hope they release something better later on but im guessing at this point. The good thing is this will force AMD to lower the price and soon. So be patient it will happen and you will be able to save 100$ on the 7970 if you can wait..

John P. Myers
03-26-12, 12:03 AM
Yes, full-speed double precision performance is a feature we reserve for our professional customers. Consumer applications have little use for double precision, so this does not really affect GeForce users.

!! <---That was so i could meet the 2 character minimum message limit. Quotes don't count. Grrrrr

EDIT: actually i would like to add a few more characters. How do such incompetent f__kwits, people so GD retarded they think cheerios are donut seeds, get the jobs they get while people like me sit here counting the number of empty Dr. Pepper cans on my desk? Makes me want to pull (the rest of) my hair out! [/rant]

Fire$torm
03-26-12, 01:57 AM
What I want to know now - maybe JPM or F$ or Zombie or somebody else knows - is this stupid limitation on crunching a software limitation? Like a BIOS/firmware or driver issue that could be altered or remedied is some way? Or is it a hardware limitation where nVidia somehow screw with the chip just to make sure that crunching with the card is impossible.:confused:

The whole thing just stinks. It's one thing not to have the technology period. It's another thing for that technology to be "present" but out of reach financially to all but giant corporations or Gov entities such as NASA. It is quite another thing altogether for the technology to be present, and financially accessible...and then denied the consumer because of stupidity and uber greed on the part of the corporation who makes it. I'm not against them making a profit. I am against them scalping me because I want to use the CUDA portion of their technology whch EVERY ad of theirs boasts about.

I think my next GPU may have to be an AMD just for the principle of it. Is the 7970 turning out to be that hot of a card on Moo! or Donate or whatever the hot AMD/ATI project flavor of the month is?:rolleyes::p


In the 680, it's a physical reduction of FP64 cores, which are then additionally crippled. Nothing can be done.

In the 400 series and 500 series i never found out how it's done for sure, but i DO know it isn't done through BIOS. So hacking that won't help at all.

It's possible it's done in the drivers. If that's true, the last driver release that wouldn't've been crippled would've been during the reign of the 200 series, however, conveniently enough, those drivers won't allow a 400 or 500 series card to work at all.

I thought I read somewhere it was a firmware/driver imposed limitation as nVidia didn't want to muck with fabrication layouts or some such thing. That is for pre-680 units.

Edit: For anyone that has not seen the "Pro" units ----> http://www.nvidia.com/object/workstation-solutions.html
Their top shelf card sells for $3,998.00. Now you see why nVidia limits the 680! For x8 the price you get your card unlocked..........

John P. Myers
03-26-12, 06:07 AM
yeah http://www.nvidia.com/object/buy_now_results_ci.html?id=QD6000 That's nothing but a GTX 560 Ti. a 560Ti for $4k. F__k Nvidia up the a__ with a pinecone.

Mumps
03-26-12, 08:25 PM
yeah http://www.nvidia.com/object/buy_now_results_ci.html?id=QD6000 That's nothing but a GTX 560 Ti. a 560Ti for $4k. F__k Nvidia up the a__ with a pinecone.

Come on JPM. Don't hold back. Let us know how you really feel about it. :)

Duke of Buckingham
03-26-12, 08:48 PM
Tell us about the love and other secret feelings, don`t let us out of it JPM.=))=))=))

Crazy Duke:o

DrPop
03-26-12, 09:22 PM
Haha...oh man...:)) .... :D .... :o

John P. Myers
04-06-12, 07:18 PM
http://www.legitreviews.com/news/12801/

Gotta admit that really looks awesome. Too bad they wasted all that effort on Nvidia :( And only 7dB max fan noise. I'm figureing that doesn't count the radiator fan, only the one on the card.