First! by kwrzesien on Monday, November 12, 2012
My first First! Okay, now back to work.
kwrzesien
RE: First! by DigitalFreak on Monday, November 12, 2012
I wouldn't call riding the short-bus work...
DigitalFreak
RE: First! by kwrzesien on Monday, November 12, 2012
Hey, I wouldn't call reading news work either!
kwrzesien
Thank you insane amd fanboys, for months on end, you've been screaming that nVidia yields are horrible and they're late to the party, while nVidia itself has said yields are great, especially in the GPU gaming card space.
now the big amd fanboy lie is exposed.
" Interestingly NVIDIA tells us that their yields are terrific – a statement backed up in their latest financial statement – so the problem NVIDIA is facing appears to be demand and allocation rather than manufacturing."
(that's in the article above amd fanboys, the one you fainted...after raging... trying to read)
Wow.
I'm so glad this site is so fair, and as we see, as usual, what nVidia has been telling them is considered a lie for a very, very long time, until the proof that it was and is actually the exact truth and has been all along is slammed hard into the obstinate amd fan brain.
So nVidia NEVER had an ongoing yield issue on 600 series..
That's what they said all along, and the liars, knows as amd fanboys, just lied instead, even after they were informed over and over again that nVidia did not buy up a bunch of manufacturing time early.
Thanks amd fanboys, months and months of your idiot lies makes supporting amd that much harder, and now they are truly dying.
Thank you for destroying competition.
CeriseCogburn
Prejusdice and bias by mayankleoboy1 on Monday, November 12, 2012
Anand, I am a Nvidia fanboi.
But still i was surprised by your AMD S10000 coverage. That merited a page in the _pipeline_ section.
And a product from Nvidia gets a front seat, _3 page_ article ?

Bias, or page hits ?
mayankleoboy1
RE: Prejusdice and bias by Ryan Smith on Monday, November 12, 2012
I had more to write about the K20, it's as simple as that. This is the first chance I've had to write in-depth about GK110, whereas S10000 is a dual-chip board using an existing GPU.
Ryan Smith
RE: Prejusdice and bias by lx686x on Monday, November 12, 2012
Ohhh the W9000/8000 review that never got a promised part 2? And the S9000 and S7000 that was also thrown in the pipeline?
lx686x
RE: Prejusdice and bias by tviceman on Monday, November 12, 2012
Just like the gtx650 that never got it's own review. Get over it.
tviceman
RE: Prejusdice and bias by lx686x on Monday, November 12, 2012
It wasn't promised, get over it.
lx686x
RE: Prejusdice and bias by The Von Matrices on Tuesday, November 13, 2012
It was promised, but it never was published.

http://www.anandtech.com/show/6289/nvidia-launches...

"We’ll be looking at the GTX 650 in the coming week, at which point we should have an answer to that question."
The Von Matrices
RE: Prejusdice and bias by CeriseCogburn on Thursday, November 29, 2012
LOL - wrong again amd fanboy
CeriseCogburn
RE: Prejusdice and bias by Bullwinkle J Moose on Tuesday, November 13, 2012
K20X
384 bit bus
6 GB VRAM
7.1 Billion Transistors
3.95 TFLOP Single Precision
1.31 TFLOP Double Precision
$3200

Sounds impressive but can it play Crisis?
Bullwinkle J Moose
RE: Prejusdice and bias by Bullwinkle J Moose on Wednesday, November 14, 2012
CRYSIS
Bullwinkle J Moose
RE: Prejusdice and bias by CeriseCogburn on Thursday, November 29, 2012
You both meant Crysis Warhead, frost bench, an amd advantage favorite for a single amd win of late, not Crysis 2.
LOL
the bias is screaming out
CeriseCogburn
RE: Prejusdice and bias by eddman on Monday, November 12, 2012
"I am a Nvidia fanboi."

You do know that fanboy means "A stupid and highly biased fan", right?
The term you'd want to use is simply "Fan".
eddman
RE: Prejusdice and bias by Denithor on Monday, November 12, 2012
I think that was his point exactly - he's a RABID nVidia fan but still finds it off balance the differential treatment for the two companies.
Denithor
RE: Prejusdice and bias by CeriseCogburn on Thursday, November 29, 2012
The truth is probably he's a rabid amd fanboy in disguise
CeriseCogburn
RE: Prejusdice and bias by Sabresiberian on Tuesday, November 13, 2012
The "stupid" part is YOUR interpretation, it's not what it means to most people.

Biased, yes, stupid, no.
Sabresiberian
RE: Prejusdice and bias by CeriseCogburn on Thursday, November 29, 2012
Biased because, while being so stupid, so obsessed, so doused in tampon and motrin lacking estrogenic emoting for the team, that facts simply do not matter, and spinning Barney and and Nancy and Harry would be proud of, becomes all that spews forth.
Stupid is definitely part of it, coupled with a huge liar factor.
It may be the washing of the brain coupled with the excessive female hormone problems are the base cause, but in every case except flat out devious lying troll for amd, paid or unpaid, stupidity is a very large component.
CeriseCogburn
RE: Prejusdice and bias by dragonsqrrl on Monday, November 12, 2012
'An article about an AMD product got only 1 page of coverage while an article about an Nvidia product got 3, BIAS, FAVORITISM, FANBOI'

Dude really, grow up. You're just about the last person who should be throwing around accusations of bias and fanboism. Do you really have nothing better to do than to troll and whine on Tom's and Anand, about how the whole world is conspiring against your benevolent AMD?
dragonsqrrl
RE: Prejusdice and bias by CeriseCogburn on Thursday, November 29, 2012
Maybe he's the hacker that gives the -20 to every post on every gpu article at Tom's that is not 100% amd fanboy lie plus up based, or even hints at liking any nVidia card, ever.
I'm so SICK of having to hit the show post crap at Tom's in order to read any comments that aren't radeon rager amd favor boys
CeriseCogburn
I've heard through back channels that nVidia may be moving away from supporting OpenCL. Can you confirm any of this?
nutgirdle
There's always going to be that nagging concern since NVIDIA has CUDA, but I haven't heard anything so substantiate that rumor.
Ryan Smith
You mean you're worried and still sore over how pathetic AMD has been in it's lack and severey long time lacking suport for OpenCL compared to nVidia's far ahead forever and a long time great job in actually supporting it and breaking all the new ground, while amd fanboys whine OpenCl is the way and amd PR propaganda liar freaks paid by amd squeal OpenCL should be the only way forward ?
Yeah, that's what you really meant, or rather should have said.
Hey, cheer up, amd finally got the checkmark in GPU-Z for OpenCL support. Like YEARS after nVidia.
Thanks, I love the attacks on nVidia, while amd is crap.
It's one of the major reasons why amd is now nearly dead. The amd fanboys focus on hating the rich, prosperous, and profitable competition 100%, instead of directing their efforts at kicking the loser, amd, in the head or groin, or at all, let alone hard enough, for their failures to become apparent and self evident, so that they actually do something about them, fix them, and perform.
Most famous Catalyst Maker quote: " I didn't know we had a problem. "
That's amd professionalism for you.
CeriseCogburn
Gaming GPU by Jorange on Monday, November 12, 2012
So the GK110 will form the basis of the GTX 680's replacement?
Jorange
RE: Gaming GPU by thebluephoenix on Monday, November 12, 2012
Yes.
thebluephoenix
RE: Gaming GPU by suryad on Friday, November 16, 2012
That's pretty much what I needed to hear. My Geforce 285 GTX OC editions in SLI are getting a bit long in the tooth!
suryad
RE: Gaming GPU by Ryan Smith on Monday, November 12, 2012
Frankly we have no idea. It is a GPU, so NVIDIA could absolutely do that if they wanted to. But right now the entire allocation is going to Tesla. And after that I would expect to see a Quadro GK110 card before we ever saw a consumer card.
Ryan Smith
RE: Gaming GPU by mayankleoboy1 on Monday, November 12, 2012
Probably no. Why should it ? With HPC, they can sell it at $4000, making atleast $2000 in profit.

With a consumer gaming card, thay would have to sell it at $ 600 max, making $150-200 max.
mayankleoboy1
RE: Gaming GPU by Assimilator87 on Monday, November 12, 2012
nVidia already sells a $1k consumer graphics card, aka the GTX 690, so why can't they introduce one more?
Assimilator87
RE: Gaming GPU by HisDivineOrder on Monday, November 12, 2012
More to the point, they don't need to. The performance of the GK104 is more or less on par with AMD's best. If you don't need to lose money keeping up with the best your opponent has, then why should you lose money?

Keep in mind, they're charging $500 (and have been charging $500) for a GPU clearly built to be in the $200-$300 segment when their chief opponent in the discrete GPU space can't go a month without either dropping the prices of their lines or offering up a new, even larger bundle. This is in spite of the fact that AMD has released not one but two spectacular performance driver updates and nVidia disappeared on the driver front for about six months.

Yet even still nVidia charges more for less and makes money hand over fist. Yeah, I don't think nVidia even needs to release anything based on Big Daddy Kepler when Little Sister Kepler is easily handing AMD its butt.
HisDivineOrder
RE: Gaming GPU by RussianSensation on Monday, November 12, 2012
"Big Daddy Kepler when Little Sister Kepler is easily handing AMD its butt."

Only in sales. Almost all major professional reviewers have handed the win to HD7970 Ghz as of June 2012. With recent drivers, HD7970 Ghz is beating GTX680 rather easily:

http://www.legionhardware.com/articles_pages/his_7...

Your statement that Little Kepler is handing AMD's butt is absurd when it's slower and costs more. If NV's loyal consumers want a slower and more expensive card, more power to them.

Also, it's evident based on how long it took NV to get volume production on K20/20X, that they used GK104 because GK100/110 wasn't ready. It worked out well for them and hopefully we will get a very powerful GTX780 card next generation based on GK110 (or perhaps some other variant).

Still, your comment flies in the face of facts since GK104 was never build to be a $200-300 GPU because NV couldn't possibly have launched a volume 7B chip since they are only now shipping thousands of them. Why would NV open pre-orders for K20 parts in Spring 2012 and let its key corporate customers wait until November 2012 to start getting their orders filled? This clearly doesn't add up with what you are saying.

Secondly, you make it sound like price drops on AMD's part are a sign of desperation but you don't acknowledge that NV's cards have been overpriced since June 2012. That's a double standard alright. As a consumer, I welcome price drops from both camps. If NV drops prices, I like that. Funny how some people view price drops as some negative outcome for us consumers...
RussianSensation
RE: Gaming GPU by CeriseCogburn on Thursday, November 29, 2012
So legion has the 7970 vanilla winning nearly every benchmark.
LOL
I guess amd fanboys can pull out all the stops, or as we know, they are clueless as you are.
http://www.hardocp.com/article/2012/10/08/his_rade...

Oh look at that, the super expensive amd radeon ICE Q X2 GIGAHERTZ EDITION overclocked can't even beat a vanilla MSI 680 .

LOL

Reality sucks for amd fanboys.
CeriseCogburn
RE: Gaming GPU by Gastec on Tuesday, November 13, 2012
Right now ,in the middle of the night, an idea sprang into my abused brain. nVidia is like Apple. And their graphical cards are like the iPhones. There's always a few millions of people willing to buy their producs no matter what, no matter what price they put up. Even if the rest of the world would stop buying nVidia and iPhones at least there will always be some millions of amaricans to will buy them, and their sons and their sons' sons and so on and so forth until the end of days. Heck even one of my friends when we were chatting about computers components uttered the words: "So you are not a fan of nVidia? You know it has PhysX." In my mind I was like : "FAN? What the...I bought my ATI card because it was cheaper and consumed less power so I pay less money when the bloo...electricity bill comes" And after reading all your comments I understand now what you mean by "fanboy" or "fanboi" whatever. Typically american bs.
Gastec
RE: Gaming GPU by CeriseCogburn on Thursday, November 29, 2012
LOL - another amd fanboy idiot who needs help looking in the mirror.
CeriseCogburn
RE: Gaming GPU by Kevin G on Monday, November 12, 2012
A consumer card would make sense if yields are relatively poor. A die this massive has to have a very few fully functional chips (in fact, K20X only has 14 of 15 SMX clusters enabled). I can see a consumer card with 10 or 12 SMX clusters being active depending on yields for successful K20 and K20X dies.
Kevin G
RE: Gaming GPU by RussianSensation on Monday, November 12, 2012
It would also make sense if the yields are very good. If your yields are exceptional, you can manufacture enough GK110 die to satisfy both the corporate and consumer needs. Right now the demand for GK110 is outstripping supply. Based on what NV has said, their yields are very good. The main issue is wafer supply. I think we could reasonably see a GK110 consumer card next year. Maybe they will make a lean gaming card though as a lot of features in GK110 won't be used by gamers.
RussianSensation
RE: Gaming GPU by Dribble on Tuesday, November 13, 2012
Hope not - much better to give us another GK104 style architecture but increase the core count.
Dribble
Too costly for the enthusiast? by wiyosaya on Monday, November 12, 2012
IMHO, at these prices, I won't be buying one, nor do I think that the average enthusiast is going to be interesting in paying perhaps one and a half to three times the price of a good performance PC for a single Tesla card. Though nVidia will probably make hoards of money from supercomputing centers, I think they are doing this while forsaking the enthusiast market.

The 600 series seriously cripples double-precision floating point capabilities making a Tesla an imperative for anyone needing real DP performance, however, I won't be buying one. Now if one of the 600 series had DP performance on par or better than the 500 series, I would have bought one rather than buying a 580.

I don't game much, however, I do run several BOINC projects, and at least one of those projects requires DP support. For that reason, I chose a 580 rather than a 680.
wiyosaya
RE: Too costly for the enthusiast? by DanNeely on Monday, November 12, 2012
The Tesla (and quadro) cards have always been much more expensive than their consumer equivalents. The Fermi generation M2090 and M2070Q were priced at the same several thousand dollar pricepoint as K20 family; but the gaming oriented 570/580 were at the normal several hundred dollar prices you'd expect for a high end GPU.
DanNeely
RE: Too costly for the enthusiast? by wiyosaya on Tuesday, November 13, 2012
Yes, I understand that; however, IMHO, the performance differences are not significant enough to justify the huge price difference unless you work in very high end modeling or simulation.

To me, with this generation of chips, this changes. I paid close attention to 680 reviews, and DP performance on 680 based cards is below that of the 580 - not, of course, that it matters to the average gamer. However, I highly doubt that the chips in these Teslas would not easily adapt to use as graphics cards.

While it is nVidia's right to sell these into any market they want, as I see it, the only market for these cards is the HPC market, and that is my point. It will be interesting to see if nVidia continues to be able to make a profit on these cards now that they are targeted only at the high-end market. With the extreme margins on these cards, I would be surprised if they are unable to make a good profit on them.

In other words, do they sell X amount at consumer prices, or do they sell Y amount at professional prices and which target market would be the better market for them in terms of profits? IMHO, X is likely the market where they will sell many times the amount of chips than they do in the Y market, but, for example, they can only charge 5X for the Y card. If they sell ten times the chips in X market, they will have lost profits buy targeting the Y market with these chips.

Also, nVidia is writing their own ticket on these. They are making the market. They know that they have a product that every supercomputing center will have on its must buy list. I doubt that they are dumb.

What I am saying here is that nVidia could sell these for almost any price they choose to any market. If nVidia wanted to, they could sell this into the home market at any price. It is nVidia that is making the choice of the price point. By selling the 680 at high-end enthusiast prices, they artificially push the price points of the market.

Each time a new card comes out, we expect it to be more expensive than the last generation, and, therefore, consumers perceive that as good reason to pay more for the card. This happens in the gaming market, too. It does not matter to the average gamer that the 580 outperforms the 680 in DP operations; what matters is that games run faster. Thus, the 680 becomes worth it to the gamer and the price of the hardware gets artificially pushed higher - as I see it.

IMHO, the problem with this is that nVidia may paint themselves into an elite market. Many companies have tried this, notably Compaq and currently Apple. Compaq failed, and Apple, depending on what analysts you listen to, is losing its creative edge - and with that may come the loss of its ability to charge high prices for its products. While nVidia may not fall into the "niche" market trap, as I see it, it is a pattern that looms on the horizon, and nVidia may fall into that trap if they are not careful.
wiyosaya
RE: Too costly for the enthusiast? by CeriseCogburn on Thursday, November 29, 2012
Yep, amd is dying, rumors are it's going to be bought up after a chapter bankruptcy, restructured, saved from permadeath, and of course, it's nVidia that is in danger of killing itself... LOL
Boinc is that insane sound in your head.
NVidia professionals do not hear that sound, they are not insane.
CeriseCogburn
RE: Too costly for the enthusiast? by shompa on Monday, November 12, 2012
These are not "home computer" cards. These are cards for high performance calculations "super computers". And the prices are low for this market.

The unique thing about this years launch is that Nvidia always before sold consumer cards first and supercomputer cards later. This time its the other way.

Nvidia uses the supercomputer cards for more or less subsidising its "home PC" graphic cards. Usually its the same card but with different drivers.

Home 500 dollars
Workstation 1000-1500 dollars
Supercomputing 3000+ dollars

Three different prices for the same card.

But 7 billion transistors on 28nm will be expensive for home computing. It cost more then 100% more to manufacture these GPUs then Nvidia 680.

7 BILLION. Remember that the first Pentium was the first 1 MILLION transistors. This is 7000 more dense.
shompa
RE: Too costly for the enthusiast? by kwrzesien on Monday, November 12, 2012
All true.

But I think what has people complaining is that this time around Nvidia isn't going to release this "big" chip to the Home market at all. They signaled this pretty clearly by putting their "middle" chip into the 680. Unless they add a new top-level part name like a 695 or something they have excluded this part from the home graphics naming scheme. Plus since it is heavily FP64 biased it may not perform well for a card that would have to be sold for ~$1000. (Remember they are already getting $500 for their middle-size chip!)

Record profits - that pretty much sums it up.
kwrzesien
RE: Too costly for the enthusiast? by DanNeely on Monday, November 12, 2012
AFAIK that was necessity speaking. The GK100 had some (unspecified) problems; forcing them to put the Gk104 in both the mid and upper range of their product line. When the rest of the GK11x series chips show up and nVidia launches the 7xx series I expect to see GK110's in the top as usual. Having seen nVidia's midrange chip trade blows with their top end one, AMD is unlikely to be resting on it's laurels for their 8xxx series.
DanNeely
RE: Too costly for the enthusiast? by RussianSensation on Monday, November 12, 2012
Great to see someone who understood the situation NV was in. Also, people think NV is a charity or something. When they were selling 2x 294mm^2 GTX690 for $1000, we can approximate that on a per wafer cost, it would have been too expensive to launch a 550-600mm^2 GK100/110 early in the year and maintain NV's expected profit margins. They also faced wafer shortages which explains why they re-allocated mobile Kepler GPUs and had to delay under $300 desktop Kepler allocation by 6+ months to fulfill 300+ notebook design wins. Sure, it's still Kepler's mid-range chip in the Kepler family, but NV had to use GK104 as flagship.
RussianSensation
RE: Too costly for the enthusiast? by CeriseCogburn on Thursday, November 29, 2012
kwrsezien, another amd fanboy idiot loser with a tinfoil brain and rumor mongered brainwashed gourd
Everything you said is exactly wrong.
Perhaps and OWS gathering will help your emotional turmoil, maybe you can protest in front of the nVidia campus.
Good luck, wear red.
CeriseCogburn
RE: Too costly for the enthusiast? by bebimbap on Monday, November 12, 2012
Each "part" being made with the "same" chip is more expensive for a reason.

For example Hard drives made by the same manufacturer have different price points for enterprise, small business, and home user. I remember an Intel server rep said to use parts that are designed for their workload so enterprise "should" use an enterprise drive and so forth because of costs. And he added further that with extensive testing the bearings used in home user drives will force out their lubricant fluid causing the drive to spin slower and give read/write errors if used in certain enterprise scenarios, but if you let the drive sit on a shelf after it has "failed" it starts working perfectly again because the fluids returned to where they need to be. Enterprise drives also tend to have 1 or 2 orders of magnitude better bit read error rate than consumer drives too.

In the same way i'm sure the tesla, quadro, and gtx all have different firmwares, different accepted error rates, different loads they are tested for, and different binning. So though you say "the same card" they are different.

And home computing has changed and have gone in a different direction. No longer are we gaming in a room that needs a separate AC unit because of the 1500w of heat coming from the computer. We have moved from using 130w CPUs to only 78w. Single gpu cards are no longer using 350w but only 170w. so we went from using +600-1500w systems using ~80% efficient PSUs to using only about ~<300-600w with +90% efficient PSUs, and that is just under high loads. If we were to compare idle power, instead of only using 1/2 we are only using 1/10. We no longer need a GK110 based GPU, and it might be said that it will not make economic sense for the home user.

GK104 is good enough.
bebimbap
RE: Too costly for the enthusiast? by EJ257 on Monday, November 12, 2012
The consumer model of this with the fully operational die will be in the $1000 range. 7 billion transitors is a really big chip even for 28nm process.
EJ257
RE: Too costly for the enthusiast? by CeriseCogburn on Thursday, November 29, 2012
We can look forward to a thousand morons screaming it's not twice as fast as the last gen, again.
The pea brained proletariat never disappoints.
They always have memorized 15 or 20 lies, and have hold of half a fact to really hammer their points home.
I can hardly wait.
CeriseCogburn
Latest from AnandTech