Fermi

Message boards : Graphics cards (GPUs) : Fermi
Message board moderation

To post messages, you must log in.

1 · 2 · 3 · 4 . . . 16 · Next

AuthorMessage
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15301 - Posted: 18 Feb 2010, 17:00:20 UTC

ID: 15301 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Snow Crash

Send message
Joined: 4 Apr 09
Posts: 450
Credit: 539,316,349
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15305 - Posted: 18 Feb 2010, 17:25:13 UTC - in response to Message 15301.  

I'm getting at least one and maybe two. We just need to have them available so we can see how quick they process WUs!

Thanks - Steve
ID: 15305 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15316 - Posted: 18 Feb 2010, 21:42:08 UTC

Nice link with that video explaining the basics! Just be sure to check its price and power supply requirements before you buy a couple of them ;)

MrS
Scanning for our furry friends since Jan 2002
ID: 15316 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Snow Crash

Send message
Joined: 4 Apr 09
Posts: 450
Credit: 539,316,349
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15366 - Posted: 22 Feb 2010, 17:12:46 UTC

http://www.nvidia.com/object/paxeast.html
Thanks - Steve
ID: 15366 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Zydor

Send message
Joined: 8 Feb 09
Posts: 252
Credit: 1,309,451
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 15382 - Posted: 23 Feb 2010, 15:59:09 UTC - in response to Message 15366.  
Last modified: 23 Feb 2010, 15:59:28 UTC

Dont waste your money - Fermi is a dead duck.

It will release late March early April, but with very limited quantities, each one being fielded at a huge loss, and each knocking hard on the 300w PCIe certification limit. It will be a PR stunt to keep them going, by May / June you will not get one for love nor money because NVidia cannot afford to subsidise more. The current design is grossly uneconomical, and can be wiped out in a performance sense by current ATI cards, let alone the ATI cards due for release this summer.

NVidia will not fix its silicon engineering problems until it can go to 28nm production, that will not be until 2011. On top of that a viable production level NVidia DX11 facility is just a marketing dream at present, NVidia is not capable of producing production standard DX11 for sustained mass deployment until it goes 28nm. Even if it took a suicidal route of keep to a "fix" for the current Fermi design, it would take a minimum of 6 months to get to market, and the final fix would be uneconomic, it would break an already theoreticaly insolvent NVidia.

By the time Fermi2 comes out 2011, ATI will be two generations ahead and NVidia is strategically left for dust (which frankly, it already is). 2010 will be a year where NVidia internally positions itself for a re-branding and market repositioning. By 2011, it will be finished in its current guise, and a rump of its former self.

275, 280, 285 are already formally EOL, and the 295 all but EOL - OEMs will not get any further 295s - there are no shots left in the NVidia armoury.

I have been an NVidia fan for many years, but the party's over ...... I will be changing to ATI for my next PC build.

NVidia did not listen to warnings given about engineering architecture issues in 2007, and will now pay the price for that.

Regards
Zy
ID: 15382 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Snow Crash

Send message
Joined: 4 Apr 09
Posts: 450
Credit: 539,316,349
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15383 - Posted: 23 Feb 2010, 16:18:26 UTC - in response to Message 15382.  

Can you reveal your sources?

NVidia is still profitable: http://www.istockanalyst.com/article/viewarticle/articleid/3838318
Thanks - Steve
ID: 15383 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Zydor

Send message
Joined: 8 Feb 09
Posts: 252
Credit: 1,309,451
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 15384 - Posted: 23 Feb 2010, 16:49:55 UTC - in response to Message 15383.  
Last modified: 23 Feb 2010, 17:07:40 UTC

Thats based on the first three quaters of 2009, they still had four mainstream cards to sell - they dont now .... and will not until Firmi is out. NVidia is not only about graphics cards at consumer level, there is no suggestion from me it will go out of business, but it will retreat from the classic PC Graphics card market, it has nothing left. It will go into specialist graphics applications in 2011, it cant compete with ATI.

Firmi is a disaster, the returns from the final runs of silicon end of January had a mere 2-5% yield, the silicon engineering is wrong. To fix it without a total redesign, they have to increase core voltage, that pushes up the watts to get the power they need, that in turn butts them up against the PCIe certification limit. They must, must, come in under 300w else with no PCIe certification, OEMs will not touch them. To come in under 300w they will have to disable parts of the GPU when going at full speed.

Source - my opinion from searching the Web. Sit down one day and google real, real hard along the lines of:

GTX480, GF100 design, GF100 silicon, Firmi silicon, Firmi engineering, NVidia architecture, TSMC Silicon NVidia etc.

Keep away from "card reviews" and the like, they just regurgitate NVidia specs. Delve into comments at a pure engineering level. The story is not a happy one.

Regards
David
ID: 15384 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 15385 - Posted: 23 Feb 2010, 17:21:50 UTC - in response to Message 15384.  

Fermi launch date:
http://www.theinquirer.net/inquirer/news/1593186/fermi-finally-appear

gdf
ID: 15385 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
cenit

Send message
Joined: 11 Nov 09
Posts: 23
Credit: 668,841
RAC: 0
Level
Gly
Scientific publications
watwat
Message 15386 - Posted: 23 Feb 2010, 17:42:05 UTC - in response to Message 15385.  

there are some really harsh news about Fermi...

look this one:
The chip is unworkable, unmanufacturable, and unfixable.

it also contains a lot of links and technical data to support the statement... ugh!
ID: 15386 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Zydor

Send message
Joined: 8 Feb 09
Posts: 252
Credit: 1,309,451
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 15387 - Posted: 23 Feb 2010, 17:54:02 UTC - in response to Message 15385.  

They will be out of stock by 1 Jul, maybe earlier depends on how fast take up is from existing users. After that, they cant produce more - dont have the cash to subsidise the fix they need to apply to the silicon. Its the 250 spin saga all over again, but this time there is nothing to make up for it like the 295 producing a cash flow. Current cards are just a PR stunt pre Firmi, they cant stand up to ATI on their own performance.

There's just too many indicators. Another one, why are ATI/AMD dragging their heals on a FFT fix? They say they have one and will be deployed in the next OpenCL release "in a few months". A few months?? When both ATI and NVidia are (at least publically) shouting virtues of OpenCL. Look at it from the other angle, with NVidia engineering problems, why should ATI accelarate OpenCL dev/fixes, without NVidia in its current form there is no real need for ATI to push OpenCL.

All speculative, for sure, its only an opinion. However there is far too many indicators supporting this, and equally significant, no retraction of the main issues from NVidia or their partners/supporters to some very strong reviews and statements made re the silicon issue.

Time will tell I guess, but certainly there is enough to say "hold for now", if the Firmi production reviews pick this up, and they will, or the cards suddenly become scarce "due to extrordinary demand" end June, then we will know. However I suspect the Industry experts and analysts will have blown this on out before then once the production card is out there and its engineering can be picked apart. Then the next phase will begin with Spin covering the delay over Firmi2, but I dont think they will get away with it this time.

Regards
Zy
ID: 15387 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Zydor

Send message
Joined: 8 Feb 09
Posts: 252
Credit: 1,309,451
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 15388 - Posted: 23 Feb 2010, 17:59:42 UTC - in response to Message 15386.  
Last modified: 23 Feb 2010, 18:03:45 UTC

there are some really harsh news about Fermi...

look this one:
The chip is unworkable, unmanufacturable, and unfixable.

it also contains a lot of links and technical data to support the statement... ugh!


Charlie is a known Nvidia hater - he would spit on their grave given the chance, so we have to take some of his ire with a pinch of salt. However even if only half of that report is true, NVidia are sunk without trace.

In fairness to him, he has been right with his predictions on NVidia for the last 2 years, especially on the engineering aspects.

I'm trying to keep an open mind ... really am ... but its real hard to with whats out there.

Regards
Zy
ID: 15388 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 15389 - Posted: 23 Feb 2010, 18:25:15 UTC - in response to Message 15388.  

Luckily we need to wait only 1 month now to know what is true.

gdf
ID: 15389 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile liveonc
Avatar

Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 15410 - Posted: 24 Feb 2010, 20:49:02 UTC - in response to Message 15384.  
Last modified: 24 Feb 2010, 20:49:55 UTC

Thats based on the first three quaters of 2009, they still had four mainstream cards to sell - they dont now .... and will not until Firmi is out. NVidia is not only about graphics cards at consumer level, there is no suggestion from me it will go out of business, but it will retreat from the classic PC Graphics card market, it has nothing left. It will go into specialist graphics applications in 2011, it cant compete with ATI.

Firmi is a disaster, the returns from the final runs of silicon end of January had a mere 2-5% yield, the silicon engineering is wrong. To fix it without a total redesign, they have to increase core voltage, that pushes up the watts to get the power they need, that in turn butts them up against the PCIe certification limit. They must, must, come in under 300w else with no PCIe certification, OEMs will not touch them. To come in under 300w they will have to disable parts of the GPU when going at full speed.

Source - my opinion from searching the Web. Sit down one day and google real, real hard along the lines of:

GTX480, GF100 design, GF100 silicon, Firmi silicon, Firmi engineering, NVidia architecture, TSMC Silicon NVidia etc.

Keep away from "card reviews" and the like, they just regurgitate NVidia specs. Delve into comments at a pure engineering level. The story is not a happy one.

Regards
David


Stupid question, I'll ask anyways. If GTX470/480 comes out clocked at 676Mhz to meet with PCIe certification limits, if Nvidia gave the option to go beyond this 300W (at UR own risk), would they get that "edge", people are looking for? OCing always voids waranties. So why not give the option (at UR own risk) & package that extra PCIe 12V connector in a way that will void all warranty, if unsealed...
ID: 15410 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15413 - Posted: 24 Feb 2010, 22:32:02 UTC - in response to Message 15410.  

You will probably be able to OC the Fermi cards, just like the previous cards. However, almost any attempt at doing so will get you past the 300 W. So nVidia don't have to do much to allow this ;)

And it's not like your system suddenly fails if you draw more than that. At least in quality hardware there are always some safety margins built in. A nice example of this is ATIs 2-chip 5970: it draws almost exactly 300 W under normal conditions, but when OCed goes up to ~400 W. It works, but the heat and noise is unpleasant to say the least. And the power supply circuitry is quite challanged by such a load and can become a limiting factor (could be helped by a more expensuve board design).

Is that the "edge"? It depends.. I wouldn't like to draw about twice the power of an ATI for a few 10 % more performance. For me it would have to be at least ~80% more performance for 100% more power. Oh, and don't forget that you can OC the ATIs as well ;)

MrS
Scanning for our furry friends since Jan 2002
ID: 15413 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15419 - Posted: 25 Feb 2010, 2:46:27 UTC - in response to Message 15413.  

Would you like to order some Fermi cards before they are built, and before the power requirements and other specs are released? You can.

http://techreport.com/discussions.x/18515#jazz

I'll let you decide if it looks worth it.

ID: 15419 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15422 - Posted: 25 Feb 2010, 6:37:35 UTC

Seems that my last last in this thread has turned out questionable - when I found the site that lists them for sale again, it now shows both GTX4xx models as out of stock.

Also I found an article saying that the amount of memory shown on that web page is questionable for the Fermi design.

http://www.brightsideofnews.com/news/2010/2/21/nvidia-geforce-gtx-480-shows-up-for-preorder-and-the-specs-look-fishy.aspx

Also an article showing Nvidia's announcement plans for that series:

http://www.nvidia.com/object/paxeast.html
ID: 15422 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15428 - Posted: 25 Feb 2010, 13:22:46 UTC - in response to Message 15413.  
Last modified: 25 Feb 2010, 14:17:49 UTC

NVidia has taken huge steps over the last year towards the low and mid range card market for many reasons, none of them being ATI. This should not be seen as a migration from the top end market, but rather a necessary defence and improvement of their low to mid range market. This is where most GPUs are sold, so its the main battlefield, and NVidia have a bigger threat than ATI, in this war.

NVidias mid range cards such as the GT220 and GT240 offer much lower power usage than previous generations, making these cards very attractive to the occasional gamer, the home media centre users (with their HDMI interface), the office environment, and of course to GPUgrid. The GT240 offers up a similar performance to a 8800GS or 9600GT in out and out GPU processing, but incorporates new features and technologies. Yet, where the 8800GS and 9800GT use around 69W when idle and 105W under high usage, the GT240 uses about 10W idle and up to 69W (typically 60 to 65W) when in high use. As these cards do not require any special power connectors, are quiet and not oversized they fit into more systems and are more ergonomically friendly.

WRT crunching GPUGrid, the GT240 will significantly outperform these and many more powerful, and power hungry, CC1.1 cards (such as a GTS 250, 150W TDP) because the GT240 is CC1.2, and is much, much more reliable (G215 core rather than G92)!

NVidia has also entered the extra-PC market, creating chips for things such as TVs and mobiles. However, this is not just an attempt to stave off Intel’s offensive on the low end GPU market. NVidia have found allies in the form of ARM; Intel also started to take the low end CPU fight to ARM, with the release of Atoms and continued development of small CPUs of limited power. However ARM are not an easy target, not by a long way. ARM sell over 1Billion CPU chips every year, for various devices including just about every mobile on the planet. So NVidia have found very solid support from ARM (especially their computer on a chip Cortex ARM9 product, Tegra 2, which could be making its way to a TV, game console, PDA or netbook near you any time soon).

So Intel’s recent move towards low end CPUs and GPUs, and more importantly their development of single chip CPU and GPUs naturally makes them the common enemy of NVidia and ARM. If NVidia failed to respond to this challenge they would have their market place threatened not just by ATI & AMD but by Intel on the main battlefield (where most cards are).


A few things to note about PCIE technologies:
PCIe operates at 2.5GHz
PCIe 2 operates at 5GHz
PCIe 2.1 operates at 5GHz
PCIe 3 will operate at 8GHz, but this has not yet been ratified – rather, delayed until the second quarter of 2010, and perhaps then some.
It is no surprise that it has not been decided upon, Fermi has not been released!

This highlights the secretive nature of GPU manufacturers, too scared to reveal what they are up to even if it hampers their own sales. It lends weight to the argument that the Fermi cards will be an NVidia loss leader; however we are not privy to the internal cost analysis, just the speculation and you could argue that Fermi is a trail blazing technology; a flagship for newly developed technology, which will lend itself to new mid range cards over the next year or so and set a new bar for performance.

It is worth noting that Intel developed PCIe, so they might want to try to scupper the technology to knobble both ATI and NVidia, saying as Intel don’t have a top end GPU; they might want to deny other GPU manufacturers the opportunity to compete on something they cannot control.

It strikes me as likely, that people who buy a Fermi in a month’s time might have to buy a new motherboard sometime later in the year to fully appreciate the card. Today it would have to work on a PCIe2 (or PCIe2.1) based motherboard, as these are the best that there is, but I would not be surprised if better performance will be brought with a future PCIe3 slotted motherboard:
PCIe2 uses an 8b/10b encoding scheme (at the data transmission layer). This results in a 20% overhead. The PCIe3 will use a 128b/130b encoding system, resulting in a <2% overhead, but this is only one area of improvement.
ID: 15428 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile [AF>Libristes] Dudumomo

Send message
Joined: 30 Jan 09
Posts: 45
Credit: 425,620,748
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15429 - Posted: 25 Feb 2010, 13:40:35 UTC - in response to Message 15428.  

Thanks SKGiven.
Interesting post.
ID: 15429 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15430 - Posted: 25 Feb 2010, 14:45:49 UTC - in response to Message 15428.  

NVidia has taken huge steps over the last year towards the low and mid range card market for many reasons, none of them being ATI. This should not be seen as a migration from the top end market, but rather a necessary defence and improvement of their low to mid range market. This is where most GPUs are sold, so its the main battlefield, and NVidia have a bigger threat than ATI, in this war.

The highest profit area however is the high end and the truth is that NVidia has made numerous missteps in the high end market. AMD/ATI is executing much better at the moment. NVidia spent much money developing the G200 series and now they're becoming more expensive and harder to find as they reach EOL status. Fermi sounds like it might be in trouble. It's too bad as we need strong competition to spur advances and hold down prices.

It strikes me as likely, that people who buy a Fermi in a month’s time might have to buy a new motherboard sometime later in the year to fully appreciate the card. Today it would have to work on a PCIe2 (or PCIe2.1) based motherboard, as these are the best that there is, but I would not be surprised if better performance will be brought with a future PCIe3 slotted motherboard:
PCIe2 uses an 8b/10b encoding scheme (at the data transmission layer). This results in a 20% overhead. The PCIe3 will use a 128b/130b encoding system, resulting in a <2% overhead, but this is only one area of improvement.

Interesting but not very relevant to our needs. Even the slowest versions of PCIe presently are overkill as far as interfacing the GPU and CPU for the purpose of GPU computing. Maybe gaming is a different matter, but NVidia seems to be moving away from the high end gaming market anyway.

As far as Intel, over the years they've made several attempts to take over the graphics card market and have miserably failed every time. It seems that some of their much ballyhooed technologies have been recently scrapped this time too. Still, Intel is huge, has many resources, and has never been one to let the law stand in it's way when attempting to dominate a market. As always, time will tell.

ID: 15430 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 15439 - Posted: 25 Feb 2010, 18:35:40 UTC - in response to Message 15430.  

Interesting but not very relevant to our needs.

Probably, but I was talking generally because I wanted to make the point about the politics behind the situation; if preventing PCIe3 means another company cant release a significantly better product, that could be used here, some will see that as a battle won.

I expect gaming would improve in some ways with a faster PCIe interface; otherwise there would be no need for PCIe3. I doubt that all the holdups are due to the fab problems.

NVidia seems to be moving away from the high end gaming market anyway.


Well since the GTX295s that is true, because the battle was on a different front, but Firmi will be a competitive top end card. We will see in a month, or two, hopefully!

Intel have been trying desperately for years to be all things to all users, but their flood the market strategy often leaves their biggest competitor to a new product to be one of their own existing products. I dont see Intel as a graphics chip developer and I think most people would feel the same.
I think most of their attempts to take over the graphics card market have been through ownership legislations and control methods, so no wonder their miserable attempts failed, they never actually made a good GPU.
ID: 15439 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
1 · 2 · 3 · 4 . . . 16 · Next

Message boards : Graphics cards (GPUs) : Fermi

©2025 Universitat Pompeu Fabra