Fermi

Message boards : Graphics cards (GPUs) : Fermi
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 10 · 11 · 12 · 13 · 14 · 15 · 16 · Next

AuthorMessage
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17495 - Posted: 1 Jun 2010, 21:25:27 UTC - in response to Message 17493.  

The GF100 with half the GPU disabled would not be competitive; too much leakage – better off in the bin!
There are too many rumours about the GF104. I heard 384, 750MHz and 130W, which would do nicely doubled up for the fall, but then I also saw a Photoshoped image of a streached GF100!


That's why I said "next smaller chip" ;)
And 750 MHz core clock matches very well the 1.5 GHz shader clock I've read (and the >1.5 GHz which I'd expect). However, I'd rather expect 256 shaders at 150 W and double the amount of texture units than 384 shaders at 130 W.
Based on the numbers for the GTX480 I actually estimate 143 W for "my" card, neglecting the TMU increase and any voltage increases to reach the higher clock. 384 shaders at 130 W is day dreaming - it's the same design on the same process.

MrS
Scanning for our furry friends since Jan 2002
ID: 17495 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17496 - Posted: 2 Jun 2010, 10:55:37 UTC - in response to Message 17495.  
Last modified: 2 Jun 2010, 11:02:13 UTC

This thread suggests 336 and 384 shaders:
336shaders 768MB 675/1350/1800 ??
384shaders 1GB 750/1500 ??
Wishful thinking perhaps!
It's more likely to be 256shaders, and perhaps 1GB 750/1500/1800 (though 800MHz versions might turn up).
The 130W suggestion was probably for the GTS450 or GTS440. We can only guess if these have the full compliment of 256 shaders (675MHz & 625MHz for example), with the GTS430 only having 192 shaders, it might be possible to have 224 shaders.

150W is being bounced about for the GTX460, and there is talk of a dual GTX460 (GTX490). 295W seems more likely than 375W (a dual GTX480), but if a GF102 Fermi turns up with 512shaders, it would be competing against itself! So perhaps there is something in the 336/384 shader speculation, even if it is a different product and many months away.

Although they were correct about the GTX465 specs, it's still speculation, until they are anounced by NVidia.

This is my rough guess as to what might turn up, and when.
- GTX 495 - 2xGF104 295 W (long way off)
- GTX 485 - GF102 - 245 W (3 or 4 months)
- GTX 475 - GF102 - 215 W (3 or 4 months)
- GTX 465 - GF102 - 200 W (released)
- GTX 460 - GF104 - 150 W (within 2 months)
- GTS 450 - GF106 - 140 W (within 2 months)
- GTS 440 - GF106 - 130 W (within 2 months)
- GTS 430 - GF108 - 120 W (within 2 months)

As for what turns up under the hood, who knows?
It is probably not even worth speculating on because the TDP and frequencies are likely to vary with open design cards probably becoming the norm. A few of us here suggested this was the way forward for NVidia; the chips get built by NVidia and the card makers design the boards and cooling, as they do a better job of this anyway!

What's clear is that there will be more efficient Fermi's with fewer shaders, targeting the middle market.
A competitive range of cards with GPUGrid performances between that of a GT240 and a GTX285 for between £100 and £250 would be very good news for the project!
ID: 17496 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17498 - Posted: 2 Jun 2010, 13:39:34 UTC - in response to Message 17496.  

Galaxy have a single slot GTX470,
http://www.anandtech.com/show/3731/news-a-single-slot-gtx-470-from-galaxy

Might be of interest to a few people with many slots on their boards, and the techs!


http://www.evga.com/articles/00501/
ID: 17498 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17502 - Posted: 2 Jun 2010, 18:56:51 UTC

SK, that looks like a good board for 4x of those single slot GTX 470 cards. Leaves good airflow space between the cards.

Here's Tom's take on the GTX 465:

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-465-fermi-gf100,2642.html
ID: 17502 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17503 - Posted: 2 Jun 2010, 20:42:25 UTC - in response to Message 17496.  
Last modified: 2 Jun 2010, 20:48:47 UTC

I meant the rumors which I read were right about the release date of the GTX465, so I'll give them the benefit of the doubt and assume they'll also be right about the "beginning of July" release of the next smallest Fermi architecture chip (may they call it GF104 or Miss Piggy, I don't really care ;)

Regarding your first link: that looks quite bad to me. They show 8 memory modules, they even write that it's 8 of them, yet they claim it's got 192 bit memory. The latter would require 6 of the usual 32 bit GDDR5 memory chips or 24 bit GDDR5 chips - something I haven't heard of.

Furthermore they give the card 67% of the raw shader performance of the GTX480 but only 49% the memory bandwidth, which whould yield an unbalanced design (GTX 480/470 don't have that much bandwidth to spare).

And 256 bit memory hasn't been that expensive during the last years. Plus this GF104 is supposed to be the fastest of 3 mainstream chips - which need to be smaller, cheaper and slower.

Why are their benchmark screenshots so small that you can't recognize anything? You only do this if you've got something to hide, don't you?

Actually, now that I took a closer look at the die shots it appears to me the entire post is complete BS. Take a closer look: the "GTX460 die" is exactly the same, just squeezed vertically to half the original size. Even the dark line (some fotographing / lighting artefact) from the bottom left to the upper right is similar.
However, GF100 is designed in blocks. NVidia can easily take these blocks, reduce their number and call it a day. Or a new Geforce. So what you'll see in a real chip is a smaller number of (at least very similar) blocks. Not the same number of smaller blocks (which would be bad due to numerous reasons). Plus they can't scale things like the UVD engine and each of the 64 bit memory controllers in size.

Enough said.. and I don't see much point in speculating on the remaining lineup before we know what the next card / chip will actually be ;)

MrS

BTW: my suggested GF104 top model would have 57% the shader power and 58% the memory bandwidth of the GTX480 at a modest 800 MHz memory clock..
Scanning for our furry friends since Jan 2002
ID: 17503 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17506 - Posted: 3 Jun 2010, 0:59:41 UTC - in response to Message 17503.  
Last modified: 3 Jun 2010, 1:24:34 UTC

I saw early July as well, but they may stagger the releases according to locational time of year events (returning to school); so it might appear in the US in early July, but not turn up in the UK until the end of July or early August.
I know the names don’t really matter much, but NVidia like to keep using the same number systems (with a few twists). So in some respects the GTX480 is a reflection of the GTX280. We are missing GTX460, 75, 85, 95 and the lower GTS card for a full Fermi flush. Most of these should appear with the GF104.
Power usage is probably key for the lesser GF104 cards, if they are to compete with ATI (especially for OEM systems). However if they are open design, there could be some special GTX460 cards with performances matching that of a standard GTX465, and using about 50W less.
You read too much into that first link. I did say it was probably incorrect, that it is more likely to have 256 shaders, and I earlier mentioned the Photoshoped image of a streached GF100!
We will know the core numbers and frequencies soon enough. Not knowing does not take away from the fact that there will be a range of GF104 cards, which means more choice for GPUGrid crunchers, and hopefully more participants.

The single slot Galaxy cards are almost there but not quite (better than the liquid cooed cards that are 2mm to wide to be single slot). The Galaxy cards have no ability to exhaust heat via the back plate, so it just blows around the case!
To be able to use 7 of those single Slot GTX 470 Galaxy cards, you would need an extended server style case with at least 2 large side extractor fans, probably opposite door intake fans (if it’s a Full tower case) and heavy duty front intake fans. I would take the back plates off to allow heat to escape that way – not that I could afford 7 Fermi’s, that board, two 1500W PSUs and the rest of what would be a £3000+ system.

PS. A dual GF100 would require two 8pin power connectors, making it only usable with very expensive PSU’s (750W or over). At 295 or 300W a more reasonable 650w PSU might suffice.
The GTX465 RAM is at 802MHz (x4) rather than the 837MHz of the GTX470 – not that it makes too much difference here.
The GTX465 also says A3 revision, the same as my GTX470, so no changes there.
ID: 17506 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17509 - Posted: 3 Jun 2010, 10:07:27 UTC - in response to Message 17506.  

Is there anything else of interest in this article besides the "probably made up specs" of GF104 / GTX460?

And I'm not sure a more open design would really improve things. NVidia still needs to come up with a good reference design, that's for sure. Otherwise every partner would have to do the same work again. And they can already choose alternative cooling solutions, clock speeds (factory OC) and voltages (e.g. Asus Matrix). And I'm not impressed by the custom PCB designs of the HD5870. despite the usage of expensive high quality circuitry they mostly run at slightly higher clock speeds, but at the cost of substantially increased power consumption (Anand had a review of 3 recently). So I don't think there's much to be gained from such a move.. apart from the obvious benefits of better / different coolers.

MrS
Scanning for our furry friends since Jan 2002
ID: 17509 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17511 - Posted: 3 Jun 2010, 10:22:13 UTC - in response to Message 17509.  
Last modified: 3 Jun 2010, 10:56:07 UTC

A more open design frame would create greater variety and thus more choice for the end user. Circuitry changes would allow different lengths and widths of cards, to fit more cases, different temperatures with new heatsink designs (eg vapor chamber cooler) for quieter environments or loud crunching farms, and of course different prices.
There may be green varieties that only require one 6pin power connector (for lesser PSUs) and a range of top end dual card versions. These GF104 cards may even turn up in small form factor cases (media centre systems).
I see lots of excellent potential for innovation with open design and I think there is a good chance more people will crunch.
If left stuck with NVidia's default GPU in a shoebox design, none of this could happen!
ID: 17511 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17521 - Posted: 3 Jun 2010, 16:19:55 UTC - in response to Message 17511.  

Also consider that there are two sides of the coin: more freedom in actual card design also means more manufacturers trying to screw people by selling them inferior stuff. A reference design also stands for guaranteed quality and driver support.
I think the shoebox is flexible already. And isn't it you who posts these links about Fermi cards with non-reference coolers in this thread? It's been just 2 months after the cards have been available!

MrS
Scanning for our furry friends since Jan 2002
ID: 17521 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile liveonc
Avatar

Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 17522 - Posted: 3 Jun 2010, 17:04:16 UTC - in response to Message 17521.  

But just like Bush Jr. "U fooled me once, shame on U, U fool me again, shame on ME!"

Consumers aren't stupid, they'll know when they're getting cheated, eventually...

It's against free competition to forbid the sale of inferior products, as long as they don't cause bodily harm. But how many cheap-o-crap-o brands can get away with junk, before it effects their pricing? If ASUS, fx, wants to shoot their own feet, they just need to sell junk, again, & again, & again. Very quickly people will start calling them @SSUS.
ID: 17522 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17523 - Posted: 3 Jun 2010, 17:52:20 UTC - in response to Message 17522.  

But it's not exactly in nVidias interest to let customers find out themselves over time which design is good and which one isn't. The argument would be like "Should I get nVidia or ATI?" "Well, if you want to take a crap shot at something unproven, go and chose one of the experimental xxx designs. If you want to get reliable quality, go ATI. On the other hand, I heard the Hawazuzi XXX1234XTXTXT Superpro Very Limited Edition Ultra Clocked Monster (TM) is not such a bad model, if you can find one manufatured in the first half of week 34."

Example: considering the power draw of Fermi cards you really don't want them to use inferior capacitors to drive prices down a couple of $...

MrS
Scanning for our furry friends since Jan 2002
ID: 17523 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17524 - Posted: 3 Jun 2010, 18:09:44 UTC - in response to Message 17522.  
Last modified: 3 Jun 2010, 18:36:52 UTC

I have no doubt that cheap inferior products will turn up anyway, they are called OEM systems!
If Galaxy, XFX, BFG, Palit, Gigabyte, Asus... sell you a Fermi with a 1 year warranty and you card fails after 13months, hard luck. You should have bought a card with a 3, 5, 10year or lifetime warranty!
If they sell cheap parts and the cards start failing after 6months, then the manufacture also has a problem. They have to repair the card, and will lose face (customers).
Nice to see some Green oldies, but one size does not fit all when we move on from G92.

Your argument that, we can’t we have a Green Fermi, a shorter card, a narrower card, a dual card or any other type of bespoke card because someone in China might build a cheap GPU, makes little sense. It is what OEM’s do, so why should I not be able to buy a thinner version of a Fermi from a reputable dealer to put into my slim line case? NVidia could require a minimal reference design. You don’t open a packet at both ends!

It is up to Europe to determine what is brought into the continent, and the same applies for everywhere else. If we don’t want cheap rubbish, legislate rather than blockade. If NVidia are worried about having their reputation tarnished then its up to them to set minimum standards.
Heatsink design is not GPU design - they don’t make tyres in an engine factory.
Besides, if a manufacturer can alter the card by using a better heatsink and fan then why limit the board dimensions or voltages?
Why can Asus use better capacitors to allow more OC room? Why not use smaller of fewer capacitors on a Green board that uses lower voltages?
It’s not as if such cards would no-longer fit a PCIE slot.
ID: 17524 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile liveonc
Avatar

Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 17525 - Posted: 3 Jun 2010, 18:11:30 UTC - in response to Message 17523.  

But here me & here say is how the net works. It's a jungle of knowledge & ignorance, spin, & lies. Just like consumers like it. Otherwise it would be different. Nothing comes from Nothing, & it's just a matter of time before a full tank in a good car runs out & the car stops, becomes useless, & needs bailing out.

Here in Denmark, retailers must guarantee at least 2 years on a card. If something is too cheap, it'll be a disaster in never ending RMA's. Before I buy something though, I like to google for reviews on the card I'm interested in. Sure, there can be spin & manipulation here too, but then the reputation of the site that reviews GPU's get hurt along with their credibility.

I'm just saying, let it loose! I lived in Indonesia where there once was a choice of 20+ different flavors of Fanta, now there are 3-4. If there's a greater choice of Coca Cola in the USA then in EU, that's because there's a market, otherwise it'll sort itself out in time.
ID: 17525 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17529 - Posted: 5 Jun 2010, 18:22:32 UTC - in response to Message 17524.  

Your argument that, we can’t we have a Green Fermi, a shorter card, a narrower card, a dual card or any other type of bespoke card because someone in China might build a cheap GPU, makes little sense.


That's because this is not what I said. Or not the point I wanted to make. The point is that nVidia needs solid reference design and anything that is in their official lineup has to guarantee a certain minimum level of performance and quality.

As to why you can't buy all the different flavors of Fermi today which you'd like to see.. well, I'd said it all comes down to money.

First a very interesting option: a "Green Edition" Fermi. Why not? first let's consider what such a product can do and what it can not. The operating voltage of Fermi is already set quite close to what it needs to reach its current stock speeds. So simply lowering the voltage is not an option as there's just not enough room left. What's left is lowering clock speed and voltage. This does save power and increases efficiency, but you also loose performance. So your Green Fermi will (a) be slower or (b) needs more hardware units enabled to make up for the loss in clock speed.
Graphics card prices are still determined by their gaming performance and I dare say that the crunching / HPC market is still too small to change that. So in case of option (a) nVidia would have to sell chips at a lower price than if they were regular GF100s. That's not what they're going to do if they could still sell every functional chip. And option (b) is financially unattractive as well: let's assume you'd take a GTX480, reduce its clock speed to ~1 GHz and lower the voltage a little more. You'd get about GTX470 performance at lower power consumption. However, by now you have to sell a chip which could be sold as a GTX480 for the price of a GTX470 because it's slower. NVidia would loose profit this way, if they could still sell all chips as GTX480 instead (which currently is the case).
BTW: you can't cut the power consumption of a massive chip like Fermi arbitrarily by lowering clocks & voltage. The reason is subthreshold leakage: every transistor consumes a small amount of power irregardless of it being switched or not. This gives you a certain baseline of minimum power consumption. At idle they can keep this in check by power gating entire chip regions completely off the supply voltage, but under load you can't do that.

The dual card is easy: they're experimenting with it and landed at 430 W so far. Neither easy nor very attractive (no PCIe spec for that).

A shorter card might be nice, but then PCBs already cost something, so it's actually in ATIs and nVidias interest to keep their cards as short as possible before performance suffers. The current high end ATIs use less power then Fermis, have fewer memory channels to route and are longer. Yet I can't see any custom design (which are allowed) that "fixes" this. One would think it should be much easier to shave off some length here than with a Fermi. Why not? I can't say for sure, but routing the high speed GDDR5 lines is certainly not trivial. If you limit the amount of space your engineers can work with your signal quality will suffer and hence obtainable memory clocks. So you'd have to pay for additional development and probably sell the card at a slightly lower price, for a limited audience. This is not set in stone, but I can see solid reasons for a company not to do this, even if it was allowed.

And by narrower card you mean a low profile one? If yes, then how do you route 386 bits of memory traces in such a limited area? And fan in the cooler would have to be much smaller, so cooling really suffers. These are probably the reasons why low profile cards have traditionally been limited to cards with 64 and 128 bit memory bus width.

It is what OEM’s do, so why should I not be able to buy a thinner version of a Fermi from a reputable dealer to put into my slim line case? NVidia could require a minimal reference design. You don’t open a packet at both ends!

It is up to Europe to determine what is brought into the continent, and the same applies for everywhere else. If we don’t want cheap rubbish, legislate rather than blockade. If NVidia are worried about having their reputation tarnished then its up to them to set minimum standards.
Heatsink design is not GPU design - they don’t make tyres in an engine factory.
Besides, if a manufacturer can alter the card by using a better heatsink and fan then why limit the board dimensions or voltages?
Why can Asus use better capacitors to allow more OC room? Why not use smaller of fewer capacitors on a Green board that uses lower voltages?
It’s not as if such cards would no-longer fit a PCIE slot.

And to make one thing clear: I don't think the current Fermi offerings are the best we can and should get. I also want higher efficiency, more choices and better cooling. But I also think that you're barking up the wrong tree here - the reason for the currently limited amout of choices is not nVidia being stupid or evil, but rather them being profit oriented, so they try to get the most money out of the chips that they have.

MrS
Scanning for our furry friends since Jan 2002
ID: 17529 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile liveonc
Avatar

Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 17530 - Posted: 5 Jun 2010, 19:06:36 UTC - in response to Message 17529.  

The question then is what is green? Some say what shade of green, but I'd like to ask what green is. Is green cheap? How can green be cheap, if it's supposed to be an investment, something that'll make up for the "added costs" of being green. If it's tomorrow that you'll save, then today you'll pay more. Efficient PSU's cost more than inefficient PSU's. So when a "green" GPU costs less then a "non-green" GPU, it makes me wonder. Just because it's slow, doesn't make it green, & if the power-savings will mean more time spent on my side, then my CPU, HDD, RAM, & the rest of my PC isn't going to be "greener", just because my GPU is "slower".

Did they use recycled parts to make that "green" GPU? Is there any "green" funeral plans for my GPU when it dies? Do "green" GPU's go to Heaven?

So as far as I see, a really "green" GPU is: Expensive, made of high quality/high efficiency "recycled" components that in turn itself can be "recycled", & will only live as long as it's best for me to have it around, because an old senior GPU isn't going to be very "green".
ID: 17530 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17531 - Posted: 5 Jun 2010, 21:26:49 UTC - in response to Message 17530.  

The question then is what is green?


To me that's quite clear: a green edition uses less power and achieves a higher power efficiency. It's got nothing to do with its price and the rest of the PC. If it's any different than other PCBs regarding recycling then it's probably inferior. So I wouldn't want that, but in the end it's not us who define which card a manufacturer declares as green. Currently they don't give a ***** about that ;)
And regarding high efficiency components: they're already using them, at least on the high end. The reason is simple: both ATI and nVidias current high end chips are power constrained, i.e. the designs could take higher voltages & clocks and thus they could get more performance out of the same silicon (same production cost), but power consumption, heat & noise would rise painfully. And nVidia is being beaten quite a bit for the high power consumption of Fermis. So if it was as simple as using 1$ more expensive voltage converters - I'm almost sure they'd already have done so. The custom HD5870 designs, which presumably use even higher quality components, all feature worse energy efficiency.

MrS
Scanning for our furry friends since Jan 2002
ID: 17531 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17533 - Posted: 6 Jun 2010, 16:43:54 UTC - in response to Message 17531.  
Last modified: 6 Jun 2010, 16:53:14 UTC

When I spoke of Green cards, I was only talking about using the forthcoming GF104 flavours of Fermi, with potentially 256 shaders. Not the GF100 monsters! The potential GTX440 could be a candidate.

PS. Galaxy made a dual GTX470 prototype,

http://hothardware.com/News/Computex-2010-Dual-GPU-GTX-470-Videocard-Spotted/
Potentially a good card for GPUGrid clustering; 896 shaders.

The 8phase VRM says a lot!

Doing it Wrong

Two wrongs...
ID: 17533 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile liveonc
Avatar

Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 17534 - Posted: 6 Jun 2010, 17:28:41 UTC - in response to Message 17533.  
Last modified: 6 Jun 2010, 17:29:09 UTC

Looks like Galaxy didn't want to wait for Nvidia. Looks like they linked it with a Lucid Hydra alternative . If they use this, it just might work
ID: 17534 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17535 - Posted: 6 Jun 2010, 19:55:15 UTC - in response to Message 17534.  
Last modified: 6 Jun 2010, 20:19:18 UTC

Several manufacturers are now producing motherboards with 7 PCI-E slots,


http://www.atomicmpc.com.au/Gallery/171976,gigabyte-x58a-ud9-motherboard.aspx
http://www.gigabyte.com.tw/products/product-page.aspx?pid=3434#dl

Now all we need are inexpensive, cool, low power, slim-line (1 slot wide) GF104 Fermi's to fill up the board with :p

PS. That dual GTX470 might actually be a reference design!
I'm guessing you will see a dual GTX460 before the dual GTX470; they may produce a dual GTX460 with 512cores as these will be able to use higher shader and RAM clocks first, and be NVidia's fastest card, before going for a dual GTX470 or even 480 when they move to GF102, and should they manage to reduce the heat & power somehow. But they will need to be quick! So dont disregard the possibility of a dual GTX460 within 6weeks, or a dual GTX470 (which would outperform a reference 5970 in many ways) before the end of the year.
It is strange that there is so little confirmed info on the GTX460, given that it is about a month away from release. One report I read said 240shaders, and then confirmed it when questioned. Most say 256, but some even say 300-odd with redesigned ratios, so who knows, they can't all be correct!
ID: 17535 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17539 - Posted: 6 Jun 2010, 21:54:41 UTC - in response to Message 17533.  

When I spoke of Green cards, I was only talking about using the forthcoming GF104 flavours of Fermi


LOL! Well, then the discussion was a little senseless. I fully agree that GF104 (what ever it will end up being) is the chip to use in a power consumption optimized card. Production cost & quantity should be much less critical for this one and cards with a single 6 pin PCIe connector should well be possible. However, the chip is not even announced yet, so there's not much point discussing whether nVidia forbids their partners green versions of it or not :p

And, yes, that dual GPU monster is what I saw, together with a notion of 430 W TDP. No thanks. For me, anyway.

MrS
Scanning for our furry friends since Jan 2002
ID: 17539 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 . . . 10 · 11 · 12 · 13 · 14 · 15 · 16 · Next

Message boards : Graphics cards (GPUs) : Fermi

©2025 Universitat Pompeu Fabra