Fermi

Message boards : Graphics cards (GPUs) : Fermi
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 13 · 14 · 15 · 16

AuthorMessage
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19409 - Posted: 10 Nov 2010, 11:32:37 UTC - in response to Message 19116.  
Last modified: 10 Nov 2010, 11:34:22 UTC

NVidia seem to have released the GTX580 out of the blue, to compete against the top ATI cards about to be released. This sudden change of direction may have come at the expense of other planned releases. To my dismay I read a suggestion that the long awaited GTX475 is not coming in the predicted form or timeframe. Instead it is to be renamed to the GTX560 and according to blogspot it will be GT114 based rather than GF104.

If this is correct it would mean no GTX475, and waiting on a GTX560. Lets hope they are wrong and NVidia release both, but if not some manufacturers may still want to push out a few golden samples of GTX460’s, with full complements of cores and shaders. A pair of such cards would just about match a GTX580 and one might match a GTX560; something you might think NVidia would not be too keen on, but if there is a limited supply of GTX580's they might, and many retailers want you to pre-order the GTX580.
ID: 19409 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19436 - Posted: 11 Nov 2010, 21:47:00 UTC

Difficult to say, but I have a hard time imagining nVidia rushing to replace GF104. It's not broken in the way GF100 was/is.

MrS
Scanning for our furry friends since Jan 2002
ID: 19436 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19513 - Posted: 16 Nov 2010, 19:02:48 UTC - in response to Message 19436.  

Sounds like a GTX570 is on its confusing way:

It is reported to be turning up with 480 shaders, so it's really a GF110 version of a GTX480, with a better cooler.

http://www.fudzilla.com/graphics/item/20814-geforce-gtx-570-comes-soon
ID: 19513 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Fred J. Verster

Send message
Joined: 1 Apr 09
Posts: 58
Credit: 35,833,978
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19516 - Posted: 16 Nov 2010, 21:47:18 UTC - in response to Message 19513.  
Last modified: 16 Nov 2010, 21:48:02 UTC

It's about time NVidia is gettin clear, of what it's doing, or not.
This is all very confusing and looks like keeping ahead of
ATI and 6000
series GPU's, they have to make up their minds :(
.

Knight Who Says Ni N!
ID: 19516 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19518 - Posted: 16 Nov 2010, 23:11:13 UTC - in response to Message 19516.  

Indeed, that's their game.

Not sure how their GeForce GTX 460 SE fit's into the picture, maybe it’s just a case of sell the chaff too. Just so people know, it only has 288shaders and is relatively under clocked when compared to the GTX 460. As it’s additional ROP is disabled it’s no cheaper to run (the same as a 768MB GTX460). Make your own mind up what the S stands for.

Still, for crunching here people are better off with a 448 Cuda Core GTX470 for £189 than any GTX460 and when the GTX570 turns up it should outperform the GTX480.


ID: 19518 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19532 - Posted: 17 Nov 2010, 20:11:02 UTC

Well, nVidia decided that a small bump in performance and a somewhat larger bump in efficiency is worth a new generation tag. If you accept this then the rumored GTX570 does neither add confusion nor will it be a bad card. It's going to perform slower than a GTX580, probably approximately by the amount one would expect from the numbering, and be cheaper. And it should outperform the GTX480.

Personally I don't see a problem with this. Apart from the whole "5 series = 4 series" issue. But given it's nvidia I'm actually glad they don't mix DX10 chips into the 4 and 5 series..

MrS
Scanning for our furry friends since Jan 2002
ID: 19532 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19539 - Posted: 18 Nov 2010, 0:36:07 UTC - in response to Message 19532.  
Last modified: 18 Nov 2010, 6:46:29 UTC

The GTX 560 will be GF114 based, have 8 ROPs, 384 shaders (48/1 ratio), higher clocks (725MHz, 1450MHz) and faster RAM. Basically it will be the much purported GTX475, but cooler and faster. Still, the GTX580 and GTX570 will do more work.
ID: 19539 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 19550 - Posted: 19 Nov 2010, 9:00:40 UTC - in response to Message 19539.  

Yes,
remember that the GPU based on the 48cores/multiprocessor like the 460 do not perform well. Most likely the best cost effective GPU to run GPUGRID will be the GTX570.

gdf
ID: 19550 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Hypernova

Send message
Joined: 16 Nov 10
Posts: 22
Credit: 24,712,746
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwat
Message 19551 - Posted: 19 Nov 2010, 9:25:30 UTC - in response to Message 19550.  

Excuse my ignorance, but if I put two Fermi boards on my machine but without any SLI bridge. I mean just plug two boards in their respective PCI-Express slots as two indipendent boards (no gaming use just crunching). Then this will be seen as two GPUs and I will get 4 WU's at a time on my machine (instead of 2 with one GPU). Is this correct?

The CPU share is displayed as 0.30, or 0.20 etc. But if I have a multi core like the 980X (hexacore and 12 threads) does really the GPU need 30% which is about 2 full cores to manage one board? Then if I put two boards I monopolize 4 physical cores ? that would be an overkill. So what is the reality of the CPU share ?
ID: 19551 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19557 - Posted: 19 Nov 2010, 19:28:28 UTC - in response to Message 19551.  
Last modified: 19 Nov 2010, 20:16:25 UTC

You will get 4 tasks because at GPUGrid you can only have 2 per card. Note one runs at a time per card, and the others sit in the queue. You don’t need to have a SLI bridge (in fact using one is generally not recommended).

The GPU requires varying amounts of CPU depending on the app, the task running and your system, in particular your CPU.

Generally, for a Fermi it is recommended to free up one CPU thread per GPU in Boinc Manager and to use swan_sync=0. So for your W7 x64 system with 12 threads this would mean losing 2 threads. If you don’t do this your would-be dual Fermi setup would run 40% less efficiently. As you are new to GPUGrid I would recommend sticking to this setup for now.

For a GTX260 this is not the situation; you don’t need to free up a CPU core or use swan_sync=0 and it is 0.3 threads/cores per GPU. In fact this is not even the situation, you have top end CPUs so you will use less than 0.10 CPU threads per GPU.
ID: 19557 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
vaio
Avatar

Send message
Joined: 6 Nov 09
Posts: 20
Credit: 10,781,505
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwat
Message 19560 - Posted: 19 Nov 2010, 21:55:51 UTC - in response to Message 19557.  
Last modified: 19 Nov 2010, 21:57:10 UTC

Another naive cruncher chipping in.

From reading the above am I correct in thinking I could run 2 x GTX-460's independently without using SLI?

Does that mean any board with dual pci-e slots will do and doesn't need to be an SLI certified board?

Also, would a thread from an X4 620 or i3 530 be adequate?

As you can see I know little.....I just like to crunch!
Join Here
Team Forums
ID: 19560 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19561 - Posted: 19 Nov 2010, 22:09:58 UTC - in response to Message 19560.  

From reading the above am I correct in thinking I could run 2 x GTX-460's independently without using SLI?


Yes. And it's actually the other way around: you couldn't make them work together on one WU, even if you wanted to. SLI is meant for games, computing works in a very different way.

Does that mean any board with dual pci-e slots will do and doesn't need to be an SLI certified board?


Yes. Your main problem will likely be supplying enough power to those beasts and cooling them. A PCIe slot which mechanically provides 16 lanes but "only" 4 electrically connected lanes is fine for GPU-Grid, even at PCIe 1 (half the bandwidth of PCIe 2). Avoid 1x slots, though.

Also, would a thread from an X4 620 or i3 530 be adequate?


Yes.

MrS
Scanning for our furry friends since Jan 2002
ID: 19561 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
vaio
Avatar

Send message
Joined: 6 Nov 09
Posts: 20
Credit: 10,781,505
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwat
Message 19562 - Posted: 19 Nov 2010, 22:27:05 UTC - in response to Message 19561.  

Thank you ETA.

Looks like I need to add another Corair 850 to my shopping wishlist.
Join Here
Team Forums
ID: 19562 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19565 - Posted: 20 Nov 2010, 1:01:12 UTC - in response to Message 19562.  
Last modified: 29 Nov 2010, 17:15:45 UTC

With the release of the GTX580 and the planned release of the GTX570 (7th Dec) we are likely to see further price drops for the GTX480, GTX470 and GTX465. People should be aware that for GPUGrid the GTX470 will outperform the planned GTX560. So when a GTX560 does turn up, it should work here, but not nearly as well as a GTX470 never mind a GTX570 or GTX580; a GTX460 gets about half the credits of a GTX480.

The GTX 570 has a reference 772MHz frequency, 10% higher than the 700MHz reference of a GTX 480. However the GTX570 will use less power (225W vs 250W), will be quieter and with a better designed cooler it should clock better. Its release price will also be less than the release price of the GTX480, and this will no doubt fall in time.

I remember seeing a dual GTX470 prototype and think it is likely this will emerge in some form of a dual GTX570 (GTX595 perhaps) next year, to challenge the big dual core ATI cards of course. I have heard no mention of a GTX565, but I expect these could turn up in OEM form, with a further core disabled.

Expect the GTX480, GTX 470 and GTX465 to be phased out over the next six months.
ID: 19565 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Hypernova

Send message
Joined: 16 Nov 10
Posts: 22
Credit: 24,712,746
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwat
Message 19570 - Posted: 20 Nov 2010, 12:36:16 UTC - in response to Message 19557.  
Last modified: 20 Nov 2010, 12:36:49 UTC

Thanks skgiven. All is clear now. I have on order two Asus Fermi GTX580 boards. I am anxious to see how they will perform. I have one device with a Corsair 1000 Watts rated 80 Plus Silver (over 90% efficiency) PSU and an Asus Rampage Extreme III board that should be able to handle very well two 580 boards. The CPU is an 980X with water cooling, and 6Gb Corsair 2000 MHZ Dominator DDR3 memory. Disk is an Crucial 300, Sata3 256 Gb SSD. It will be my most powerful machine to date.
I will post when this will be up and running.
ID: 19570 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19573 - Posted: 20 Nov 2010, 18:46:45 UTC - in response to Message 19560.  

Also, would a thread from an X4 620 or i3 530 be adequate?

An Athlon X4 620 would easily be more than enough and still have power to drive CPU projects. I have an X4 620 on a machine with a GTX 460 and an HD 5770 (both heavily OCed) running 2 GPU projects at 99% and 4 CPU projects at full speed, uses ~375 watts. It's been running on an Antec Earthwatts 500 watt PS, no problems at all.

ID: 19573 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19713 - Posted: 29 Nov 2010, 19:01:34 UTC - in response to Message 19573.  
Last modified: 29 Nov 2010, 19:05:28 UTC

Details of the GTX570 have been released/leaked:
    480 CUDA Cores
    GPU clock @ 732 MHz
    shaders @1464 MHz
    320-bit memory interface
    1280MB GDDR5
    Ram @ 3800 MHz.
    TDP of 225W

So it’s about 4.6% faster than a GTX480 (by clocks alone), and uses 11% less power. It should also be cooler and less noisy.
Just for comparison, it should be (732/607) X (15/14) faster than a stock GTX470; 29.2% faster at the expense of an additional 4.6% power usage.

Release date is supposed to be the 7th Dec. While this suggests there will not be a market flood (tomorrow is the peak E-sales day in the UK, not next week), NVidia also wanted to get the card out before the 6950 and 6970 - due out on the 13th Dec.

As for the price it shoud drop in between a GTX580 and a GTX480 (lowest card prices from one UK site):

    GTX580 £380
    GTX570 £3??
    GTX480 £320
    GTX470 £180
    GTX465 £170

From the above list of present retail prices it would be fair to say the GTX570 will be released for around £350, but the GTX480 will likely fall somewhat during the next week or so, bringing the GTX570 to around £320.

The GTX470 is clearly still the best deal in terms of crunching power per £, and this is likely to remain the case until the GTX570 drops below £250, which might be 6months from now, going by the GTX470’s price drops. However, I would not write off a GTX565 either; given their track record I can’t see NVidia binning 14 good ROPs, and if the GTX470 is to be phased out there would be a gap in the market between the GTX570 and a GTX560. This could be filled by a GTX465 with 448shaders. The project could do with having a good replacement card for under £200.

While a GTX595 is unlikely to appear until well into the next year, some manufacturers are building single-slot width GTX580’s.
In theory, someone (with a spare £5K) could fill all 7 PCIE slots of the best motherboard and have a system which would get close to 700K credits per day. Or put another way, if one task was run in parallel over 7 cards, reduce the time per step to under 0.5ms on a single computer.

ID: 19713 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19714 - Posted: 29 Nov 2010, 21:11:54 UTC - in response to Message 19713.  

That GTX570 looks nice and there's not much reason to doubt these specs. And remember that the GTX480 tepnded to draw way more than its 250W TDP suggested. So the power consumption advantage of the GTX570 versus GTX480 may be even higher.

And I would also expect a further cut-down version. I won't speculate on specs, tough. Alternatively they could use many of the partly-defective GF110 GPUs in Teslas and Quadros (they sell because of software anyway) and in the mobile space, where the demand for "unreasonably power hungry GPUs" appears to be surprisingly high.

MrS
Scanning for our furry friends since Jan 2002
ID: 19714 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 19791 - Posted: 7 Dec 2010, 11:49:55 UTC - in response to Message 19714.  
Last modified: 8 Dec 2010, 20:25:32 UTC

Just checked some UK prices for the GTX570; surprised to see one on offer for £260. That’s a good price and in terms of performance per monitory unit only 10% shy of a GTX470 for £180.
Also of note is the SuperClocked EVGA GTX 570 @ 797MHz (note this is slightly higher than a GTX580 @772MHz), available for £300. I would expect the performance of that card to be within 5% of a reference GTX 580, and for £80 less.
To put these prices in perspective, I picked up my first GTX470 for £320 and that EVGA is 40% faster; in terms of original price per performance that’s 50% faster. A decent improvement in less than a year.
ID: 19791 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 . . . 13 · 14 · 15 · 16

Message boards : Graphics cards (GPUs) : Fermi

©2025 Universitat Pompeu Fabra