GTX 590 coming?

Message boards : Graphics cards (GPUs) : GTX 590 coming?
Message board moderation

To post messages, you must log in.

1 · 2 · 3 · Next

AuthorMessage
Hypernova

Send message
Joined: 16 Nov 10
Posts: 22
Credit: 24,712,746
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwat
Message 20327 - Posted: 1 Feb 2011, 12:07:17 UTC
Last modified: 1 Feb 2011, 12:07:46 UTC

It seems we may get in February a new card the GTX 590 that will be a dual 580 card. All cores (1024 total) will be active but frequencies a little lower to cut consumption and heat. It will be interesting to see how they will behave on GPUGrid.
ID: 20327 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Carlesa25
Avatar

Send message
Joined: 13 Nov 10
Posts: 328
Credit: 72,619,453
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20329 - Posted: 1 Feb 2011, 13:19:38 UTC - in response to Message 20327.  

Español:
Hola: Será muy interesante, pero hay que tener presente que no son 1024 núcleos en realidad es 16+16 SMs = 512+512 Shaders es como la anterior GTX295 (240+240 que es la mía) lo que permite ejecutar DOS tareas el mismo.

Supongo que en la practica doblará el rendimiento, pero la estructura interna es muy diferente en especial la distribución de las unidades de cálculo. Saludos

Ingles:
Hello: It will be very interesting, but should be present are not 1024 cores actually is 16 + 16 SMs = 512 + 512 Shaders is like the previous GTX295 (240 + 240 which is mine) allowing to run two tasks the same.

I suppose that in her practice it will double the performance, but the internal structure is very different in particular the distribution of units of calculation. Best regards
ID: 20329 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20331 - Posted: 1 Feb 2011, 14:40:13 UTC - in response to Message 20329.  

It will allow some single PCIE slot users to basically have 2 GPUs, and dual slot users to have more than 2 GPUs. Good news. My guess is that they will be slightly more energy efficient than the GTX580 cards (performance per Watt). Hopefully they will drive the price of other cards down too; the GTX580 is still far too rich for many, and if the GTX570 is only the 3rd fastest Fermi then those prices may see an early fall too.
ID: 20331 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Hypernova

Send message
Joined: 16 Nov 10
Posts: 22
Credit: 24,712,746
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwat
Message 20332 - Posted: 1 Feb 2011, 21:23:28 UTC - in response to Message 20329.  

Español:
Hola: Será muy interesante, pero hay que tener presente que no son 1024 núcleos en realidad es 16+16 SMs = 512+512 Shaders es como la anterior GTX295 (240+240 que es la mía) lo que permite ejecutar DOS tareas el mismo.

Supongo que en la practica doblará el rendimiento, pero la estructura interna es muy diferente en especial la distribución de las unidades de cálculo. Saludos

Ingles:
Hello: It will be very interesting, but should be present are not 1024 cores actually is 16 + 16 SMs = 512 + 512 Shaders is like the previous GTX295 (240 + 240 which is mine) allowing to run two tasks the same.

I suppose that in her practice it will double the performance, but the internal structure is very different in particular the distribution of units of calculation. Best regards


You are right. You won't have 1024 Cores available only for one task. It will be two GPU at 512 core each, and crunching each one a separate task.

The reason I was mentioning 1024 cores, was to say that Nvidia had to arbitrate for consumption and thermal reason, to either reduce the number of active cores per GPU or reduce the frequencies. In my opinion the fact that they preferred to keep all cores active, is I think the better variant for crunching.

For game playing were framerates (fast cycle times) are paramount, probably choosing to keep frequencies high would have been best.

But maybe I am wrong.
ID: 20332 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20351 - Posted: 6 Feb 2011, 15:21:41 UTC - in response to Message 20332.  

AMD will be showcasing Bulldozer and Antilles at CeBit, which will be in Hannover from 1st to 5th March this year.

NVidia are now expected to release the GTX595 sometime after this; no doubt they want to test Antilles so that they can fine tune their GTX595 to outperform Antilles, at least in some promotional way.

So my guess is that a GTX595 will turn up by the middle of the year.

With high end Sandy Bridge CPU's, Bulldozer, Antilles and a GTX595 en route, this will be a big year for big CPU's and very big GPU's.
ID: 20351 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20488 - Posted: 21 Feb 2011, 9:40:35 UTC

For games and regarding shaders it doesn't matter if you increase shader count or frequency. The task is "embarrassingly parallel", meaning even at just 1024x786 we've got 0.8 million pixels per frame and could to first approximation make use of just as many shaders in parallel (this number will go down by a factor of 20 or so if you consider pipeline depth, but still safe). And regarding frequency: we need frames at Hz region, whereas GPU frequencies are in the MHz region.

There are some parts of the GPU, however, whose number is not reduced upon cutting shaders. So these will work faster in a card with less higher clocked shaders, as they'll get higher frequencies too.

And regarding the GTX595 rumors: even a double GTX570 exceeds the 300 W power wall set by the PCIe specification by quite a bit. And they're already down to ~0.95 V, so there's not much room left at the bottom for TSMCs 40 nm process. I could imagine 2 full GF110 at no more than 0.90 V and frequencies below GTX570, making them about as fast as GTX570 SLI. It's going to be a powerful and interesting card, but don't expect miracles ;)

MrS
Scanning for our furry friends since Jan 2002
ID: 20488 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20648 - Posted: 11 Mar 2011, 18:32:46 UTC - in response to Message 20488.  
Last modified: 11 Mar 2011, 18:59:24 UTC

Anticipated release date is the 22nd March, according to several reports - only 11 days.
It will be interesting to see what the performance is, as we have not seen a dual-Fermi of any kind. One things for sure, at the suggested entry price I won't be the first to fork out. Hopefully the other Fermi's will start to be more reasonably priced, I fancy a GTX570, but only when the price is right.

I can't see NVidia releasing a sub 300W GF110 dual-Fermi; the Radeon HD 6990 uses 350W (or 450W if you flick the performance switch).
ID: 20648 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20649 - Posted: 11 Mar 2011, 21:57:34 UTC - in response to Message 20648.  

Agreed - if they follow the HD6990 they can probably put out a decent dual GF110, i.e. without castrating it too much to stay within 300 W.

MrS
Scanning for our furry friends since Jan 2002
ID: 20649 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20683 - Posted: 17 Mar 2011, 17:59:31 UTC - in response to Message 20649.  
Last modified: 17 Mar 2011, 18:02:24 UTC

This one is funny.

Perhaps a bit closer to the real thing.
ID: 20683 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20685 - Posted: 17 Mar 2011, 20:11:54 UTC

LOL!

MrS
Scanning for our furry friends since Jan 2002
ID: 20685 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
alephnull

Send message
Joined: 8 Jul 09
Posts: 13
Credit: 306,850,267
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 20691 - Posted: 18 Mar 2011, 4:07:34 UTC - in response to Message 20683.  

hopefully these come out soon. i was waitin for them but when it got delayed again my patience ran out and just got the 580s. anyone have guesstimations on what the 590s may go for? i feel a second mortgage comin in the near future...

any speculation on the cpu usage for gpugrid with these cards? since they will be able to crunch 2 wu each, will that mean 2 cpu cores per card?
ID: 20691 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20692 - Posted: 18 Mar 2011, 7:08:51 UTC - in response to Message 20691.  

You're about right on the cost - mortgage territory.

Yeah, with 2 GPU's, you would want two CPU cores/threads disabled if you use swan_sync (which you would do, of course).
ID: 20692 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20705 - Posted: 18 Mar 2011, 18:54:55 UTC - in response to Message 20692.  
Last modified: 23 Mar 2011, 14:15:37 UTC

Hexus suggest a very reasonable TDP of 365W. This might mean the 622MHz I read elsewhere (607) is real and might lay some doubt about if it will outperform a 6990. That said I still think it will be 50% faster than a single GTX580.
ID: 20705 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20709 - Posted: 18 Mar 2011, 21:50:44 UTC

Well, if you'd need a mortgage to afford one.. you probably shouldn't ;)

Anyway, I fail to be impressed by these cards (rumored GTX590 and HD6990). In my opinion it's pushing single cards too far, especially for serious 24/7 crunching. I'd rather see more flexible use of more cards of the GTX580 caliber (card arrangement & cases, PCIe slots & their spacing).

MrS
Scanning for our furry friends since Jan 2002
ID: 20709 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Carlesa25
Avatar

Send message
Joined: 13 Nov 10
Posts: 328
Credit: 72,619,453
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20711 - Posted: 18 Mar 2011, 23:38:51 UTC - in response to Message 20709.  

Hi, My experience with a GTX295 is very good (on Linux and Windows) better than its equivalent in SLI, lower consumption and easy installation and OC quite broad, also suggests in mount 4 GPU and is not nonsense.
When the change will almost certainly be a GTX590 (if I can get a credit ...). Greetings.
ID: 20711 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Jeremy

Send message
Joined: 15 Feb 09
Posts: 55
Credit: 3,542,733
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 20715 - Posted: 19 Mar 2011, 3:37:41 UTC
Last modified: 19 Mar 2011, 3:38:50 UTC

The dual GPU on a single card configurations have certain advantages. Ability to install in a single PCIe slot motherboard is one, but another (the one interests me, frankly) is water cooling. A single watercooling block is typically about $110-$120 or so. With proper cooling, there's little reason a dual-GF110 card won't be able to achieve full GTX580 clock speeds if your power supply can handle it. Once you factor in the costs of the waterblock(s), it might even be less expensive than two 570/580s.

Time will tell, but I'm interested to see what nVidia actually brings to the table.
C2Q, GTX 660ti
ID: 20715 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20718 - Posted: 19 Mar 2011, 15:21:58 UTC - in response to Message 20715.  
Last modified: 24 Mar 2011, 13:08:33 UTC

Most people that buy these cards will probably remove the heatsink and fans, and go straight to water cooling. If it's TDP is only 365W and the user has a PCIE2 slot then the system can supply up to 450W to the card – plenty of headroom to overclock to at least the GTX580 ref of 772MHz, probably more saying as these will be the sweetest of cores.
My concern would be the potential lack of capacitors on the card (to save on power). Hope NVidia did not scrimp in this area, but you never know. I doubt it, but if there is a switch such as with the 6990 then perhaps it could be a 365W/440W card.
I also read a suggestion that only 1000 of these would be made available at the outset. A very low number, suggesting a very high price tag, but I think in the long run many more will be released.
ID: 20718 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 20778 - Posted: 24 Mar 2011, 12:52:28 UTC - in response to Message 20718.  
Last modified: 27 Mar 2011, 21:56:03 UTC

Ref specs:

512Shaders per GPU (1024 total)
Memory: 3072 MB DDR5 (1536MB per GPU)
Core Clock :607 MHz
Shader Clock: 1215 MHz
Memory Clock : 3414 MHz
Memory Interface : 768 bit (384 bit per GPU)
DirectCompute 5.0 and OpenCL support
Microsoft DirectX 11 support
NVIDIA PhysX -Ready
Quad SLI Ready
Three dual-link DVI + mini-displayport connectors
Power consumption: 365W

It's GF110 (Rev 1A), so it should work here straight out of the box (using the 267.71 driver). Of course ASUS already have one listed at 612 MHz, so expect some variation. GPUz image
Prices from £550 ($700~ish, 600 Euro)

Review by Ryan Smith at AnandTech.
Going by this Folding graph it should outperform a GTX580 by around 55% for crunching here or at Folding, and if OC'd might get to about 190% of a ref GTX580.
ID: 20778 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile liveonc
Avatar

Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 20780 - Posted: 24 Mar 2011, 20:28:14 UTC

There's also this review from benchmarkreviews.com

At first it seamed like pulling rabbits out of a hat, sold as a "miracle". Since the specs were (copy/paste from benchmarkreviews):

Graphics Clock: 607 MHz
Processor Clock: 1215 MHz
Memory Clock: 854/3414 MHz
Thermal Design Power: 365 Watts

But even thought the benchmarks point to 1.5 of GTX580 SLI. The TDP of a GTX 580 SLI is 492 Watts (2x246) & 1.5 of that is 369 Watts.

So where is this "miracle"???
ID: 20780 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Zydor

Send message
Joined: 8 Feb 09
Posts: 252
Credit: 1,309,451
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 20787 - Posted: 26 Mar 2011, 23:06:35 UTC - in response to Message 20780.  

There is a major factor lurking outside the marketing/fanboy hype of both the 590 and 6990, and those considering a long term purchase of high cards may well like to consider it if not really known before. Both the 590 and 6990 are cobbled together designs patching over the cracks resulting from the 32nm fabrication being canned - they had no choice but to release new designs on 40nm. AMD were in a better position to slip the Nothern Islands design to 40nm, NVidia had a bigger hassle as it was still getting over the Fermi debacle. The result is of course two fast cards, but both are hobbled by the fact they had to go to 40nm.

Both AMD and NVidia are due to release 28nm designs this year (allegedly late second quater - probably means by Xmas in reality). The designs are already proven and not vaporware. In September last year NVidia claimed they would release the Kepler card - 28nm - end second quater this year, but given NVidia are always late on Marketing promises, Xmas is probably it. AMD will be ready to go by then with 28nm Northern Islands as they usually do product refreash mid-end fourth quater.

The import of this is a step change in performance of 3-6 times current levels (NVidia even claim Maxwell, due in 2013 on 28nm, will be 15x current levels). Normaly I personaly would go for the cards that meet the need now, there is always something better round the corner, and you could end up waiting forever. In this instance however, the releases later this year come with a massive step change in performance. That would also explain NVidias claim of only releasing 1000 590's.

Both Companies were caught by the 32nm fabrication failure, AMD arguably came out better on balance in that scramble over the last 2-3 years, but the real change where ears need to prick up is the 28nm fabrication process. The latter will put both AMD and NVidia on a level playing field for the first time in three years, and will deliver massive performance increases along with reduced power and heat.

If individuals need a high end card now, thats life, but if the need is not critical, waiting to see the 28nm options develop makes sense. Lots of hype flying around re 590/6990 at present, all driven by respective marketing - but thats the wrong war. 40nm is dead now, the 6990 & 590 are the last of the 40nm line, cobbled together designs that were originally seen as 32nm. The real war starts again at 28 nm and the first battle is about to start in a few months. So unless a purchase of a high end card is urgent, there is for once, a real case for waiting until the 28nm picture is clarified.

Horses for Courses, we all have our individual needs and drivers, but if individuals were not fully aware of the 40 / 32 / 28nm fabrication saga, its worth pausing to see what it means for them - 28nm is only a few months away ....

Regards
Zy
ID: 20787 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
1 · 2 · 3 · Next

Message boards : Graphics cards (GPUs) : GTX 590 coming?

©2025 Universitat Pompeu Fabra