Tests on GTX680 will start early next week [testing has started]

Message boards : News : Tests on GTX680 will start early next week [testing has started]
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · 6 · Next

AuthorMessage
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24673 - Posted: 1 May 2012, 21:43:43 UTC - in response to Message 24672.  
Last modified: 1 May 2012, 21:44:27 UTC

I wish the latest Intel processors were only 50% faster!

If it's faster here then it's a good card for here.
For each GPU project different cards perform differently. AMD chose to keep their excellent level of FP64 in their top (enthusiast) cards (HD 7970 and 7950), but dropped FP64 to really poor levels in their mid and range cards (HD 7870, 7850, 7770 and 7750; all 1/16th).

It's not actually a new thing from NVidia; the CC2.1 cards reduced their FP64 compared to the CC2.0 cards (trimmed the fat), making for relatively good & affordable gaming cards, and they were popular.
I consider the GTX680 more of an update from these CC2.1 cards than the CC2.0 cards. We know there will be a full-fat card along at some stage. It made sense to concentrate on the gaming cards - that's were the money's at. Also, NVidia have some catching up to do in order to compete with the AMD's big FP64 cards.
NVidia's strategy is working well.

By the way, the GTX690 offers excellent performance per Watt compared to the GTX680, which offers great performance to begin with. The GTX690 should be ~18% more efficient.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24673 · Rating: 0 · rate: Rate + / Rate - Report as offensive
]{LiK`Rangers`
Avatar

Send message
Joined: 5 Jan 12
Posts: 117
Credit: 77,256,014
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwat
Message 24675 - Posted: 1 May 2012, 23:05:54 UTC - in response to Message 24673.  

well 50% increase in compute speed sounds good to me, especially since nvidia had, not sure if they still do, 620 driver link on there site as someone here noted. but if it comes down to it i guess a new 570 probably wont be a bad deal.
ID: 24675 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24677 - Posted: 2 May 2012, 1:14:48 UTC - in response to Message 24673.  
Last modified: 2 May 2012, 1:17:03 UTC

I wish the latest Intel processors were only 50% faster!

If it's faster here then it's a good card for here.
For each GPU project different cards perform differently. AMD chose to keep their excellent level of FP64 in their top (enthusiast) cards (HD 7970 and 7950), but dropped FP64 to really poor levels in their mid and range cards (HD 7870, 7850, 7770 and 7750; all 1/16th).

It's not actually a new thing from NVidia; the CC2.1 cards reduced their FP64 compared to the CC2.0 cards (trimmed the fat), making for relatively good & affordable gaming cards, and they were popular.
I consider the GTX680 more of an update from these CC2.1 cards than the CC2.0 cards. We know there will be a full-fat card along at some stage. It made sense to concentrate on the gaming cards - that's were the money's at. Also, NVidia have some catching up to do in order to compete with the AMD's big FP64 cards.
NVidia's strategy is working well.

By the way, the GTX690 offers excellent performance per Watt compared to the GTX680, which offers great performance to begin with. The GTX690 should be ~18% more efficient.


Unfortunately, both Nvidia and AMD are now locking out reasonable BOINC upgrades for users like me who are limited by how much extra heating the computer room can stand, and therefore cannot handle the power requirements of any of the new high-end cards.
ID: 24677 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24678 - Posted: 2 May 2012, 3:36:51 UTC

I posted a question in regards to GPU Boost on NVIDIA's forums, and the high voltage given to the card (1.175) and my concerns about this running 24/7, and asking (pleading) that we should be allowed to turn Boost off.

An Admins response:

Hi 5pot,

I can understand about being concerned for the wellbeing of your hardware, but in this case it is unwarranted. :) Previous GPUs used fixed clocks and voltages and these were fully guaranteed and warrantied. GPU Boost has the same guarantee and warranty, to the terms of your GPU manufacturer's warranty. :thumbup: The graphics clock speed and voltage set by GPU Boost is determined by real-time monitoring of the GPU core and it won't create a situation that is harmful for your GPU.

Amorphous@NVIDIA

Figured I would share this information with everyone else here
ID: 24678 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24686 - Posted: 3 May 2012, 13:20:30 UTC - in response to Message 24677.  

Hi Robert,
At present there is nothing below a GTX680, but there will be.
GeForce GT 630 and GT 640 cards will come from NVidia in the next few months.
Although I don't know how they will perform, I expect these GK107 cards will work here. These will be 50/75W cards, but when running tasks should only use ~75% of that (38/56W).

It's probably best to avoid the GF114 and GF116 GF600 cards for now (40nm). These are just re-branded GF500 cards (with Fermi rather than Kepler designs).

We should also see a GTX670, GTX660 Ti, GTX660 and probably a GTX650 Ti (or similar) within a few months. I think the GTX670 is expected ~ 10th May.

My guess is that a GTX670 would have a TDP of ~170W/175W and therefore actually use ~130W. There is likely to be at least one card with a TDP of no more than 150W (only one 6-pin PCIE power connector required). Such a card would actually use ~112W when running tasks.

I think these might actually be favorable compared to their CC2.1 GF500 predecessors, but we will have to wait and see.

Unfortunately, both Nvidia and AMD are now locking out reasonable BOINC upgrades for users like me who are limited by how much extra heating the computer room can stand, and therefore cannot handle the power requirements of any of the new high-end cards.


FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24686 · Rating: 0 · rate: Rate + / Rate - Report as offensive
frankhagen

Send message
Joined: 18 Sep 08
Posts: 65
Credit: 3,037,414
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 24687 - Posted: 3 May 2012, 14:40:15 UTC - in response to Message 24686.  
Last modified: 3 May 2012, 14:41:43 UTC

Hi Robert,
At present there is nothing below a GTX680, but there will be.
GeForce GT 630 and GT 640 cards will come from NVidia in the next few months.


you will defintely have to have a close look what you get there:


http://www.geforce.com/hardware/desktop-gpus/geforce-gt-640-oem/specifications

6 different versions under the same label!

mixed up, mungled up, fraudulent - at least potentially. :(
ID: 24687 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24688 - Posted: 3 May 2012, 16:34:35 UTC - in response to Message 24687.  

Hi Robert,
At present there is nothing below a GTX680, but there will be.
GeForce GT 630 and GT 640 cards will come from NVidia in the next few months.


you will defintely have to have a close look what you get there:


http://www.geforce.com/hardware/desktop-gpus/geforce-gt-640-oem/specifications

6 different versions under the same label!

mixed up, mungled up, fraudulent - at least potentially. :(


I see only three versions there, but definitely mixed up.

However, a GT 645 is also listed now, and short enough that I might find some brand that fill fit in my computer that now has a GTS 450. I may have to look at that one some more, while waiting for the GPUGRID software to be updated enough to tell if the results make it worth upgrading.
ID: 24688 · Rating: 0 · rate: Rate + / Rate - Report as offensive
frankhagen

Send message
Joined: 18 Sep 08
Posts: 65
Credit: 3,037,414
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 24689 - Posted: 3 May 2012, 16:39:50 UTC - in response to Message 24688.  
Last modified: 3 May 2012, 16:40:32 UTC

I see only three versions there, but definitely mixed up.


take a closer look!

it's 2 kepler's and 1 fermi.

it's 1 or 2 - or 1,5 of 3GB or ram.

+ DDR3 vs. DDR5.

and that's only the suggested specs - OEM's are free to do whatever they want according to clock-rates..

However, a GT 645 is also listed now, and short enough that I might find some brand that fill fit in my computer that now has a GTS 450.


if you want a rebranded GTX560se..
ID: 24689 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24690 - Posted: 3 May 2012, 17:09:49 UTC - in response to Message 24689.  

I see only three versions there, but definitely mixed up.


take a closer look!

it's 2 kepler's and 1 fermi.

it's 1 or 2 - or 1,5 of 3GB or ram.

+ DDR3 vs. DDR5.

and that's only the suggested specs - OEM's are free to do whatever they want according to clock-rates..

I see what you mean about RAM sizes.

However, a GT 645 is also listed now, and short enough that I might find some brand that fill fit in my computer that now has a GTS 450.


if you want a rebranded GTX560se..

I see nothing about it that says Fermi or Kepler. But if that's correct, I'll probably wait longer before replacing the GTS 450, but check if one of the Kepler GT 640 versions are a good replacement for the GT 440 in my other desktop.
ID: 24690 · Rating: 0 · rate: Rate + / Rate - Report as offensive
frankhagen

Send message
Joined: 18 Sep 08
Posts: 65
Credit: 3,037,414
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 24691 - Posted: 3 May 2012, 17:16:37 UTC - in response to Message 24690.  
Last modified: 3 May 2012, 17:18:46 UTC

I see nothing about it that says Fermi or Kepler. But if that's correct, I'll probably wait longer before replacing the GTS 450, but check if one of the Kepler GT 640 versions are a good replacement for the GT 440 in my other desktop.


look there:
http://en.wikipedia.org/wiki/GeForce_600_Series

probably best for you to wait for a gt?-650 to show up..
ID: 24691 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24692 - Posted: 3 May 2012, 19:19:24 UTC - in response to Message 24687.  
Last modified: 3 May 2012, 19:23:00 UTC

These have already been released as OEM cards. Doesn't mean you can get them yet, and I would still expect retail versions to turn up, but exactly when I don’t know.
Anything that is PCIE2 probably has a 40nm Fermi design. Anything PCIE3 should be Kepler.

GeForce GT 600 OEM list
:
GT 645 (GF114, Not Kepler, 40nm, 288 shaders) – should work as an entry level/mid-range card for GPUGrid
GT 630 (GK107, Kepler, 28nm, 384 shaders) – should work as an entry level card for GPUGrid
GT 620 (GF119, Not Kepler, 40nm, 48 shaders) – too slow for GPUGrid
605 (GF119, Not Kepler, 40nm, 48 shaders) – too slow for GPUGrid

GT 640 – 3 or 6 types:
GK107 (Kepler), 28nm, PCIE3, 384shaders, 950MHz, 1GB or 2GB, GDDR5, 729GFlops, 75W TDP
GK107 (Kepler), 28nm, PCIE3, 384shaders, 797MHz, 1GB or 2GB, DDR3, 612GFlops, 50W TDP
GF116 (Fermi), 40nm, PCIE2, 144shaders, 720MHz, 1.5GB or 3GB, DDR3, 414GFlops, 75W TDP

Although these are untested, the 729GFlops card looks like the best OEM option.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24692 · Rating: 0 · rate: Rate + / Rate - Report as offensive
frankhagen

Send message
Joined: 18 Sep 08
Posts: 65
Credit: 3,037,414
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 24693 - Posted: 3 May 2012, 19:37:43 UTC - in response to Message 24692.  

These have already been released as OEM cards. Doesn't mean you can get them yet, and I would still expect retail versions to turn up, but exactly when I don’t know.
Anything that is PCIE2 probably has a 40nm Fermi design. Anything PCIE3 should be Kepler.


probably that's the clue we will have.

only one thing left on the bright side: the low TDP kepler-version GT640 will most likely show up even fanless.
ID: 24693 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 1
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24694 - Posted: 3 May 2012, 21:33:49 UTC - in response to Message 24670.  

just skimming this im getting alot of mixed signals, i read that theres a 50% increase on the 680, and also that the coding on the 680 almost isnt worth it, while i know its just come out, should I be waiting for a 600 or not?

This 50% increase is actually around 30%.
The answer depends on what you prefer.
The GTX 680 and mostly the GTX 690 is an expensive card, and they will stay expensive for at least until Xmas. However, considering the running costs, it could be worth the investment in long term.
My personal opinion is that nVidia won't release the BigKepler as a GeForce card, so there is no point in waiting for a better cruncher card from nVidia this time. In a few months we'll see if I was right in this matter. Even if nVidia releases the BigKepler as a GeForce card, its price will be between the price of the GTX 680 and 690.
On the other hand, there will be a lot of cheap Fermi based (CC2.0) cards, either second-hand ones, or some "brand new" from a stuck stockpile, so one could buy 30% less computing power approximately at half (or maybe less) price.
ID: 24694 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24695 - Posted: 3 May 2012, 23:26:53 UTC - in response to Message 24694.  
Last modified: 3 May 2012, 23:28:55 UTC

Until the GF600 app gets released there's not much point buying any GF600.

Upgrading to a GF500, on the cheap, seems reasonable (and I've seen a few at reasonable prices), but I expect that when the GTX 670 turns up (launches next week, supposedly) we will see a lot of price drops.

The GTX690 didn't really change anything; firstly there are none, and secondly a $999 card is way beyond most people, so it doesn't affect the prices of other cards. In fact the only thing it really competes against is the GTX680.
I suppose a few people with HD6990's and GTX 590's might upgrade, but not many, and not when they can't get any.

I have a feeling 'full-fat' Kepler might have a fat price tag too. I'm thinking that the Quadro line up will expand to include the amateur video editors as well as the professionals. The old Quadro's were too pricey and most just used the GeForce Fermi cards, but now that the GF600 has put all it's eggs in the gaming basket, there is nothing for video editors. The design of the GTX690 suggests things. The Tesla's might also change. Possibly becoming more University friendly.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24695 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24705 - Posted: 4 May 2012, 8:04:17 UTC - in response to Message 24677.  

Unfortunately, both Nvidia and AMD are now locking out reasonable BOINC upgrades for users like me who are limited by how much extra heating the computer room can stand, and therefore cannot handle the power requirements of any of the new high-end cards.


The solution is easy, don't vent the hot exhaust from your GPU into the room. Two ways to do that:

1) Get a fan you can mount in the window. If the window is square/rectangular then get a fan with a square/rectangular body as opposed to a round body. Mount the fan in the window then put the computer on a stand high enough to allow the air that blows out of the video card to blow directly into the fan intake. Plug the open space not occupied by the fan with whatever cheap plastic material you can find in a building supply store, a painted piece of 1/4" plywood, kitchen counter covering (arborite) or whatever.

2) I got tired of all the fan noise so I attached a shelf outside the window and put both machines out there. An awning over my window keeps the rain off but you don't have to have an awning, there are other ways to keep the rain off. Sometimes the wind blows snow into the cases in the winter but it just sits there until spring thaw. Sometimes I need to pop a DVD in the tray so I just open the window. I don't use DVD's much anymore so it's not a problem. I screwed both cases to the shelf so they can't be stolen. It never gets much colder than -30 C here and that doesn't seem to bother them. Now I'm finally back to peaceful computing, the way it was before computers needed cooling fans.
ID: 24705 · Rating: 0 · rate: Rate + / Rate - Report as offensive
wdiz

Send message
Joined: 4 Nov 08
Posts: 20
Credit: 871,871,594
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24707 - Posted: 4 May 2012, 8:29:47 UTC - in response to Message 24705.  

Any news about GpuGRID support for GTX 680 (under linux) ?

Thx
ID: 24707 · Rating: 0 · rate: Rate + / Rate - Report as offensive
wiyosaya

Send message
Joined: 22 Nov 09
Posts: 114
Credit: 589,114,683
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24780 - Posted: 7 May 2012, 14:13:35 UTC - in response to Message 24619.  

CUDA4.2 comes in the drivers which support cards as far back as the GeForce 6 series. Of course GeForce 6 and 7 are not capable of contributing to GPUGrid. So the question might be, will GeForce 8 series cards still be able to contribute?

At this point, I run the short queue tasks on my 8800 GT. It simply cannot complete long queue tasks in a reasonable time. If tasks in the short queue start taking longer than 24 hours to complete, I will probably retire it from this project.

That said, if CUDA4.2 will bring significant performance improvements to fermi, I'll be looking forward to it.

As to the discussion of what card to buy, I found a new GTX 580 for $370 after rebate. Until I complete my new system, which should be in the next two weeks or so, I have been and will be running it in the machine where the 8800 GT was. It is about 2.5x faster than my GTX 460 on GPUGrid tasks.

As I see it, there are great deals on 580s out there considering that about a year ago, these were the "top end" cards in the $500+ range.

ID: 24780 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24782 - Posted: 7 May 2012, 14:17:51 UTC
Last modified: 7 May 2012, 14:40:47 UTC

670's are looking to perform AT LEAST at 580 levels if not better, and with a GIANT decrease in power consumption. They come out Thursday.

EDIT: Any chance we could get an update on new app? ETA or how things are moving along? Know you guys are having issues with drivers, but an update would be appreciated.

Thanks, keep up the hard work.
ID: 24782 · Rating: 0 · rate: Rate + / Rate - Report as offensive
wiyosaya

Send message
Joined: 22 Nov 09
Posts: 114
Credit: 589,114,683
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24785 - Posted: 7 May 2012, 17:07:57 UTC
Last modified: 7 May 2012, 17:10:06 UTC

How is DP performance on 670s? Given DP performance on 680s, I would expect that DP performance on the 670 would be worse than the 680.

I know power consumption is not optimal on the 580 compared to the 600 series in most "gamer" reviews that I have seen, however, I chose the 580 since I run a project that requires DP capability. For projects that require DP capability, I would not be surprised if the 580 is more efficient, power consumption wise, than any of the 600 series as the 680's DP benchmarks are a fraction of the 580's DP benchmarks. On the project I run, Milkyway, I am seeing a similar 2.5 - 3x performance gain with the GTX 580 over my GTX 460 .

Unfortunately, anyone considering a GPU has many factors to consider and that only makes the task of choosing a GPU harder and more confusing.

For a GPU dedicated to GPUGrid, a 600 series card may be an optimal choice; however, for anyone running projects that require DP capability, 600 series may be disappointing at best.
ID: 24785 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24786 - Posted: 7 May 2012, 17:32:14 UTC

Agreed. Do not even consider 6xx if your looking for DP
ID: 24786 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Previous · 1 · 2 · 3 · 4 · 5 · 6 · Next

Message boards : News : Tests on GTX680 will start early next week [testing has started]

©2025 Universitat Pompeu Fabra