gtx680

Message boards : Graphics cards (GPUs) : gtx680
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · Next

AuthorMessage
Profile Damaraland

Send message
Joined: 7 Nov 09
Posts: 152
Credit: 16,181,924
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwat
Message 24085 - Posted: 22 Mar 2012, 21:06:15 UTC
Last modified: 22 Mar 2012, 21:08:09 UTC

$499.99 in USA :(
Amazon
ID: 24085 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24086 - Posted: 22 Mar 2012, 21:25:31 UTC - in response to Message 24079.  

Summarizing the reviews: gaming performance is like we have expected it, computing performace still not known, since folding isn't working on the GTX680 yet, probably the GPUGrid client won't work either without some optimization.
ID: 24086 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
zombie67 [MM]

Send message
Joined: 16 Jul 07
Posts: 209
Credit: 5,496,860,456
RAC: 8,998
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24087 - Posted: 22 Mar 2012, 21:51:49 UTC - in response to Message 24079.  

http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-14.html

Moreover, Nvidia limits 64-bit double-precision math to 1/24 of single-precision, protecting its more compute-oriented cards from being displaced by purpose-built gamer boards. The result is that GeForce GTX 680 underperforms GeForce GTX 590, 580 and to a much direr degree, the three competing boards from AMD.

Does GPUGRID use 64-bit double-precision math?
Reno, NV
Team: SETI.USA
ID: 24087 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 24088 - Posted: 22 Mar 2012, 21:56:14 UTC - in response to Message 24087.  

If somebody can run here on a gtx680, let us know.

thanks,
gdf
ID: 24088 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 24089 - Posted: 22 Mar 2012, 21:56:52 UTC - in response to Message 24087.  

Almost nothing, this should not matter.
gdf

http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-14.html

Moreover, Nvidia limits 64-bit double-precision math to 1/24 of single-precision, protecting its more compute-oriented cards from being displaced by purpose-built gamer boards. The result is that GeForce GTX 680 underperforms GeForce GTX 590, 580 and to a much direr degree, the three competing boards from AMD.

Does GPUGRID use 64-bit double-precision math?

ID: 24089 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
JohnSheridan

Send message
Joined: 26 Jun 09
Posts: 10
Credit: 104,093
RAC: 0
Level

Scientific publications
watwat
Message 24091 - Posted: 22 Mar 2012, 23:01:30 UTC - in response to Message 24088.  

If somebody can run here on a gtx680, let us know.

thanks,
gdf


Should be getting an EVGA version tomorrow morning here in UK - cost me £405.

Already been asked to do some other Boinc tests first though.
ID: 24091 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24092 - Posted: 23 Mar 2012, 8:48:12 UTC - in response to Message 24091.  

It would be nice if you also reported here what you find for other projects - thanks!

MrS
Scanning for our furry friends since Jan 2002
ID: 24092 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Evil Penguin
Avatar

Send message
Joined: 15 Jan 10
Posts: 42
Credit: 18,255,462
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 24093 - Posted: 23 Mar 2012, 9:32:21 UTC

I wonder how this tweaked architecture will perform with these BOINC projects.
So far compute doesn't seem like Kepler's strong point.

Also, a little off topic...
But is then any progress being made on the AMD side of things?
I haven't heard a single peep about it for over a month.
If the developers still don't have a 7970, fine.
Please at least confirm as much...

Thanks.
ID: 24093 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 24094 - Posted: 23 Mar 2012, 9:39:13 UTC - in response to Message 24093.  

We have a small one, good enough for testing. The code works on Windows with some bugs. We are assessing the performance.

gdf
ID: 24094 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24096 - Posted: 23 Mar 2012, 11:47:49 UTC - in response to Message 24094.  

It would seem NVidia have stopped support for XP; there are no XP drivers for the GTX 680!
http://www.geforce.com/drivers/results/42929 I think I posted about this a few months ago.

Suggestions are that the 301.1 driver is needed (probably Win).
http://www.geforce.com/drivers/beta-legacy
A Linux 295.33 driver was also released on the 22dn, and NVidia's driver support for Linux is >>better than AMD's.

The cards fan profile is such that the fans don't make much noise; so it might get hot. This isn't a good situation. If we can't use with WinXP, then we are looking at W7 (and presumably an 11% or more hit in performance)? If we use Linux we could be faced with cooling issues.

The 301.1 driver might work on a 2008R2 server, but probably not on earlier servers.

Good luck,
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24096 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
JohnSheridan

Send message
Joined: 26 Jun 09
Posts: 10
Credit: 104,093
RAC: 0
Level

Scientific publications
watwat
Message 24099 - Posted: 23 Mar 2012, 14:41:40 UTC

OK have now got my 680 and started by running some standard gpu tasks for Seti.

On the 580 it would take (on average) 3m 40s to do one task. On the 680 (at normal settings) it takes around 3m 10s.

The card I have is an EVGA so can be overclocked using their Precision X tool.

The first overclock was too aggressive and it was clearly causing the gpu tasks to error out however lowering the overclock resulted in gpu tasks now taking around 2m 50s each.

Going to try and get a GPUGRID task shortly to see how that goes.
ID: 24099 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
JohnSheridan

Send message
Joined: 26 Jun 09
Posts: 10
Credit: 104,093
RAC: 0
Level

Scientific publications
watwat
Message 24101 - Posted: 23 Mar 2012, 15:19:17 UTC

Tried to download and run 2 x GPUGRID tasks but both crashed out before completing the download saying acemd.win2382 had stopped responding.

So not sure what the problem is?
ID: 24101 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
JohnSheridan

Send message
Joined: 26 Jun 09
Posts: 10
Credit: 104,093
RAC: 0
Level

Scientific publications
watwat
Message 24102 - Posted: 23 Mar 2012, 15:23:05 UTC

Just reset the graphics card back to "normal" ie. no overclock and still errors out - this time it did finish downloading but crashed out as soon as it started on the work unit so looks like this project does not yet work on the 680 ?
ID: 24102 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Richard Haselgrove

Send message
Joined: 11 Jul 09
Posts: 1639
Credit: 10,159,968,649
RAC: 318
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24106 - Posted: 23 Mar 2012, 16:14:11 UTC

stderr_txt:

# Using device 0
# There is 1 device supporting CUDA
# Device 0: "GeForce GTX 680"
# Clock rate: 0.71 GHz
# Total amount of global memory: -2147483648 bytes
# Number of multiprocessors: 8
# Number of cores: 64
SWAN : Module load result [.fastfill.cu.] [200]
SWAN: FATAL : Module load failed

Assertion failed: 0, file swanlib_nv.c, line 390

We couldn't be having a 31-bit overflow on that memory size, could we?
ID: 24106 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
JohnSheridan

Send message
Joined: 26 Jun 09
Posts: 10
Credit: 104,093
RAC: 0
Level

Scientific publications
watwat
Message 24107 - Posted: 23 Mar 2012, 16:22:55 UTC - in response to Message 24106.  

stderr_txt:

# Using device 0
# There is 1 device supporting CUDA
# Device 0: "GeForce GTX 680"
# Clock rate: 0.71 GHz
# Total amount of global memory: -2147483648 bytes
# Number of multiprocessors: 8
# Number of cores: 64
SWAN : Module load result [.fastfill.cu.] [200]
SWAN: FATAL : Module load failed

Assertion failed: 0, file swanlib_nv.c, line 390

We couldn't be having a 31-bit overflow on that memory size, could we?


In English please?
ID: 24107 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24109 - Posted: 23 Mar 2012, 16:35:11 UTC - in response to Message 24107.  
Last modified: 23 Mar 2012, 17:48:19 UTC

The GPUGRID application doesn't support the GTX680 yet. We'll have test units soon and - if there are no problems - we'll update over the weekend or early next week.

MJH


English - GPUGrid's applications don't yet support the GTX680. MJH is working on an app and might get one ready soon; over the weekend or early next week.

PS. Your SETI runs show the card has some promise ~16% faster @stock than a GTX580 (244W TDP). Or 30% faster overclocked. Not sure that will be possible here, but you'll know fairly soon. Even if the GTX680 (195W TDP) just matches the GTX580 the performance/power gain might be note worthy; ~125% performance/Watt, or 45% at 116% performance of a GTX580.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24109 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
JohnSheridan

Send message
Joined: 26 Jun 09
Posts: 10
Credit: 104,093
RAC: 0
Level

Scientific publications
watwat
Message 24110 - Posted: 23 Mar 2012, 16:36:15 UTC

Thanks for that simple to understand reply :)

I will suspend GPUGRID on that machine until the project does support the 680.
ID: 24110 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24111 - Posted: 23 Mar 2012, 16:39:03 UTC - in response to Message 24110.  
Last modified: 23 Mar 2012, 16:41:45 UTC

Good idea; no point returning lots of failed tasks!

I expect you will see an announcement when there is a working/beta app.

Thanks,
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24111 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
JAMES DORISIO

Send message
Joined: 6 Sep 10
Posts: 8
Credit: 3,478,997,495
RAC: 67
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24114 - Posted: 23 Mar 2012, 18:12:00 UTC

ID: 24114 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
matlock

Send message
Joined: 12 Dec 11
Posts: 34
Credit: 86,423,547
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwat
Message 24115 - Posted: 23 Mar 2012, 18:59:04 UTC - in response to Message 24096.  

Why would there be cooling issues in Linux? I keep my 560Ti448Core very cool by manually setting the fan speed in the nvidia settings application, after setting "Coolbits" to "5" in the xorg.conf.


It would seem NVidia have stopped support for XP; there are no XP drivers for the GTX 680!
http://www.geforce.com/drivers/results/42929 I think I posted about this a few months ago.

Suggestions are that the 301.1 driver is needed (probably Win).
http://www.geforce.com/drivers/beta-legacy
A Linux 295.33 driver was also released on the 22dn, and NVidia's driver support for Linux is >>better than AMD's.

The cards fan profile is such that the fans don't make much noise; so it might get hot. This isn't a good situation. If we can't use with WinXP, then we are looking at W7 (and presumably an 11% or more hit in performance)? If we use Linux we could be faced with cooling issues.

The 301.1 driver might work on a 2008R2 server, but probably not on earlier servers.

Good luck,

ID: 24115 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 · Next

Message boards : Graphics cards (GPUs) : gtx680

©2025 Universitat Pompeu Fabra