Tests on GTX680 will start early next week [testing has started]

Message boards : News : Tests on GTX680 will start early next week [testing has started]
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · 6 · Next

AuthorMessage
Munkhtur

Send message
Joined: 13 Nov 10
Posts: 3
Credit: 105,044,879
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 24450 - Posted: 17 Apr 2012, 9:50:05 UTC - in response to Message 24234.  
Last modified: 17 Apr 2012, 9:53:06 UTC

my gtx680 didnt compute any work from GPUGrid
i tested it on S@H, it works without problem

so i bought it from US, then send it to Mongolia,
and Mongolia to Korea

fml
ID: 24450 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24451 - Posted: 17 Apr 2012, 10:51:32 UTC - in response to Message 24450.  

my gtx680 didnt compute any work from GPUGrid
i tested it on S@H, it works without problem

Please be patient. The current GPUGrid application doesn't support the GTX 680. A new version is under construction, it will support the GTX 680.
ID: 24451 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 24457 - Posted: 17 Apr 2012, 20:29:18 UTC - in response to Message 24451.  

Compared to the current production application running on a gtx580, the new app is 17% faster on the same GTX580 and 50% faster on a Gtx680.

It will come out first as a beta and stay as a separate application for now. We will try to get it out quickly as it makes a big difference.

It should come out within this week for Linux and Windows.

gdf

ID: 24457 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Stoneageman
Avatar

Send message
Joined: 25 May 09
Posts: 224
Credit: 34,057,374,498
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24458 - Posted: 17 Apr 2012, 20:55:33 UTC

Want NOW
ID: 24458 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24459 - Posted: 17 Apr 2012, 21:19:13 UTC
Last modified: 17 Apr 2012, 21:23:10 UTC

50%!!!!!!!!!!!!!! WOW!!!!!! Great work guys!!!! Waiting "patiently"..... :)

Profile has been changed to accept betas only for that rig. Again, 50%!!!!!!!!

Sorry Einstein, but your apps have NOTHING on this jump in performance!! And that doesn't even account for performance/watt. My EVGA step up position in queue better increase faster!!! My 570 is still at #501!!!
ID: 24459 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24460 - Posted: 17 Apr 2012, 22:06:11 UTC - in response to Message 24459.  
Last modified: 17 Apr 2012, 22:15:40 UTC

Compared to the current production application running on a gtx580, the new app is 17% faster on the same GTX580 and 50% faster on a Gtx680.

I don't think that means the GTX680 is 50% faster than a GTX580!
I think it means the new app will be 17% faster on a GTX580, and a GTX680 on the new app will be 50% faster than a GTX580 on the present app.
That would make the GTX680 ~28% faster than a GTX580 on the new app.
In terms of performance per Watt that would push it to ~160% compared to the GTX580, or twice the performance per Watt of a GTX480 ;)
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24460 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 24462 - Posted: 17 Apr 2012, 22:26:42 UTC - in response to Message 24460.  

just to give some numbers for clarity:
production app on gtx580 98ns/day
new app on gtx 580 115 ns/day
new app on gtx 680 150 ns/day

gdf
ID: 24462 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24463 - Posted: 17 Apr 2012, 22:47:38 UTC

Lol. I knew that...... Either way, a lot faster than my 570 that's currently attached! And 160% more efficient is amazing. Again, great work guys!!! Still need to find a mobo for ivy that supports 3.0 at 2 x16.
ID: 24463 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Butuz

Send message
Joined: 13 Sep 10
Posts: 5
Credit: 17,517,835
RAC: 0
Level
Pro
Scientific publications
watwat
Message 24464 - Posted: 17 Apr 2012, 23:17:31 UTC

Wow that is excellent news. I had been waiting for news on this before comitting to my GFX card purchase. You guys rock. Your dedication and speed of reaction is outstanding!

Will the new app also give speed boosts on GTX 570 and 560 cards?

Cheers
ID: 24464 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Butuz

Send message
Joined: 13 Sep 10
Posts: 5
Credit: 17,517,835
RAC: 0
Level
Pro
Scientific publications
watwat
Message 24465 - Posted: 17 Apr 2012, 23:24:45 UTC

Also I have one more question to those in the know. If I run a GTX680 on a PCIE2 motherboard will it take a performance hit on that 150% figure? Could this be tested if you have time GDF - I know its not a high priority but may help people like me who dont have a next gen motherboard make an informed decision.

Cheers
ID: 24465 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24468 - Posted: 18 Apr 2012, 3:08:21 UTC

All I can say is on the note about the performance hit is, I'm going to THINK that it won't, PCI 3.0 allows for 16 GB/s in each direction. For what we do, this is A LOT of bandwidth. From the results that I've seen, which are based on games, the performance increase seems to be only 5-7%, if this is the case, I would ASSUME that there wouldn't be that big of a performance hit.

The only reason that I want a PCI 3 mobo, which can run 2 cards at x16 each, is because i play games, well one, and two; because it's just a mental thing for me (meaning running at full capacity) even if it's not noticed. I also don't plan on building another rig for some time, and I would like this one to be top notch ;).

It will MOST LIKELY only make a difference to those who run either a) huge monitors or b) multiple monitors using NVIDIA Surround, which I plan on doing with a 3+1 monitor setup.

Think of it like this, even the biggest tasks for GPUgrid only use a lil over a GB if i'm not mistaken (in memory), the need for 16GB/s is way overpowered I would imagine. I'll let you know how my 680 runs once the beta is out (it's on a PCI 2.0 mobo currently)
ID: 24468 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 24475 - Posted: 18 Apr 2012, 7:51:21 UTC - in response to Message 24465.  

I don't know if PCI3 will make a little change or not. We are trying on a PCI3 motherboard.
The fact that now the PCI controller is inside the CPU, might make some difference in lower latency.

gdf
ID: 24475 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24478 - Posted: 18 Apr 2012, 10:48:22 UTC - in response to Message 24475.  

This post, in this thread, discusses speculatively PCIE3 vs PCIE2.
Basically, for a single card it's probably not worth the investment, for two cards it depends on what you want from the system, and for 3 or 4 it's worth it.
As you are looking to get a new system it may be worth it. Obviously we won't know for sure until someone posts actual results for both PCIE2 and 3 setups and multiple cards.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24478 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24482 - Posted: 19 Apr 2012, 0:05:27 UTC
Last modified: 19 Apr 2012, 0:43:28 UTC

In regards to the PCI 3.0 running 2x16. If I am reading this correctly, am I gonna be "forced" to get a SB-E now, and would most likely get the 3930K, since 3820 isn't unlocked, and that's what I would prefer? http://www.anandtech.com/show/4830/intels-ivy-bridge-architecture-exposed

Further, since IB-E won't be released until MAYBE Q3-Q4, probably towards christmas would be my guess, and won't really offer any benefit besides a die shrink.

I guess this explains why I was having a hard time finding a PCI 3.0 2x16 mobo. Wow, my idea of 100% GPU functionality just increased the price by about another $250. Hmmmm...

Oh, and I found this on andantech (though it's for AMD GPU)

Simply enabling PCIe 3.0 on our EVGA X79 SLI motherboard (EVGA provided us with a BIOS that allowed us to toggle PCIe 3.0 mode on/off) resulted in a 9% increase in performance on the Radeon HD 7970. This tells us two things: 1) You can indeed get PCIe 3.0 working on SNB-E/X79, at least with a Radeon HD 7970, and 2) PCIe 3.0 will likely be useful for GPU compute applications, although not so much for gaming anytime soon.

Doesn't list what they ran or any specs though

EDIT. Well it appears the 3820 can OC to 4.3 which would be most for what I need. Wouldn't mind having a 6 core though. 4 extra threads for WUs would be nice but not mandatory. At $250 at MicroCenter, quite a nice deal.
ID: 24482 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Carlesa25
Avatar

Send message
Joined: 13 Nov 10
Posts: 328
Credit: 72,619,453
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24485 - Posted: 19 Apr 2012, 14:08:12 UTC
Last modified: 19 Apr 2012, 14:12:38 UTC

Hi, An interesting comparison NVIDIA vs AMD; GTX680/580 versus HD6970/7970.

Makes it quite clear the poor performance of GTX680 in FP64 and best performance in the HD6970/7070 MilkyWay. Greetings.

http://muropaketti.com/artikkelit/naytonohjaimet/gpgpu-suorituskyky-amd-vs-nvidia

Less is better


ID: 24485 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24487 - Posted: 19 Apr 2012, 15:11:13 UTC
Last modified: 19 Apr 2012, 15:15:59 UTC

Glad we do FP32

Further the 680 has only 8 FP64 cores, which arent included in its core count, which run at full speed compared to the reduced speed of previous generations.
ID: 24487 · Rating: 0 · rate: Rate + / Rate - Report as offensive
wiyosaya

Send message
Joined: 22 Nov 09
Posts: 114
Credit: 589,114,683
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24525 - Posted: 22 Apr 2012, 2:30:57 UTC - in response to Message 24482.  

EDIT. Well it appears the 3820 can OC to 4.3 which would be most for what I need. Wouldn't mind having a 6 core though. 4 extra threads for WUs would be nice but not mandatory. At $250 at MicroCenter, quite a nice deal.

I've been looking at the 3820 myself. In my opinion, that is the only SB-E to get. Techspot got the 3820 up to 4.625 GHz, and at that speed, it performs pretty much equally as well as a 3960K at 4.4 GHz. To me, it's a no-brainer - $1000 3960K, $600 3930K, or $250 3820 that performs as well as the $1K chip. According to the Microcenter web site, that price is in-store only.

Where SB-E will really excel is in applications that are memory intensive, such as FEA and solid modelling - which is a conclusion that I came to as a result of the Techspot review - that tested the 3820 in a real-world usage scenario of SolidWorks.

Anyway, IB is releasing on Monday, and it might be worth the wait. Personally, I do not think IB will beat SB-E in memory intensive applications, however, I'll be looking very closely at the IB reviews.
ID: 24525 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24526 - Posted: 22 Apr 2012, 2:49:02 UTC

IB is not going to beat ANY SB-E. It's slight performance improvement and energy savings may very well be negated by its ability to OC less than Sandy (what I've read anyways).

The real advantage of Ivy will come from it's PCIE 3 support, but since SB-E already has this, plus with the ability to natively support 40 lanes instead of 1155 CPUs 16 is my MAIN reason.

We power users probably won't see improvement, if any, until Haswell? is released, the next "tock" phase.
ID: 24526 · Rating: 0 · rate: Rate + / Rate - Report as offensive
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24563 - Posted: 23 Apr 2012, 19:16:06 UTC
Last modified: 23 Apr 2012, 19:16:39 UTC

I know patience is a virtue, and I REALLY hate to ask GDF, but........... how's the progress on the beta app coming.

Truly itching to bring my 680 over to you guys. :)

As always, you guys do a great job, and I can't wait to hear about how the experiment with the cystallographers works out!!

Keep it up!!
ID: 24563 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Evil Penguin
Avatar

Send message
Joined: 15 Jan 10
Posts: 42
Credit: 18,255,462
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 24574 - Posted: 24 Apr 2012, 7:03:42 UTC - in response to Message 24563.  

Better than the ATi version, probably.
ID: 24574 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Previous · 1 · 2 · 3 · 4 · 5 · 6 · Next

Message boards : News : Tests on GTX680 will start early next week [testing has started]

©2025 Universitat Pompeu Fabra