New Kepler's

Message boards : Graphics cards (GPUs) : New Kepler's
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · Next

AuthorMessage
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24765 - Posted: 6 May 2012, 20:36:23 UTC
Last modified: 6 May 2012, 20:37:21 UTC

His had a 8+6, and never posted power draw (wouldn't expect him too). But at this price point, and it competing or beating a 580 with less power, gonna pick one up myself once GDF gets this app out.

Multiple "rumors" out not to expect the 660Ti for quite some time. This may be the August release card.
ID: 24765 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24766 - Posted: 6 May 2012, 20:56:22 UTC

The card would officially launch and be available globally by May 7th for a price of $399.

It's already Monday (7th May) in Australia. But I wonder, how did this guy get one so early in the morning? :)
ID: 24766 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24768 - Posted: 6 May 2012, 21:31:47 UTC
Last modified: 6 May 2012, 21:42:39 UTC

Midnight release?

GPUz still says May 10th for release date. So assuming May 7th, we should be hearing something "official" from NVIDIA tomorrow right?

EDIT. The way it's looking so far, the higher clock of the 670 keeps up with, and may beat a 680 despite losing a SMX. This will be interesting indeed.
ID: 24768 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
wiyosaya

Send message
Joined: 22 Nov 09
Posts: 114
Credit: 589,114,683
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24876 - Posted: 10 May 2012, 15:09:59 UTC

I don't think this is particularly the right thread for this info, but since the conversation has covered some of this already - some benchmarks on GTX 670 compute performance are out. It looks like the 670 is available at the same price as GTX 580s are now. For single-precision, the 670 looks to be on par with the 580 and close to the 680 in performance. The advantage, as I see it, the 6XX series has over the 5XX series is significantly better power consumption. So flops/watt is better with the 6XX series than with the 5XX series.

However, for double-precision performance, the 580 is about double the 680. So, if you are running projects that need or benefit from DP support and given the lower prices on 580s at the moment, a 580 may be a better buy, and perhaps more power-efficient, too.

Here is the compute performance section of the one 670 review I have seen.
ID: 24876 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Zydor

Send message
Joined: 8 Feb 09
Posts: 252
Credit: 1,309,451
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 24884 - Posted: 10 May 2012, 17:05:59 UTC

Guru3d review of the 670 is out - including a 2 and 3 way SLI review

http://www.guru3d.com/news/four-geforce-gtx-670-and-23way-sli-reviews/



Regards
Zy
ID: 24884 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24885 - Posted: 10 May 2012, 17:23:01 UTC
Last modified: 10 May 2012, 17:43:55 UTC

Ordered a gigabyte 670. Should be here monday. Hope windows is up soon. $400.

EDIT: OUT OF STOCK glad I ordered :-)
ID: 24885 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24893 - Posted: 10 May 2012, 19:20:02 UTC - in response to Message 24876.  

The only BOINC project I know which currently uses DP is Milkyway.. where the AMDs are superior anyway.

MrS
Scanning for our furry friends since Jan 2002
ID: 24893 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24898 - Posted: 10 May 2012, 21:53:29 UTC - in response to Message 24893.  

I agree, neither a Fermi or a Kepler is any good at FP64. It's just that Kepler is relatively less good. Neither should be considered as practical for MW. Buying a Fermi rather than a Kepler is just sitting on the fence. If you want to contribute to MW get one of the recommended AMD cards. Even a cheap second hand HD 5850 would outperform a GTX580 by a long way. If you want to contribute here get one of the recommended cards for here. As yet these don't include Kepler, because I have not updated the list, and won't until there is a live app. That said, we all know the high end GTX690, GTX680 and GTX670's will eventually be the best performers.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24898 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
wiyosaya

Send message
Joined: 22 Nov 09
Posts: 114
Credit: 589,114,683
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24922 - Posted: 11 May 2012, 14:02:11 UTC
Last modified: 11 May 2012, 14:03:17 UTC

I went through the decision making process of GTX 580 or AMD card. The older AMD cards are getting hard to find. Basically, a used one for something in the 5XXX series from e-bay is about the only realistic choice.

6XXX series can still be found new, and if you can catch one, e-bay has some excellent deals.

However, with AMD Milkyway is one of the very few projects that have AMD GPU solutions.

So, I went with the 580. It is not that far behind the top AMD cards on MW, and it is proving to be about 3X as fast as my GTX 460 on MW and on GPUGrid. IMHO, it is a far more flexible card for BOINC in that it is supported by far more BOINC projects than AMD; thus, the reasoning behind my choice. Plus, I got it new for $378 after rebate, and for me, it was by far the "value" buy.

I expect that I run projects a bit differently from others, though. On weekends, I run GPUGrid 24x7 on my GPUs and do no work for other projects due to the length of the GPUGrid WUs. Then during the week, I take work from other projects but not GPUGrid. The other project's WUs complete mostly in under 30 mins except for Einstein which takes about an hour on the 580, 80 min on the 460 and 100 min on my 8800 GT; these times are all well within the bounds of the length of time I run my machines during the week.

My aim is to get maximal value from the few machines I run. If I had more money, I might run a single machine with an AMD card and dedicate that GPU to MW or another project that has AMD support, however, I am building a new machine that will host the 580. At some point, I may put an AMD card in that machine if I can verify that both the 580 and the AMD card will play nice together, but not yet. The wife is getting nervous about how much I've spend so far on this machine. LOL
ID: 24922 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24927 - Posted: 11 May 2012, 15:53:21 UTC - in response to Message 24922.  

The Folding@Home CUDA benchmark is interesting, as it has a reference GTX670 within 6% of a reference GTX680.

Anandtech, http://images.anandtech.com/graphs/graph5818/46450.png

Despite using CUDA this app is very different to the one here and it would be wrong to think such performance is likely here.

Going by the numerous manufacturer's implementations I'm now expecting substantial variations in performance from GTX670 cards; from what I can see the boost speeds will tend upwards from the reference rates (980MHz), with some FOC's reaching ~1200MHz (+22%), which would best a reference GTX680. Overclockers have went past this and 1250 or 1266 is common. Some even reference >1300MHz on air. Of course a GTX680 can overclock too. For hear the OC has to be sustainable, stable for weeks/months rather than minutes.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 24927 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25031 - Posted: 13 May 2012, 11:23:30 UTC

This is a little off-topic, but I'll add my 2 cents anyway: bying anything AMd older than the 5000 series wouldn't make sense since these 55+ nm chips are less power efficient. Currently the best bet would probably be a Cayman chip. An unlocked HD6950 is still very good value.

At MW my Cayman, which I bought over 1 year ago for a bit over 200€, was doing a WU each ~50s. I guess that's tough to beat for a GTX580 ;)

Anyway, it's running POEM now: more credits, less power consumption, hopefully more useful science :)

MrS
Scanning for our furry friends since Jan 2002
ID: 25031 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Paul Raney

Send message
Joined: 26 Dec 10
Posts: 115
Credit: 416,576,946
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 25068 - Posted: 14 May 2012, 12:36:01 UTC - in response to Message 24759.  

What to buy? I have 2 open slots for graphics cards so I continue to troll ebay for GTX 570s and 580s at a good prices. I purchased a 580 for about $300 and have an active auction for another at about the same price.

Is it a good idea to pick-up a 580 right now or just wait a bit and see what happens to the resale value of these cards? Should I wait for the GTX 670s to come down in price a bit as they appear to overclock very well? What is the expected performance per watt advantage of the 6x0 series? Is it 20% or 50%? And now for the big question, would it be better to look at the 590s? We don't have that many working on the project but we limited slots, the 590 would appear to be a good way to increase performance.

Advice? Used 580s @ $300 now? Wait and see on GTX 580s? Wait and move to 670s? Start looking for 590s?
Thx - Paul

Note: Please don't use driver version 295 or 296! Recommended versions are 266 - 285.
ID: 25068 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25070 - Posted: 14 May 2012, 13:57:14 UTC

From what I've been reading, a 670 will AT LEAST be able to be on par if not 10% faster than 580 (670 @ stock), and OC'd it should be able to be pretty close to a 680.

The main advantages of spending the extra $100, are decrease in runtimes, LESS POWER, and LESS HEAT.

Don't think anyone has attached a 670 to GPUgrid for betas yet, but it should be worth the extra money, or extra wait in time until the extra money is in hand.

The 680 is completing the betas around 50 sec (one was consistently), while I believe Stoneageman's 580 was around 75 secs. So as the researchers said, a 680 is about 30% faster.

Whenever Windows is released I'll have 670 waiting, and will post results.
ID: 25070 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25080 - Posted: 14 May 2012, 20:45:01 UTC

I don't have new hard numbers, but Kepler is looking quite good for GPU-Grid. Comparing GTX580 and GTX670 you'll save almost 100 W in typical gaming conditions. At 0.23 €/kWh that's 200€ less of electricity cost running 24/7 for a single year. I can't say how much the cards will draw running GPU-Grid, but I'd expect about this difference, maybe a bit less (since overall power draw may be smaller).

That would mean choosing a GTX670 over a GTX580 would pay for itself in about 6 months.. pretty impressive!

MrS
Scanning for our furry friends since Jan 2002
ID: 25080 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25084 - Posted: 14 May 2012, 22:02:09 UTC - in response to Message 25080.  
Last modified: 14 May 2012, 22:17:55 UTC

A reference GTX580 has a TDP of 244W, and a GTX670 has a TDP of 170W.
GPUGrid apps typically use ~75% of the TDP. So a GTX580 should use around 183W and a GTX670 should use around 128W. The GTX670 would therefor save you ~55W.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 25084 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25097 - Posted: 15 May 2012, 18:18:35 UTC - in response to Message 25084.  

You're assuming Keplers use the same percentage of their TDP-rating as Fermis. That may not be true for several reasons:

- different architecture
- on high end GPUs nVidia tends to specify unrealistically low TDPs (compared to lower end parts of the same generation)
- for Keplers the actual power consumption is much more in line due to their Turbo mode, this essentially makes them use approximately their maximum boost target power under typical load

For the 100 W difference in games I'm referring to such measurements.

MrS
Scanning for our furry friends since Jan 2002
ID: 25097 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
frankhagen

Send message
Joined: 18 Sep 08
Posts: 65
Credit: 3,037,414
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 25098 - Posted: 15 May 2012, 19:18:16 UTC - in response to Message 25097.  

For the 100 W difference in games I'm referring to such measurements.


probably that's about the range we will see. talking about at least 2 KWh per day.

how many tons of CO² per year?
ID: 25098 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25099 - Posted: 15 May 2012, 20:37:47 UTC

When running 3 Einstein WUs concurrent, at 1200 clock and 3110 memory, Precision power usage.monitor says around 60-70% at 85-90% gpu usage.
ID: 25099 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Carlesa25
Avatar

Send message
Joined: 13 Nov 10
Posts: 328
Credit: 72,619,453
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25106 - Posted: 16 May 2012, 11:23:09 UTC

NVIDIA GeForce GTX 680: Windows 7 vs. Ubuntu 12.04 Linux

NVIDIA GeForce graphics comparisons between Windows and Linux

http://www.phoronix.com/scan.php?page=article&item=nvidia_gtx680_windows&num=1
ID: 25106 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25112 - Posted: 16 May 2012, 20:53:37 UTC
Last modified: 16 May 2012, 20:57:50 UTC

I've been checking out 680 cards and they're all PCIe 3.0. Would I take a performance hit if I put one in a PCIe 16 2.0 slot. There would be no other expansion cards in that machine so there would be no lane sharing.

edit added:

This is the mobo
ID: 25112 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · Next

Message boards : Graphics cards (GPUs) : New Kepler's

©2025 Universitat Pompeu Fabra