GPUGRID and Fermi

Message boards : Graphics cards (GPUs) : GPUGRID and Fermi
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 8 · 9 · 10 · 11 · 12 · 13 · Next

AuthorMessage
Danger30Q

Send message
Joined: 11 Jul 09
Posts: 21
Credit: 3,021,211
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 17215 - Posted: 22 May 2010, 0:18:28 UTC

Thanks for the help everyone in getting XP Pro set up. I have completed 2 work units significantly quicker on XP compared to Win7 and that's without the SWAN_SYNC=0 enabled. I'll post some comparisons once a few work units complete with XP Pro and SWAN_SYNC enabled. I'll try to find the same work units to compare between XP and Win7. I'm sure my results will be similar to what others have already posted but maybe there have been some optimizations.

It's nice to see XP using over 97% gpu usage compared to the maximum ~70% (if lucky) on Win7.
ID: 17215 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
STE\/E

Send message
Joined: 18 Sep 08
Posts: 368
Credit: 4,174,624,885
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 17249 - Posted: 24 May 2010, 8:37:57 UTC - in response to Message 16604.  

The future will be brighter when also ATI works nicely.

gdf


As I no longer own any NVIDIA Cards I keep waiting, I'm almost Tempted to go out & buy a couple of Fermi's ... NOT ... haha

STE\/E
ID: 17249 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Danger30Q

Send message
Joined: 11 Jul 09
Posts: 21
Credit: 3,021,211
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 17333 - Posted: 26 May 2010, 12:24:54 UTC

Looks like the work unit pool for 6.73 has dried up. I can't get any more on either of my systems.
ID: 17333 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Richard Haselgrove

Send message
Joined: 11 Jul 09
Posts: 1639
Credit: 10,159,968,649
RAC: 2
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17340 - Posted: 26 May 2010, 13:38:16 UTC - in response to Message 17333.  

Looks like the work unit pool for 6.73 has dried up. I can't get any more on either of my systems.

Go fishing for v6.05 instead. The Fermi version seems to work fine.
ID: 17340 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Snow Crash

Send message
Joined: 4 Apr 09
Posts: 450
Credit: 539,316,349
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17434 - Posted: 29 May 2010, 14:35:15 UTC

My experience with WinXP 32 has shown that there is only a very minor difference between HT ON and HT OFF for an i7-920 when you have SWAN_SYNC=0.
Thanks - Steve
ID: 17434 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
roundup

Send message
Joined: 11 May 10
Posts: 68
Credit: 12,531,253,875
RAC: 2,388,659
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17435 - Posted: 29 May 2010, 16:14:08 UTC - in response to Message 16932.  


Now running a task at 715MHz GPU, 1430MHz Shaders, 1700MHz RAM:

What voltage did you set to drive the GTX470 with those settings? At stock voltage the application crashed so I increased it to 1.050 V.

Any other recommendations? Thanks in advance :-) !
ID: 17435 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17439 - Posted: 29 May 2010, 19:10:24 UTC - in response to Message 17435.  

Any other recommendations?


Well.. yes. Better don't increase the voltage ;)
It reduces chip longevity and drives power consumption up. On a chip like Fermi it's also a considerable factor that increased temperature (due to voltage increase) increases the leakage quite a bit, so your card becomes less power efficient (=higher electricity cost). I'd rather try to improve cooling & temperatures. That would also give you an increased frequency headroom. Probably not as much as voltage increases, but without the negative side effects.

MrS
Scanning for our furry friends since Jan 2002
ID: 17439 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17442 - Posted: 29 May 2010, 19:58:12 UTC - in response to Message 17439.  

My GTX470 is at 704MHz GPU, 1407MHz shaders and 854MHZ RAM (x4).
Voltages are not increased!
ID: 17442 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
chumbucket843

Send message
Joined: 22 Jul 09
Posts: 21
Credit: 195
RAC: 0
Level

Scientific publications
wat
Message 17449 - Posted: 30 May 2010, 4:29:43 UTC - in response to Message 17439.  

Any other recommendations?


Well.. yes. Better don't increase the voltage ;)
It reduces chip longevity and drives power consumption up. On a chip like Fermi it's also a considerable factor that increased temperature (due to voltage increase) increases the leakage quite a bit, so your card becomes less power efficient (=higher electricity cost). I'd rather try to improve cooling & temperatures. That would also give you an increased frequency headroom. Probably not as much as voltage increases, but without the negative side effects.

MrS

OCing is not as detrimental as your make it sound. modern processes are designed to handle very high and very low temperatures. a major consideration when designing a chip is robustness. in fact it is so important that they are overly conservative with clocks and volts, especially with server/professional chips. that's free performance for us. 400 series is great for OCing too, partially from architecture and partially from high leakage.
ID: 17449 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
roundup

Send message
Joined: 11 May 10
Posts: 68
Credit: 12,531,253,875
RAC: 2,388,659
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17454 - Posted: 30 May 2010, 8:59:52 UTC - in response to Message 17449.  
Last modified: 30 May 2010, 9:53:30 UTC

Thanks to all for your help.
skgiven, I am using your settings now.

Edit:
Sorry but with the standard setting of 0,962 V the appliction craches. I have to select at least 0,975 V to have a stabile GTX470 with skgiven's clock settings :-/
ID: 17454 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17458 - Posted: 30 May 2010, 11:21:48 UTC - in response to Message 17454.  

@roundup: well, if you want to overclock to a given frequency then there's no guarantee that your chip can make it at a certain voltage. Chips vary. The increase to 0.975 V is not too much, though :)
The jump from 0.96 to 1.05 V would have increased your power consumption by 19%, i.e. would have brought you from 210 W TDP to 250 W. That's not even factoring in increased leakage due to higher temperatures and the power consumption increase due to frequency. If you go to 0.975 V it's a much more modest 2.7% increase.
BTW: the frequency increase itself increases your power consumption by 15% - but that's OK since you're also getting 15% more performance out of the card.

@chumbucket843: actually they make the chips more vulnerable to damage by shrinking the transistor dimensions. Having a dopant atom swap place with a neighbouring atom starts to hurt if your channel is only a couple of atoms long. We're not that small yet, but it serves to illustrate the problem.
For CPUs I'd agree that they can take quite a beating. So far I've only seen one single chip fail personally (and I've seen many). However, the situation is different for GPUs: the high end chips are already being driven quite hard at the edge of stability. And usually they run at 80 - 95°C, much higher than CPUs. Not because they could take it by design, but rather because it's too expensive and loud to cool them any better. And because noone's gaming 24/7. They're made so that most of them survive the warrenty period under typical loads - which does not include BOINC.
And the GTX-Fermi cards are not professional cards. They're meant for gamers (hence the crippling of double precision performance). And the high leakage is actually something nVidia would love to get rid of - however, it's a byproduct of the process variation in TSMCs infamous 40 nm process. So there's not much they could do about it without crippling performance. They've got one thing going for them, though: on an absolute scale their stock voltage is quite low to keep power consumption somewhat in check (one may argue whether they succeed at that - but that's not the point). Hence the chip degradation due to voltage alone (not talking about temperature here) is not as strong as for other chips, and is thus less important upon voltage increases. So practically you'll only have to deal with temperature & power consumption increases upon voltage increases.

BTW: I think overclocking and overvolting have to be clearly distinguished. I love overclocking, as it provides performance basically for free and actually increases efficiency. But I don't generally recommend high voltages, as they reduce the chip lifetime and efficiency (past a certain point).

MrS
Scanning for our furry friends since Jan 2002
ID: 17458 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17470 - Posted: 30 May 2010, 20:05:37 UTC - in response to Message 17458.  

Fortunately the Fermi's can be over-volted in very small amounts. I initially tried to increase it, when I was struggling to use a Fermi with Win7 (I failed to get reasonable performance, and that is still the picture). I was able to increase my GTX470 to about 750MHz if I remember correctly, and stay under 1V - not that I was ever going to keep it at that!

I think a small tweak in Voltage is reasonable if it allows reasonable performance gain and relative power usage. So a 15% increase in performance while increasing power usage by 15% seems fair enough. If you take into consideration the full power used by the system, it might be more like 8% increase in power consumption.

I agree that increasing the Voltage too much is not just wasteful in terms of power consumption, but in terms of reducing the longevity of your card.
I forked out £320 to crunch with a Fermi; I want to crunch not crash.
- Getting burnt is bad, but burning yourself is just stupid!
ID: 17470 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17471 - Posted: 30 May 2010, 21:32:28 UTC - in response to Message 17470.  

Agreed.. just want to clarify a detail, if it wasn't clear before: 15% power for 15% performance is only due to frequency increase. Touching the voltage adds to this (or in fact multiplies). So in roundups example he'd get 1.15 * 1.19 = 1.37, i.e. a 37% increase in power consumption if he went for the higher clocks at 1.05 V.

MrS
Scanning for our furry friends since Jan 2002
ID: 17471 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17528 - Posted: 4 Jun 2010, 15:48:18 UTC - in response to Message 17092.  

Yes, just cuda3.1.

gdf


When do you plan to release this CUDA3.1 (beta) client?
ID: 17528 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 17555 - Posted: 10 Jun 2010, 11:09:35 UTC - in response to Message 17528.  

Only when CUDA3.1 is out.

gdf
ID: 17555 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17561 - Posted: 11 Jun 2010, 0:41:12 UTC - in response to Message 17555.  

GeForce/ION Release 256 BETA 257.15 May 24, 2010

http://www.nvidia.com/Download/Find.aspx?lang=en-us
http://www.nvidia.com/object/winxp-257.15-beta.html
http://www.nvidia.com/object/win7-winvista-32bit-257.15-beta.html

Adds support for CUDA Toolkit 3.1 which includes significant performance increases for double precision math operations. See CUDA Zone for more details.

The XP driver has been working fine for weeks.
ID: 17561 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Richard Haselgrove

Send message
Joined: 11 Jul 09
Posts: 1639
Credit: 10,159,968,649
RAC: 2
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17567 - Posted: 11 Jun 2010, 10:07:07 UTC

I've recently moved my Fermi GTX 470 from host 71413 (Windows 7, 32-bit) to host 43404 (Windows XP, 32-bit, running as a service). I moved the 9800GTX+ in the opposite direction.

On both hosts, I started the Fermi with driver 197.75, and then upgraded to driver 257.15_beta to test some CUDA 3.1 stuff for another project. I don't think the speed of the current cuda30 v6.05 ACEMD2 changed significantly with the driver change: if anything, it was slightly slower on the Beta driver (as you might expect). I think we'll have to wait for a new cuda31 app as well before we see any benefit from the driver.

What was significant was the increase in speed when I put the Fermi into the WinXP box. Times went down from 19,000+ seconds (with Swan_Sync and 95% CPU usage) to 11,000+ seconds and under 15% CPU. It's difficult to tell how much of that is due to a more modern hardware platform, and how much was the operating system, but it was a dramatic change.
ID: 17567 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17568 - Posted: 11 Jun 2010, 11:07:30 UTC - in response to Message 17567.  

Until they compile using the CUDA 3.1 toolkit you will see no change in using the 257.15 driver.

It has been well reported (by me and others) that Fermi cards dont work well under Vista or Win 7. They work, but at 60% speed. I dont know why, but it is either the driver or the app.

Basically, if you have a Fermi use XP or Linux.
ID: 17568 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Richard Haselgrove

Send message
Joined: 11 Jul 09
Posts: 1639
Credit: 10,159,968,649
RAC: 2
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17569 - Posted: 11 Jun 2010, 11:40:01 UTC - in response to Message 17568.  

Those reports are largely what prompted me to make the swap - I just thought I'd post some more actual figures.

The 9800GTX+ which moved in the opposite direction also slowed down, but to a lesser extent - maybe 20% - and started using a lot more CPU. Now, its new host has a slower CPU, so you would expect it to need more seconds - but not a four-fold increase. That must be down to Windows 7, too.
ID: 17569 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17571 - Posted: 11 Jun 2010, 16:12:00 UTC - in response to Message 17569.  

Those reports are largely what prompted me to make the swap - I just thought I'd post some more actual figures.

The 9800GTX+ which moved in the opposite direction also slowed down, but to a lesser extent - maybe 20% - and started using a lot more CPU. Now, its new host has a slower CPU, so you would expect it to need more seconds - but not a four-fold increase. That must be down to Windows 7, too.

Yep, you got it. The slowdown for Win7 was reported as soon as the OS was officially released. It's specific to GPUGRID and nothing has been done to resolve the problem so far. Here's a thread on the subject:

http://www.gpugrid.net/forum_thread.php?id=1729
ID: 17571 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 . . . 8 · 9 · 10 · 11 · 12 · 13 · Next

Message boards : Graphics cards (GPUs) : GPUGRID and Fermi

©2026 Universitat Pompeu Fabra