Message boards :
Graphics cards (GPUs) :
power optimization?
Message board moderation
| Author | Message |
|---|---|
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Hi guys, is there anyone out there with a watt-meter or has already tried this? What I have in mind: - calculations are run on the shaders, there should not be much that needs to be done in the "normal" GPU core - the idle power draw of NV cards is quite high, so it seems like they can't switch everything off which is not needed right now - could someone reduce their GPU clock, while keeping shader and memory clock constant, and measure if there is any reduction in system power draw? - and observe if it affects WU runtimes - I'd suggest lowering the GPU core clock by something between 100 and 200 MHz to get some measureable difference in power draw Thanks in advance, MrS Scanning for our furry friends since Jan 2002 |
Krunchin-Keith [USA]Send message Joined: 17 May 07 Posts: 512 Credit: 111,288,061 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Hi guys, I have BBS's with software that includes a Watt Meter. I have not tired what you suggest, I was waiting until the project stabilizes first. However I did measure the increase from running the computer with BOINC running 1 CPU task and the LCDs warmed up and active only and then switching to running BOINC with 1 GPU task, the extra draw was only 26W. I will do some additional tests but first we need a stable app here. Additionally the GPU's I use are all the same brand and basic model with same number of stream processors and memory, the two at work are clocked at 601MHz and the one here is clocked at 632MHz, per the nvmonitor program. Those are factory settings. The faster one took around 59000s and the slower ones took around 61000s. This is sort of hard to figure at this time with all the app changes and version changes, that is why I say we need a stable app before testing what you suggest. But basically just that small difference in clock frequency makes the slower ones take 1/2 hour longer. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Hi Keith, thanks for the reply. I figured 4.61 was stable enough, that's why I posted this suggestion now :) (on my machine I see run time differences of about 15 min between different WUs) I suspect your card with the higher core clock also has its shaders clocked higher, which would/could explain the difference in speed? And what's a BBS? MrS Scanning for our furry friends since Jan 2002 |
[XTBA>XTC] ZeuZSend message Joined: 15 Jul 08 Posts: 60 Credit: 108,384 RAC: 0 Level ![]() Scientific publications
|
I have a watt-meter, I will do some tests with my 9600GT |
Krunchin-Keith [USA]Send message Joined: 17 May 07 Posts: 512 Credit: 111,288,061 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Hi Keith, thanks for the reply. I figured 4.61 was stable enough, that's why I posted this suggestion now :) (on my machine I see run time differences of about 15 min between different WUs) Battery Backup Supply - a.k.a. UPS or If you're from before the internet generation, a BBS use to be a Bulletin Board System. You dialed each computer system's telephone number. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
By now I have finished 6 4.61 WUs. The average time is 44569 s, the maximum 44887 s and the minimum 44131 s. That's 1% downwards and 0.7% upwards, seems stable enough for me :) MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
OK guys, forget the idea! I lowered the core clock (750 -> 400 MHz) and while the power supply fan went down 50 rpm und the card became 3°C cooler, performance suffered immediately. MrS Scanning for our furry friends since Jan 2002 |
pitSend message Joined: 23 Sep 08 Posts: 3 Credit: 8,255,016 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
OK guys, forget the idea! I lowered the core clock (750 -> 400 MHz) and while the power supply fan went down 50 rpm und the card became 3°C cooler, performance suffered immediately. Lowering the clock won't succeed. You should try to lower the voltage. This has an immediate impact on your energy bill. Using a powermeter you should be able to see every movement of the voltage regulator. I expect that 20 - 40 W are possible (NV 260 / 280). I'll give it a try soon when I can sit next to my workstation. There might be some reboots necessary ;-) Good Luck Jabba |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Lowering the clock won't succeed. That's what I said.. kind of ;) Sure, lowering the clock only drops the dynamic power consumption linearly, but it drops it nevertheless. You should try to lower the voltage. I'd be glad to do so if nVidia would let me. From your post it sounds like on GT200 based cards you can control the voltage yourself? On my 9800GTX+ this doesn't seem to be the case, that's why I went to try clock speed. MrS Scanning for our furry friends since Jan 2002 |
pitSend message Joined: 23 Sep 08 Posts: 3 Credit: 8,255,016 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Lowering the clock won't succeed. In general this should work for all nVidia GPUs. You need the tool 'nvflash' which operates in DOS Mode. !!! Make sure that your graphics adapter is not overclocked!!!! For further details and how to handle the tool please check for some websites in your preferred language. There are a couple of howtos available. Greetz Jabba |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
That's interesting! I could lower my GPU voltage from 1.15 V to 1.10, 1.05 or 1.00 V. I guess 1.05 V may still be stable at stock clocks, but could get borderline. And would reduce dynamic power consumption to 83%. Maybe I'll try the bootable-USB-stick-thing tomorrow, but I don't have a PCI card at hand.. MrS Scanning for our furry friends since Jan 2002 |
©2025 Universitat Pompeu Fabra