Message boards :
Graphics cards (GPUs) :
real GPU power consumption
Message board moderation
| Author | Message |
|---|---|
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
A very nice and detailed article. There's some nice background information (e.g. the temperature dependence of the power consumption) and the main advantage is that they measure the GPU power consumption directly, not just the entire system. They are testing with FurMark and 3D Mark, though. Both should generate a higher load than GPU-Grid. MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 9 Oct 08 Posts: 50 Credit: 12,676,739 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Thanks for sharing the article ET. Interesting reading. I've wondered lately what kind of power my GPU's pull from the PSU. They put out quite a bit of heat and not sure if the AC into the room will be able to keep them cool enough this summer. During the winter its easy enough to open a window for cooling! :-) Crunching for the benefit of humanity and in memory of my dad and other family members. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
If one doesn't have a power-meter there may be a way to estimate GPU-Grid power consumption: 1. Set your fan to a fixed speed. Not a very low one, we're going to need 3D mode. 2. Note your idle temperature, check that the clock speeds are the same as mentioned in the article. 3. Do some 3D Mark runs where you can find the real power consumption of your card in the article. Several loops may be neccessary until the temperature stabilizes. 4. Log the temperature (e.g. GPU-Z with "refresh while in background"). 5. Now look for the idle and 3D Mark power consumption of your card in the article. 6. Note your temperature while running GPU-Grid, give it some time to stabilize. 7. Interpolate linearly. To give an example: wrote:
Sure, it's only an approximation.. but we're not going to argue over 10W here, aren't we? MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 25 Sep 08 Posts: 111 Credit: 10,352,599 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
A few weeks ago I put my 2x Gpu rig (gpus only) on F@H for about 5 days just to take my account there over the 1 mil mark. What I noticed with my kill-a-watt is that the F@H WUs were pulling 30w more then when I ran GpuGrid. On that rig: GpuGrid = 360w F@H = 390w Same setup, OCs, everything....just switched to F@H for a few days...when I switched back to GpuGrid...the power draw went back down to 360w. Some of the F@H guys have noticed a while back that some of their WUs draw more power and create higher temps. They definitely work the cards harder then some of their other WUs. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
And if you run seti@home on your GPUs you should see less power draw than under GPU-Grid. MrS Scanning for our furry friends since Jan 2002 |
©2025 Universitat Pompeu Fabra