Credit per € / $

Message boards : Number crunching : Credit per € / $
Message board moderation

To post messages, you must log in.

1 · 2 · 3 · 4 . . . 6 · Next

AuthorMessage
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 8566 - Posted: 18 Apr 2009, 14:27:32 UTC
Last modified: 29 Apr 2009, 20:53:39 UTC

last update: 29th of April 2009

A list compiled by loki that gives you an overview of the supported Nvidia cards and their €/Credit and $/Credit rating.

The credit entity is GFLOPS from wikipedia.de. € prices from idealo.de, US$ from pricegrabber.com.

Prices: Cheapest found incl. tax and excl. shipping. Check your local prices as they vary alot, often from day to day.
Prices in [] are from [ebay]: Buy It Now, incl. shipping and only included if 10€/$ cheaper than shops.

Exchange rate 16 Apr: EURUSD=X 1,3171 - 1€ = 1,32$; USDEUR=X 0,7593 - 1$ = 0.76€

Power consumption and minimum system power requirement from nvidia.com

---------------------------------------------------------------------------------------------------------------------------------------
G92, G92b, G94, G94b

Model - - - - - - - - - - - GFLOPS - - - - - € - - - - - GFLOPS/€ - - - - - $ - - - - - - GLOPS/$ - - - - Load - - - req. Power Supplie
Geforce 8800 GT - - - - - -504 - - - - - - 62€ - - - - - - 8.13 - - - - - $99[80] - - - - - 5.10[6.30] - - - 105 W - - 400 W
Geforce 8800 GTS(512) - 624 - - - - - 117€ - - - - - - 5.33 - - - - - $159[90] - - - - 3.92[6.93] - - - 140 W - - 450 W

Geforce 9600 GSO 512 - 234 - - - - - - 70€ - - - - - - -3.34 - - - - - -$80 - - - - - - 2.93 - - - - - - - - 90 W - - 400 W
Geforce 9600 GT - - - - - -312 - - - - - - 71€ - - - - - - -4.39 - - - - - -$75 - - - - - - 4.16 - - - - - - - - 59 W - - 300 W
Geforce 9600 GSO - - - - 396 - - - - - - 80€ - - - - - - -4.95 - - - - - -$80[70] - - - 4.95[5.66] - - - - 105 W - - 400 W
Geforce 9800 GT - - - - - -508 - - - - - - 87€ - - - - - - -5.84 - - - - - -$99 - - - - - - 5.13 - - - - - - - - 105 W - - 400 W
Geforce 9800 GTX - - - - -648 - - - - - - 126€[107] - - 5.14[6.01] - -$135 - - - - - 4.8 - - - - - - - - - 140 W - - 450 W
Geforce 9800 GX2 - - - -1052 - - - - - - 250€[233] - - 4.21[4.52] - $250[205] - - 4.21[5.13] - - - - 197 W - - 580

Geforce GTS 250 - - - - - 705 - - - - - - 100€ - - - - - - 7.05 - - - - - $137 - - - - - -5.15 - - - - - - - - 150 W - - 450 W

(note:
- 9800GTX+ is similar to GTS 250
- 8800GS is similar to 9600GSO 384 MB)


GT200, GT200b - optimization bonus 41% **1

Model - - - - - - - - - - - GFLOPS(+41%) - - - € - - - GFLOPS/€ - - - - - $ - - - - - - - - GLOPS/$ - - - - - - - - - - - Load - - - req. Power Supplie
Geforce GTX 260 - - - - - 715.4(1009) - - - 141€ - - 5.07(7.15) - - - - $179 - - - - - - 4.00(5.64) - - - - - - - - - - - - 182 W - - 500 W
Geforce GTX 260(216) - - 805(1135) - - - - 156€ - - 5.16(7.28) - - - - $189 - - - - - - 4.26(6.00) - - - - - - - - - - - -190 W - - 500 W
Geforce GTX 275 - - - - - 1010.9(1424) -- - 212€ - - 4.77(6.73) - - - - $250 - - - - - - 4.04(5.70) - - - - - - - - - - - -219 W - - 550 W
Geforce GTX 280 - - - - - 933.1(1316) - - - 289€ - - 3.23(4.55) - - - - $265 - - - - - - 3.52(4.96) - - - - - - - - - - - - 236 W - - 550 W
Geforce GTX 285 - - - - - 1062.7(1498) -- - 287€ - - 3.70(5.22) - - - - $340[330] - - 3.13(4.41)[3.22/4.54] - - - - 204 W - - 550 W
Geforce GTX 295 - - - - - 1788.4(2522) -- - 406€ - - 4.40(6.20) - - - - $533[510] - - 3.36(4.74)[3.51/4.95] - - - - 289 W - - 680 W


Nvidia Tesla

C1060 Computing Processor - - 1936(2730) - - 1508€ - - 1.28(1.80) - - - $1300 - - - 1.49(2.10) - - - - - - - - - - 188 W - - 500 W
S1070 1U Computing Server - - 4320(6091) - - 6682€ - - 0.65(0.91) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 800 W


The 100 series is not available for individual purchase.
---------------------------------------------------------------------------------------------------------------------------------------

**1
Let's put some numbers in here and compare these 2 WUs (1 & 2) with pretty similar names:

1. Me with 9800GTX+: 89917 s, 1944 MHz, 128 shaders -> 746.5 GFlops
2. GTX 260 Core 216: 48517 s, 1512 MHz, 216 shaders -> 979.8 GFlops

-> I need 1.853 times as long with 0.761 times the GFlops. That means for this WU each "GT200-Flop" is worth 1.41 "G92-Flops", or put another way: GT200 is 41% faster per clock.

ExtraTerrestrial Apes wrote:

1. The speedup of G200:
- GDF just said 10 - 15%
- based on fractals numbers it's ~90%
- when the G200 optimizations were introduced I estimated >30% performance advantage for G200 at the same theoretical FLOPS
- it may well be that the G200-advantage is not constant and scales with WU sizes, which may explain why we're seeing a much higher

advantage now than in the past (with smaller WUs)
- my 9800GTX+(705GFLOPS) was good for ~6400 RAC, whereas GTX 260(805GFLOPS) made 10 - 11k RAC prior to the recent credit adjustments
- G200 is more future proof than G92. Both are DX 10.0, but G200 has additional features which may or may not be needed for future CUDA

clients.

-> concluding these observations I can say that the advantage of G200-based cards is >50% at the same theoretical FLOPS.


Would still be nice if someone could provide such performance numbers for other WUs.
Scanning for our furry friends since Jan 2002
ID: 8566 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 8567 - Posted: 18 Apr 2009, 14:58:24 UTC

And some more points to consider:

- the GT200 based cards have a higher hardware capability level, i.e. they can run future code which the G9x series can not run [it is unknown at this point, if GPU-Grid will require this cability in the future]

- more smaller cards are not always better than fewer high end cards, even if the "purchase cost per €/$" is better: there is a certain overhead required to run a GPU, i.e. you need to provide a PC and a PCIe slot. So if you go for several 9600GSOs instead of GTX 260s you'll need about 3 times a much supporting PC hardware, which adds to the power bill and may add to the purchase cost.

- smaller cards do not always consume proportionally less power: e.g. the GTS 250 looks good in flops/€, but consumes about as much power as the (much faster) GTX 260

- under GPU-Grid the cards consume much less power than the typical power draw quoted by nVidia. As a rough guideline: the additional power draw from the wall for a GTX 280 has been measured at ~130W and a GTX 260 at ~100W.

- I'm not being paid by NV for this or own any of their stock, I just want to help people make smart decisions if they're going to get a new GPU :)

MrS
Scanning for our furry friends since Jan 2002
ID: 8567 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Martin Chartrand

Send message
Joined: 4 Apr 09
Posts: 13
Credit: 17,030,367
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 9401 - Posted: 6 May 2009, 21:40:41 UTC - in response to Message 8567.  

I cannot find the chart anymore but on it my 8800GTX is said to have a core G80 so not good for using GPU but..
It was crunching GPU. Any particular reasons of this behavior?

Martin
ID: 9401 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 9448 - Posted: 7 May 2009, 20:24:17 UTC - in response to Message 9401.  

During the first months (about 3?) G80 did run GPU-Grid with a separate code path, to work around the missing features. Later CUDA versions broke something.

MrS
Scanning for our furry friends since Jan 2002
ID: 9448 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Martin Chartrand

Send message
Joined: 4 Apr 09
Posts: 13
Credit: 17,030,367
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 9449 - Posted: 7 May 2009, 20:30:44 UTC - in response to Message 9448.  
Last modified: 7 May 2009, 20:31:40 UTC

Aw ok thanks a lot.
I now run GTX285.
For the hardware/software maniacs out there, in my control panel of nvidia I added seti@home enhanced 6.08 and BOINC.exe and GPUGRID
Can you actually tweak the NVIDIA control panel to maximize those 3 program or that is completely irrelevant?

Martin
ID: 9449 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Martin Chartrand

Send message
Joined: 4 Apr 09
Posts: 13
Credit: 17,030,367
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 9450 - Posted: 7 May 2009, 20:36:24 UTC - in response to Message 9449.  

Hmm Not a thread about this.
Should I start a new thread ExtraTerrestrial Apes about maximizing software through the NVIDIA Control panel?

Martin
ID: 9450 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 9523 - Posted: 9 May 2009, 11:05:29 UTC - in response to Message 9450.  

Not sure what you mean by maximising, but this is certainly the wrong thread for that. Generally there's nothing the control panel could do for CUDA. Maybe if your card runs GPU-Grid in 2D-Mode (check clocks with GPU-Z), but this is not generally the case (switches to 3D clocks automatically).

MrS
Scanning for our furry friends since Jan 2002
ID: 9523 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Jonathan Figdor

Send message
Joined: 8 Sep 08
Posts: 14
Credit: 425,295,955
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 10621 - Posted: 17 Jun 2009, 5:34:01 UTC - in response to Message 9523.  

Can we update this? I have a friend building a new PC and trying to figure out what card to get for part gaming, part crunching $100-300.
ID: 10621 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Paul D. Buck

Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 10626 - Posted: 17 Jun 2009, 13:52:55 UTC - in response to Message 10621.  

Can we update this? I have a friend building a new PC and trying to figure out what card to get for part gaming, part crunching $100-300.

What is needed to be updated?

The only part that changes is the prices ... find the most productive card for the money he wants to spend. For that range any of the 200 series cards is possible.

I have 260, 280 and 295 cards and there is not a huge difference in the throughput on these cards... though there is a detectable improvement as you go up in capacity ... any will be decent performers...
ID: 10626 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Jonathan Figdor

Send message
Joined: 8 Sep 08
Posts: 14
Credit: 425,295,955
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 10629 - Posted: 17 Jun 2009, 15:55:20 UTC - in response to Message 10626.  

Which is best bang for buck? 275 or 260 core 216? Or does it make sense to scale up to 285/295? Should he wait for GT300 architecture?
ID: 10629 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 10632 - Posted: 17 Jun 2009, 20:14:03 UTC - in response to Message 10629.  

We can't be sure about GT300 yet. It looks like an expensive monster.. certainly impressive, but bang for the buck we can not assess (yet).

Otherwise.. yes, we could update this. Just give me the update numbers and i'll put them in :)
Sorry, don't have time to search for them myself.

MrS
Scanning for our furry friends since Jan 2002
ID: 10632 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Skip Da Shu

Send message
Joined: 13 Jul 09
Posts: 64
Credit: 2,922,790,120
RAC: 98
Level
Phe
Scientific publications
watwatwatwatwatwatwat
Message 11128 - Posted: 13 Jul 2009, 6:25:35 UTC
Last modified: 13 Jul 2009, 6:34:09 UTC

I found a couple posts where a person was saying their vid card could not meant the GPUGrid WU deadlines but he doesn't say what card that is.

Does a 9600GT have sufficient power to finish the WUs in time?
- da shu @ HeliOS,
"A child's exposure to technology should never be predicated on an ability to afford it."
ID: 11128 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Bymark
Avatar

Send message
Joined: 23 Feb 09
Posts: 30
Credit: 5,897,921
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwat
Message 11129 - Posted: 13 Jul 2009, 8:17:26 UTC - in response to Message 11128.  

I found a couple posts where a person was saying their vid card could not meant the GPUGrid WU deadlines but he doesn't say what card that is.

Does a 9600GT have sufficient power to finish the WUs in time?


Yes, 9600GT can, but slow, 28 huors+ for a 93-GIANNI one....

<core_client_version>6.4.7</core_client_version>
<![CDATA[
<stderr_txt>
# Using CUDA device 0
# Device 0: "GeForce 9600 GT"
# Clock rate: 1600000 kilohertz
# Total amount of global memory: 536543232 bytes
# Number of multiprocessors: 8
# Number of cores: 64
MDIO ERROR: cannot open file "restart.coor"
# Time per step: 204.018 ms
# Approximate elapsed time for entire WU: 102008.765 s
called boinc_finish

</stderr_txt>
]]>


"Silakka"
Hello from Turku > Åbo.
ID: 11129 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 12196 - Posted: 29 Aug 2009, 0:00:15 UTC - in response to Message 8566.  
Last modified: 29 Aug 2009, 0:12:26 UTC

Would still be nice if someone could provide such performance numbers for other WUs.


Recently replaced a Palit GTS 250 with a Palit GTX 260 (216), so I have some performance numbers. Details & Specs:

The GTX260 has two fans rather than the one on the GTS250. Although the GTX260 is louder, the temperature is a bit less; 71 °C (running GPUGrid) rather than 76°C.
The GTX260 is a bit longer too, but both would make a BIOS reset awkward, so no messing.
The 250 has VGA, DVI and HDMI, but is only 1.1 Compute Capable (CC); using the G92 core.
The 1.3 CC (G200 core) GTX260 only has 2 DVI ports, but I have a DVI to HDMI converter and a DVI to VGA adapter, should I ever need them.
Although the GTX260 has 27 Multiprocessors and 216 Shaders, compared to the GTS250’s 16 Multiprocessors and 128 Shaders, my system’s power usage is surprisingly similar, perhaps even slightly less for the GTX260! Surprising until I looked at the clock rates; GTS250 1.85GHz, GTX260 1.35GHz!
http://www.techpowerup.com/gpuz/6y4yp/
http://www.techpowerup.com/gpuz/hkf67/

Apart from changing the cards, the system is identical and Granted credit was the same for both WU’s (5664.88715277777):

On the GTS250 I completed the Work Unit 48-GIANNI_BINDTST001-7-100-RND7757_2 in 53268.082 s

On the GTX260 I completed the Work Unit 334-GIANNI_BIND001-7-100-RND2726_1 in 31902.258 s

The GTS250 has a Boinc GFlops rating of 84 while the GTX260 is 104, which would make the GTX almost 20% faster, going by the Boinc GFlops rating.

However, the similar work unit did not complete in 80% of the time it took the GTS250 (which would have been 42614.4656 sec); it completed it in 59.89% of the time. So to turn that around the new card was between 40 and 41% faster overall, and about 20% faster than Boinc predicted with its GFlops rating (Explained by having the better 1.3CC G200 core, compared to the 1.1CC G92 core of the GTS250).

So for the above conditions, I would say the GTX260 has a rating of 104 (125) Boinc GFlops, with the number in the brackets representing the 1.3 CC/G200 comparable value (or a 1.2 correction factor; +20%) to a 1.1CC G92 core, that was rated at 104 Boinc GFlops.

Perhaps these results vary with different tasks and cards?
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 12196 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 12206 - Posted: 29 Aug 2009, 14:34:30 UTC - in response to Message 12196.  

Thanks for reporting. Your numbers actually exactly confirm what I found, you just got it wrong at the end ;)

GTS250: 84 BOINC GFlops, 53268s
GTX260: 104 BOINC GFlops, 31902s

Based on the GFlopfs rating the GTX 260 should have needed 53268s * 84/104 = 43024s. Real performance is 43024 / 31902 = 1.35 times (= +35%) faster per GFlop. So GTX 260 would deserve a BOINC GFlops rating of 140 to represent this. These 35% are comfortably close to the 41% I determined earlier this year :)

BTW: I don't recommend using BOINC GFlop ratings, as they depend on a totally arbitrary choice of code. Right now this rating scales linearly with theoretical maximum GFlops, but the numbers could change anytime, as soon as the Devs decide to use different code. On the other hand the theoretical maximum values (times correction factors, which may depend on driver and app versions) don't change over time.

MrS
Scanning for our furry friends since Jan 2002
ID: 12206 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile liveonc
Avatar

Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 14814 - Posted: 30 Jan 2010, 0:16:27 UTC

It the math is done, it's a question of getting the max credit per $/€ invested + the cost of powering the beast. I'm not in this for the credits, nor do I have any special interest, other than hoping that I can help the collective, because I might need some helping out myself someday.

If the efficiency of BOINC can be increased w/o running the risk of getting tonnes of error, that would help out even more.

I try to focus on Projects related to medical studies, entertainment, & non ET research. I really hope that it's not just a bunch of waste in regards to money & power. If I wanted that, I'd rather use my PC's for games.
ID: 14814 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile liveonc
Avatar

Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 14819 - Posted: 30 Jan 2010, 5:05:18 UTC - in response to Message 8566.  

last update: 29th of April 2009

A list compiled by loki that gives you an overview of the supported Nvidia cards and their €/Credit and $/Credit rating.

The credit entity is GFLOPS from wikipedia.de. € prices from idealo.de, US$ from pricegrabber.com.

Prices: Cheapest found incl. tax and excl. shipping. Check your local prices as they vary alot, often from day to day.
Prices in [] are from [ebay]: Buy It Now, incl. shipping and only included if 10€/$ cheaper than shops.

Exchange rate 16 Apr: EURUSD=X 1,3171 - 1€ = 1,32$; USDEUR=X 0,7593 - 1$ = 0.76€

Power consumption and minimum system power requirement from nvidia.com

---------------------------------------------------------------------------------------------------------------------------------------
G92, G92b, G94, G94b

Model - - - - - - - - - - - GFLOPS - - - - - € - - - - - GFLOPS/€ - - - - - $ - - - - - - GLOPS/$ - - - - Load - - - req. Power Supplie
Geforce 8800 GT - - - - - -504 - - - - - - 62€ - - - - - - 8.13 - - - - - $99[80] - - - - - 5.10[6.30] - - - 105 W - - 400 W
Geforce 8800 GTS(512) - 624 - - - - - 117€ - - - - - - 5.33 - - - - - $159[90] - - - - 3.92[6.93] - - - 140 W - - 450 W

Geforce 9600 GSO 512 - 234 - - - - - - 70€ - - - - - - -3.34 - - - - - -$80 - - - - - - 2.93 - - - - - - - - 90 W - - 400 W
Geforce 9600 GT - - - - - -312 - - - - - - 71€ - - - - - - -4.39 - - - - - -$75 - - - - - - 4.16 - - - - - - - - 59 W - - 300 W
Geforce 9600 GSO - - - - 396 - - - - - - 80€ - - - - - - -4.95 - - - - - -$80[70] - - - 4.95[5.66] - - - - 105 W - - 400 W
Geforce 9800 GT - - - - - -508 - - - - - - 87€ - - - - - - -5.84 - - - - - -$99 - - - - - - 5.13 - - - - - - - - 105 W - - 400 W
Geforce 9800 GTX - - - - -648 - - - - - - 126€[107] - - 5.14[6.01] - -$135 - - - - - 4.8 - - - - - - - - - 140 W - - 450 W
Geforce 9800 GX2 - - - -1052 - - - - - - 250€[233] - - 4.21[4.52] - $250[205] - - 4.21[5.13] - - - - 197 W - - 580

Geforce GTS 250 - - - - - 705 - - - - - - 100€ - - - - - - 7.05 - - - - - $137 - - - - - -5.15 - - - - - - - - 150 W - - 450 W

(note:
- 9800GTX+ is similar to GTS 250
- 8800GS is similar to 9600GSO 384 MB)


GT200, GT200b - optimization bonus 41% **1

Model - - - - - - - - - - - GFLOPS(+41%) - - - € - - - GFLOPS/€ - - - - - $ - - - - - - - - GLOPS/$ - - - - - - - - - - - Load - - - req. Power Supplie
Geforce GTX 260 - - - - - 715.4(1009) - - - 141€ - - 5.07(7.15) - - - - $179 - - - - - - 4.00(5.64) - - - - - - - - - - - - 182 W - - 500 W
Geforce GTX 260(216) - - 805(1135) - - - - 156€ - - 5.16(7.28) - - - - $189 - - - - - - 4.26(6.00) - - - - - - - - - - - -190 W - - 500 W
Geforce GTX 275 - - - - - 1010.9(1424) -- - 212€ - - 4.77(6.73) - - - - $250 - - - - - - 4.04(5.70) - - - - - - - - - - - -219 W - - 550 W
Geforce GTX 280 - - - - - 933.1(1316) - - - 289€ - - 3.23(4.55) - - - - $265 - - - - - - 3.52(4.96) - - - - - - - - - - - - 236 W - - 550 W
Geforce GTX 285 - - - - - 1062.7(1498) -- - 287€ - - 3.70(5.22) - - - - $340[330] - - 3.13(4.41)[3.22/4.54] - - - - 204 W - - 550 W
Geforce GTX 295 - - - - - 1788.4(2522) -- - 406€ - - 4.40(6.20) - - - - $533[510] - - 3.36(4.74)[3.51/4.95] - - - - 289 W - - 680 W


Nvidia Tesla

C1060 Computing Processor - - 1936(2730) - - 1508€ - - 1.28(1.80) - - - $1300 - - - 1.49(2.10) - - - - - - - - - - 188 W - - 500 W
S1070 1U Computing Server - - 4320(6091) - - 6682€ - - 0.65(0.91) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 800 W


The 100 series is not available for individual purchase.
---------------------------------------------------------------------------------------------------------------------------------------

**1
Let's put some numbers in here and compare these 2 WUs (1 & 2) with pretty similar names:

1. Me with 9800GTX+: 89917 s, 1944 MHz, 128 shaders -> 746.5 GFlops
2. GTX 260 Core 216: 48517 s, 1512 MHz, 216 shaders -> 979.8 GFlops

-> I need 1.853 times as long with 0.761 times the GFlops. That means for this WU each "GT200-Flop" is worth 1.41 "G92-Flops", or put another way: GT200 is 41% faster per clock.

ExtraTerrestrial Apes wrote:

1. The speedup of G200:
- GDF just said 10 - 15%
- based on fractals numbers it's ~90%
- when the G200 optimizations were introduced I estimated >30% performance advantage for G200 at the same theoretical FLOPS
- it may well be that the G200-advantage is not constant and scales with WU sizes, which may explain why we're seeing a much higher

advantage now than in the past (with smaller WUs)
- my 9800GTX+(705GFLOPS) was good for ~6400 RAC, whereas GTX 260(805GFLOPS) made 10 - 11k RAC prior to the recent credit adjustments
- G200 is more future proof than G92. Both are DX 10.0, but G200 has additional features which may or may not be needed for future CUDA

clients.

-> concluding these observations I can say that the advantage of G200-based cards is >50% at the same theoretical FLOPS.


Would still be nice if someone could provide such performance numbers for other WUs.


Efficiency & total potential is a factor, but what about initial cost of purchase vs running cost? That prices swing & that some can be lucky as to get a good deal on a GPU, & that the price of power is different from country to country. That some of use use a 85+ PSU & if it's winter & the beast warms up the room, is also a factor.

ID: 14819 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 14844 - Posted: 30 Jan 2010, 22:22:48 UTC - in response to Message 14819.  

There are lots of factors to consider. Initial purchase costs, running costs, actual contribution (points being a good indicator), reliability, even aesthetics perhaps, certainly noise, heat and other unwanted/wanted side effects!
Until the new NVidia GPUs are released the most efficient is the GT 240. It is fairly inexpensive to purchase, cheap to run and delivers excellent performance given its small power consumption. It does not require any extra power connectors and can fit most standard cases. It is also very reliable!
Tomorrow, my GT 240 will be migrating from an Opteron @ 2.2 GHz to an i7 at a presently non-determined frequency. Should see a bit more.
ID: 14844 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile liveonc
Avatar

Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 14848 - Posted: 31 Jan 2010, 0:19:43 UTC - in response to Message 14844.  

24/7 operation causing a shortening of life span, although that 3 years will not be an issue, if it's credits you'd want to generate. Mine's factory OC'd plus slightly more. That 1 PC running several GPU's is overall more efficient than several PC's running just one GPU.

I haven't checked out the GT 240, but does that OC well?

That GPUGRID.net doesn't run SLI or Crossfire & that mixing different GPU's (as far as this N00B understands), is not an issue.
ID: 14848 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 14907 - Posted: 1 Feb 2010, 18:48:50 UTC - in response to Message 14848.  

Just moved the GT 240 to an i7, to see how much it benefits from the added CPU power. I have not tried to OC it yet, as I was mainly concerned with reliability. I did underclock the opteron from 2.2GHz down to about 800MHz and the time to complete tasks dropped significantly for the GPU, so giving it faster CPU support should reduce task turnover times. Its low temperatures should allow it to clock reasonable well. I will try to OC it in a few days, after I have a better i7 heatsink, as the i7 is running too hot and could interfere with the overclocking the GPU. Its also good to get some stock results in first.
ID: 14907 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
1 · 2 · 3 · 4 . . . 6 · Next

Message boards : Number crunching : Credit per € / $

©2025 Universitat Pompeu Fabra