Performance of 3D Graphic @ PS3GRID

Message boards : Graphics cards (GPUs) : Performance of 3D Graphic @ PS3GRID
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · Next

AuthorMessage
TomaszPawel

Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2013 - Posted: 2 Sep 2008, 11:09:19 UTC - in response to Message 1997.  
Last modified: 2 Sep 2008, 11:11:12 UTC

Wolfram1

So you have 9800GTX+, and q6600@3Ghz, and Vista x64. - 38000sec.

What driver you use? Are you crunching something on onther cores?

What FSB you have set?

ExtraTerrestrial Apes

So you have 9800GTX+, and q6600@3Ghz, and XPSP2. - 44000sec.

What driver you use? Are you crunching something on onther cores?

What FSB you have set?
ID: 2013 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Thamir Ghaslan

Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 2024 - Posted: 2 Sep 2008, 15:52:46 UTC - in response to Message 1970.  

Performance of 3D Graphic @ PS3GRID
Hi!
This is performance guide of nVidia graphic cards on PS3GRID:
1. GeForce 280GTX: 25000 sec/WU
2. GeForce 260GTX: 28000 sec/WU
3. GeForce 9800GTX+: 44000 sec/WU
4. GeForce 9800GTX: 47000 sec/WU
5. GeForce 8800GTS512: 50000 sec/WU
6. GeForce 8800GT: 58000 sec/WU
7. GeForce 9600GT: 70000 sec/WU
8. GeForce 8800GS: 74000 sec/WU
This is estimated time (+/-2000 seconds ) of what you should expected from this cards when they operate at normal values. This data was taken from statistic of users who crunch. Although this comparison is not 100% correct due to different CPU, and RAM clocks, but this give some lights on GeForce performance.
I will update and correct this when more WU will be completed using 6.43 application.

It seems that for all WU You can get the same points, but deferent’s are in how much time computer needs to completed WU. Overcloacking helps :)

Also different drivers = different performance.


I was'nt able to fully gauge how much difference there was between my 8800 GS and 280 GTX on this project due nvidia support being new here, but on folding@home, I was getting close to 1000 frames per second on my 8800 and 3000 frames per second on my 280. Time to completion was also cut to 3rd.

I'm not a big believer in flops as the ultimate truth, driver optimization, bandwith and bits all plays a role!

I'm getting 26,000 seconds, 7 hours 20 minutes per task!


ID: 2024 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile [SETI.USA]Tank_Master
Avatar

Send message
Joined: 8 Jul 07
Posts: 85
Credit: 67,463,387
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 2025 - Posted: 2 Sep 2008, 16:25:50 UTC - in response to Message 2024.  

I'm getting 26,000 seconds, 7 hours 20 minutes per task!

On which card?

My 8800 GTS 512 takes 10.76h with server 2008 x64 4GB RAM and a q9450 @ 3.05GHz
ID: 2025 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile koschi
Avatar

Send message
Joined: 14 Aug 08
Posts: 127
Credit: 913,858,161
RAC: 13
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 2026 - Posted: 2 Sep 2008, 16:38:33 UTC

It's a GTX280, according to the information in stderr out...

@Thamir

So it looks like as if the ratio here is also ~1:3 when you compare your GTX280 to the times that Tomasz collected for a 8800GS, +/- some %...
ID: 2026 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Thamir Ghaslan

Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 2029 - Posted: 2 Sep 2008, 18:06:40 UTC - in response to Message 2025.  

I'm getting 26,000 seconds, 7 hours 20 minutes per task!

On which card?

My 8800 GTS 512 takes 10.76h with server 2008 x64 4GB RAM and a q9450 @ 3.05GHz


GTX 280, win xp 32 bit, 4 GB ram, quad 6600, 2.4 default but on some days I'll overclock close to 3 GHZ.

I could lower that 26,000 value with a little CPU and GPU overclocking, but I prefer stability over speed!

Additionally, I heavily watch movies on my connected TV which does not really waste much GPU and occasionally spend an hour or two playing games while ps3grid tasks in the background.

No major performance degradation in terms of FPS. Most of my games are a year old and run full setting @ 1024x768.

Would like to see how Crysis feels about it :P
ID: 2029 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Thamir Ghaslan

Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 2030 - Posted: 2 Sep 2008, 18:09:24 UTC - in response to Message 2026.  

It's a GTX280, according to the information in stderr out...

@Thamir

So it looks like as if the ratio here is also ~1:3 when you compare your GTX280 to the times that Tomasz collected for a 8800GS, +/- some %...


Yes, seems that way!

Honestly, if people would stop whining about the price of a 280! I think its worth it with the latest price wars!

+/- a few bucks, if you buy 3 8800 GS the price is close to 1 280 GTX!
ID: 2030 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TomaszPawel

Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2035 - Posted: 2 Sep 2008, 19:24:05 UTC - in response to Message 2030.  

[quote]It's a GTX280, according to the information in stderr out...
@Thamir
+/- a few bucks, if you buy 3 8800 GS the price is close to 1 280 GTX!


I agree. 2x8800GTS >= 1x280GTX but i preffer 280GTX... so colecting $...
ID: 2035 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2045 - Posted: 2 Sep 2008, 21:43:26 UTC - in response to Message 2013.  

Wolfram1

So you have 9800GTX+, and q6600@3Ghz, and Vista x64. - 38000sec.

What driver you use? Are you crunching something on onther cores?
What FSB you have set?

ExtraTerrestrial Apes

So you have 9800GTX+, and q6600@3Ghz, and XPSP2. - 44000sec.

What driver you use? Are you crunching something on onther cores?
What FSB you have set?


On average he is actually faster than 38.000. We're both on 177.92 & 6.3.10.
I'm running 3 x QMC on my other cores. FSB is set to 334 MHz with the memory at DDR2-800 5-4-4 and turbo sub timings.

But the latter 2 should really play no role, as the CPU speed itself is uncritical for GPU-Grid and the FSB & mem would only account for ~5% overall CPU speed anyway. We're talking about a 20% difference here! And what runs on the other cores *should* be pretty irrelevant, because GPU-Grid has a dedicated core. I wouldn't be my life on it though.

MrS
Scanning for our furry friends since Jan 2002
ID: 2045 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2046 - Posted: 2 Sep 2008, 22:26:43 UTC

I'm not a big believer in flops as the ultimate truth, driver optimization, bandwith and bits all plays a role!


It's not a question of belief and no one is saying they'd be the "ultimate truth".

Look, >99.9% of what happens in GPU-Grid happens in the shaders (this number is just made up by me). In all chips of the G80/G90 class the shaders are almost identical (apparently apart from this CUDA 1.0 / 1.1 issue). The code is compute bound, memory bandwidth plays a very minor role. Therefore, the only thing which matters is how many shaders you have and how many clock cycles they get per second, assuming we're all running the same software. That's why FLops are a good guideline to compare GPUs - they reflect the key performance factors directly.

Do I know this to be true? Well, GDF said so [that performance scales linearly with Flops]. But all of this goes under the assumption that the shader core is basically identical between GPUs. The developers tested with 8800GTs and 9800GX2 where this condition is met.

However, for GT200 things are different. The shader core has been tweaked quite a bit, registers have been added etc. That's why I asked about the performance of 260GTX, because it's got the same raw Flops as a 9800GTX+. I got the answer that performance was basically identical.

As it seems now, maybe only after the devs switched to CUDA 2.0, the GT200 based chips are indeed way faster than their Flop-rating would imply. Guys, this is a major finding and greatly improves the value of GTX260/280!!

This is not a failure of Flops in general - you just have to understand what you're doing. It's perfectly normal for a different architecture to make better real world use of its theoretical maximum Flops. This just illustrates that GT200 and G80/90 can not be compared based on Flops. But all G80/90 can still be compared with each other as well as all GT200 based GPUs with each other.

Honestly, if people would stop whining about the price of a 280! I think its worth it with the latest price wars!


I don't hear anyone whining here.

+/- a few bucks, if you buy 3 8800 GS the price is close to 1 280 GTX!


You don't buy 8800GS for GPU-Grid. A worthy opponent for GTX260 would be a 9800GX2. Price-wise they're about the same, whereas the GX2 has the higher power consumption. Performance should be 50.000 - 52.000s for each WU. So you'd be a bit faster than the GTX260, but you'd need to sacrifice 2 CPU cores.

This makes the GTX260/280 look much better than before. Thanks Thomasz!

MrS
Scanning for our furry friends since Jan 2002
ID: 2046 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Pigu

Send message
Joined: 1 Sep 08
Posts: 2
Credit: 4,544,099
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 2049 - Posted: 3 Sep 2008, 5:38:18 UTC
Last modified: 3 Sep 2008, 5:38:46 UTC

how many wu's can i cruch simultanously? always 1 or can i crunch more on better cards? can i crunch 4x more wus in quad sli or only one faster?
ID: 2049 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile koschi
Avatar

Send message
Joined: 14 Aug 08
Posts: 127
Credit: 913,858,161
RAC: 13
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 2051 - Posted: 3 Sep 2008, 6:52:55 UTC

You should be able to crunch as many units in parallel as you have GPU cores. As every GPU core consumes one whole CPU core at the moment, you also need the same or bigger amount of CPUs.

So if you have a normal SLI config on a Quad core processor, you can crunch two units in parallel, leaving two CPU cores free for traditional projects.

If you can afford 3 or even 4 way SLI, maybe via 2 * 9800GTX2 you will need all 4 cores of your Quad processor to utilize the 4 GPU cores, but you are crunching 4 units in parallel then :)
ID: 2051 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2054 - Posted: 3 Sep 2008, 7:48:41 UTC

And don't forget to turn SLI off for GPU-Grid.

MrS
Scanning for our furry friends since Jan 2002
ID: 2054 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TomaszPawel

Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2097 - Posted: 5 Sep 2008, 7:48:18 UTC - in response to Message 2051.  


If you can afford 3 or even 4 way SLI, maybe via 2 * 9800GTX2 you will need all 4 cores of your Quad processor to utilize the 4 GPU cores, but you are crunching 4 units in parallel then :)


Hmmm, Is someone crunching on 9800GX2????
ID: 2097 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile [AF>HFR>RR] Jim PROFIT

Send message
Joined: 3 Jun 07
Posts: 107
Credit: 31,331,137
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 2098 - Posted: 5 Sep 2008, 12:55:10 UTC - in response to Message 2097.  


If you can afford 3 or even 4 way SLI, maybe via 2 * 9800GTX2 you will need all 4 cores of your Quad processor to utilize the 4 GPU cores, but you are crunching 4 units in parallel then :)


Hmmm, Is someone crunching on 9800GX2????


Maybe in a week....i hope!

Jim PROFIT
ID: 2098 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 2099 - Posted: 5 Sep 2008, 15:28:18 UTC - in response to Message 2098.  


If you can afford 3 or even 4 way SLI, maybe via 2 * 9800GTX2 you will need all 4 cores of your Quad processor to utilize the 4 GPU cores, but you are crunching 4 units in parallel then :)


Hmmm, Is someone crunching on 9800GX2????


Maybe in a week....i hope!

Jim PROFIT


Cpu cores will soon be not necessary.

g
ID: 2099 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Sandro

Send message
Joined: 19 Aug 08
Posts: 22
Credit: 3,660,304
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 2100 - Posted: 5 Sep 2008, 15:47:44 UTC - in response to Message 2099.  



Cpu cores will soon be not necessary.

g

that is a very good news!
any idea how long it will take? ;)
ID: 2100 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Thamir Ghaslan

Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 2108 - Posted: 5 Sep 2008, 21:26:32 UTC - in response to Message 1970.  

Performance of 3D Graphic @ PS3GRID
Hi!
This is performance guide of nVidia graphic cards on PS3GRID:
1. GeForce 280GTX: 25000 sec/WU
2. GeForce 260GTX: 28000 sec/WU
3. GeForce 9800GTX+: 44000 sec/WU
4. GeForce 9800GTX: 47000 sec/WU
5. GeForce 8800GTS512: 50000 sec/WU
6. GeForce 8800GT: 58000 sec/WU
7. GeForce 9600GT: 70000 sec/WU
8. GeForce 8800GS: 74000 sec/WU
This is estimated time (+/-2000 seconds ) of what you should expected from this cards when they operate at normal values. This data was taken from statistic of users who crunch. Although this comparison is not 100% correct due to different CPU, and RAM clocks, but this give some lights on GeForce performance.
I will update and correct this when more WU will be completed using 6.43 application.

It seems that for all WU You can get the same points, but deferent’s are in how much time computer needs to completed WU. Overcloacking helps :)

Also different drivers = different performance.


One more benchmark!

I ran this application on my work computer 24/7 which was equipped with a 8400 GS.

http://www.ps3grid.net/workunit.php?wuid=38165

538,666 seconds give or take 6 days. Bad idea :P and passed the deadline and no credits!

Will detach my work machine once I get to the office.
ID: 2108 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile [SETI.USA]Tank_Master
Avatar

Send message
Joined: 8 Jul 07
Posts: 85
Credit: 67,463,387
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 2109 - Posted: 5 Sep 2008, 22:25:16 UTC

of the 6 WUs I have completed so far, my average time is 37819.59333 seconds (min 37k, max 40k)

I have:
BFG GeForce 8800 GTS 512 OC (675/972) with 177.84 drivers
Server 2008 x64
4GB RAM
BOINC 6.3.10 x64
ID: 2109 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile koschi
Avatar

Send message
Joined: 14 Aug 08
Posts: 127
Credit: 913,858,161
RAC: 13
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 2113 - Posted: 6 Sep 2008, 0:59:25 UTC

The card is too slow,
with only 16 shaders the times are as expected...
The card itself is not recommended for this project, as it has only few shaders and hence it is way to slow to crunch a work unit within deadline time.

http://www.gpugrid.net/forum_thread.php?id=316
ID: 2113 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Yeti
Avatar

Send message
Joined: 20 Jul 08
Posts: 3
Credit: 5,450,108,679
RAC: 3,240
Level
Tyr
Scientific publications
watwatwatwatwatwatwat
Message 2115 - Posted: 6 Sep 2008, 8:12:50 UTC

It looks, like I have one more that works but it seems to be to slow:

Quattro GVS 290 will take 175 hours :-((


Supporting BOINC, a great concept !
ID: 2115 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · Next

Message boards : Graphics cards (GPUs) : Performance of 3D Graphic @ PS3GRID

©2025 Universitat Pompeu Fabra