GPU grid??! Which GPU is supported?

Message boards : Graphics cards (GPUs) : GPU grid??! Which GPU is supported?
Message board moderation

To post messages, you must log in.

Previous · 1 · 2

AuthorMessage
Profile [XTBA>XTC] ZeuZ

Send message
Joined: 15 Jul 08
Posts: 60
Credit: 108,384
RAC: 0
Level

Scientific publications
wat
Message 1229 - Posted: 16 Jul 2008, 11:59:44 UTC - in response to Message 1227.  


Fedora 7 64bit
Asus 8800GS
NVIDIA-Linux-x86_64-169.09-pkg2.run


Thanks

I tested two drivers, 173.14 and 177.13, no difference

I will test 169.09 ... maybe


ID: 1229 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Temujin

Send message
Joined: 12 Jul 07
Posts: 100
Credit: 21,848,502
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwat
Message 1230 - Posted: 16 Jul 2008, 12:50:51 UTC - in response to Message 1229.  

I tested two drivers, 173.14 and 177.13, no difference

I initially had 173.14 installed (from here) but wasn't sure if that was a cuda driver or not
So followed the http://www.nvidia.com/object/cuda_get.html link and it suggested the 169.09 package for my setup.

ID: 1230 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Westsail and *Pyxey*

Send message
Joined: 12 Feb 08
Posts: 11
Credit: 3,194,461
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 1235 - Posted: 17 Jul 2008, 14:25:27 UTC

So what do you guys think? It appears my older Nvidia cards don't support CUDA. :(
Looking for a card to put in my dedicated cruncher for this project.
I think I have settled on this:

ZOTAC ZT-98XES2P-FSP GeForce 9800 GTX
ID: 1235 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Stefan Ledwina
Avatar

Send message
Joined: 16 Jul 07
Posts: 464
Credit: 298,573,998
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwat
Message 1237 - Posted: 17 Jul 2008, 14:38:36 UTC
Last modified: 17 Jul 2008, 14:40:37 UTC

The 9800 GTX shouldn't be a bad cruncher, you can see the runtimes per WU of my card in the other thread you started...

But if you would like to, or are able to spend a little bit more money on the graphics card, one of the new GTX 260, or GTX 280 cards would be much faster!

But for a GTX 280, you would probably also need a new PSU, because it needs a 8pin plus a 6pin power connector - the GTX 260 only two 6 pin connectors...

pixelicious.at - my little photoblog
ID: 1237 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 1239 - Posted: 17 Jul 2008, 15:31:42 UTC - in response to Message 1237.  
Last modified: 17 Jul 2008, 15:32:04 UTC

The 9800 GTX shouldn't be a bad cruncher, you can see the runtimes per WU of my card in the other thread you started...

But if you would like to, or are able to spend a little bit more money on the graphics card, one of the new GTX 260, or GTX 280 cards would be much faster!

But for a GTX 280, you would probably also need a new PSU, because it needs a 8pin plus a 6pin power connector - the GTX 260 only two 6 pin connectors...



A way to know which card is better for the money do this: Compute the peak Gflops of the card.

Peak Gflops = (shader clock)x(number of stream processors)x(3 flop)

The highest the better.


GDF
ID: 1239 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile [XTBA>XTC] ZeuZ

Send message
Joined: 15 Jul 08
Posts: 60
Credit: 108,384
RAC: 0
Level

Scientific publications
wat
Message 1241 - Posted: 17 Jul 2008, 16:11:48 UTC - in response to Message 1239.  


Peak Gflops = (shader clock)x(number of stream processors)x(3 flop)

The highest the better.
GDF


It's x2 flop I think
ID: 1241 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Westsail and *Pyxey*

Send message
Joined: 12 Feb 08
Posts: 11
Credit: 3,194,461
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 1242 - Posted: 17 Jul 2008, 16:30:03 UTC
Last modified: 17 Jul 2008, 16:38:53 UTC

Thanks guys!
What I would like to know, is how CPU intensive is this? I am imagining the GPU is doing most the work. A core is tied up with it but not running 100% like we do with "traditional" projects.
So my conjecture/question is if the same RAC, with whatever card, can be achieved with a slightly slower (read cheaper) cpu?
I am very keen to see if it will be successful to run multiple tasks on multiple cards simultaneously.
If all my assumptions above are correct. Then rather than spend more to get a faster card, that ties up one of my good crunching cores.
Wouldn't it be cool to build a budget dedicated twin or even quad GPU cruncher?
I invision two of the $200 dollar cards. (although if one had the funds the GTX 280 is an amazing machine) A basic $50 atx board and a $50 processor, probably a 4200+. Using Stefan's numbers that would be like 8-10k RAC a day for the price of a PS3. Really exciting development you guys have here. I will do whatever I can to encourage this technology. And to think I thought the Cell BE was cool. ;)
ID: 1242 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Stefan Ledwina
Avatar

Send message
Joined: 16 Jul 07
Posts: 464
Credit: 298,573,998
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwat
Message 1244 - Posted: 17 Jul 2008, 16:56:34 UTC - in response to Message 1242.  

Thanks guys!
What I would like to know, is how CPU intensive is this? I am imagining the GPU is doing most the work. A core is tied up with it but not running 100% like we do with "traditional" projects.


Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor...
I really can't say how important the CPU speed is for the GPU application, but earlier test have shown that if the app uses only 50% of a CPU core, the WUs were a good bit slower.

Would be interesting to see some comparisons with the same graphics cards but other CPUs in the thread you started...

pixelicious.at - my little photoblog
ID: 1244 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 1245 - Posted: 17 Jul 2008, 17:11:48 UTC - in response to Message 1244.  
Last modified: 17 Jul 2008, 17:17:16 UTC

Thanks guys!
What I would like to know, is how CPU intensive is this? I am imagining the GPU is doing most the work. A core is tied up with it but not running 100% like we do with "traditional" projects.


Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor...
I really can't say how important the CPU speed is for the GPU application, but earlier test have shown that if the app uses only 50% of a CPU core, the WUs were a good bit slower.

Would be interesting to see some comparisons with the same graphics cards but other CPUs in the thread you started...



The CPU is not important at all. It appears to be using 100% of resources just because it is polling waiting for the accelerated kernel to finish.
So, any CPU should do.

We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU.

We are looking into building another one with Geforce 280.

GDF
ID: 1245 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Stefan Ledwina
Avatar

Send message
Joined: 16 Jul 07
Posts: 464
Credit: 298,573,998
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwat
Message 1255 - Posted: 17 Jul 2008, 20:15:56 UTC - in response to Message 1245.  



The CPU is not important at all. It appears to be using 100% of resources just because it is polling waiting for the accelerated kernel to finish.
So, any CPU should do.

We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU.

We are looking into building another one with Geforce 280.

GDF


Wow! This sounds like a nice machine! :D
Want to give it away to me as a present? ;-)




pixelicious.at - my little photoblog
ID: 1255 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MacDitch

Send message
Joined: 9 Jul 07
Posts: 1
Credit: 0
RAC: 0
Level

Scientific publications
wat
Message 1258 - Posted: 18 Jul 2008, 1:57:18 UTC - in response to Message 1244.  

Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor...


Please excuse my ignorance, but does that mean the remaining cores are left free to do 'traditional' BOINC projects?
ID: 1258 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile PaladinRPG

Send message
Joined: 1 Apr 08
Posts: 2
Credit: 4,460,665
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwatwat
Message 1259 - Posted: 18 Jul 2008, 6:20:27 UTC

Here's to hoping this can get ported to windows soon. I'd love to donate my new 8800 GT to the cause. :)

The "100%" resource use of one CPU core doesn't interfere with other traditional BOINC projects, does it?
ID: 1259 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 1260 - Posted: 18 Jul 2008, 6:44:55 UTC - in response to Message 1258.  

Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor...


Please excuse my ignorance, but does that mean the remaining cores are left free to do 'traditional' BOINC projects?



Yes, they are.

gdf
ID: 1260 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 1261 - Posted: 18 Jul 2008, 6:47:15 UTC - in response to Message 1259.  

Here's to hoping this can get ported to windows soon. I'd love to donate my new 8800 GT to the cause. :)

The "100%" resource use of one CPU core doesn't interfere with other traditional BOINC projects, does it?


No, it should not.

gdf
ID: 1261 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Krunchin-Keith [USA]
Avatar

Send message
Joined: 17 May 07
Posts: 512
Credit: 111,288,061
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1267 - Posted: 18 Jul 2008, 19:41:50 UTC - in response to Message 1245.  

We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU.

GDF

Could your clarify this.

Does a GPU task use only 1 CPU core and all avaialable GPU cores, in this case 6, or does each gpu task use 1 cpu and 1 graphics card (2 gpu cores) or just 1 gpu core ?

So your rig above, how many gpu tasks could run at the same time to use all cores, and how many cpus would that use and how many cpus would be left for other boinc projects ?
ID: 1267 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 1270 - Posted: 19 Jul 2008, 1:22:39 UTC - in response to Message 1267.  

We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU.

GDF

Could your clarify this.

Does a GPU task use only 1 CPU core and all avaialable GPU cores, in this case 6, or does each gpu task use 1 cpu and 1 graphics card (2 gpu cores) or just 1 gpu core ?

So your rig above, how many gpu tasks could run at the same time to use all cores, and how many cpus would that use and how many cpus would be left for other boinc projects ?


This is tunable. At the moment we are using 1 CPU core for each GPU core. Regarding the machine, we are using it mainly outside BOINC.

gdf
ID: 1270 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile [FVG] bax
Avatar

Send message
Joined: 18 Jun 08
Posts: 29
Credit: 17,772,874
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwat
Message 1272 - Posted: 19 Jul 2008, 13:10:38 UTC - in response to Message 1210.  

news ?


For Linux64:
1) Install the latest Nvidia driver.....
GDF


yes sir :-))

thx for your efforts ;-)
ID: 1272 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2

Message boards : Graphics cards (GPUs) : GPU grid??! Which GPU is supported?

©2025 Universitat Pompeu Fabra