Message boards :
Graphics cards (GPUs) :
GPU grid??! Which GPU is supported?
Message board moderation
Previous · 1 · 2
| Author | Message |
|---|---|
[XTBA>XTC] ZeuZSend message Joined: 15 Jul 08 Posts: 60 Credit: 108,384 RAC: 0 Level ![]() Scientific publications
|
Thanks I tested two drivers, 173.14 and 177.13, no difference I will test 169.09 ... maybe |
|
Send message Joined: 12 Jul 07 Posts: 100 Credit: 21,848,502 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
I tested two drivers, 173.14 and 177.13, no difference I initially had 173.14 installed (from here) but wasn't sure if that was a cuda driver or not So followed the http://www.nvidia.com/object/cuda_get.html link and it suggested the 169.09 package for my setup. |
|
Send message Joined: 12 Feb 08 Posts: 11 Credit: 3,194,461 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]()
|
So what do you guys think? It appears my older Nvidia cards don't support CUDA. :( Looking for a card to put in my dedicated cruncher for this project. I think I have settled on this: ZOTAC ZT-98XES2P-FSP GeForce 9800 GTX
|
Stefan LedwinaSend message Joined: 16 Jul 07 Posts: 464 Credit: 298,573,998 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
The 9800 GTX shouldn't be a bad cruncher, you can see the runtimes per WU of my card in the other thread you started... But if you would like to, or are able to spend a little bit more money on the graphics card, one of the new GTX 260, or GTX 280 cards would be much faster! But for a GTX 280, you would probably also need a new PSU, because it needs a 8pin plus a 6pin power connector - the GTX 260 only two 6 pin connectors... pixelicious.at - my little photoblog |
GDFSend message Joined: 14 Mar 07 Posts: 1958 Credit: 629,356 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() |
The 9800 GTX shouldn't be a bad cruncher, you can see the runtimes per WU of my card in the other thread you started... A way to know which card is better for the money do this: Compute the peak Gflops of the card. Peak Gflops = (shader clock)x(number of stream processors)x(3 flop) The highest the better. GDF |
[XTBA>XTC] ZeuZSend message Joined: 15 Jul 08 Posts: 60 Credit: 108,384 RAC: 0 Level ![]() Scientific publications
|
It's x2 flop I think |
|
Send message Joined: 12 Feb 08 Posts: 11 Credit: 3,194,461 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]()
|
Thanks guys! What I would like to know, is how CPU intensive is this? I am imagining the GPU is doing most the work. A core is tied up with it but not running 100% like we do with "traditional" projects. So my conjecture/question is if the same RAC, with whatever card, can be achieved with a slightly slower (read cheaper) cpu? I am very keen to see if it will be successful to run multiple tasks on multiple cards simultaneously. If all my assumptions above are correct. Then rather than spend more to get a faster card, that ties up one of my good crunching cores. Wouldn't it be cool to build a budget dedicated twin or even quad GPU cruncher? I invision two of the $200 dollar cards. (although if one had the funds the GTX 280 is an amazing machine) A basic $50 atx board and a $50 processor, probably a 4200+. Using Stefan's numbers that would be like 8-10k RAC a day for the price of a PS3. Really exciting development you guys have here. I will do whatever I can to encourage this technology. And to think I thought the Cell BE was cool. ;)
|
Stefan LedwinaSend message Joined: 16 Jul 07 Posts: 464 Credit: 298,573,998 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Thanks guys! Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor... I really can't say how important the CPU speed is for the GPU application, but earlier test have shown that if the app uses only 50% of a CPU core, the WUs were a good bit slower. Would be interesting to see some comparisons with the same graphics cards but other CPUs in the thread you started... pixelicious.at - my little photoblog |
GDFSend message Joined: 14 Mar 07 Posts: 1958 Credit: 629,356 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() |
Thanks guys! The CPU is not important at all. It appears to be using 100% of resources just because it is polling waiting for the accelerated kernel to finish. So, any CPU should do. We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU. We are looking into building another one with Geforce 280. GDF |
Stefan LedwinaSend message Joined: 16 Jul 07 Posts: 464 Credit: 298,573,998 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Wow! This sounds like a nice machine! :D Want to give it away to me as a present? ;-) pixelicious.at - my little photoblog |
|
Send message Joined: 9 Jul 07 Posts: 1 Credit: 0 RAC: 0 Level ![]() Scientific publications
|
Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor... Please excuse my ignorance, but does that mean the remaining cores are left free to do 'traditional' BOINC projects? |
PaladinRPGSend message Joined: 1 Apr 08 Posts: 2 Credit: 4,460,665 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]()
|
Here's to hoping this can get ported to windows soon. I'd love to donate my new 8800 GT to the cause. :) The "100%" resource use of one CPU core doesn't interfere with other traditional BOINC projects, does it? |
GDFSend message Joined: 14 Mar 07 Posts: 1958 Credit: 629,356 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() |
Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor... Yes, they are. gdf |
GDFSend message Joined: 14 Mar 07 Posts: 1958 Credit: 629,356 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() |
Here's to hoping this can get ported to windows soon. I'd love to donate my new 8800 GT to the cause. :) No, it should not. gdf |
Krunchin-Keith [USA]Send message Joined: 17 May 07 Posts: 512 Credit: 111,288,061 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU. Could your clarify this. Does a GPU task use only 1 CPU core and all avaialable GPU cores, in this case 6, or does each gpu task use 1 cpu and 1 graphics card (2 gpu cores) or just 1 gpu core ? So your rig above, how many gpu tasks could run at the same time to use all cores, and how many cpus would that use and how many cpus would be left for other boinc projects ? |
GDFSend message Joined: 14 Mar 07 Posts: 1958 Credit: 629,356 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() |
We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU. This is tunable. At the moment we are using 1 CPU core for each GPU core. Regarding the machine, we are using it mainly outside BOINC. gdf |
[FVG] baxSend message Joined: 18 Jun 08 Posts: 29 Credit: 17,772,874 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]()
|
news ? yes sir :-)) thx for your efforts ;-) |
©2025 Universitat Pompeu Fabra