What card?

Message boards : Graphics cards (GPUs) : What card?
Message board moderation

To post messages, you must log in.

1 · 2 · 3 · Next

AuthorMessage
HTH

Send message
Joined: 1 Nov 07
Posts: 38
Credit: 6,365,573
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 1888 - Posted: 29 Aug 2008, 9:20:56 UTC

Hi!

a) What is the cheapest possible graphics card for crunching PS3grid.net workunits?
b) What is the best/fastest possible graphics card for crunching PS3grid.net workunits?

Thanks!
ID: 1888 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile koschi
Avatar

Send message
Joined: 14 Aug 08
Posts: 127
Credit: 913,858,161
RAC: 18
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 1889 - Posted: 29 Aug 2008, 9:39:41 UTC - in response to Message 1888.  

Hi!

a) What is the cheapest possible graphics card for crunching PS3grid.net workunits?
b) What is the best/fastest possible graphics card for crunching PS3grid.net workunits?

Thanks!


a) Geforce 8400GS - but better run it on a single core PC, as otherwise the other units won't finish before the deadline, because the card is sooo slow. Only few shaders at a low clock speed
b) Geforce GTX280 - if you can afford it
ID: 1889 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
HTH

Send message
Joined: 1 Nov 07
Posts: 38
Credit: 6,365,573
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 1890 - Posted: 29 Aug 2008, 9:54:09 UTC - in response to Message 1889.  

Hi!

a) What is the cheapest possible graphics card for crunching PS3grid.net workunits?
b) What is the best/fastest possible graphics card for crunching PS3grid.net workunits?

Thanks!


a) Geforce 8400GS - but better run it on a single core PC, as otherwise the other units won't finish before the deadline, because the card is sooo slow. Only few shaders at a low clock speed
b) Geforce GTX280 - if you can afford it


Thanks.

Ok. c) What is the best possible (not too slow, not too expensive) graphics card for crunching PS3grid.net workunits?

Henri.
ID: 1890 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 1894 - Posted: 29 Aug 2008, 10:14:18 UTC - in response to Message 1890.  

Hi!

a) What is the cheapest possible graphics card for crunching PS3grid.net workunits?
b) What is the best/fastest possible graphics card for crunching PS3grid.net workunits?

Thanks!


a) Geforce 8400GS - but better run it on a single core PC, as otherwise the other units won't finish before the deadline, because the card is sooo slow. Only few shaders at a low clock speed
b) Geforce GTX280 - if you can afford it


Thanks.

Ok. c) What is the best possible (not too slow, not too expensive) graphics card for crunching PS3grid.net workunits?

Henri.


8800GT 512MB

gdf
ID: 1894 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
HTH

Send message
Joined: 1 Nov 07
Posts: 38
Credit: 6,365,573
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 1896 - Posted: 29 Aug 2008, 10:24:36 UTC - in response to Message 1894.  
Last modified: 29 Aug 2008, 10:36:55 UTC


8800GT 512MB
gdf


Thanks! :)

So, is this the correct model?

Does this card support CUDA 2? Will it work fast enough in PCI Express 1.1? I only have PCI Express 1.1 motherboard.

Where to get the latest drivers for that card? I have no experience on using NVIDIA cards (I have an ATI card at the moment).

Henri.
ID: 1896 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TomaszPawel

Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1901 - Posted: 29 Aug 2008, 12:00:39 UTC - in response to Message 1896.  
Last modified: 29 Aug 2008, 12:01:04 UTC


8800GT 512MB
gdf


Thanks! :)

So, is this the correct model?

Does this card support CUDA 2? Will it work fast enough in PCI Express 1.1? I only have PCI Express 1.1 motherboard.

Where to get the latest drivers for that card? I have no experience on using NVIDIA cards (I have an ATI card at the moment).

Henri.


I recomended 8800 GTS 512 - 128 procesors....

It is 2.0 PCI-E but it should work without problems on PCI-E 1.0

drivers: http://www.nvidia.com/object/cuda_get.html
ID: 1901 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Krunchin-Keith [USA]
Avatar

Send message
Joined: 17 May 07
Posts: 512
Credit: 111,288,061
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1903 - Posted: 29 Aug 2008, 12:17:24 UTC

I run 3 8800GT, 1 in each computer. They work very well and I see not much slowdown with normal windows operation. There is some noticed depending on what I'm running, most mostly not too bad. Some people have reported they can't even use their computer when it runs here. My two at work I use all day, heavily with little noticed slowdown and they are both P4-HT, running full boinc using both CPU and GPU. These were well priced for me, not to expensive like the high end cards. Mine are XFX brand and are single slot wide, an important thing to consider as some computers, especially all mine, can not take double wide cards without sacrificing another PCI slot, which I could not do as I have other boards and no empty slots to move them to. These are PCIe x16 2.0 but that is backward compatible with PCIe x16 1.1 slots. They worked fine in my PCIe x 16 1.1 slots.

Mine came with a double life-time warranty. I guess that means when I die, I get to take it with me to the afterlife ;)

Number of stream processors (shown as cores) is important, less processors means it the application takes longer.

512MB memory would be good.

There is 1 device supporting CUDA

Device 0: "GeForce 8800 GT" (640MHz version)
Major revision number: 1
Minor revision number: 1
Total amount of global memory: 536543232 bytes
Number of multiprocessors: 14
Number of cores: 112
Total amount of constant memory: 65536 bytes
Total amount of shared memory per block: 16384 bytes
Total number of registers available per block: 8192
Warp size: 32
Maximum number of threads per block: 512
Maximum sizes of each dimension of a block: 512 x 512 x 64
Maximum sizes of each dimension of a grid: 65535 x 65535 x 1
Maximum memory pitch: 262144 bytes
Texture alignment: 256 bytes
Clock rate: 1.62 GHz
Concurrent copy and execution: Yes

Test PASSED

There is 1 device supporting CUDA

Device 0: "GeForce 8800 GT" (600MHz Version x 2)
Major revision number: 1
Minor revision number: 1
Total amount of global memory: 536543232 bytes
Number of multiprocessors: 14
Number of cores: 112
Total amount of constant memory: 65536 bytes
Total amount of shared memory per block: 16384 bytes
Total number of registers available per block: 8192
Warp size: 32
Maximum number of threads per block: 512
Maximum sizes of each dimension of a block: 512 x 512 x 64
Maximum sizes of each dimension of a grid: 65535 x 65535 x 1
Maximum memory pitch: 262144 bytes
Texture alignment: 256 bytes
Clock rate: 1.51 GHz
Concurrent copy and execution: Yes

Test PASSED

See some of the other reports by users in other threads.

Links to downloads are on the front page. NVIDIA supports their product well on their website. Visit the CUDA Zone section for the CUDA drivers. See the FAQ section for a list of cards that are supported here.

ID: 1903 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Wolfram1

Send message
Joined: 24 Aug 08
Posts: 45
Credit: 3,431,862
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 1905 - Posted: 29 Aug 2008, 13:19:03 UTC - in response to Message 1901.  
Last modified: 29 Aug 2008, 13:19:31 UTC


8800GT 512MB
gdf


I recomended 8800 GTS 512 - 128 procesors....

It is 2.0 PCI-E but it should work without problems on PCI-E 1.0

drivers: http://www.nvidia.com/object/cuda_get.html


I think the FX9800GTX+ 512MB is a good (ot better) solution. You have to pay 10 EURO more but you will get more power. Isn't it? Or am I wrong?
ID: 1905 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TomaszPawel

Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1906 - Posted: 29 Aug 2008, 13:30:51 UTC - in response to Message 1905.  


8800GT 512MB
gdf


I recomended 8800 GTS 512 - 128 procesors....

It is 2.0 PCI-E but it should work without problems on PCI-E 1.0

drivers: http://www.nvidia.com/object/cuda_get.html


I think the FX9800GTX+ 512MB is a good (ot better) solution. You have to pay 10 EURO more but you will get more power. Isn't it? Or am I wrong?


To be honest you can buy 8800GTS512 much cheaper then 9800GTX+

On both cart it is G92 the rally diference betwean this 2 cards is that 9800+ has smaller chip on 55nm and 8800 on 65nm and 9800+ is slighty faster due to aditional Mhz on core and memory. So price/performance better is 8800GTS, but on thermal 9800+
ID: 1906 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
HTH

Send message
Joined: 1 Nov 07
Posts: 38
Credit: 6,365,573
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 1908 - Posted: 29 Aug 2008, 14:09:44 UTC

Thanks again!

One more thing. I want the card to be as silent as possible. It would be very annoying to crunch ~24/7, if the GPU fan yells all the time. So, what exact make and model is quiet enough?

Henri.
ID: 1908 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile MJH

Send message
Joined: 12 Nov 07
Posts: 696
Credit: 27,266,655
RAC: 0
Level
Val
Scientific publications
watwat
Message 1909 - Posted: 29 Aug 2008, 14:25:48 UTC - in response to Message 1908.  

So, what exact make and model is quiet enough?


There do exist 8800-series cards with passive cooling systems (ie no fan) but do expect to pay a premium for these!

So long as the devices conform to Nvidia's reference designs, we'd expect them to be OK for GPUGRID.

MJH
ID: 1909 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1919 - Posted: 29 Aug 2008, 19:04:53 UTC

Regarding noise I'll refer to the thread I just created.

MrS
Scanning for our furry friends since Jan 2002
ID: 1919 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1921 - Posted: 29 Aug 2008, 19:40:09 UTC

And let's get serious about the "most effective card" question.

Buying anything smaller than a G92-based card is not the way to go - they are not that much cheaper but much slower. I'd like to know how fast a GTX260 is "in real world", because it's considerably more expensive than the 9800GTX+, the most expensive G92 card, but has about the same maximum GFlops.

Let's consider these cards: 8800GT, 8800GTS 512, 9800GTX+ and GTX280. I collect current pricing for Germany and the GFlops from the wiki (here, here and here).

Doing that, the
- 8800GT has 504 GFlops for 110€ -> 4.58 GFlops/€
- 8800GTS 512 has 624 GFlops for 130€ -> 4.80 GFlops/€
- 9800GTX+ has 705 GFlops for 155€ -> 4.54 GFlops/€
- GTX280 has 933 GFlops for 350€ -> 2.66 GFlops/€

So the 8800GTS 512 seems to be the most efficient card of these. However, you have to take into account that you also need a PC to run the card in and one CPU core. I'll use my main rig to give an example of what I mean by that.

On average my 9800GTX+ needs 44569s/WU with the 6.41 client. That means running 24/7 it earns 3850 credits/day or 6265 with the 6.43 client. Since I sacrifice one CPU core I loose about 1000 credits/day (Q6600@3GHz running QMC). Therefore the net gain by running GPU-Grid is 2580 or 5265 credits/day. Assuming linear performance scaling with the GFlops rating a 8800GTS 512 would earn 5545 credits/day which is a net win of 4545 credits/day.

Therefore the 8800GTS 512 gives you 35 credits/day/€ and the 9800GTX+ 34 credits/day/€ and the 8800GTS 512 is the efficiency winner. However, are 700 credits/day worth a one time investment of 25€ for you? Your choice.. I certainly made mine ;)

Of course you could always overclock either card.. but I don't think the software is already that stable. I'd rather have the additional speed guaranteed. And going with a 55 nm chip doesn't help much but doesn't hurt either.

MrS
Scanning for our furry friends since Jan 2002
ID: 1921 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Wolfram1

Send message
Joined: 24 Aug 08
Posts: 45
Credit: 3,431,862
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 1924 - Posted: 29 Aug 2008, 20:12:27 UTC - in response to Message 1921.  

And let's get serious about the "most effective card" question.

MrS


It is very intresting what you wrote. I have also a Q6600 overclocking at 3 GHz and will buy me a 9800GTX+ to morrow.

The contigent of 1 WU per CPU and day seams me very small. In your calculation it should be 2 WUs
ID: 1924 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1925 - Posted: 29 Aug 2008, 20:29:09 UTC

Yes, actually I have a hard time to establish a 2 days cache.. but the GPU did not yet run dry :)

MrS
Scanning for our furry friends since Jan 2002
ID: 1925 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile [FVG] bax
Avatar

Send message
Joined: 18 Jun 08
Posts: 29
Credit: 17,772,874
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwat
Message 1926 - Posted: 29 Aug 2008, 20:57:18 UTC - in response to Message 1925.  

Once upon a time... 2 weeks ago... we all happy owners of:

# Geforce 8800 GTS - 320/640M - 96 shader units
# GeForce 8800 GTX - 768M - 128 shader units
# GeForce 8800 Ultra - 768M - 128 shader units

cruched 6.25 application on happy linux OS....

Do U think is possible make us happy in the future? Now we can't help the project but we want !!!


sorry but... I was so happy 2 weeks ago :-))
ID: 1926 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Stefan Ledwina
Avatar

Send message
Joined: 16 Jul 07
Posts: 464
Credit: 298,573,998
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwat
Message 1927 - Posted: 29 Aug 2008, 21:17:52 UTC - in response to Message 1921.  
Last modified: 29 Aug 2008, 21:18:13 UTC

...I'd like to know how fast a GTX260 is "in real world", because it's considerably more expensive than the 9800GTX+, the most expensive G92 card, but has about the same maximum GFlops.

...


Well, I was able to run a few tasks on my GTX 260 with an earlier app version in the first tests under Linux64, but couldn't crunch more than one WU in a row because of driver problems, therefore I switched it to the Vista box...

But as for the speed comparision - My EVGA GTX 260 was as fast as my EVGA 9800 GTX SC (super clocked), actually a little bit slower!

pixelicious.at - my little photoblog
ID: 1927 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1929 - Posted: 29 Aug 2008, 21:40:15 UTC - in response to Message 1927.  

But as for the speed comparision - My EVGA GTX 260 was as fast as my EVGA 9800 GTX SC (super clocked), actually a little bit slower!


Thx! So the architectural fine tuning (more registers etc) of the GT200 doesn't yield any benefits (yet) for GPU-Grid and these cards have thus a rather bad performance per money.

If I put the numbers in for the GTX280 I get 2000 credits/day more than a 9800GTX+ for 200€ more. Not a terrible deal, but I wouldn't recommend it.

And I forgot the 9800GX2! 1TFlops for 260€ -> 8900 credits/day, 1600 credits/day more than the 9800GTX+ (assuming 1000 cr/day for both CPU cores) for 100€ more. Downside of this card is that it needs a 6 pin and an 8 pin power plug and aftermarket cooling solutions likely won't work due to the 2 chip architecture.

MrS
Scanning for our furry friends since Jan 2002
ID: 1929 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Thamir Ghaslan

Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 1931 - Posted: 30 Aug 2008, 5:15:49 UTC - in response to Message 1921.  

And let's get serious about the "most effective card" question.

Buying anything smaller than a G92-based card is not the way to go - they are not that much cheaper but much slower. I'd like to know how fast a GTX260 is "in real world", because it's considerably more expensive than the 9800GTX+, the most expensive G92 card, but has about the same maximum GFlops.

Let's consider these cards: 8800GT, 8800GTS 512, 9800GTX+ and GTX280. I collect current pricing for Germany and the GFlops from the wiki (here, here and here).

Doing that, the
- 8800GT has 504 GFlops for 110€ -> 4.58 GFlops/€
- 8800GTS 512 has 624 GFlops for 130€ -> 4.80 GFlops/€
- 9800GTX+ has 705 GFlops for 155€ -> 4.54 GFlops/€
- GTX280 has 933 GFlops for 350€ -> 2.66 GFlops/€

So the 8800GTS 512 seems to be the most efficient card of these. However, you have to take into account that you also need a PC to run the card in and one CPU core. I'll use my main rig to give an example of what I mean by that.

On average my 9800GTX+ needs 44569s/WU with the 6.41 client. That means running 24/7 it earns 3850 credits/day or 6265 with the 6.43 client. Since I sacrifice one CPU core I loose about 1000 credits/day (Q6600@3GHz running QMC). Therefore the net gain by running GPU-Grid is 2580 or 5265 credits/day. Assuming linear performance scaling with the GFlops rating a 8800GTS 512 would earn 5545 credits/day which is a net win of 4545 credits/day.

Therefore the 8800GTS 512 gives you 35 credits/day/€ and the 9800GTX+ 34 credits/day/€ and the 8800GTS 512 is the efficiency winner. However, are 700 credits/day worth a one time investment of 25€ for you? Your choice.. I certainly made mine ;)

Of course you could always overclock either card.. but I don't think the software is already that stable. I'd rather have the additional speed guaranteed. And going with a 55 nm chip doesn't help much but doesn't hurt either.

MrS


I just sold my 8800 GS and went for a 280, I dont regret the upgrade despite many complaining of the affordability!

Real world benchmarks with folding@home and ps3grid and future mark showed me a 3x gain since the upgrade. I've sold my 8800 GS for 1/3 rd the price of the 280.

"Flops" are misleading, I think the number of stream processors plays a bigger role, and frankly, I was never a big fan of SLIs.
ID: 1931 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1932 - Posted: 30 Aug 2008, 9:17:36 UTC - in response to Message 1931.  

"Flops" are misleading, I think the number of stream processors plays a bigger role, and frankly, I was never a big fan of SLIs.



Well.. no. Flops are calculated as "number of shaders" * "shader clock" * "instructions per clock per shader". The latter one could be 2 (one MADD) or 3 (one MADD + one MUL), but it's constant for all G80/90/GT200 chips. So Flop are a much better performance measure than "number of shaders", because they also take the frequency into account.

And SLI.. yeah, just forget it for games. And for folding you'd have to disable it anyway.

MrS
Scanning for our furry friends since Jan 2002
ID: 1932 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
1 · 2 · 3 · Next

Message boards : Graphics cards (GPUs) : What card?

©2025 Universitat Pompeu Fabra