Advanced search

Message boards : Graphics cards (GPUs) : Anyone tried the superclocked GT240?

Author Message
MarkJ
Volunteer moderator
Volunteer tester
Send message
Joined: 24 Dec 08
Posts: 738
Credit: 200,909,904
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16168 - Posted: 5 Apr 2010 | 10:40:21 UTC

I notice evga has a superclocked version of the GT240 out. About the only difference seems to be the memory clocks and the price tag. Has anybody tried one of these?

I was looking at replacing a couple of old GTX260 (65nm) with say 3 of these. Not as fast I know with only 96 cuda cores compared to the 216 of the GTX260. The GTX260 is getting rather hard to find these days. Also the nvidia drivers seem to have issues with more than 3 gpus under Win7.
____________
BOINC blog

[boinc.at] Nowi
Send message
Joined: 4 Sep 08
Posts: 44
Credit: 3,685,033
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwatwat
Message 16170 - Posted: 5 Apr 2010 | 10:55:24 UTC
Last modified: 5 Apr 2010 | 10:55:38 UTC

I don´t know more about the GT240 as I read in articles, but I think it must much more slower than my old 8800 GT which is my companion to my GTX260 (55nm). Crunching a "normal" WU with 6.03 app lasts about 13 to 14 hours each compared to my GTX260 which needs 5.5 to 6.5 hours.

It´s up to you, if the speed is enough.

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16172 - Posted: 5 Apr 2010 | 14:26:10 UTC - in response to Message 16170.

Here's my 2 GT 240 cards that are OCed a bit:

625MHz in Win7-64:

http://www.gpugrid.net/results.php?hostid=55407&offset=0&show_names=1&state=0

600MHz in WinXP-32:

http://www.gpugrid.net/results.php?hostid=63615&offset=0&show_names=1&state=0

That might give you an idea of the speed. Both are the GDDR5 version which is faster than GDDR3. Notice that WinXP is faster even though the card is clocked lower. Win7 is just slower in GPUGRID than WinXP. Here's one of the threads about the problem:

http://www.gpugrid.net/forum_thread.php?id=1729#14527

One thing is that Win7 runs the GPU at a lower percentage of utilization for some reason.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16174 - Posted: 5 Apr 2010 | 17:01:56 UTC - in response to Message 16172.
Last modified: 5 Apr 2010 | 17:15:42 UTC

The only thing EVGA overclocked was the memory,
http://www.legitreviews.com/article/1144/1/

I would not call 1800MHz superclocked!
1700MHz is normal. All my GT240 cards with GDDR5 are clocked at 1800MHz or more. My GPUs are clocked to between 610MHz and 640MHz (depending on the card) and the shaders are linked (about 1490 to 1500MHz). At these speeds they operate slightly better than half the speed of a GTX260. So three would do about 55% more work that one GTX260 and use about the same power - might cost slightly more though.

The setup is also important. A good CPU backing the GPU makes some difference. On my i7 I only found a 3 or 4% difference in using GDDR5 rather than GDDR3 (when both are overclocked), but on other systems the difference was more. I think a good CPU hides some of the shortcomings of GDDR3.
By the way, there is DDR3 as well as GDDR3! If anyone gets the DDR3 card it will be noticeably slower than a GDDR5 card.

Two GT240s in an i-7 (W7 x64) (GPU 612MHz, GDDR5 1800MHz, Shaders 1491MHz) RAC 28K.


GT240 (GPU 615, GDDR5 1824, Shaders 1498) on a Quad Opteron @ 2.1GHz

Tasks take around 11h, but there are a few that finish around 9h30min.

fractal
Send message
Joined: 16 Aug 08
Posts: 87
Credit: 1,248,879,715
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16178 - Posted: 5 Apr 2010 | 19:47:10 UTC

The MSI gt240 I picked up for $60 after rebate a few weeks ago is also factory overclocked at the same 550/1800. I have not tried to overclock it further. The gt240 is compute architecture 1.2 so gets the 40% architecture bonus over compute architecture 1.1 cards like the 8800gt. They are good little cards, probably the best value for gpugrid.

As has been mentioned, make sure you get the gddr5 version and avoid the gddr3 version.

Also, the cards lack a PCIe power plug. This is both a plus and a minus. The plus is you don't need to wire it for power. The bad news is it gets all its power from the motherboard. Be careful that your motherboard can feed multiple PCIe slots at near full load. I seem to recall the PCI spec saying you can feed 75 watts from a single PCIe slot and 200w total. You should be ok with three if your board is built to spec, your power supply has enough capacity on +12v1 and my memory is correct.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16180 - Posted: 5 Apr 2010 | 22:34:54 UTC - in response to Message 16170.

but I think it must much more slower than my old 8800 GT


It's much faster due to the newer architecture (CUDA compute capability 1.2)

MrS
____________
Scanning for our furry friends since Jan 2002

Betting Slip
Send message
Joined: 5 Jan 09
Posts: 670
Credit: 2,498,095,550
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16187 - Posted: 6 Apr 2010 | 9:30:55 UTC - in response to Message 16178.
Last modified: 6 Apr 2010 | 9:42:21 UTC

As has been mentioned, make sure you get the gddr5 version and avoid the gddr3 version.



Why is everyone against DDR3 version of this card for GPUGRID?

Take a look at my machines 3 are DDR3 and 1 is DDR5 (QUAD CPU)

2 are OCed 1 DDR3 as GPUZ 630 core 740 Memory and 1580 shaders (6300 Core2) and same for (QUAD) with DDR5 except memory is 2000 as per GPUZ.
The other 2 are stock and DDR3. Don't take their RAC as proof of anything as they're all used differently.
____________
Radio Caroline, the world's most famous offshore pirate radio station.
Great music since April 1964. Support Radio Caroline Team -
Radio Caroline

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16189 - Posted: 6 Apr 2010 | 13:25:46 UTC - in response to Message 16187.

Any GT240 is a good card for crunching here, but it is about their relative values.
If you can get a GT240 DDR3 for 20 or 30% less than a GT240 GDDR5 then it’s a good deal. If there is not much price difference then the GDDR5 card would be the one to go for. But some of the cards are better designed than other, so that has to be considered too.
The best designed card I have seen uses DDR3; it is a Gigabyte GBG016. It has a Large fan compared to many cards and the GPU is steady at 640MHz. It shipped with a Factory Overclocking of 600MHz (up from 550). I have not seen another GT240 reach that for GPU speed, but I am sure a few people will have managed it.

The Gigabyte's FOC offset much of the performance loss it would have by not using GDDR5, especially when compared to a native GDDR5 card. That said it is slightly slower than my overclocked GT240s with GDDR5.

It is presently very difficult to compare the cards when in different systems crunching one of many different tasks, and I think performance varies from WU to WU anyway.
All in I think we are only talking about 15% performance difference between native DDR3 and GDDR5 cards. Even a highly overclocked GDDR5 card might only be 30% faster than a natively clocked DDR3. In my opinion the top GDDR5 cards, that are 30% more expensive are not worth the money.

I picked up my last GT240 for £50 – a real bargain given that my first card was £65 and used DDR3.
It has GDDR5 and overclocks reasonably well (GPU 615MHz, GDDR5 1800MHz, Shaders 1498MHz).

Perhaps someday when I am really bored I will strip a GT240 and put silly heatsinks and fans on it (rather than the cheap tin that comes with most of them) just to see what can be got out of the card. I think that just using better adhesive and putting small heatsinks on the RAM could make a difference. In the past I found that positioning a 12cm case fan to blow directly onto a card made a lot of difference; temps dropped by over 10degC.
When I am overclocking in this way I try to get the temps down and then up the clocks until I come close to (but under) the previous temps, then tweak. For GPUGrid a good margin of error is required – There is no point overclocking a card by 15% only to find that 10% of tasks fail. You would be better off only overclocking to 10% and getting no failures.

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16190 - Posted: 6 Apr 2010 | 14:27:18 UTC - in response to Message 16187.
Last modified: 6 Apr 2010 | 14:27:39 UTC

Why is everyone against DDR3 version of this card for GPUGRID?

It's just that the GDDR5 version is faster. Simply look at similar WUs on your machines and you can see that your GDDR5 version is noticeably quicker at completing them.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16238 - Posted: 10 Apr 2010 | 19:45:02 UTC - in response to Message 16190.

http://www.gigabyte.com.tw/Products/VGA/Products_Overview.aspx?ProductID=3183

- a superclocked GTX260, 96MHz faster GPU core; performs as well as a GTX275!

I guess they have stopped manufacturing GTX275 and GTX285 GPU cores so there will be no such versions?

If it only uses the same power, then in terms of power usage vs performance it would be on a power with a GT240 (but crunch around 2.5 times as many tasks)!

Pity none of the manufacturers managed to use GDDR5 (to reduce the heat & power usage) - the GDDR5 prices are no longer too high. Such cards might have been useful over the last 6months, as would GTX275 and GTX285 versions!

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16240 - Posted: 10 Apr 2010 | 20:50:17 UTC - in response to Message 16238.

GT200 doesn't know what GDDR5 is ;)

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16242 - Posted: 11 Apr 2010 | 12:49:28 UTC - in response to Message 16240.

I thought GDDR5 might have been usable if it could mimic GDDR3; with 25% higher clocks, so the card would benefit from reduced heat & power usage?

But on second thoughts, even if GT200 could be introduced to GDDR5 (perhaps by GT215), the limited heat and power benefits might not be worth the effort - higher GDDR5 latency at only 125% GDDR3 rates would actually reduce performance. The latency is only outweighed at much higher rates.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16243 - Posted: 11 Apr 2010 | 13:14:50 UTC

NVidia would have to redesign the memory controller to support GDDR5. That's not something GPU manufacturers generally do, since they normally update their chips quite frequntly anyway (refresh 6 months, new 1 year .. slowing down recently). If they update the silicon and card design, they can just as well update the architecture.. which, for nVidia, is Fermi. And remember: Fermi was not intended to be hot, slow and late ;)

MrS
____________
Scanning for our furry friends since Jan 2002

MarkJ
Volunteer moderator
Volunteer tester
Send message
Joined: 24 Dec 08
Posts: 738
Credit: 200,909,904
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16249 - Posted: 12 Apr 2010 | 12:28:27 UTC

Well I ordered a few today. Not the overclocked version, but the normal (supposedly GDDR5) version. Hopefully the motherboard can cope with 3 in the one box.
____________
BOINC blog

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16285 - Posted: 13 Apr 2010 | 16:32:14 UTC - in response to Message 16249.
Last modified: 13 Apr 2010 | 16:34:00 UTC

What did you order and which board is it?
- I'm guessing you mean 3 GT240 cards.
Hope you read the Motherboard manual, and that you actually have three PCIE x16 slots.

MarkJ
Volunteer moderator
Volunteer tester
Send message
Joined: 24 Dec 08
Posts: 738
Credit: 200,909,904
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16296 - Posted: 14 Apr 2010 | 11:12:50 UTC - in response to Message 16285.

What did you order and which board is it?
- I'm guessing you mean 3 GT240 cards.
Hope you read the Motherboard manual, and that you actually have three PCIE x16 slots.


Yes 3 x GT240's. Evga part number 512-P3-1240-LR.

Its an ASUS P6T motherboard. It has 3 slots each double spaced. I gather they drop to x8 speed when there are more than 2 devices. Power supply is a 1000w Corsair.
____________
BOINC blog

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16298 - Posted: 14 Apr 2010 | 12:54:12 UTC - in response to Message 16296.
Last modified: 14 Apr 2010 | 12:55:26 UTC

Yes 3 x GT240's. Evga part number 512-P3-1240-LR.

Its an ASUS P6T motherboard. It has 3 slots each double spaced. I gather they drop to x8 speed when there are more than 2 devices. Power supply is a 1000w Corsair.

The drop to 8x speed has no consequence for DC. The 1000w power supply is WAY more than you need so you're good there. The 3 cards together won't draw much more than 200 watts. My problem in trying to put 3 GPUs on a board such as this (PCIe slots doublespaced) is that the fans are so close to the adjacent cards that airflow is dramatically reduced and the temps skyrocket. Of course waterblock cooling would solve the problem but that has it's own set of troubles. A few MBs have triple spaced slots but they're unusual. If anyone has a solution I'd appreciate knowing about it.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16300 - Posted: 14 Apr 2010 | 15:20:58 UTC - in response to Message 16296.

You should be OK with that setup.
I tried a board with 3 but the third PCIE dropped to 1X and prevented any from working. Two ran on the same board, but it beeped on restart every time and occasionally restarted itself (but it was an MSI K9A12 Plat board, 790FX & SB600)! I have two on a different system working well.
Good Luck,

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16303 - Posted: 14 Apr 2010 | 18:58:34 UTC - in response to Message 16300.

Beyond,
I noticed that the top card I have, that sits very close to a chipset heatsink, is about 10 degrees warmer than the lower card. Confirmed it is the heatsink/airflow by swapping the cards. Essentially the chipset heatsink is radiating heat onto the back of the card (GT240).

As the GT240 fan is on the front this cannot cool the back down, so the chipset is heating up the GPU from the rear.

So in one respect having a fan at the back of the GPU is therefore preferable, even if the fan is that of another GPU!

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16305 - Posted: 14 Apr 2010 | 22:15:08 UTC - in response to Message 16303.

Beyond,
I noticed that the top card I have, that sits very close to a chipset heatsink, is about 10 degrees warmer than the lower card. Confirmed it is the heatsink/airflow by swapping the cards. Essentially the chipset heatsink is radiating heat onto the back of the card (GT240).

As the GT240 fan is on the front this cannot cool the back down, so the chipset is heating up the GPU from the rear.

So in one respect having a fan at the back of the GPU is therefore preferable, even if the fan is that of another GPU!

Have you actually tried 3 cards in a system where the cards are double spaced? I have, and the temps in the middle and upper card increased dramatically. Removing the middle card dropped the temps to normal. I tried different cards of varied length and fan configurations to try to alleviate the problem, to no avail. What you're talking about is entirely different. It has nothing to do with my post / question.


Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16306 - Posted: 15 Apr 2010 | 10:17:50 UTC - in response to Message 16305.

As I said in my earlier post I did try 3 but it did not work.
My K9A2 can in theory take 4 PCIE cards, but the system struggled with two GT240s, never mind four. At the time I did not have four cards to play with, but I might give it a go sometime.
I have however had 2 GPU cards and another non-GPU card just underneath the lower GPU, and a GPU sandwiched between 2 other cards in a tight midi case.
The problem with three or more is likely to be the exhaust air being brought back around and over the cards, especially the middle card(s). This would occur when the case fans are pushing too much air over the cards and the cards cannot expel all that air, so this warm air is brought back over the cards.
I alleviated this by reversing the airflow on a side case fan (turning it around, so that it blows air out from the GPUs), after installing a Big front case fan. I also found that leaving back plates off helps the air flow hugely. When I had my power greedy HD450 sandwiched between a sound card and a SATA RAID card in a midi case, I had to put a large fan blowing in from the front, and remove the back plates just to get it stable. The case had no side fan, and just a standard front fan was not enough to run the card stably. When I left the door off the temps plummeted, but it was too noisy so I took the back blanking plates off (a happy medium).

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16314 - Posted: 15 Apr 2010 | 20:17:56 UTC - in response to Message 16306.

Tried to put 4 GT240 cards into my K9A12 Motherboard.
System boots, Boinc Loads but only sees 2 cards. GPU-Z can see all 4 cards as can EVGA Precision and the system, just not Boinc!

Tried latest drivers and Boinc Beta.
Edited cc_config to use all GPUs.

Still only sees 2 of the 4 cards!

Anyone?

15/04/2010 21:05:28 Starting BOINC client version 6.10.45 for windows_x86_64
15/04/2010 21:05:28 Config: report completed tasks immediately
15/04/2010 21:05:28 Config: use all coprocessors
15/04/2010 21:05:28 log flags: file_xfer, sched_ops, task
15/04/2010 21:05:28 Libraries: libcurl/7.19.7 OpenSSL/0.9.8l zlib/1.2.3
15/04/2010 21:05:28 Data directory: C:\ProgramData\BOINC
15/04/2010 21:05:28 Running under account X
15/04/2010 21:05:28 Processor: 4 AuthenticAMD AMD Phenom(tm) II X4 940 Processor [Family 16 Model 4 Stepping 2]
15/04/2010 21:05:28 Processor: 512.00 KB cache
15/04/2010 21:05:28 Processor features: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 htt pni cx16 syscall nx lm svm sse4a osvw ibs skinit wdt page1gb rdtscp 3dnowext 3dnow
15/04/2010 21:05:28 OS: Microsoft Windows Vista: Ultimate x64 Edition, Service Pack 1, (06.00.6001.00)
15/04/2010 21:05:28 Memory: 4.00 GB physical, 8.21 GB virtual
15/04/2010 21:05:28 Disk: 931.51 GB total, 700.11 GB free
15/04/2010 21:05:28 Local time is UTC +1 hours
15/04/2010 21:05:28 NVIDIA GPU 0: GeForce GT 240 (driver version 19745, CUDA version 3000, compute capability 1.2, 475MB, 257 GFLOPS peak)
15/04/2010 21:05:28 NVIDIA GPU 1: GeForce GT 240 (driver version 19745, CUDA version 3000, compute capability 1.2, 475MB, 257 GFLOPS peak)

15/04/2010 21:12:16 Re-reading cc_config.xml
15/04/2010 21:12:16 Re-read config file
15/04/2010 21:12:16 Config: report completed tasks immediately
15/04/2010 21:12:16 Config: use all coprocessors
15/04/2010 21:12:16 log flags: file_xfer, sched_ops, task

<cc_config>
<options>
<report_results_immediately>1</report_results_immediately>
<ncpus>0</ncpus>
<use_all_gpus>1</use_all_gpus>
</options>
</cc_config>

Thanks,

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16321 - Posted: 16 Apr 2010 | 9:41:05 UTC - in response to Message 16314.

Removed 1 card, leaving 3 but Boinc only sees 2. The system sees 3 as does GPUZ and EVGA Precision and NVidia control panel.

As it was doing nothing I removed the 3rd card as well.

Pity, 4 GT240 cards would do the same work as a GTX295.

Any ideas and I can put them back in.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16411 - Posted: 18 Apr 2010 | 18:20:49 UTC - in response to Message 16321.

Anyone?

Betting Slip
Send message
Joined: 5 Jan 09
Posts: 670
Credit: 2,498,095,550
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16413 - Posted: 18 Apr 2010 | 20:08:31 UTC - in response to Message 16411.

Have you tried putting a dummy load on two cards. It's all I can think of, sorry.
____________
Radio Caroline, the world's most famous offshore pirate radio station.
Great music since April 1964. Support Radio Caroline Team -
Radio Caroline

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16414 - Posted: 18 Apr 2010 | 20:48:33 UTC - in response to Message 16413.

I don't really know what you mean, dummy load?
Two cards presently work, but when I add a third or fourth they are not seen by Boinc.
Thanks,

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16416 - Posted: 18 Apr 2010 | 23:02:26 UTC - in response to Message 16414.

I don't really know what you mean, dummy load?

You need to either attach them to monitors or use a dummy plug. Some KVMs also work. If they're not attached to something that looks remotely like a monitor they won't crunch WUs. Seems like a strange requirement, but true.


Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16433 - Posted: 19 Apr 2010 | 15:47:01 UTC - in response to Message 16416.

I will have another play about with it. Fortunately I have 2 KVM's to try. I dont have a dummy VGA/terminator, though I might be able to make one.
Reminds me of having to set up a dummy network port or bus networking.
Odd that the 2nd card works without any dummy connector.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16445 - Posted: 19 Apr 2010 | 23:15:49 UTC - in response to Message 16433.

Tried the KVM switches (both), but no joy.
The computer (device manager) can see all 4 cards and that they have the same driver. GPUZ can see them as can NVidia Control Panel, but Boinc is blind.

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16446 - Posted: 20 Apr 2010 | 1:42:40 UTC - in response to Message 16445.

Tried the KVM switches (both), but no joy.
The computer (device manager) can see all 4 cards and that they have the same driver. GPUZ can see them as can NVidia Control Panel, but Boinc is blind.

When you right click on the desktop and look at the display are they all shown as active?

BOINC will not see them if they are not "active"... not only do you need the dummy or KVM, but need to make sure windows "extends" the desktop to all 4 displays. If you don't like the mouse disappearing (I just learned this trick) move the second window up so that only the corner is touching display 1... (or down) ...

When I had installed one secondary GPU I had this same problem, I could "see" the card, windows said it was working, so did the display panels and tools ... but BOINC is far more picky in the sense that ALL the ducks have to be in line before you can use the card.

Oh, and if they are all not identical you have to set the switch in cc config to use all GPUs or DA's smart code will lock some of them out ...

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16449 - Posted: 20 Apr 2010 | 8:57:16 UTC - in response to Message 16446.
Last modified: 20 Apr 2010 | 9:32:43 UTC

Thanks Paul, I forgot about extending the monitors after I wired up the KVM.
Boinc is now able to see 3 of the 4 cards; an improvement.

I use Vista, so for anyone else with this problem that has Vista/Win7,
Right Click on the desktop, Personalize, Display Settings, then extend your monitors to the other cards.

I have 7 cards listed here! (Device Manager correctly shows 4 GT240's.
In Display Settings there are,
Four, (Default Monitor) on NVIDIA GeForce GT 240,
and three, Generic non plug and play Monitor on NVIDIA GeForce GT 240

So, to get the fourth card working...

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16458 - Posted: 20 Apr 2010 | 11:53:59 UTC

Ok, did you update cc config?

<use_all_gpus>1</use_all_gpus>

IIRC to get this flag to work you have to stop and restart BOINC...

What does BOINC show in the log for the GPU messages?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16459 - Posted: 20 Apr 2010 | 12:35:29 UTC - in response to Message 16458.

I uninstalled the drivers, restarted and reinstalled the drivers, restarted again, and then looked at the monitor extentions. Initially I could see all four cards in the list but when I extended the monitor one of the listed devices vanished. I think this suggests that one of my connections is not working, so I probably need to get a different video cable or get a dummy VGA plug.

I did restart Boinc again.

20/04/2010 11:48:51 Starting BOINC client version 6.10.45 for windows_x86_64
20/04/2010 11:48:51 Config: report completed tasks immediately
20/04/2010 11:48:51 Config: use all coprocessors
20/04/2010 11:48:51 log flags: file_xfer, sched_ops, task
20/04/2010 11:48:51 Libraries: libcurl/7.19.7 OpenSSL/0.9.8l zlib/1.2.3
20/04/2010 11:48:51 Data directory: C:\ProgramData\BOINC
20/04/2010 11:48:51 Running under account X
20/04/2010 11:48:51 Processor: 4 AuthenticAMD AMD Phenom(tm) II X4 940 Processor [Family 16 Model 4 Stepping 2]
20/04/2010 11:48:51 Processor: 512.00 KB cache
20/04/2010 11:48:51 Processor features: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 htt pni cx16 syscall nx lm svm sse4a osvw ibs skinit wdt page1gb rdtscp 3dnowext 3dnow
20/04/2010 11:48:51 OS: Microsoft Windows Vista: Ultimate x64 Edition, Service Pack 1, (06.00.6001.00)
20/04/2010 11:48:51 Memory: 4.00 GB physical, 8.17 GB virtual
20/04/2010 11:48:51 Disk: 931.51 GB total, 677.85 GB free
20/04/2010 11:48:51 Local time is UTC +1 hours
20/04/2010 11:48:51 NVIDIA GPU 0: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 257 GFLOPS peak)
20/04/2010 11:48:51 NVIDIA GPU 1: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 257 GFLOPS peak)
20/04/2010 11:48:51 NVIDIA GPU 2: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 257 GFLOPS peak)


20/04/2010 13:27:12 Re-reading cc_config.xml
20/04/2010 13:27:12 Re-read config file
20/04/2010 13:27:12 Config: report completed tasks immediately
20/04/2010 13:27:12 Config: use all coprocessors


Thanks,

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16461 - Posted: 20 Apr 2010 | 13:33:47 UTC

All you really need is 3 each 75 Ohm resistors per dummy cnx. You can make a dummy plug out of the DVI to VGA connector included with the video cards. There are articles on the net on the pin out. For each resistor, bend one of the leads down so both come out on the same side, clip them off even to start, find the depth of the connector and re-clip so the resistor is just above the connector when inserted.

Insert the resistor so that the body is on the "top"/wide part of the connector because that is the "hot" side, the middle pins are ground ...

Insert plugs, restart, and with luck the last one comes alive ...

I have to admit I have not tried the 4 GPU configs yet, mostly because I don't have a 4 GPU MB (yet) ... I hope in a month or so to be there ... but not yet ... :(

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16464 - Posted: 20 Apr 2010 | 16:03:30 UTC - in response to Message 16461.

Finally, another cable brought Success!

20/04/2010 16:45:05 Starting BOINC client version 6.10.45 for windows_x86_64
20/04/2010 16:45:05 Config: report completed tasks immediately
20/04/2010 16:45:05 Config: use all coprocessors
20/04/2010 16:45:05 log flags: file_xfer, sched_ops, task
20/04/2010 16:45:05 Libraries: libcurl/7.19.7 OpenSSL/0.9.8l zlib/1.2.3
20/04/2010 16:45:05 Data directory: C:\ProgramData\BOINC
20/04/2010 16:45:05 Running under account X
20/04/2010 16:45:05 Processor: 4 AuthenticAMD AMD Phenom(tm) II X4 940 Processor [Family 16 Model 4 Stepping 2]
20/04/2010 16:45:05 Processor: 512.00 KB cache
20/04/2010 16:45:05 Processor features: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 htt pni cx16 syscall nx lm svm sse4a osvw ibs skinit wdt page1gb rdtscp 3dnowext 3dnow
20/04/2010 16:45:05 OS: Microsoft Windows Vista: Ultimate x64 Edition, Service Pack 1, (06.00.6001.00)
20/04/2010 16:45:05 Memory: 4.00 GB physical, 8.17 GB virtual
20/04/2010 16:45:05 Disk: 931.51 GB total, 677.68 GB free
20/04/2010 16:45:05 Local time is UTC +1 hours
20/04/2010 16:45:05 NVIDIA GPU 0: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 281 GFLOPS peak)
20/04/2010 16:45:05 NVIDIA GPU 1: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 281 GFLOPS peak)
20/04/2010 16:45:05 NVIDIA GPU 2: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 281 GFLOPS peak)
20/04/2010 16:45:05 NVIDIA GPU 3: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 281 GFLOPS peak)

Hardly superclocked (from 550MHz to 600MHz, with linked shaders and modest GDDR increases, 1800MHz), but I think it's fair to say the GT 240 has now been tried ;)

Thanks guys,

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16466 - Posted: 20 Apr 2010 | 19:39:00 UTC - in response to Message 16464.

I suppose you can crank up the shaders much more than the core and GPU-Grid reacts well to shader clock :)

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16467 - Posted: 20 Apr 2010 | 20:03:11 UTC - in response to Message 16466.

Thanks for the tip.
I will leave them as is for today, and have a go at upping the shaders tomorrow, after I pass my milestone :)
Although the case door is off, temps are between 54 and 64 deg C, so I should have a bit of leeway yet. The door also has a fan, in a very nice place and I can add another fan to the front of the Antec, nice and low, just opposite the GPUs.
On single card systems and dual card systems I can usually get slightly over 600MHz for these cores with the shaders tied; about 615MHz or 620MHz (though some can reach 640MHz) so if the shaders clock well then perhaps they will go past 1550MHz with the core kept at 600MHz?

I was actually considering changing that system (not the case), no chance of that now!

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16483 - Posted: 21 Apr 2010 | 21:46:28 UTC - in response to Message 16467.

I wouldn't be surprised if your shaders hit 1.7 GHz :)

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16485 - Posted: 21 Apr 2010 | 23:51:34 UTC - in response to Message 16483.

For now I am keeping the shaders at a slightly more modest 1.6GHz on the system with the four GTX240 cards.

I have other systems to test on, but when I tried a DDR3 Gigabyte GT240 I could not hit 1.7 without an almost immediate failure. Mind you the GPU on that card is at 640MHz and the Shaders now seem stable at 1.65. Not bad considering the NVidia ref. rates are 550MHz and 1340MHz.

The real test is to run the cards without any task failures.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16487 - Posted: 22 Apr 2010 | 8:02:08 UTC - in response to Message 16485.

- Typo, should read GT 240 (no X).

The 4 cards ran overnight without error :)
GPU 600MHz, Shaders 1600MHz.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16489 - Posted: 22 Apr 2010 | 9:59:58 UTC

I've got a 8600GT which runs at 540 / 1180 MHz default. It's a 65 nm chip and easily reaches about 650 / 1700 MHz :p

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16496 - Posted: 22 Apr 2010 | 14:13:13 UTC - in response to Message 16489.

Alas the Gigabyte GT 240 failed a task when the shaders were at 1650MHz, and after 6h! The GPU was only 51 deg C, so I think the overall card cooling (capacitors and smaller chips) might have been the issue. The system was running 4 CPU tasks, so I'm sure that would not have helped; the case temps would have been a fair bit warmer as a result. I think I will leave all the cards at 1625MHz for now, to see how they get on for a few days. I may have a go at a different card in another system when I get the time.

On your 8600GT comparison,
You are comparing a seasoned GPU with 32 shaders to that of one with 96 shaders.
Although the 8600GT has a default clock of 540MHz with shaders at 1180, its big brother, the 8600GTS, sports a 675MHz GPU and 1450MHz shaders; so there was lots of potential there!
Was it stable crunching over long periods of time at 1.7GHz?
Did you use non-standard cooling?

If my 1625MHz shader rate stands the test of time, that is still a 21% increase :)

MarkJ
Volunteer moderator
Volunteer tester
Send message
Joined: 24 Dec 08
Posts: 738
Credit: 200,909,904
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16515 - Posted: 24 Apr 2010 | 8:28:16 UTC
Last modified: 24 Apr 2010 | 8:31:00 UTC

Well machine is finally back home. Its got 3 x GT240's in it from evga. Only two recognised by Win 7 initially until I shoved a spare KVM cable into the back of number 3.

I asked the computer shop if they have any dummy vga plugs, but they have never heard of them. A quick google and while there are plans on how to make them on the net, I can't find anyone selling them in Australia.

Anyway its going for now after reinstalling Windows and setting it up the way I have all the others. I'll post some pics to the blog later.
____________
BOINC blog

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16523 - Posted: 24 Apr 2010 | 17:34:53 UTC - in response to Message 16515.

I asked the computer shop if they have any dummy vga plugs, but they have never heard of them. A quick google and while there are plans on how to make them on the net, I can't find anyone selling them in Australia.

If you can get 75 Ohm resistors (I got 25 for $2.00 US) 1/4 watt are fine you can bend one of the leads over (so they both stick out the same way), clip the leads to the same length about 3/8" from the bottom of the resistor, and then plug them into the DVI to VGA adapter that likely came with the cards (unless they were OEM cards).

The exposed lead should be into the center line pins (the grounds) though the likelihood of a "short" is small, no need to take a risk not needed ...

If you use 1/2 watt resistors you can make it work and the leads seat better, but the bodies of the resistors are larger and you have to kinda cram them in and they don't fit well ...

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16540 - Posted: 25 Apr 2010 | 18:48:43 UTC

Temperature is fine, as is yours. It's actually happily running Collatz.
And, sure, it's a smaller chip.. but that doesn't really matter. What's important is the design (pretty similar), the process node (65 vs 40 nm) and the chip voltage. NVidia appears to be very generous here on the 860GT, it could have saved ~5W under load (with lower voltage) and probably still reached similar yield. On GT240 they probably didn't "overshot" that much with voltage, because here the power consumption matters: they need to stay below the limit of the PCIe slot.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16555 - Posted: 26 Apr 2010 | 12:15:13 UTC - in response to Message 16540.

For now I'm reasonably happy with what I am getting out of these cards.
As you suggest, I suspect I am being limited by the voltage, but measuring GPU heat does not tell me anything about board temperatures so perhaps if I can reduce the heat the cards can be tweeked slightly better.

I do have one open system that I will play with further. First would like to put heat spreaders onto the RAM, but this may not be physically possible, due to the GPU heatsink. I still have to add a system fan and possibly another fan towards the back of the card, or onto the motherboards chipset heatsink, as this would be radiating heat onto the card. I will keep an eye on the temperatures for improvement. If they drop I will try to up the clocks again and test for stability.

After that I may turn the voltage up ever so slightly, and see if I can get some more from the cards shaders, but there is no way I am going to up the voltage on the 4 cards in the same system - that's just asking for trouble!

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16570 - Posted: 26 Apr 2010 | 20:50:49 UTC - in response to Message 16555.

Don't expect much from lowering the temps - it does not affect maximum clock speed much and your temperature is already low, i.e. there's not much room for improvement anyway. Similar for RAM heat sinks - it's seldom they lead to any measureable improvements (logic: if they needed cooling, they'd already have it). Improved case airflow never hurts, though!

And strictly speaking: if 100% GPU fan speed keeps the chip temperature below 90°C any GPU OC is voltage limited ;)
But that doesn't mean I'd suggest increasing voltages, as it also reduces lifetime. Not sure what I'd do if I could increase it via software on my cards..

but there is no way I am going to up the voltage on the 4 cards in the same system - that's just asking for trouble!


Definitely agreed!

MrS
____________
Scanning for our furry friends since Jan 2002

Post to thread

Message boards : Graphics cards (GPUs) : Anyone tried the superclocked GT240?

//