Anyone tried the superclocked GT240?

Message boards : Graphics cards (GPUs) : Anyone tried the superclocked GT240?
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3

AuthorMessage
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16489 - Posted: 22 Apr 2010, 9:59:58 UTC

I've got a 8600GT which runs at 540 / 1180 MHz default. It's a 65 nm chip and easily reaches about 650 / 1700 MHz :p

MrS
Scanning for our furry friends since Jan 2002
ID: 16489 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16496 - Posted: 22 Apr 2010, 14:13:13 UTC - in response to Message 16489.  

Alas the Gigabyte GT 240 failed a task when the shaders were at 1650MHz, and after 6h! The GPU was only 51 deg C, so I think the overall card cooling (capacitors and smaller chips) might have been the issue. The system was running 4 CPU tasks, so I'm sure that would not have helped; the case temps would have been a fair bit warmer as a result. I think I will leave all the cards at 1625MHz for now, to see how they get on for a few days. I may have a go at a different card in another system when I get the time.

On your 8600GT comparison,
You are comparing a seasoned GPU with 32 shaders to that of one with 96 shaders.
Although the 8600GT has a default clock of 540MHz with shaders at 1180, its big brother, the 8600GTS, sports a 675MHz GPU and 1450MHz shaders; so there was lots of potential there!
Was it stable crunching over long periods of time at 1.7GHz?
Did you use non-standard cooling?

If my 1625MHz shader rate stands the test of time, that is still a 21% increase :)
ID: 16496 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MarkJ
Volunteer moderator
Volunteer tester

Send message
Joined: 24 Dec 08
Posts: 738
Credit: 200,909,904
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16515 - Posted: 24 Apr 2010, 8:28:16 UTC
Last modified: 24 Apr 2010, 8:31:00 UTC

Well machine is finally back home. Its got 3 x GT240's in it from evga. Only two recognised by Win 7 initially until I shoved a spare KVM cable into the back of number 3.

I asked the computer shop if they have any dummy vga plugs, but they have never heard of them. A quick google and while there are plans on how to make them on the net, I can't find anyone selling them in Australia.

Anyway its going for now after reinstalling Windows and setting it up the way I have all the others. I'll post some pics to the blog later.
BOINC blog
ID: 16515 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Paul D. Buck

Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16523 - Posted: 24 Apr 2010, 17:34:53 UTC - in response to Message 16515.  

I asked the computer shop if they have any dummy vga plugs, but they have never heard of them. A quick google and while there are plans on how to make them on the net, I can't find anyone selling them in Australia.

If you can get 75 Ohm resistors (I got 25 for $2.00 US) 1/4 watt are fine you can bend one of the leads over (so they both stick out the same way), clip the leads to the same length about 3/8" from the bottom of the resistor, and then plug them into the DVI to VGA adapter that likely came with the cards (unless they were OEM cards).

The exposed lead should be into the center line pins (the grounds) though the likelihood of a "short" is small, no need to take a risk not needed ...

If you use 1/2 watt resistors you can make it work and the leads seat better, but the bodies of the resistors are larger and you have to kinda cram them in and they don't fit well ...
ID: 16523 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16540 - Posted: 25 Apr 2010, 18:48:43 UTC

Temperature is fine, as is yours. It's actually happily running Collatz.
And, sure, it's a smaller chip.. but that doesn't really matter. What's important is the design (pretty similar), the process node (65 vs 40 nm) and the chip voltage. NVidia appears to be very generous here on the 860GT, it could have saved ~5W under load (with lower voltage) and probably still reached similar yield. On GT240 they probably didn't "overshot" that much with voltage, because here the power consumption matters: they need to stay below the limit of the PCIe slot.

MrS
Scanning for our furry friends since Jan 2002
ID: 16540 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16555 - Posted: 26 Apr 2010, 12:15:13 UTC - in response to Message 16540.  

For now I'm reasonably happy with what I am getting out of these cards.
As you suggest, I suspect I am being limited by the voltage, but measuring GPU heat does not tell me anything about board temperatures so perhaps if I can reduce the heat the cards can be tweeked slightly better.

I do have one open system that I will play with further. First would like to put heat spreaders onto the RAM, but this may not be physically possible, due to the GPU heatsink. I still have to add a system fan and possibly another fan towards the back of the card, or onto the motherboards chipset heatsink, as this would be radiating heat onto the card. I will keep an eye on the temperatures for improvement. If they drop I will try to up the clocks again and test for stability.

After that I may turn the voltage up ever so slightly, and see if I can get some more from the cards shaders, but there is no way I am going to up the voltage on the 4 cards in the same system - that's just asking for trouble!
ID: 16555 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16570 - Posted: 26 Apr 2010, 20:50:49 UTC - in response to Message 16555.  

Don't expect much from lowering the temps - it does not affect maximum clock speed much and your temperature is already low, i.e. there's not much room for improvement anyway. Similar for RAM heat sinks - it's seldom they lead to any measureable improvements (logic: if they needed cooling, they'd already have it). Improved case airflow never hurts, though!

And strictly speaking: if 100% GPU fan speed keeps the chip temperature below 90°C any GPU OC is voltage limited ;)
But that doesn't mean I'd suggest increasing voltages, as it also reduces lifetime. Not sure what I'd do if I could increase it via software on my cards..

but there is no way I am going to up the voltage on the 4 cards in the same system - that's just asking for trouble!


Definitely agreed!

MrS
Scanning for our furry friends since Jan 2002
ID: 16570 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3

Message boards : Graphics cards (GPUs) : Anyone tried the superclocked GT240?

©2026 Universitat Pompeu Fabra