GTX 750Ti Questions

Message boards : Graphics cards (GPUs) : GTX 750Ti Questions
Message board moderation

To post messages, you must log in.

Previous · 1 · 2

AuthorMessage
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38344 - Posted: 7 Oct 2014, 13:59:07 UTC - in response to Message 38334.  


My factory OCed models (PNY & EVGA) are mostly running at 1307 - 1320 MHz core

Using Precision X 16, and noting the very useful "how to" in the fourth post of this link, I gradually increased the GPU clock offset till I got to +169, which gave me 1305 Mhz, for a temperature increase of just 3C plus a little more fan noise. For 3+ hours it ran and looked like it would complete in about 10 hours but it errored. I dropped the offset by 13 and it's off again.

I do bump the memory speed to 6000 MHz on the EVGAs

Now I need some help please. Precision X 16 does have a mem clock offset slider but it does not display a value. What can I use to see this value while adjusting?

And I just discovered a major problem with Precision X 16: it does not remember settings across boots!!
ID: 38344 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38345 - Posted: 7 Oct 2014, 14:15:12 UTC - in response to Message 38344.  

And I just discovered a major problem with Precision X 16: it does not remember settings across boots!!

Found it!!!
ID: 38345 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38348 - Posted: 7 Oct 2014, 14:53:28 UTC - in response to Message 38344.  

For 3+ hours it ran and looked like it would complete in about 10 hours but it errored.

Oops. It was not the 750Ti that errored! Seems that as I was upping the GPU clock offset, and doing lots of BOINC suspend/resume, the WU was given to the 770, and it was the 770 that errored! See here.
ID: 38348 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38352 - Posted: 7 Oct 2014, 18:50:28 UTC - in response to Message 38344.  

Now I need some help please. Precision X 16 does have a mem clock offset slider but it does not display a value. What can I use to see this value while adjusting?

Ah! Now I see it:



But the slider goes nowhere near your 6000 MHz. The max is 4752 MHz...
ID: 38352 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38353 - Posted: 7 Oct 2014, 19:05:16 UTC - in response to Message 38352.  
Last modified: 7 Oct 2014, 19:13:31 UTC

Now I need some help please. Precision X 16 does have a mem clock offset slider but it does not display a value. What can I use to see this value while adjusting?

Ah! Now I see it: But the slider goes nowhere near your 6000 MHz. The max is 4752 MHz...

In Precision it reports the memory speed as 3000 instead of 6000, just a different way of looking at it. DDR: double data rate:

http://en.wikipedia.org/wiki/Double_data_rate
ID: 38353 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38366 - Posted: 8 Oct 2014, 13:58:15 UTC - in response to Message 38353.  

Now I need some help please. Precision X 16 does have a mem clock offset slider but it does not display a value. What can I use to see this value while adjusting?

Ah! Now I see it: But the slider goes nowhere near your 6000 MHz. The max is 4752 MHz...

In Precision it reports the memory speed as 3000 instead of 6000, just a different way of looking at it. DDR: double data rate:

Thanks, Beyond, for putting me right again!

The first WU with GPU clock at 1305 MHz came in at 98.5% of a 660. The current WU also has mem clock at 3000/6000. Hoping to beat a 660!

ID: 38366 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38372 - Posted: 8 Oct 2014, 21:24:45 UTC

Hi Tomba,

you're doing well for your first attempts, and the GTX750/Ti is the perfect card for this. I suppose 1305 MHz core clock is the highest you can set? It's usual that GTX750Ti is limited by the maximum value allowed via software - most chips could go even higher. It's a setting in the card's BIOS, so changing this is only for advanced users.

Regarding the memory: 3000 / 6000 MHz should still be reasonable, but as others have said: the memopry clock is not all that important for this card. And memory errors are hard to detect, as sometimes the data transfer rate will actually slow down instead of increasing with higher clocks (because the ECC engine of the memory bus detects and corrects the errors). I'd first find the optimal core clock (likely the maximum one) and let it run for ~1 week like this. Afterwards you know the card well enough to increase the memory clock and watch for anything suspecious.

MrS
Scanning for our furry friends since Jan 2002
ID: 38372 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38379 - Posted: 9 Oct 2014, 18:31:41 UTC - in response to Message 38372.  

I suppose 1305 MHz core clock is the highest you can set?

I just upped it another 13MHx. EVGA Precision now switches every second or so between 1318, 1305, 1280, 1311 and 1305. Should I go for more?? I don't want to fry it!!

Regarding the memory: 3000 / 6000 MHz should still be reasonable

Beyond tells us that's the PNY factory clocked number, and that he's applied it to the EVGA 750Ti, so that's what I've gone with. We shall see...

Happy to have reached a milestone: 1M credits in one day :)
ID: 38379 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38380 - Posted: 9 Oct 2014, 18:56:15 UTC - in response to Message 38379.  

I just upped it another 13MHx. EVGA Precision now switches every second or so between 1318, 1305, 1280, 1311 and 1305. Should I go for more?? I don't want to fry it!!

You are hitting either the power or the temperature limit. How hot is the chip? If it'S too hot you can use Precision to increase the fan speed a bit, until either the noise becomes unpleasant or the clock stabilizes. If it's the power limit (I suspect it will) you could increae it by a few %, depending on the card's bios (precision won't let you set more). To play it completely safe, don't do this. Realistically speaking, tough, it's no problem: your card is set to 60 W maximum by default, whereas the PCIe slot alone has to be able to deliver 75 W. I think you can add at most ~6% power, or maybe nothing at all.

If it's the power limit you should be able to push the card higher by setting even higher clock offsets. The card may not immediately switch to the higher maximum boost bin, but the voltage for the lower clock speeds will be lower, and hence the card will consume less power running them. Hence it should be able to clock higher on average, given the same power budget. Have fun :D

MrS
Scanning for our furry friends since Jan 2002
ID: 38380 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38483 - Posted: 14 Oct 2014, 3:20:59 UTC - in response to Message 38338.  

Suprisingly I have found mem OC on the 750ti to have negligable impact (due to the large 2MB cache) - was only something in the order of 120sec per 100Mhz increase for long runs of 45k secs.

Thought I'd give this another look. On an EVGA ACX OC 750Ti going from the stock 2700MHz memory timing to 3000MHz consistently dropped the SDOERR_BARNAS WU times from ~12:28 to ~12:13, anyway on average a 15 minute drop or 2%. Didn't measure power draw but fan speed and GPU temp increased 1% and 1 degree C respectively. No big deal.
ID: 38483 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38553 - Posted: 16 Oct 2014, 21:56:49 UTC - in response to Message 38483.  
Last modified: 16 Oct 2014, 21:57:57 UTC

If you OC you should see some improvement unless you are encountering recoverable errors. With the GTX750Ti there isn't a Memory Controller bottleneck so your probably not going to get much out of the GDDR5. Its likely you would see a greater improvement from OC'ing the GPU core.
For cards with Memory Controller bottlenecks it's the exact opposite.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 38553 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Trotador

Send message
Joined: 25 Mar 12
Posts: 103
Credit: 14,948,929,771
RAC: 13
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38564 - Posted: 18 Oct 2014, 13:07:09 UTC

A truly 750Ti lover :)

http://www.gpugrid.net/show_user.php?userid=101769

ID: 38564 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 38575 - Posted: 20 Oct 2014, 6:51:22 UTC - in response to Message 38564.  

A truly 750Ti lover :)

http://www.gpugrid.net/show_user.php?userid=101769

Blimey - 25!!

What mobo lets you run seven double-wide GPUs??
ID: 38575 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
eXaPower

Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 38579 - Posted: 20 Oct 2014, 8:25:07 UTC - in response to Message 38575.  
Last modified: 20 Oct 2014, 8:25:22 UTC

A truly 750Ti lover :)

http://www.gpugrid.net/show_user.php?userid=101769

Blimey - 25!!

What mobo lets you run seven double-wide GPUs??



Asrock H61 Pro BTC was used in this six GPU set up.


http://www.geeks3d.com/20140502/geforce-gtx-titan-vs-a-render-farm-with-six-gtx-750/
ID: 38579 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2

Message boards : Graphics cards (GPUs) : GTX 750Ti Questions

©2025 Universitat Pompeu Fabra