Gigabyte GTX 780 Ti OC (Windforce 3x) problems

Message boards : Graphics cards (GPUs) : Gigabyte GTX 780 Ti OC (Windforce 3x) problems
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · Next

AuthorMessage
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34569 - Posted: 3 Jan 2014, 16:23:53 UTC - in response to Message 34565.  

From your perspective it is the design of the card and I agree with your perspective. If you ask the manufacturer and present all the evidence you have uncovered, their response might be like "It's a problem with the application, that card is for gaming applications where a few errors won't be noticed. It's not for data crunching that requires high precision and reliability."

I'm aware of (and accept) the manufacturer's perspective. However, none of my previous OC cards showed such flaw, including theirs. It's very strange, that the memory clock is the one which had to be reduced to the 77% of its original frequency to fix this problem. However it is much harder to make the RMA guys accept this error condition at the shop I've bought this card, since I'm sure that they are testing graphics cards only with games, and 3D accelerator tests (which show no problem at all).

Do you plan to RMA it?

No, as it is working now, and probably the replacement card would have the same flaw. It is a better option to sell this card to a gamer, and buy a different OC card (from a different manufacturer, or the new version of this card).

IIUC, you have had to downclock the memory below the frequency used on the standard model (not OC), yes?

Yes, since the OC and the non-OC card originally have the same memory frequency (3500MHz).
ID: 34569 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
288larsson

Send message
Joined: 15 Apr 10
Posts: 2
Credit: 1,584,667,975
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34805 - Posted: 24 Jan 2014, 16:28:16 UTC

hello Retvari Zoltan* thanks for the problem solution. Can run mine at 3100MHz
ID: 34805 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34810 - Posted: 24 Jan 2014, 23:34:46 UTC - in response to Message 34805.  

hello Retvari Zoltan* thanks for the problem solution.

You're welcome!
The thanks goes to skgiven as well.

Can run mine at 3100MHz

You've got more luck with your card than me with mine, but it could be because you run it under Win8.1. If you'd run it under WinXP, probably you should reduce a little further the memory clock frequency.
If I'll have some time and guts then I'll try to change the power buffering capacitors around the RAM chips for bigger capacity on my card.
ID: 34810 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
a1kabear

Send message
Joined: 19 Oct 13
Posts: 15
Credit: 578,770,199
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwat
Message 34811 - Posted: 25 Jan 2014, 1:27:16 UTC
Last modified: 25 Jan 2014, 1:27:43 UTC

I finally got around to flashing my card down to 2700 memory and now it works fine under linux. its a little slower than yours (about 1500 seconds) but I am also running worldcommunitygrid on every other thread and the ambient is about 25c here recently.

I am on linux with it.

so thanks for the solution :)
ID: 34811 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34830 - Posted: 27 Jan 2014, 1:00:32 UTC - in response to Message 34811.  
Last modified: 27 Jan 2014, 9:46:43 UTC

When you had you GPU stripped did you notice what type of GDDR5 your Gigabyte Windforce 3X card was using?
If it uses Hynix GDDR5 is should be R2C, but there are other types it might be; R0C, T2C, or T0C.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 34830 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34833 - Posted: 27 Jan 2014, 17:05:40 UTC - in response to Message 34830.  
Last modified: 27 Jan 2014, 17:13:09 UTC

When you had you GPU stripped did you notice what type of GDDR5 your Gigabyte Windforce 3X card was using?
If it uses Hynix GDDR5 is should be R2C, but there are other types it might be; R0C, T2C, or T0C.

It's using 12 pieces of Hynix H5GQ2H24AFA R2C. There are 6 groups of 2 (adjacent) RAM chips. The groups have a FET along with 5 resistors and 2 capacitors between the RAM chips they belong to. One of the capacitors is bigger. I suspect that either this bigger capacitor is not big enough for 2 RAM chips, or the capacitors around the whole memory array are not big enough for the array. It would be nice to have the electrical scheme of this board, or at least some recommended circuit diagram from Hynix.
ID: 34833 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34834 - Posted: 27 Jan 2014, 18:37:08 UTC - in response to Message 34833.  

Have you checked Gigabyte's website for an errata sheet and/or correction sheet dealing with this issue? I suspect many hundreds of their customers are having the same issue for exactly the same reason (wrong component or failed component such as a cap or resistor) and I would think by now Gigabyte is aware of the problem. If not then the more people who report it and the workaround (downclocking the memory) the sooner they will become aware they have a big problem and a potential big blow to their reputation and move on the issue.

BOINC <<--- credit whores, pedants, alien hunters
ID: 34834 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34849 - Posted: 30 Jan 2014, 21:55:49 UTC - in response to Message 34833.  
Last modified: 1 Feb 2014, 19:17:41 UTC

H5GQ2H24AFA R2C or,
H5GQ2H24AFR R2C ?

Assuming H5GQ2H24AFR R2C, these require 1.6V to support 3.5GHz.

Excluding the possibility of bad GDDR5 and bad circuitry (which we can do nothing about anyway), my guess is that the card isn't supplying the necessary 1.6V, and is either supplying 1.5V or 1.35V - possibly 1.5V for some people and 1.35V for others; with 1.35V perhaps being sufficient for 2.7GHz and 1.5V sufficient for 3.1GHz. This ties in with what has been reported here and suggests a firmware, driver or OS issue.
This might be related to the performance levels of the GPU (0, 1, 2, 3, 4) which might behave differently for GK110 than GK104.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 34849 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Gattorantolo [Ticino]
Avatar

Send message
Joined: 29 Dec 11
Posts: 44
Credit: 251,211,525
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwatwat
Message 34909 - Posted: 5 Feb 2014, 6:11:23 UTC - in response to Message 34849.  
Last modified: 5 Feb 2014, 6:12:18 UTC

How is possible to have crunching time like you Zoltan (about 16.000 sec.) on the last WU? I have 4 GPU like you, GTX780ti, and my crunching time is 24.000 sec. Why?
Member of Boinc Italy.
ID: 34909 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34911 - Posted: 5 Feb 2014, 11:56:26 UTC - in response to Message 34909.  
Last modified: 5 Feb 2014, 11:57:30 UTC

Your GTX780Ti temperatures look very cool - too cool. If you are not using water cooling, your GPU's may be downclocking.

Win XP vs Win8.1 - XP and Linux were ~12.5% faster for a GTX770, last time I looked, which might increase for a GTX780Ti.

48 CPU threads vs 8threads - less resource conflict, HT doesn't scale really well for some CPU/GPU project combinations.

i7-4770K CPU @ 3.50GHz vs E5-2695 @ 2.4GHz - 46% faster stock cores (and might be overclocked).

2 GPU's vs 4 GPU's - less demand on the PCIE and CPU, likely to run cooler and have higher clocks. Might be overclocked too.

Your use of CPU is somewhat unknown, as are your settings.

Probably has faster RAM too.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 34911 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34916 - Posted: 5 Feb 2014, 15:51:49 UTC - in response to Message 34909.  

How is possible to have crunching time like you Zoltan (about 16.000 sec.) on the last WU? I have 4 GPU like you, GTX780ti, and my crunching time is 24.000 sec. Why?

You have the same I have. To get Zoltan's time you need XP or Linux.
There is a thread about it: http://www.gpugrid.net/forum_thread.php?id=3580
Greetings from TJ
ID: 34916 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34918 - Posted: 5 Feb 2014, 16:22:37 UTC - in response to Message 34911.  
Last modified: 5 Feb 2014, 16:26:11 UTC

48 CPU threads vs 8threads - less resource conflict, HT doesn't scale really well for some CPU/GPU project combinations.


I gasped when I saw 48 too but it turns out his CPU has 12 real cores (24 virtual) and I suspect he might have 2 CPUs. Also, that's a socket 2011 CPU with 40 PCIe lanes so I doubt there is congestion on the PCIe bus unless his mobo is lane restricted.

BTW, I saw ads for that CPU... $2,500 US!!!

Gattorantolo... get with the penguin :-)
BOINC <<--- credit whores, pedants, alien hunters
ID: 34918 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34923 - Posted: 5 Feb 2014, 17:44:49 UTC - in response to Message 34918.  

48 CPU threads vs 8threads - less resource conflict, HT doesn't scale really well for some CPU/GPU project combinations.


I gasped when I saw 48 too but it turns out his CPU has 12 real cores (24 virtual) and I suspect he might have 2 CPUs. Also, that's a socket 2011 CPU with 40 PCIe lanes so I doubt there is congestion on the PCIe bus unless his mobo is lane restricted.

BTW, I saw ads for that CPU... $2,500 US!!!

Gattorantolo... get with the penguin :-)

The memory bandwidth could limit the performance of the GPU tasks, as this Xeon E5-2695v2 and E5-2697v2 processors are basically two i7-4960X processors within a single package: They have lowered clock frequency (for staying within 130W TDP), some (safety) features turned on, but they have only the same 4-channel memory interface as the i7-4960X capable of 59.7GB/s. So the two CPU chips inside the physical CPU sharing its memory interface (and its bandwidth).
If the two physical processors share this physical memory interface, this could reduce this bandwidth further. But I think that the two physical processors have separated physical memory, yet they can access each other's physical memory through QPI or somehow, but this method can also reduce the throughput of the memory interface. If CPU tasks running on all cores (virtual+real) of both physical CPUs, this impact could be big enough to reduce the performance of the GPU tasks by 10-20%. These hosts have Windows 8(.1) on them, which is not a server OS, so I'm sure it's not aware of aligning the memory allocation of an application to the CPU's physical memory it's running on. Even the application can be switched over to the other physical CPU, which is a time consuming process, and will force the CPU to handle the application's data transfer through (and with the help of) the other physical CPU. I think that only the Datacenter versions of the MS server OSes can handle this complex task. I don't know Linux so there maybe such edition of that OS also.
ID: 34923 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Gattorantolo [Ticino]
Avatar

Send message
Joined: 29 Dec 11
Posts: 44
Credit: 251,211,525
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwatwat
Message 34926 - Posted: 5 Feb 2014, 21:33:49 UTC - in response to Message 34911.  

Your GTX780Ti temperatures look very cool - too cool. If you are not using water cooling, your GPU's may be downclocking.

Water cooling of course :-)
Thank you for your help GPUGRID cruncher ;-), now i know the "Problem"!

Member of Boinc Italy.
ID: 34926 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34927 - Posted: 5 Feb 2014, 21:57:36 UTC - in response to Message 34923.  
Last modified: 5 Feb 2014, 22:01:43 UTC

Gattorantolo, what are your Boinc processor usage settings, GPU clocks, and the RAM frequency?

The PCIE controller is on-die but if there are only 40 PCIE lanes 'in total' that's 20 per CPU or at most 8 per GPU, and massive contention. How much contention there is could be assessed if all CPU tasks were suspended for a full GPUGrid run.

Generally, the best CPU's for GPUGrid crunching have enough cores to support the number of GPU's, and PCIE lanes, but also have high clocks (at least until Maxwell arrives). I expect a 4th generation i7 has something over the LGA2011 processors.

The 2695v2 has a Tdp of 115W, which is really sweet for CPU crunching.

As an aside, Linux scales VERY well, which is why it's used in data centers, including Microsoft's.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 34927 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34929 - Posted: 5 Feb 2014, 23:06:03 UTC - in response to Message 34927.  

As an aside, Linux scales VERY well, which is why it's used in data centers, including Microsoft's.


Oh quit pulling my leg.

BOINC <<--- credit whores, pedants, alien hunters
ID: 34929 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Jozef J

Send message
Joined: 7 Jun 12
Posts: 112
Credit: 1,140,895,172
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 34932 - Posted: 6 Feb 2014, 18:43:49 UTC

So I was able to complete this one task-gluilex2x33-NOELIA_DIPEPT1-0-2-RND9057_2
GPU [GeForce GTX 780 Ti] Platform [Windows] Rev [3203M] VERSION [55]
# SWAN Device 0 :
# Name : GeForce GTX 780 Ti
# ECC : Disabled
# Global mem : 3072MB
# Capability : 3.5
# PCI ID : 0000:01:00.0
# Device clock : 1084MHz
# Memory clock : 3500MHz
# Memory width : 384bit
# Driver version : r331_82 : 33221
# GPU 0 : 52C
# GPU 0 : 53C
# GPU 0 : 54C
# GPU 0 : 55C
# GPU 0 : 56C
# GPU 0 : 57C
# GPU 0 : 58C
# GPU 0 : 59C
# GPU 0 : 60C
# GPU 0 : 61C
# GPU 0 : 62C
# Time per step (avg over 12500000 steps): 1.657 ms
# Approximate elapsed time for entire WU: 20717.930 s
18:44:22 (7780): called boinc_finish

unfortunately unfinished right behind this successful
970x-SANTI_MAR420cap310-0-32-RND2207_1
<core_client_version>7.2.33</core_client_version>
<![CDATA[
<message>
(unknown error) - exit code -97 (0xffffff9f)
</message>
<stderr_txt>
# GPU [GeForce GTX 780 Ti] Platform [Windows] Rev [3203M] VERSION [55]
# SWAN Device 0 :
# Name : GeForce GTX 780 Ti
# ECC : Disabled
# Global mem : 3072MB
# Capability : 3.5
# PCI ID : 0000:01:00.0
# Device clock : 1084MHz
# Memory clock : 3500MHz
# Memory width : 384bit
# Driver version : r331_82 : 33221
# GPU 0 : 52C
# The simulation has become unstable. Terminating to avoid lock-up (1)

220x-SANTI_MAR423cap310-0-84-RND2313_1
(unknown error) - exit code -97 (0xffffff9f)

201x-SANTI_MARwtcap310-6-32-RND4585_0
(unknown error) - exit code -97 (0xffffff9f)

883x-SANTI_MARwtcap310-2-32-RND9284_0
same sh**

I can not download and try other tasks .. GPUGRID it blocked my client
Why this one task goes no problems ...??


ID: 34932 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34944 - Posted: 8 Feb 2014, 11:50:36 UTC - in response to Message 34932.  

The successful task was a NOELIA_DIPEPT Work Unit. Typically these WU's utilize the GPU to a lesser extent. The task is quite different than the SANTI_MAR tasks.

Can I suggest you shut down your system, start it up, and underclock your GPU. Start by underclocking the GDDR5 memory to 3000MHz. Should that fail try 2700MHz and then 2600MHz. If you are still unsuccessful try to reduce the GPU clocks too.

When you continuously fail tasks the server stops sending them. This is to protect the servers available Internet bandwidth - It's essential.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 34944 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
valterc

Send message
Joined: 21 Jun 10
Posts: 21
Credit: 10,863,142,443
RAC: 2,984
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 35031 - Posted: 14 Feb 2014, 10:59:56 UTC - in response to Message 34944.  

Just for information, I went to the Gigabyte site and they are now selling a card called GV-N78TOC-3GD (Rev. 1.0). Is it the same as the one you described having problems? Other question, does this problem occur also with the similar GV-N78TGHZ-3GD?
ID: 35031 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
a1kabear

Send message
Joined: 19 Oct 13
Posts: 15
Credit: 578,770,199
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwat
Message 35032 - Posted: 14 Feb 2014, 11:08:38 UTC

GV-N78TOC-3GD (Rev. 1.0) is the same card as I got in Thailand which has the problems. It works ok now with the memory downclocked but runs at the same speed as my 780 oc'd which is disappointing (which is also the same speed as my titan which is sitting unused :/)
ID: 35032 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 · Next

Message boards : Graphics cards (GPUs) : Gigabyte GTX 780 Ti OC (Windforce 3x) problems

©2025 Universitat Pompeu Fabra