Advanced search

Message boards : Graphics cards (GPUs) : NVIDIA GeForce GTX 760

Author Message
Profile skgiven
Volunteer moderator
Project tester
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 101,116
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30848 - Posted: 14 Jun 2013 | 16:23:39 UTC

An NVIDIA GeForce GTX 760 is expected to be released on the 25th of June 2013.

The speculated specs are as follows:

    GK104
    GPU Clock: 915 MHz
    Boost Clock: 980 MHz (though I have also seen 1032 suggested)
    Memory Clock: 6008 MHz
    Memory Size: 2048 MB
    Memory Type: GDDR5
    Memory Bus: 256 bit
    Bandwidth: 192 GB/s
    Shading Units: 1152
    96TMUs, 32ROPs, 6SMs,
    2,108 GFLOPS
    PCIe 3.0 x16
    160W TDP


If these specs are correct (or close enough) then it could well be the new GTX660 (in terms of best performance for price)...
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2688
Credit: 1,172,901,099
RAC: 176,610
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30854 - Posted: 15 Jun 2013 | 11:30:41 UTC - in response to Message 30848.

Very interesting! This looks like a "balanced GTX660Ti", i.e. what would have been the logical configuration for that card from the beginning on. Less shaders (cheaper for nVidia to produce) but more bandwidth (same gaming performance, higher board costs for nVidias partners). Depending on price it might be the new sweet spot. But it's got to be cheaper than GTX660Ti, because this will still be faster if bandwidth doesn't limit. Will be interesting how performance works out at GPU-Grid!

MrS
____________
Scanning for our furry friends since Jan 2002

Vagelis Giannadakis
Send message
Joined: 5 May 13
Posts: 187
Credit: 349,254,454
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwat
Message 30858 - Posted: 15 Jun 2013 | 20:16:44 UTC

I guess a nice consequence of the introduction of the 760 will be that, for a (short) period, 660s in stock will probably sell cheaper!
____________

Profile skgiven
Volunteer moderator
Project tester
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 101,116
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30860 - Posted: 15 Jun 2013 | 21:49:56 UTC - in response to Message 30858.
Last modified: 15 Jun 2013 | 22:38:50 UTC

It's usually the case that when new GPU's arrive, not only does the choice improve, but the competition (between AMD and NVidia, and new vs old generations) drives prices down.

The GTX760 is still rumor (until it's official), and the GTX660 (if the 760's specs are right) will have a 20W better TDP. Who knows, but when the 760 turns up it might drive the GTX660Ti prices down to the point where it becomes the best performer/price. Perhaps there will be a GTX760Ti too?
One things for sure, the more cards there are the better the choice is.

After seeing the GT630 Rev 2; 25W TDP but performs like a GT640 (49 to 75W) I'm also looking forward to some of the mid range and entry level GeForce 700 cards... I expect this will be NVidia's best ever range of cards.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2688
Credit: 1,172,901,099
RAC: 176,610
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30862 - Posted: 16 Jun 2013 | 7:45:30 UTC

GTX660 is built on a much smaller chip (GK106) than the rumored GTX760 (GK104), so I won't bet on it taking the same price spot. I don't think nVidia would introduce a new chip for this either, as it would be too close to GK104. But I'll be happy to be surprised here ;)

MrS
____________
Scanning for our furry friends since Jan 2002

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,550,610,997
RAC: 1,104,807
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 30865 - Posted: 16 Jun 2013 | 18:29:23 UTC

I've been running an EVGA GTX770 with the ACX cooling (not many water blocks out yet) and it's about 15 minutes faster than a GTX680 on the same model type. This was after 4 runs and the same GPU clock, it shot right to 1201MHz and 1.87v right out of the box, I'm using EVGA's Precision X to keep full functionality. When I clicked on voltage control, I got a EULA that I had to agree to and it allows the GPU to be taken all the way up to 1.212 volts. I didn't want to try that on air, besides, I don't think that high of a voltage is very good for it's life span.

Another thing I noticed with the dual fan ACX cooling, is it blows all the hot air right in to the case. The bezel is open on the top and bottom and virtually no air comes out the back of the computer, my GTX670's FTW and my GTX680's SC Signature all have blowers and are like little hair dryers at the rear of my computers. I was so concerned about the heat that I bundled the wires in the rear to keep the heat away from them. It works great for heating up a tube of thermal past in just a few minutes or coffee or a Danish with butter........mmmmmmm

One odd thing though and no big deal, it doesn't show up as a GTX770 in my account, it shows as a GTX680, maybe BOINC goes by shader count.

Dylan
Send message
Joined: 16 Jul 12
Posts: 98
Credit: 366,975,962
RAC: 239,679
Level
Asp
Scientific publications
watwatwatwatwat
Message 30887 - Posted: 19 Jun 2013 | 22:08:34 UTC

Looks like the GTX 760 is the equivalent of a rebranded GTX 660ti.

http://videocardz.com/43048/nvidia-geforce-gtx-760-final-specs-unveiled

matlock
Send message
Joined: 12 Dec 11
Posts: 34
Credit: 86,423,547
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwat
Message 30888 - Posted: 20 Jun 2013 | 4:52:16 UTC - in response to Message 30887.

Looks like the GTX 760 is the equivalent of a rebranded GTX 660ti.


Maybe worse. 660 Ti has 1344 cores. It looks to be a 660 OEM with a much higher clock speed.

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,550,610,997
RAC: 1,104,807
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 30890 - Posted: 20 Jun 2013 | 9:53:37 UTC

They say it's official now, the GTX760 will have 1152 cuda cores, it will use the GK104-225 chip, 256 bit bus, 6GHz memory (1502MHz quad pumped (not the faster 7GHz type)),96 TMUs, 32 ROPs and a TDP of 170 watts for right around $300.00 USD.

Not like the GTX670 like we thought it would be, I think it's really strange but now their saying this will be the last 700 series release until 2014 so they want to keep the GTX670 viable for a while at least.

Dylan
Send message
Joined: 16 Jul 12
Posts: 98
Credit: 366,975,962
RAC: 239,679
Level
Asp
Scientific publications
watwatwatwatwat
Message 30898 - Posted: 20 Jun 2013 | 14:55:16 UTC

Maybe we can count on AMD to drive prices down with their newer cards, assuming they're released any time soon.

http://wccftech.com/codenames-generation-amd-volcanic-islands-gpus-detailed/

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2688
Credit: 1,172,901,099
RAC: 176,610
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30922 - Posted: 22 Jun 2013 | 12:55:35 UTC

Since GTX660Ti is partly held back by its 192 bit memeory interface at GPU-Grid, the GTX760 might be just as fast. It's surely not much of an improvement, though, if it costs at least the same.

MrS
____________
Scanning for our furry friends since Jan 2002

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2688
Credit: 1,172,901,099
RAC: 176,610
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30924 - Posted: 22 Jun 2013 | 13:29:55 UTC - in response to Message 30865.

flashhawk wrote:
I didn't want to try that on air, besides, I don't think that high of a voltage is very good for it's life span.

You could also say: we know that a high voltage affects its life span. What we don't know is how severe this effect will be, i.e. will the chip fail within the usable lifetime of the card? I can't answer that, but I'd probably stay at the maximum default voltage. The gains are really small at the top end (of the operating range) and efficiency gets blown away by high voltage.

Regarding the cooler: that's the old question of "blower vs. open air cooler". The blower keeps your case cooler, but the open air cooler can be much more quiet, if you've got the case cooling to get rid of that heat. I couldn't use a blower in my main machine due to its noise.

One odd thing though and no big deal, it doesn't show up as a GTX770 in my account, it shows as a GTX680, maybe BOINC goes by shader count.

BOINC should go by whatever the card reports, I assume. Otherwise it would be a nightmare to keep track of all the different card versions (including OEM versions) popping up here and there.

Or.. do you mean what BOINC shows udner "my computers"? There only one card is listed for every type present (1 AMD, 1 nVidia, 1 Intel). If you've got more than 1, the number is adjusted but not the type. Don't ask how it's determined which card's type to show.. probably the one in slot 0 or somehting :D

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1104
Credit: 6,101,732,079
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30928 - Posted: 22 Jun 2013 | 18:20:59 UTC - in response to Message 30924.

Or.. do you mean what BOINC shows udner "my computers"? There only one card is listed for every type present (1 AMD, 1 nVidia, 1 Intel). If you've got more than 1, the number is adjusted but not the type. Don't ask how it's determined which card's type to show.. probably the one in slot 0 or somehting :D

I think it's what BOINC sees as the more capable card.

Profile skgiven
Volunteer moderator
Project tester
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 101,116
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30934 - Posted: 23 Jun 2013 | 7:52:47 UTC - in response to Message 30928.
Last modified: 23 Jun 2013 | 8:39:55 UTC

That's what I thought, but -

i7-3770K1 work 467,620.92 56,952,725 7.0.64 GenuineIntel
Intel(R) Core(TM) i7-3770K CPU @ 3.50GHz [Family 6 Model 58 Stepping 9]
(8 processors) [3] NVIDIA GeForce GTX 650 Ti BOOST (2047MB) driver: 320.14 Microsoft Windows 7
Professional x64 Edition, Service Pack 1, (06.01.7601.00) 23 Jun 2013 | 7:34:31 UTC
http://www.gpugrid.net/show_host_detail.php?hostid=139265
Its actually got a GTX660Ti, GTX660 and a GTX650TiBoost in it.

The first problem I face when working out issues is which card is doing what. In the Tasks pane Boinc just says NVidia GPU (device 0), or device 1 or 2. These are logical devices and don't represent the PCIE slots. It's not that Boinc doesn't know what each logical device is; it says what these devices are in the Boinc event log,

20/06/2013 18:01:08 | | CUDA: NVIDIA GPU 0: GeForce GTX 660 Ti (driver version 320.14, CUDA version 5.50, compute capability 3.0, 2048MB, 1960MB available, 2985 GFLOPS peak)
20/06/2013 18:01:08 | | CUDA: NVIDIA GPU 1: GeForce GTX 660 (driver version 320.14, CUDA version 5.50, compute capability 3.0, 2048MB, 1968MB available, 1982 GFLOPS peak)
20/06/2013 18:01:08 | | CUDA: NVIDIA GPU 2: GeForce GTX 650 Ti BOOST (driver version 320.14, CUDA version 5.50, compute capability 3.0, 2048MB, 1973MB available, 1646 GFLOPS peak)
20/06/2013 18:01:08 | | OpenCL: NVIDIA GPU 0: GeForce GTX 660 Ti (driver version 320.14, device version OpenCL 1.1 CUDA, 2048MB, 1960MB available, 2985 GFLOPS peak)
20/06/2013 18:01:08 | | OpenCL: NVIDIA GPU 1: GeForce GTX 660 (driver version 320.14, device version OpenCL 1.1 CUDA, 2048MB, 1968MB available, 1982 GFLOPS peak)
20/06/2013 18:01:08 | | OpenCL: NVIDIA GPU 2: GeForce GTX 650 Ti BOOST (driver version 320.14, device version OpenCL 1.1 CUDA, 2048MB, 1973MB available, 1646 GFLOPS peak)
20/06/2013 18:01:08 | | OpenCL: Intel GPU 0: Intel(R) HD Graphics 4000 (driver version 9.17.10.2932, device version OpenCL 1.1, 1496MB, 1496MB available, 45 GFLOPS peak)
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1104
Credit: 6,101,732,079
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30948 - Posted: 23 Jun 2013 | 19:18:00 UTC - in response to Message 30934.

That's really interesting. A while ago I was running several boxes with 2 ATI cards, one 4770 and one 5850. No matter where I put the cards, on the project websites it always reported 2 x 5850 cards. So has BOINC changed, is it different for ATI & NVidia, or what? Another BOINC mystery...

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,550,610,997
RAC: 1,104,807
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 30951 - Posted: 23 Jun 2013 | 20:09:32 UTC

My GTX770 is, at present, in a box with a GTX670 and in my account it says there are 2 670's in that box. The 770 is in the first PCIe x16 slot, I'm having major issue's with it, spontaneous reboots, I actually saw it do it last night. I was sitting at the table working on another computer and saw out of the corner of my eye the monitor popped up with a green checkerboard pattern, I immediately tried to suspend BOINC but wasn't fast enough. I'm convinced it's a driver issue, NVidia rushed the release of the 700 series before the drivers were ready (320.18, 320.11, 320.08). I couldn't get the 320.18 to install in XP x64, so I settled on the 320.08, so far it's hosed 5 CPDN models, one with over 300 hours. When I look in Event Viewer under applications, it always reads something like "Faulting application nvxxxx.dll" (x being a different part of the driver).

I read over at the gForce forums where some were having good luck modifying the 314.22 and 314.07 inf string and adding the GTX770 to it and having it be much more stable. At this point, I've dropped my RAM speed to 1333, relaxed the RAM timings and under clocked the GTX770 and I'm also rebooting when that card finishes a WU. I hope this will hold me over until the end of the month when they release the new driver. Manual (NVidia employee) at the NVidia forums says they are incorporating many new fixes and that's why it's taking so long, I think he mentioned there would not be many game optimizations, just bug fixes. I was thinking that these current drivers might be having an impact on the way the Titan and GTX780 are failing the WU's here in beta testing, but I don't have either card and couldn't say for sure, just a thought.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2688
Credit: 1,172,901,099
RAC: 176,610
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30953 - Posted: 23 Jun 2013 | 20:49:16 UTC - in response to Message 30951.

Having to resort on a beta driver could surely cause such issues. Otherwise the power supply could be an issue. Instead of simply downclocking the GPU it should be more energy efficient to lower the power target of the GPU, so it will underclock and undervolt itself. Doesn't really matter though, if it's just for a short test.

MrS
____________
Scanning for our furry friends since Jan 2002

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,550,610,997
RAC: 1,104,807
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 30955 - Posted: 23 Jun 2013 | 21:49:35 UTC - in response to Message 30953.

My PSU is a Seasonic x850 gold and it's drawing 603 watts at the plug, that's through a Kill-O-Watt device, my Cyberpower software gives an identical reading and I have it set to log power spikes and dips and there is nothing out of the ordinary there. Do you think an 850 watt is enough? I almost got a 1000 watt PSU to make the computer more future proof but I figured an 850 would be enough.

All my machines are identical except for GPU's and one has slightly less RAM. It really sucks when I have to baby sit a machine, I have a GTX670 slowly dying in another computer, I need to swap in my spare (don't worry beyond, it's not the one I'm sending to you). Other than the driver issue's and not much in the way of water blocks, it seems like it will turn out to be a good card. It's currently cheaper than the GTX680 model I like, I got the 770 for $419.00 US (EVGA GeForce GTX 770 SC ACX 02G-P4-2774-KR) free shipping and no taxes and even with all it's problems, it's my fastest card.

Dylan
Send message
Joined: 16 Jul 12
Posts: 98
Credit: 366,975,962
RAC: 239,679
Level
Asp
Scientific publications
watwatwatwatwat
Message 30956 - Posted: 23 Jun 2013 | 23:16:39 UTC - in response to Message 30948.

I heard that Boinc reports either the best GPU or the primary GPU, regardless of there being different GPU's in one system.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2688
Credit: 1,172,901,099
RAC: 176,610
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30970 - Posted: 24 Jun 2013 | 16:39:26 UTC - in response to Message 30955.

That PSU sounds like a really good fit to me. 603 W at the wall means about 550 W consumed by the PC, which puts the PSU at 65% load - well within the green range.

MrS
____________
Scanning for our furry friends since Jan 2002

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,550,610,997
RAC: 1,104,807
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 30976 - Posted: 24 Jun 2013 | 17:23:31 UTC - in response to Message 30970.

When I decided to get the GTX680's, I got a little bit of an anxiety attack, then I did some reading and most folks said it (the 850) was enough. I almost got x750's, in fact, I had OCZ bronze 750's when I came back to crunching last year.

Profile skgiven
Volunteer moderator
Project tester
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 101,116
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30983 - Posted: 24 Jun 2013 | 19:23:20 UTC - in response to Message 30956.
Last modified: 24 Jun 2013 | 19:26:45 UTC

I heard that Boinc reports either the best GPU or the primary GPU, regardless of there being different GPU's in one system.

In my case, this explains why Boinc says I have three 650TiBoost GPUs:
    GPU1
    ________________________________________________________________________________
    Display device : GeForce GTX 650 Ti BOOST on GK106 GPU
    BIOS : 80.06.59.00.0c
    GUID : VEN_10DE&DEV_11C2&SUBSYS_11C21569&REV_A1&BUS_1&DEV_0&FN_0
    Multi-GPU role : master

    GPU2
    ________________________________________________________________________________
    Display device : GeForce GTX 660 on GK106 GPU
    BIOS : 80.06.28.00.04
    GUID : VEN_10DE&DEV_11C0&SUBSYS_11C01569&REV_A1&BUS_2&DEV_0&FN_0

    GPU3
    ________________________________________________________________________________
    Display device : GeForce GTX 660 Ti on GK104 GPU
    BIOS : 80.04.4b.00.26
    GUID : VEN_10DE&DEV_1183&SUBSYS_35521458&REV_A1&BUS_5&DEV_0&FN_0



It's set to the Multi-GPU master because it's in the first PCIE slot and I'm not using an NVidia GPU for display, I'm using the Intel HD4000 on my i7-3770K. Anyone planning to do this - I suggest you configure for best performance while the NVidia GPU is in use for the monitor because when you use the HD4000 for display NVidia Control Panel is not available.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Profile skgiven
Volunteer moderator
Project tester
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 101,116
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 31021 - Posted: 25 Jun 2013 | 19:49:41 UTC - in response to Message 30983.
Last modified: 25 Jun 2013 | 20:06:03 UTC

NVidia GeForce GTX760 Review by Ryan Smith (June 25, 2013)

- Compute part :)

In the UK the GTX760 is just £10 cheaper than a GTX660Ti ~£210 vs £220!
For here there is probably not going to be much difference in performance (+/- 10% maybe). My guess is that the GTX660Ti would just pip it for outright performance and performance/watt but performance/£ might be very close. Just a guess though!
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

werdwerdus
Send message
Joined: 15 Apr 10
Posts: 123
Credit: 1,004,473,861
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 31035 - Posted: 26 Jun 2013 | 6:02:01 UTC
Last modified: 26 Jun 2013 | 6:03:02 UTC

http://www.youtube.com/watch?v=0yUL54FOR9w

newegg.com giving away a GTX 760 OC to a US resident (sorry everbody else), just leave a comment on the video to enter by July 5, 2013
____________
XtremeSystems.org - #1 Team in GPUGrid

Profile skgiven
Volunteer moderator
Project tester
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 101,116
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 31663 - Posted: 19 Jul 2013 | 17:32:16 UTC - in response to Message 31035.

Again, if anyone has actual power usage info for the GTX760 let us know.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Post to thread

Message boards : Graphics cards (GPUs) : NVIDIA GeForce GTX 760