Big Maxwell GM2*0

Message boards : Graphics cards (GPUs) : Big Maxwell GM2*0
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · 6 · Next

AuthorMessage
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40608 - Posted: 25 Mar 2015, 0:39:25 UTC - in response to Message 40604.  
Last modified: 25 Mar 2015, 0:41:21 UTC

When will the GPUGrid app support BigMaxwell?

Soon, but not imminently. AFAIK, no one's attached one yet.
(It's working here in the lab).

FYI, it's working now (on Windows 8.1).
Here is a short workunit processed successfully on a Titan X. (Thanks to eXaPower for pointing it out)

I'm planning to sell my old cards (GTX670s and GTX680s),

High roller! I'd have throught 980s would be more cost effective?

They are.
But as I will apparently loose my 2nd place on the overall toplist, I'd like to do it in a stylish manner.
So I shall continue to have the fastest GPUGrid host on the planet at least. ;)
ID: 40608 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
[CSF] Thomas H.V. DUPONT

Send message
Joined: 20 Jul 14
Posts: 732
Credit: 130,089,082
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwat
Message 40609 - Posted: 25 Mar 2015, 7:11:15 UTC

ID: 40609 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
eXaPower

Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 40613 - Posted: 25 Mar 2015, 15:05:10 UTC - in response to Message 40608.  
Last modified: 25 Mar 2015, 15:08:06 UTC

--- NOELIA_1mg --- (980) began work -- Titan X finished it. Host# 196801 resultid=14020924
ID: 40613 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40617 - Posted: 25 Mar 2015, 19:08:42 UTC - in response to Message 40613.  
Last modified: 25 Mar 2015, 19:08:54 UTC

--- NOELIA_1mg --- (980) began work -- Titan X finished it. Host# 196801 resultid=14020924

And we see the Windows limitation of four again...only 4GB of that awesome 12 is recognized and used.

I think I will wait for the "real" big Maxwell with its own CPU. At this time 1178 Euro for a EVGA one is too much for my wallet at the moment.
Greetings from TJ
ID: 40617 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40618 - Posted: 25 Mar 2015, 22:25:29 UTC - in response to Message 40604.  

MJH wrote:
I'd have thought 980s would be more cost effective?

Yep, and even more so the GTX970's (see SKs post).

@TJ: don't worry about the memory, ~1 GB per GPU-Grid task is still fine.

And don't wait for any miraculous real big Maxwell. GM200 is about as bis as they can go on 28 nm, and on Titan X it's already fully enabled. I'd expect a cut-down version of GM200 at some point, but apart from that the next interesting chips from nVidia are very probably Pascal's.

And about this "integrating CPU" talk: the rumor mill may have gotten Tegra K1 and X1 wrong. These are indeed Kepler and Maxwell combined with ARM CPUs.. as a complete mobile SoC. Anything else wouldn't make much sense in the consumer range (and AMD is not giving them any pressure anyway), so if they experiment with GPU + closely coupled CPU I'd expect this first to arrive together with NVlink and Open Power server for HPC. And priced accordingly.

MrS
Scanning for our furry friends since Jan 2002
ID: 40618 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40624 - Posted: 26 Mar 2015, 1:01:42 UTC - in response to Message 40618.  


@TJ: don't worry about the memory, ~1 GB per GPU-Grid task is still fine.

MrS

Thanks for the explanation. Haha I don't worry about the memory but crunchers on Windows pay quite a lot for 8GB that cannot be used with this new card.

It depends on my financial conditions but I think I wait for a GTX980Ti, but not before fall, as summer turns to warm for 24/7 crunching (without AC).


Greetings from TJ
ID: 40624 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
eXaPower

Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 40628 - Posted: 26 Mar 2015, 20:36:50 UTC - in response to Message 40608.  
Last modified: 26 Mar 2015, 20:43:24 UTC

But as I will apparently loose my 2nd place on the overall toplist, I'd like to do it in a stylish manner. So I shall continue to have the fastest GPUGrid host on the planet at least. ;)

(ROBtheLionHeart) Overclocked 780ti (XP) NOELIA's are a hair faster than you're mighty 980 (XP). Rarely a sight not to see you (RZ) with the fastest times! The Performance Tab engaging comparative tool for crunchers a new way to learn about work units and expected GPU performance.

Current Long run only Maxwell RAC per day (crunching 24/7) including OS factors:

1.15-1.3mil for [1] Titan X (future Ti GM200 version - unknown SMM count and release date)
750-850k [1] 980
600-700k [1] 970
350-450k [1] 960
225-300k [1] 750ti
175-250k [1] 750

skgiven's Throughput performances and Performances/Watt chart -- relative to a GK110 GTX Titan.
    Performance GPU Power GPUGrid Performance/Watt
    211% GTX Titan Z (both GPUs) 375W 141%
    116% GTX 690 (both GPUs) 300W 97%
    114% GTX Titan Black 250W 114%
    112% GTX 780Ti 250W 112%
    109% GTX 980 165W 165%
    100% GTX Titan 250W 100%
    93% GTX 970 145W 160%
    90% GTX 780 250W 90%
    77% GTX 770 230W 84%
    74% GTX 680 195W 95%
    64% GTX 960 120W 134%
    59% GTX 670 170W 87%
    55% GTX 660Ti 150W 92%
    53% GTX 760 130W 102%
    51% GTX 660 140W 91%
    47% GTX 750Ti 60W 196%
    43% GTX 650TiBoost 134W 80%
    37% GTX 750 55W 168%
    33% GTX 650Ti 110W 75%

ID: 40628 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40635 - Posted: 26 Mar 2015, 23:05:13 UTC - in response to Message 40628.  
Last modified: 26 Mar 2015, 23:23:13 UTC

Roughly 56% faster than a Titan,

https://www.gpugrid.net/forum_thread.php?id=1150

What are the boost clocks?

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-titan-x-gm200-maxwell,4091-6.html
Suggests it boosts to 1190MHz but quickly drops to 1163MHz (during stress tests) and that it's still dropping or increasing in steps of 13MHz, as expected.
1190 is ~15% shy of where I can get my GTX970 to boost, so I expect it will boost higher here (maybe ~1242 or 1255MHz).
Apparently the back gets very hot. I've dealt with this in the past by blowing a air directly onto the back of a GPU and by using a CPU water cooler (as that reduces radiating heat).
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 40635 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
eXaPower

Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 40638 - Posted: 26 Mar 2015, 23:43:56 UTC - in response to Message 40635.  
Last modified: 27 Mar 2015, 0:43:42 UTC

Roughly 56% faster than a Titan.

As you expected for GPUGRID ACEMD: an increase of ~60%

Suggests it boosts to 1190MHz but quickly drops to 1163MHz (during stress tests) and that it's still dropping or increasing in steps of 13MHz, as expected. 1190 is ~15% shy of where I can get my GTX970 to boost, so I expect it will boost higher here (maybe ~1242 or 1255MHz). Apparently the back gets very hot. I've dealt with this in the past by blowing a air directly onto the back of a GPU and by using a CPU water cooler (as that reduces radiating heat).

12Gb of Memory is really hot (over 100C) for prolong use. As a fan blowing on the back helps- Will a back plate lower temps with such density or hinder by holding more heat to those outer (opposite the heat sink) memory pieces? Custom Water block (back) plate? (Full CPU/GPU water cooling loop) Only air cooling at 110C: down clocking of GDDR5 with a voltage drop could help. Longevity concern: can GDDR5 sustain +100C temps? I think the power techup website (Titan X review) shows the model number to reference. GDDR5 rated at 70C/80C/90C? Over bound Temperatures will certainly impact long term overclocking and boost rate prospects.
ID: 40638 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40642 - Posted: 27 Mar 2015, 8:40:49 UTC - in response to Message 40638.  
Last modified: 27 Mar 2015, 9:18:54 UTC

Cooling 12GB appears to be a big issue and it's likely hindering boost and power consumption, but without the 12GB it wouldn't be a Titan, saying as it doesn't have good dp.
Hopefully a GTX980Ti with 6GB will appear soon - perhaps 2688 cuda cores, akin to the original Titan, but 35 to 40% faster for here and <£800, if not <$700 would make it an attractive alternative to the Titan X.

For the Titan X, excellent system cooling would be essential for long term crunching, given the high memory temps. I would go for direct air cooling on the back of one or possibly 2 cards, but if I had 3+ cards I would want a different setup. The DEVBOX's just use air cooling, with 2 large front case fans, but I suspect the memory still runs a bit hot. Just using liquid wouldn't be enough, unless it included back cooling. I guess a refrigerated thin-oil system would be ideal, but that would be DIY and Cha-Ching!

In my experience high GDDR temps effects performance of lesser cards too, and I found stability by cooling the back of some cards. While tools such as MSI Afterburner allow you to cool the GPU via the fans they don't even report the memory temps. It's often the situation that the top GPU (closest the the CPU) in an air cooled case runs hot. While this is partially from heat radiation it's mostly because the airflow over the back of the card comes from the CPU so it's already warm/hot. A basic CPU water cooler is sufficient to remove this issue, and for only about twice the cost of a good CPU heatsink and fan it's a lot cheaper than a GPU water cooler.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 40642 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
eXaPower

Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 40648 - Posted: 27 Mar 2015, 12:21:09 UTC - in response to Message 40642.  
Last modified: 27 Mar 2015, 12:35:13 UTC


While tools such as MSI Afterburner allow you to cool the GPU via the fans they don't even report the memory temps.

Many voltage controllers are dumb with VID lines only. (Titan X) No i2C support. Driver read only rather than software. There are few a PCB (Maxwell) with the ability to manual read temps or voltages on the back of PCB: 980 strix and all gun metal Zotac. PNY also allows manual measurements on one of dual fan OC models with i2c support. All Asus strix have i2C advanced support. A few Zotac models support i2c. Most others (MSI/EVGA/Gigabyte) don't. Advanced i2C support on a PCB is helpful.

http://i2c.info/i2c-bus-specification
ID: 40648 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
eXaPower

Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 40659 - Posted: 27 Mar 2015, 23:42:48 UTC
Last modified: 27 Mar 2015, 23:56:27 UTC

https://www.gpugrid.net/forum_thread.php?id=3551

For the Titan X, excellent system cooling would be essential for long term crunching, given the high memory temps. I would go for direct air cooling on the back of one or possibly 2 cards, but if I had 3+ cards I would want a different setup. The DEVBOX's just use air cooling, with 2 large front case fans, but I suspect the memory still runs a bit hot. Just using liquid wouldn't be enough, unless it included back cooling. I guess a refrigerated thin-oil system would be ideal, but that would be DIY and Cha-Ching!


Cooling 12GB appears to be a big issue and it's likely hindering boost and power consumption, but without the 12GB it wouldn't be a Titan, saying as it doesn't have good dp. Hopefully a GTX980Ti with 6GB will appear soon - perhaps 2688 cuda cores, akin to the original Titan, but 35 to 40% faster for here and <£800, if not <$700 would make it an attractive alternative to the Titan X.

Recent reports point to a full GM200 980Ti with 6GB - June/July launch.
ID: 40659 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40661 - Posted: 28 Mar 2015, 14:44:27 UTC

The easiest way to cool the memory on the back side would be small passive aluminum RAM coolers. They're obviously not as strong as full water cooling, but cost almost nothing and given some airflow could easily shave off 20 - 30°C. There's not enough space for this in tightly packed multi-GPU configurations, though.

And as far as I know the boost mode doesn't deal with memory at all. There's only a secondary interarction via the power draw. But the memory chips don't consume much power anyway, so there's going to be a negligible change with temperature.

Regarding longevity: 100°C is hot for regular chips, but fine for e.g. voltage regulators. Not sure about memory chips. One would guess nvidia has thought this through, as they can't afford Titans failing left and right after a short time. Did they consider continous load? I'm not sure, but I hope they expect people paying 1000$ for a GPU to use them for more than the occasional game.

MrS
Scanning for our furry friends since Jan 2002
ID: 40661 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40670 - Posted: 28 Mar 2015, 23:32:45 UTC - in response to Message 40661.  

You would think so, but I have a colleague who has bought four (4) 30 inch screens only to play flight simulator!

So perhaps the Titan X builders aim mostly on gamers who seems willingly to invest heavily in hardware. And they forget the 24/7 crunchers. They make Tesla-like boards for calculating (hence no monitors can be attached) for dedicated crunching. These cards are a lot more expensive but are build for heavy use with the best possible parts.

But I guess the Titan X is heavily tested and some brands use the best parts in their cards too. I am a big fan of EVGA stuff, but you know that off course. Their FOC's are more expensive but have extra high grade components.
Greetings from TJ
ID: 40670 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40672 - Posted: 29 Mar 2015, 1:13:33 UTC - in response to Message 40659.  
Last modified: 29 Mar 2015, 1:14:25 UTC

Cooling 12GB appears to be a big issue and it's likely hindering boost and power consumption, but without the 12GB it wouldn't be a Titan, saying as it doesn't have good dp. Hopefully a GTX980Ti with 6GB will appear soon - perhaps 2688 cuda cores, akin to the original Titan, but 35 to 40% faster for here and <£800, if not <$700 would make it an attractive alternative to the Titan X.

Recent reports point to a full GM200 980Ti with 6GB - June/July launch.

That is very good news indeed.
Perhaps I'll wait for that card then.
6GB is ten times more than a GPUGrid task needs so it's an overkill.
Then 12GB would be ... complete waste of money.
ID: 40672 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40678 - Posted: 29 Mar 2015, 14:05:56 UTC
Last modified: 29 Mar 2015, 14:09:02 UTC

As more data aggregates, I get more confused.

.............e1s6_3-GERARD_FXCXCL12_LIG_11631322-0-1-RND5032_0 Titan--X--, Win 8.1, 28591 sec
e2s13_e1s6f79-GERARD_FXCXCL12_LIG_10920801-0-1-RND3888_0 GTX-980-, Win XP, 25607 sec
...........e1s16_7-GERARD_FXCXCL12_LIG_14907632-0-1-RND4211_0 GTX780Ti, Win XP, 29397 sec

e11s54_e4s195f119-NOELIA_27x3-1-2-RND1255_0 GTX-980-, Win XP, 11936 sec
........e7s6_e4s62f37-NOELIA_27x3-1-2-RND3095_0 GTX780Ti, Win XP, 12019 sec
....e7s18_e4s62f162-NOELIA_27x3-1-2-RND1763_0 GTX-980-, Win 8.1, 14931 sec

e2s1_792f101-NOELIA_3mgx1-1-2-RND1677_0 GTX-980-, Win 7., 18160 sec (30.7% more time)
..e3s12_47f95-NOELIA_3mgx1-1-2-RND3015_0 Titan--X--, Win8.1, 13893 sec (23.5% faster)

e5s155_2x164f180-NOELIA_S1S4adapt4-2-4-RND5279_1 GTX780Ti, Win XP, 12530 sec
....e8s83_e1s20f75-NOELIA_S1S4adapt4-3-4-RND3849_0 Titan--X--, Win 8.1, 12774 sec
e7s13_e5s186f149-NOELIA_S1S4adapt4-3-4-RND3162_0 GTX-980-, Win XP, 13822 sec
.....-e4s4_e1s27f81-NOELIA_S1S4adapt4-2-4-RND4550_0 GTX-980-, Win XP, 14080 sec
ID: 40678 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40679 - Posted: 29 Mar 2015, 14:38:27 UTC - in response to Message 40678.  

There may be a pattern:

NOELIA_3mgx1: 75919 atoms, performs very well
NOELIA_S1S4adapt4: 60470 atoms, performs OK
GERARD_FXCXCL12: 31849 atoms, performs bad

The less atoms a WU contains, the less each time step take and the more often CPU intervention is needed. And the more difficult it is to make good use of many shaders. Direct evidence for this is the low GPU usage of the Gerard WUs with "few" atoms.

Maybe the CPU support of that Titan X is not configured as well as for the other GPUs? Maybe too many other CPU tasks are running, and / or he's not using "swan_sync=0" in addition to the WDDM overhead.

MrS
Scanning for our furry friends since Jan 2002
ID: 40679 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
eXaPower

Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 40686 - Posted: 29 Mar 2015, 16:37:48 UTC - in response to Message 40679.  
Last modified: 29 Mar 2015, 16:49:57 UTC

There may be a pattern:

NOELIA_3mgx1: 75919 atoms, performs very well
NOELIA_S1S4adapt4: 60470 atoms, performs OK
GERARD_FXCXCL12: 31849 atoms, performs bad

The less atoms a WU contains, the less each time step take and the more often CPU intervention is needed. And the more difficult it is to make good use of many shaders. Direct evidence for this is the low GPU usage of the Gerard WUs with "few" atoms.

Maybe the CPU support of that Titan X is not configured as well as for the other GPUs? Maybe too many other CPU tasks are running, and / or he's not using "swan_sync=0" in addition to the WDDM overhead.

2% [4% Total] Hyper Threads usage for each ACEMD process. [2] physical cores for AVX DP @ ~60% Total CPU. (OS accounts 1%) GK107 exhibits this behavior: 98% core for NOELIA > GERARD 94%. (13773 atom) SDOERR_villinpubKc 87%. Powerful GPU's affected even greater.

Comparing top two time: GERARD_FXCXCL12_LIG_11543841 --- (31843 Natoms)
-Titan X (Win8.1) Time per step [1.616 ms.] 8.8% faster than [1.771 ms] GTX780ti (XP)

GTX780ti (XP) output 91% of Titan X (Win8.1) for this particular unit.

Titan X without WDDM tax lower's time per step and Total Runtime another 10%?

GERARD_FXCXCL12_LIG_11543841 --- (31843 Natoms) [28,216.69/30,980.94] 9% Runtime difference Titan X (Win8.1) > 780ti (XP)

Titan X possibly 20~% faster with (XP) at similar power consumption. Titan X core clocks are unknown (overclocked?) further complicating actual differences among (footprint) variables.
ID: 40686 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40687 - Posted: 29 Mar 2015, 16:56:45 UTC - in response to Message 40679.  

Over the last couple of years the performance variation between task types has increased significantly. This has made it awkward to compare cards of different generation/even sub-generation. Task performances vary significantly because different tasks challenge the GPU in different ways, even exposing the limitations of more subtle design differences.

While DooKey's 5820K system (with the Titan X) may be using lots of CPU, it's a 6 cores/12 thread system, and while he has a high rac at Universe@home (a CPU project), he does not have that system hooked up there,
http://boincstats.com/signature/-1/user/21224/sig.png
http://boincstats.com/en/stats/-1/user/detail/21224/projectList
http://universeathome.pl/universe/hosts_user.php?userid=2059

The CPU could be dropping into a lower power state, he could be gaming or playing with the setup/config:

3855MHz GDDR seems like a heavy OC for 3500MHz/7GHz memory.
While SKHynix have 4GB memory chips rated at 8GHz, I don't think the Titan X uses them?

Several errors here,
https://www.gpugrid.net/results.php?hostid=196801
and looking at some of the tasks, there are a lot of recoveries, before the failures,
https://www.gpugrid.net/result.php?resultid=14040943

The WDDM overhead could well be greater with the bigger cards and it might not be a constant, it could vary itself by task type (atom count/CPU demand). Last time we had a look there did seem to be some variation with GPU performance.

A significant overall performance gain might come by running more than one WU, as with the GTX970 and GTX980 on W7, Vista, W8 & W10 preview (WDDM2.0). Perhaps 3 WU's on a Titan X, would be optimal?
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 40687 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
eXaPower

Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 40695 - Posted: 30 Mar 2015, 11:31:57 UTC - in response to Message 40686.  
Last modified: 30 Mar 2015, 12:16:05 UTC

GERARD_FXCXCL12 update:

Zoltan's 980 reclaimed [25,606.92] fastest runtime and [1.462 ms] time per step.

RZ's 980 9% faster than DooKey's [1.616 ms.] Titan X and 11-19% better than GTX780ti [1.771 ms.]
ID: 40695 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 · 6 · Next

Message boards : Graphics cards (GPUs) : Big Maxwell GM2*0

©2025 Universitat Pompeu Fabra