Message boards :
Graphics cards (GPUs) :
Maxwell now
Message board moderation
| Author | Message |
|---|---|
|
Send message Joined: 18 Jun 12 Posts: 297 Credit: 3,572,627,986 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
There's a rumor going around that Maxwell is coming out next month. I wonder if this was planned of if AMD's sales are hurting them? |
|
Send message Joined: 28 Jul 12 Posts: 819 Credit: 1,591,285,971 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
It looks more like a delaying action to hold off AMD until the 20 nm process arrives, probably later than they had originally hoped. A GTX 750 Ti won't set the world on fire in performance, and won't make them a ton of money. But it gives them a chance to see how well the design works in practice, and to give the software developers a head start before the real Maxwell arrives. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
It likely is just a false rumor. No prove has been shown that these cards use Maxwell chips, despite relatively complete benchmarks already appeared. It's probably just GK106 with 768 shaders. MrS Scanning for our furry friends since Jan 2002 |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Producing a Maxwell on 28nm process would be a complete change of direction for NVidia, so I agree this is likely to be a false rumor. There are two revision models (Rev. 2) of GPU's in the GF600 lineup (GT630 and GT640), so perhaps NVidia want to fill out their GF700 range with a lower end card, so if there is a Rev2 version of the GK650Ti en route, it makes more sense to shift it to the GF700 range. The idea of constructing a Maxwell on 28n, does make a lot of sense however; GM could be tested directly against GK, and they could produce more competitive entry to mid-range cards earlier. Small cards are the largest part of the GPU market so why produce a big, immature card first? As GM's will have a built in CPU (of sorts) it would be better to test these (and their usefulness / scalability) on smaller cards first - no point producing a fat GM which has a insufficient CPU to support it. I've always wondered why they produced the larger cards first. It's just been a flag waving exercise IMO. While that might be marketable, it makes no sense when dealing with the savvy buyer and other businesses (OEM's), especially for supercomputers. NVidia could also have produced GF cards at 28nm, and they would have had a market. Perhaps they did; just for engineering, design and testing purposes and managed to keep these chips completely in-house. While such designs might have been/will be marketable, from a business point of view they would primarily be competing against other NVidia products - probably a bad thing - better to focus your developmental skill set on one controllable futuristic objective rather than tweaking. The eventual 40% reduction in die size will probably facilitate cooler GPU's. In the main, GK temperatures are significantly less of an issue than GF temps, but for several high end cards in one system it's still a problem. So while NVidia doesn't have temperature licked now, it should fall into place at 20nm. In the mean time, entry to mid-range 28nm cards are easy to produce and easy to cool. 28nm Maxwell's might be easier to work with now and for early 20nm products. When moving to 20nm, yields will inevitably be low (always are) so it would make sense to start at the small end where you are actually going to get enough sample to test with and enough product to release. The lesser bins tend to go to OEM anyway, so it might be better to start there and get a product out which will compete with AMD and Intel's integrated GPU processors ASAP. Lets face it, this is where the competition is highest and NVidia is weakest. So the first 28nm Maxwell's could well be for laptops and other mobile devices. ARM can already be used to support an OS, so IMO it's inevitable that ARM will bolster their CPU with an NVidia GPU. That's what the market really wants; sufficient CPU processing power to start up and run an OS and a high end GPU for the video-interface, gaming... isn't it? FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
|
Send message Joined: 16 Mar 11 Posts: 509 Credit: 179,005,236 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
ARM can already be used to support an OS, so IMO it's inevitable that ARM will bolster their CPU with an NVidia GPU. That's what the market really wants; sufficient CPU processing power to start up and run an OS and a high end GPU for the video-interface, gaming... isn't it? I guess a fairly large part of the market wants that. I would be happy with a motherboard with a BIOS that can boot from PXE, no SuperIO (USB, RS-232, parallel port, PS/2), ISC bus for onboard temperature sensors, no IDE or SATA (no disks), just lots of RAM, an RJ-45 connector and gigabit ethernet, no wifi, enough CPU processing power to startup and run a minimal OS that has a good terminal, SSH and can run BOINC client and project apps. Don't need a desktop or anything to do with a GUI, no TWAIN or printer drivers/daemons, no PnP or printer service, no extra fonts (just a decent terminal and 1 font), network services required, Python or some other scripting language would be nice but not much more. If they could fit all that onto a slightly larger video card I'd be happy, otherwise put it on a 2" x 5" board with a PCIe slot and power connectors and call it a cruncher. Something so no frills IKEA would consider stocking it. What else would be unnecessary... no RTC (get the time off the LAN), no sound, no disk activity LED. BOINC <<--- credit whores, pedants, alien hunters |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Seems like the cat is out of the bag.. and we were all wrong, as usual for a new generation ;) It's still not official, but far more solid than any rumors before this: - at least 1, probably 2 small chips in 28 nm soon - the bigger ones later in 20 nm - architectural efficiency improvements - much larger L2 cache - the compute to texture ratio increases from 12:1 to 16:1 (like AMDs) - the SMX goes down to 128 shaders (192 in Kepler) -> that could mean they're going back to non-superscalar (i.e. just scalar) If the latter is true this could mean significant per-clock per shader performance improvements here and in many other BOINC projects :) MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 16 Mar 11 Posts: 509 Credit: 179,005,236 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Sounds like a big performance per watt increase will be coming too. I think I'll put planned purchases on hold, build savings and see what the picture looks like 4 months from now. BOINC <<--- credit whores, pedants, alien hunters |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
That's not what nVidia would like you to do.. but I agree ;) MrS Scanning for our furry friends since Jan 2002 |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
For GPUGrid, performance touting is premature - we don't even know if it will work with the current app. It could take 6 months of development and debugging. It took ages before the Titan's worked. As the GTX750Ti will only have 640 Cuda Cores, the 128bit bus probably won't be an issue. The Cuda core to bus lane ratio is about the same as a GTX670. However, the 670 is super-scalar and the GTX480 had 384lanes. Suggesting a 60W GTX750Ti will be slightly faster than a GTX480 still sounds unrealistic, but assuming the non-super-scalar Cuda cores aren't 'semi-skimmed' it might be powerful enough. I suspect they will not be 'full fat' in the way GF110 was, and there could be additional bottlenecks, driver bugs... So it's wait and see. Having 6power pins means the GTX750Ti could be powered directly from the PSU, rather than through the motherboard. This is good if you want to use this card on a Riser in say a 3rd slot (which might not actually be capable of supplying 75W). Avoid cards with small fans - they don't last. I still say 'stay clear of the GTX750' if it's only got 1GB GDDR5. FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
|
Send message Joined: 8 Mar 12 Posts: 411 Credit: 2,083,882,218 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
I still feel like I'm stuck between a rock and a hard place. Haswell e will have an 8 core variant in q3. So this is definitely going to be bought. However, I would like this to be my last system build for more than a year, as pumping 5k annually is something I can not continue. Every other year, sure. But with Volta and it's stacked dram.... I'm very cautious about dropping 1.8k+ on gpus that most likely won't be that large of a change. Well see I suppose |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
But with Volta and it's stacked dram.... I'm very cautious about dropping 1.8k+ on gpus that most likely won't be that large of a change. Well see I suppose Volta will still take some time, as GPUs have matured quite a bit (compared to the wild early days of a new chip every 6 months!) and progress is generally slower. That's actually not so bad, because we can keep GPUs longer and the software guys have some time to actually think about using those beasts properly. If you still have Fermis or older running, get rid of them as long as you can still find (casual) gamers willing to pay something for them. If you think about upgrading from Kepler to Maxwell and don't want to spend too much I propose the following: replace 2 Keplers by 1 Maxwell for about the same throughput, which should hopefully be possible with 20 nm and the architectural improvements. This way you don't have to spend as much and reduce power usage significantly (further savings). You throughput won't increase, but so what? If you feel like spending again you could always add another GPU. MrS Scanning for our furry friends since Jan 2002 |
Retvari ZoltanSend message Joined: 20 Jan 09 Posts: 2380 Credit: 16,897,957,044 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
There is no ARM CPU on the block diagram of the GM107: After reading the article it seems to me that this is only a half step towards the new generation: it has better performance/watt ratio because of the evolution of the 28nm process and because of the architectural changes (probably these two aspects are bound together: this architecture can achieve higher CUDA core/chip area ratio than the GK architecture). As its performance is expected to be like the GTX480's performance, perhaps there is no need for an on-chip CPU to fully utilize this GPU. Also, it's possible that there is no need for big changes in the GPUGrid application to work with this GPU. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
As far as I remember this "ARM on chip" was still a complete rumor. Could well be that someone confused some material about future nVidia server chips with GPU (project Denver) for the regular GPUs. MrS Scanning for our furry friends since Jan 2002 |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
I would be a bit concerned about the 1306GFlops rating for the GTX750Ti. That's actually below the GTX650Ti (1420). The 750Ti also has a 128bit bus and bandwidth of 86.4GB/s. While the theoretical GFLOPS/W SP is 21.8, it's still an entry level card; it would talk 4 of these card to have the overall performance of a GTX780Ti. There should be plenty of OC models and potential for these GPU's to boost further. There may also be a 1GB version of the GTX750Ti (avoid). My confusion over ARM came from fudge reports which presumed Maxwell and ARM are joined at the hip. Just because Tesla's might get an ARM this decade does not mean any other card will. It hasn't even been announced that Maxwell based Tesla's will - just interpreted that way. The use of ARM doesn't require Maxwell architecture; the Tegra K1 is based on Kepler and uses a Quad-Core ARM Cortex-A15 R3, and previous Tegra's also used ARM. It is the case that NVidia want to do more on the discrete GPU and be less reliant on the underlying system but that doesn't in itself require an ARM processor. The only really interesting change is that the Shader Model is 5.0 - so it's CC5.0. This non-super-scalar architecture probably helped throw people into thinking that these GPU's would come with ARM processors, but when you think about it, there is no sense putting a discrete processor onto an entry level GPU. A potential obstacle to the use of ARM might be Windows licences, as these typically limit your software use to 2CPU's (makes a second card a no-no). FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
I see EVGA are selling a GTX750Ti with a 1268MHz Boost. In theory that's 16.8% faster than the reference model, though I would expect the reference card to boost higher than the quoted 1085MHz (if it works)! FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
MJHSend message Joined: 12 Nov 07 Posts: 696 Credit: 27,266,655 RAC: 0 Level ![]() Scientific publications ![]()
|
I have some GTX750Tis on order; should have them in my hands next week. It's not yet clear whether we'll need to issue a new application build. Stay tuned! Matt |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
I read that the 128bit bus is a bottleneck, but as the card uses 6GHz GDDR5 a 10% OC is a given. The GPU also OC's well (as the temps are low). So these cards could be tweeked to be significantly more competitive than the reference model. Compute is a bit mixed going by Anandtech, so its wait and see about the performance (if they work), http://anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/22 FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Don't be fooled by the comparably low maximum Flops. We got many of those with Kepler, and complained initially that we couldn't make proper use of them, as the performance per shader per clock was significantly below non-superscalar Fermis. Now we're going non-superscalar again and gain some efficiency through that, as well as through other tweaks. And this show in the compute benchmarks at Anandtech: GTX750Ti beats GTX650Ti easily and consistently, often hangs with GTX650Ti Boost and GTX660 and sometimes performs more than twice as fast as GTX660! Neither of those benchmarks is GPU-Grid, but this bodes well for Maxwell here, since GPU-Grid never really liked the super-scalarity all that much. Let's wait for Matt's test.. but I expect Maxwell to do pretty well. The 128 bit memory bus on GM107 is somewhat limiting, but mitigated by the far larger L2 cache. To what extend for GPU-Grid.. I don't know. And those chips seem to clock ridiculously high. I've seen up to almost 1.3 GHz at stock voltage (1.13 - 1.17 V). If wish the testers had lowered the voltage to see what the chips really can do, instead of being limited by the software sliders. The bigger chips naturally won't clock as well, but 20 nm should shake things up anyway. Bottom line: don't rush to buy those cards, since they're only mainstream models after all. But don't buy any other cards for GPU-Grid until we know how good Maxwell really is over here. MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 16 Mar 11 Posts: 509 Credit: 179,005,236 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
OK no purchases but I would rather a professional or a Ph.D. test the pretend Maxwells so we can be sure of what we're looking at ;-) BOINC <<--- credit whores, pedants, alien hunters |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Professional enough? Or shall I search for a review written by someone with a PhD in ancient greek history? ;) MrS Scanning for our furry friends since Jan 2002 |
©2025 Universitat Pompeu Fabra