Message boards :
Graphics cards (GPUs) :
NVidia GPU Card comparisons in GFLOPS peak
Message board moderation
Previous · 1 . . . 6 · 7 · 8 · 9 · 10 · 11 · 12 . . . 17 · Next
| Author | Message |
|---|---|
Retvari ZoltanSend message Joined: 20 Jan 09 Posts: 2380 Credit: 16,897,957,044 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Sorry, I've put (and I've taken in consideration) a bad link in my previous post: The Intel Core i7-3770K is actually socket 1155 too, so it has only one PCIe 3.0 x16 lane. That's my fault. Socket 2011 CPUs (Core i7-3820, Core i7 3930K, Core i7-3960X) have two x16, and one x8 lane. Here is a PDF explaining Socket 2011. See page 10. |
|
Send message Joined: 8 Mar 12 Posts: 411 Credit: 2,083,882,218 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Unfortunately 2011 is no longer (with SB) officially PCIe 3.0. Apparently there's a problem with the timing in the controller between different batches of chips. What this means is that it's all about luck whether or not your CPU + GPU will accept the hack or not. This has been confirmed by NVIDIA. Would post the link but their forums have been down for sometime now for maintenance. |
|
Send message Joined: 28 May 12 Posts: 63 Credit: 714,535,121 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Is it time to add the 600 series cards to this list? 690s are EXPENSIVE, and difficult to find in stock at some dealers. 680s and 670s seem to be readily available and folks are reporting success in running all these cards on GPUGrid. As we have discussed in private messages, the price difference between a used GTX480 at about $200 and a new GTX670 at about $400 and a new GTX 680 at about $500 are extremely expensive upgrades that until we can quantify the performance increase, MAY NOT SEEM WORTH THE PRICE. |
|
Send message Joined: 28 May 12 Posts: 63 Credit: 714,535,121 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Ok I went back and saw your earlier reply to this same question, and somewhat understand your reluctance to generate and maintain the data as there are several other factors involved other than simply the GPU card. Still an effort at this with the caveat of PCIE3.0 vs PCIE2.0, older versus newer machine, relative CPU performance levels, .. all impacting the results plus and minus could benefit everyone who is considering an upgrade. |
robertmilesSend message Joined: 16 Apr 09 Posts: 503 Credit: 769,991,668 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Why not list the GTX 6nn data available currently, but clearly mark it preliminary? That should at least be better than no data at all. |
JStatesonSend message Joined: 31 Oct 08 Posts: 186 Credit: 3,578,903,157 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
The update is just to add three cards (GTX 560 448 Ti, GTX 560 and GT 545) If I understand the above, then would it be fair to state that a dual gtx460 like this would not be as efficient as a single gtx570? ie: 816 x 2 is less than 1896. So if the gtx460 has 336 cores then only 256 cores are used in gpugrid. The gtx570 has 480 cores. Is all 480 used? why is there a statement in this thread that only 256 cores can be used. Both eVga boards are about the same price. I was thinking about getting that dualie 460, but not if the 570 is clearly better. By better I means 2x as fast according to your cc corrected performance. How do your statistics compare to prime or milkyway? thanks for looking! |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
A single reference GTX570 was around 16% faster than two reference GTX460's, when I did the table. The situation hasn't changed much, though we are using a different app, and running different tasks now. I don't like the look of that extra long bespoke GTX460 dualie; it would be restricted to EATX cases and the like. As far as I know, none of the other projects, including PG and MW, suffer from the same issues with CC2.1 cards. ATI/AMD cards are much better for MW anyway. A GTX570 uses all 480cores here (CC2.0). If a GTX670 is too expensive wait a couple of weeks and get a GTX660Ti if it checks out. Even though the GTX570 is a good card and you might be able to get it for a good price, in a couple of weeks the GTX660Ti should prove to be a better card in terms of performance per outlay and running cost. FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
JStatesonSend message Joined: 31 Oct 08 Posts: 186 Credit: 3,578,903,157 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
I had a gtx670 for 24 hours before passing it to my son. It would not run DVDFab's bluray copy program unlike the gtx570 or the gtx460 or any earlier CUDA boards. The nVidia forum has been down for almost a month so I am in the dark as to why it would not work and DVDFab had no time frame for getting their product to work with kepler. I have a single slot "Galaxy Razor" gtx460 that overheats badly in an EATX case even after underclocking it all the way down. eVga's dualie is going for 230 after rebate so I may go for it. Thanks! |
Retvari ZoltanSend message Joined: 20 Jan 09 Posts: 2380 Credit: 16,897,957,044 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
I have a single slot "Galaxy Razor" gtx460 that overheats badly in an EATX case even after underclocking it all the way down. eVga's dualie is going for 230 after rebate so I may go for it. If your single GPU GTX 460 overheats, a dual GPU GTX 460 will overheat even faster, because both cards exhaust the hot air inside the case and the latter one has higher TDP. I think DVDFab will support the GTX 6xx very soon, so don't buy outdated technology for crunching (or you'll pay the price difference for electricity costs). |
JStatesonSend message Joined: 31 Oct 08 Posts: 186 Credit: 3,578,903,157 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
single slot gpu is not same as a single gpu. The "Galaxy Razor" takes up exactly 1 PCIe slot and its fan simply blows air about the inside of the case. The dual slot ones exhaust the air thru the 2nd slot (usually but not always) and have much more efficient cooling. I once saw my galaxy razor selling for 4x what I paid for it on ebay. It is also possible to plug it into a notebook's "express card" slot as shown here. It works fine for gameing but 24/7 cuda is a problem with prime grid even at normal clock speed. The 2x one I may replace it with will put out more heat as you say, but I suspect it has more efficient cooling. I do have a 3 slot gtx570, made by Asus, and it runs really cool 24/7 on any boinc project. However, it does take 3 slots. I have an extensive collection of movies, mostly bluray, and the nVidia gpu's make a significient difference in copy speed. |
Retvari ZoltanSend message Joined: 20 Jan 09 Posts: 2380 Credit: 16,897,957,044 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
single slot gpu is not same as a single gpu. I know. The dual slot ones exhaust the air thru the 2nd slot (usually but not always) and have much more efficient cooling. As you say, this is true for the radial fan type cooling like the EVGA GeForce GTX 570's, wich blows the air in line with the PCB, directly out from the case. Unlike the EVGA GeForce GTX 460 2Win has 3 axial fans blowing air through the heatsink's fins at right angles to the PCB. In addition it has very limited rear exhaust grille. (it's more than what the single slot has, but you need positive air pressure inside your case to move the hot air through that little grille, because the 3 fans do not move the air towards this grille) The 2x one I may replace it with will put out more heat as you say, but I suspect it has more efficient cooling. Of course the dual slot cooler of the GPU removes the heat more efficiently from the chip than a single slot cooler, but not from the case (if it isn't radial type). After a short time it can't cool itself with the hot air emitted into the case. You have to have fans blowing cool air from the outside directly to the 3 cooler of this GPU to make them work well, because they don't move the hot air out of the case. It will also make yor CPU and PSU run hotter. I do have a 3 slot gtx570, made by Asus, and it runs really cool 24/7 on any boinc project. However, it does take 3 slots. This should be the ENGTX570 DCII, which has far better cooler fin surface per power ratio, and it has 3 times larger rear air exhaust grille than the EVGA GeForce GTX 460 2Win. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Don't buy GTX460 for GPU-Grid now, it's simply not efficient enough any more for 24/7 crunching. I'd rather think about replacing the cooler on your current card with some aftermarket design. That will probably also use 3 slots, but be strong & quiet. MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 11 Nov 10 Posts: 9 Credit: 53,476,066 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]()
|
Currently, best price/performance/watt would be GTX660Ti, which has same 1344 shaders as GTX670 but is around 20% cheaper... |
|
Send message Joined: 5 Dec 11 Posts: 147 Credit: 69,970,684 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
would the lower bandwidth of the 660ti (144Gb/s, 192bit) compared to the (198Gb/s, 256 bit) on the 670 make a difference to the performance on GPUGrid? |
|
Send message Joined: 13 Jul 09 Posts: 64 Credit: 2,922,790,120 RAC: 60 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]()
|
After a short time it can't cool itself with the hot air emitted into the case. You have to have fans blowing cool air from the outside directly to the 3 cooler of this GPU to make them work well, because they don't move the hot air out of the case. It will also make yor CPU and PSU run hotter. Cases are dumb... http://www.skipsjunk.net/gallery/texas12.html - da shu @ HeliOS, "A child's exposure to technology should never be predicated on an ability to afford it." |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
So far GPU-Grid hasn't depended much on memory bandwidth, so the performance difference should be a few percent at most, probably less. I don't have hard numbers at hand, though. MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 28 Mar 09 Posts: 490 Credit: 11,732,395,728 RAC: 71,755 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Currently, best price/performance/watt would be GTX660Ti, which has same 1344 shaders as GTX670 but is around 20% cheaper... So in the real world, with everything else being equal, will the gtx 660Ti finish the same type Wu as fast as the gtx 670? How is that for a dumb and simple question? |
|
Send message Joined: 5 Dec 11 Posts: 147 Credit: 69,970,684 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Currently, best price/performance/watt would be GTX660Ti, which has same 1344 shaders as GTX670 but is around 20% cheaper... Not dumb at all. If not the same or faster (the 660Ti is clocked faster than a 670) then it will be very close to it. Especially if you overclock the memory a bit to get some of the bandwidth back. My understanding is that GPUGrid does not pass that much information over the PCIE lanes, so I would be surprised if a 660Ti ends up being much if any slower than a 670. http://www.gpugrid.net/result.php?resultid=6043519 - my only result with my 660Ti so far - I note though the CPU time and GPU time are the same, not sure what is going on with that.....? |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
So in the real world, with everything else being equal, will the gtx 660Ti finish the same type Wu as fast as the gtx 670? A very valid question. But like Simba I'll only go as far as stating "very close". The problem is finding configurations which are similar enough to judge this. OS, driver, PCIe speed, CPU speed and even other running applications affect the actual runtimes (to a varying degree). And the real elephant in the room: GPU clock speed, which isn't reported by BOINC. MrS Scanning for our furry friends since Jan 2002 |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
The reporting is also somewhat down to the app; its reported, albeit badly, by 3.1 but not by 4.2: a 3.1 run, Stderr output <core_client_version>7.0.28</core_client_version> <![CDATA[ <stderr_txt> # Using device 0 # There is 1 device supporting CUDA # Device 0: "GeForce GTX 470" # Clock rate: 1.36 GHz # Total amount of global memory: 1341718528 bytes # Number of multiprocessors: 14 # Number of cores: 112 MDIO: cannot open file "restart.coor" # Time per step (avg over 1500000 steps): 22.644 ms # Approximate elapsed time for entire WU: 33966.235 s called boinc_finish </stderr_txt> ]]> a 4.2 run Stderr output <core_client_version>7.0.28</core_client_version> <![CDATA[ <stderr_txt> MDIO: cannot open file "restart.coor" # Time per step (avg over 3750000 steps): 7.389 ms # Approximate elapsed time for entire WU: 27707.531 s called boinc_finish </stderr_txt> ]]> I had a look at Zoltan's GTX670 times, as he reported his clocks, and compared them to my 660Ti times, over several different task types. His system was consistently 9.5% faster. The first thing I would look at here would be XP vs W7; I'm not sure what the difference is now. Use to be >11% better for XP. Assuming that is still the case, and there is no noticeable performance gain from PCIE3 over PCIE 2 x16, then I would say the cards perform very closely here; the GTX670 is about 2% faster after adjusting for the clocks. So the 670 would just about edge it in overall throughput, but the 660Ti uses slightly less power and costs a fair amount less. Of course if XP isn't >11% faster than W7, then the 670 is significantly faster... FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
©2025 Universitat Pompeu Fabra