Message boards :
Graphics cards (GPUs) :
GeForce GTX Titan launching the 18th
Message board moderation
| Author | Message |
|---|---|
|
Send message Joined: 18 Jun 12 Posts: 297 Credit: 3,572,627,986 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
There's an interesting artical here http://videocardz.com/39536/nvidia-geforce-gtx-titan-to-be-released-on-february-18th I guess it will be a paper launch but maybe some bench marks with the NDA's lifted |
|
Send message Joined: 16 Mar 11 Posts: 509 Credit: 179,005,236 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Click instead http://videocardz.com/39536/nvidia-geforce-gtx-titan-to-be-released-on-february-18th BOINC <<--- credit whores, pedants, alien hunters |
|
Send message Joined: 18 Jun 12 Posts: 297 Credit: 3,572,627,986 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Heres some pictures of it here http://www.techpowerup.com/180297/NVIDIA-GeForce-GTX-Titan-Graphics-Card-Pictured-in-Full.html Sorry for being lazy, just got excited. |
|
Send message Joined: 16 Mar 11 Posts: 509 Credit: 179,005,236 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Looks nice, love the window. I think I'd put it on a hinge or a slider like a sunroof to make dust blow outs easier. BOINC <<--- credit whores, pedants, alien hunters |
|
Send message Joined: 18 Jun 12 Posts: 297 Credit: 3,572,627,986 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Now their saying delayed untill the 19th, heres a whole new batch of pictures. http://videocardz.com/39618/nvidia-geforce-gtx-titan-pictured |
Retvari ZoltanSend message Joined: 20 Jan 09 Posts: 2380 Credit: 16,897,957,044 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
According to its specifications, it will be ~39.5% faster than a standard GTX 680. Probably it will be more power effective. I've read that the supply of the GeForce Titan will be very limited (10.000 cards) |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
At that price point 10k might actually be enough :D MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 16 Mar 11 Posts: 509 Credit: 179,005,236 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Good looks and limited edition aren't worth that much to me, maybe in a car/truck but not a video card. BOINC <<--- credit whores, pedants, alien hunters |
|
Send message Joined: 15 Apr 10 Posts: 123 Credit: 1,004,473,861 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
mostly agree, it's a component... specs matter the most. however sometimes it's worth it for peripherals (keyboard, mouse, things you actually SEE and TOUCH on a regular basis :P) however for a gpu/cpu/etc that's just going to be outdated in a year or less, why pay the premium? XtremeSystems.org - #1 Team in GPUGrid |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
NVIDIA's GeForce GTX Titan, Part 1: Titan For Gaming, Titan For Compute by Ryan Smyth, anandtech.com I guess disabling FP64 (running at 1/24) would be better for here, but that depends on the apps and WU's. What if anything would Dynamic Parallelism, Hyper-Q and more Registers/Thread bring to this project? I see it's CC3.5 FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
I guess disabling FP64 (running at 1/24) would be better for here, but that depends on the apps and WU's. GPU-Grid uses exactly 0 FP64, so disabling won't hurt. And not disabling Turbo by leaving FP64 at 1/24 will benefit GPU-Grid, depending on what other manual OC one might have in place instead. Crippling FP64 to get Turbo s*cks hard for people who want to run a DP backup project, though. Generally I think nVidia is being stupid forcing this decision on us. Sure, the chip will consume more power if fully utilized.. but they've got power consumption monitoring and temperature control in place, as well as fine-grained Turbo modew with voltage adjustement. I can't see any harm in leaving Turbo and FP64 at 1/3 active and just throttling down to base clock & voltage if really neccessary. This sure beats starting at base clock & voltage irregardless of actual workload. What if anything would Dynamic Parallelism, Hyper-Q and more Registers/Thread bring to this project? More registers/thread avoid slowing down your more complex code. However, if the code is written to perform well on regular Keplers (and older), then you should automatically stay out of the region where this would matter. The other 2 features have to be explicitely programmed for, so are not going to be used here unless all GPUs get them. MrS Scanning for our furry friends since Jan 2002 |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Unless the researchers can find something special hidden within this card I'm writing it off as a niche GPU; not for general users, gamers and 'normal' GPU crunchers. It's better suited to other things, mainly due to the price. While the Titan is essentially a Tesla for ~£830 (which will be excellent for some researchers) I cannot see GPUGrid needing 6GB GDDR5, and I can't see the uptake of this card being high enough for GPUGrid to develop specifically for it, especially if there will only be 10,000 of them - If GPUGrid started to go down that route, the project would need about 7 different apps at any one time (probably based on CC). We really need to see what the performance actually is for GPUGrid (and other projects) before drawing any solid conclusions. That said, it's clear that the price is exclusively high. 2688 was a little unexpected. Perhaps lesser (and more reasonably priced) versions will turn up (2496, 2304 and/or possibly 2112). If they don't then that suggests this is the end of the line for GK110. So this might be the launch of a different range of GPU's, or the end of a high end line. Either way I would be extremely surprised if NVidia are not already well into the design of a new supercomputer GPU. If the Titan turns out to be 40% faster than a GTX680, for here, then it's still ~36% slower than a GTX690, which costs ~£750 (presently £80 less). At only 50W more (TDP) that would make the GTX690 cheaper to buy and more economical to run, in terms of performance per Watt. However, as the Titan really has two profiles (SP and DP) the actual performance per Watt would need to be assessed on a project by project basis. At 250W I think it would be a tough ask to nobble two such GPU's into the one case, but if you dropped the CUDA core count and limited it to SP it might be more doable. More likely in a revamp - the GTX480 never emerged in such a format, but did turn up in the revamped GTX590. The biggest advantage I see for the GTX Titan is the exhaust system. It would better facilitate 3 or 4 GPU's in the one case: 3 GTX Titan's would be faster and cooler than 2 GTX690's (4 GK104 GPU cores), so better for crunching and gaming (so long as you are into £4K+ main system units). While 4 GTX Titan's would roughly match 3 GTX690's, the Titan's exhaust cooling would be much better. In fact 3 GTX690's is a bit impractical. FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
|
Send message Joined: 16 Mar 11 Posts: 509 Credit: 179,005,236 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
The biggest advantage I see for the GTX Titan is the exhaust system. It would better facilitate 3 or 4 GPU's in the one case: What makes Titan's exhaust system better? Why is 3 GTX690s impractical? Impractical from a cooling/exhaust perspective? If so why? (I plan on ordering a mobo and single GTX690 at month's end and plan on eventually adding 3 more GTX690 to the same mobo. If you think it's a bad idea from an exhaust/cooling perspective or any other perspective I'd appreciate hearing about it now before I make the initial purchase.) BOINC <<--- credit whores, pedants, alien hunters |
|
Send message Joined: 15 Apr 10 Posts: 123 Credit: 1,004,473,861 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
gtx 690 exhausts inside the pc case (at least half anyways). most other designs (including 680/titan) exhaust outside the pc case. This is only referring to the reference design though; many 3rd party manufacturers change the cooling systems to something completely different. so if you have multiple 690s in 1 case, the exhaust will heat up the inside of the case and the other cards quite noticeably and you would need a lot more airflow in and out of the pc case, whereas multiple 680s/titans would be exhausting out the rear of the case and less attention could be payed to case airflow. XtremeSystems.org - #1 Team in GPUGrid |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Sure, this card is a niche product. It's for people who don't want to or can't afford a Tesla and make a living from crunching things on it and need either massive FP64 or can't parallelize the task well. Or just want an über-gaming rig and don't care about the money :p Regarding the stock cooler vs. GTX690: the latter one exhausts lots of hot air to the case front, so you'll need a huge exhaust fan there and some other intakes, probably best in the side panel. Or run caseless if the crunching monster is in some distant room anyway. MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 22 Nov 09 Posts: 114 Credit: 589,114,683 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
NVIDIA's GeForce GTX Titan, Part 1: Titan For Gaming, Titan For Compute NVIDIA’s GeForce GTX Titan Review, Part 2: Titan's Performance Unveiled - compute performance benchmarks. Well written by a CS PhD student with a specialization in parallel computing and GPUs. Unrivaled compute performance here in both DP and SP. IMHO, nVidia got wise with this card. I bet it flys of the shelves to people who want this kind of compute performance at a fraction of the cost of their Teslas. If only my wallet were large enough... |
|
Send message Joined: 22 Nov 09 Posts: 114 Credit: 589,114,683 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Unless the researchers can find something special hidden within this card I'm writing it off as a niche GPU; not for general users, gamers and 'normal' GPU crunchers. Agreed. |
microchipSend message Joined: 4 Sep 11 Posts: 110 Credit: 326,102,587 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
|
Gattorantolo [Ticino]Send message Joined: 29 Dec 11 Posts: 44 Credit: 251,211,525 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
|
|
Send message Joined: 31 May 12 Posts: 8 Credit: 12,361,387 RAC: 0 Level ![]() Scientific publications ![]() ![]()
|
I might end up saving up and grabbing one for gaming. I have a single GTX 670 right now and it's not quite cutting it. Some of the games I play don't really scale very well with multiple GPUs, and upgrading to a 680 isn't worth the time or trouble. If I was patient I'd just wait for the new chips to come out in 2014 but that's not the case. Getting insane compute performance out of it is just an added bonus! |
©2025 Universitat Pompeu Fabra