Message boards :
Graphics cards (GPUs) :
Nvidia GT300
Message board moderation
| Author | Message |
|---|---|
|
Send message Joined: 29 Aug 08 Posts: 3 Credit: 2,514,384 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]()
|
GPU specifications Guru3d Brightsideofnews |
|
Send message Joined: 24 Dec 08 Posts: 738 Credit: 200,909,904 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
It all sounds good but no news yet on when it will be out, other than the "in the next few weeks" line. BOINC blog |
|
Send message Joined: 29 Aug 08 Posts: 3 Credit: 2,514,384 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]()
|
|
GDFSend message Joined: 14 Mar 07 Posts: 1958 Credit: 629,356 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() |
Some more technical data of the G300 for Nvidia. http://www.nvidia.com/content/PDF/fermi_white_papers/NVIDIAFermiArchitectureWhitepaper.pdf It is likely to be at least 3 times faster than a GTX285 for GPUGRID. We will probably compile specifically for it with multiple application versions, so that people can use the maximum of performance. gdf |
|
Send message Joined: 13 Jul 09 Posts: 64 Credit: 2,922,790,120 RAC: 80 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]()
|
How big of a loan would one have to apply for to get one of these things? Do I need to be looking at a ten year term? - da shu @ HeliOS, "A child's exposure to technology should never be predicated on an ability to afford it." |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
It is the 25 percent deposit that I am worried about! |
|
Send message Joined: 24 Dec 08 Posts: 738 Credit: 200,909,904 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
How big of a loan would one have to apply for to get one of these things? Do I need to be looking at a ten year term? I think they will be looking at similar pricing to the HD5870 as thats their main competition. The GT200-based cards will probably fall in price in the short term, so they can compete against ATI until the GT300 gets into volume production. BOINC blog |
GDFSend message Joined: 14 Mar 07 Posts: 1958 Credit: 629,356 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() |
I have heard 20% higher than current price of GTX295. gdf |
|
Send message Joined: 22 Jul 09 Posts: 21 Credit: 195 RAC: 0 Level ![]() Scientific publications
|
the best article is from realworldtech.com. its a good read. they know what they are talking about.will gpugrid take advamtage of the double precision capabilities? does scientific computing cache well? ie data locality. |
robertmilesSend message Joined: 16 Apr 09 Posts: 503 Credit: 769,991,668 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
the best article is from realworldtech.com. its a good read. they know what they are talking about.will gpugrid take advamtage of the double precision capabilities? does scientific computing cache well? ie data locality. You'll have to wait for someone on the GPUGRID project team to check whether their program, and the underlying science, even has a significant need for double precision, which would probably double the amount of GPU board memory needed to make full use of the same number of GPU cores. In case you're interested in the Milkyway@home project, I've already found that the GPU version of their application requires a GPU card with double precision, and therefore a 200 series Nvidia chip if that card's already available with an Nvidia GPU. Looks worth checking if it's a good use for the many GTX 260 cards with a rather high error rate under GPUGRID, though. They've already said that their underlying science needs double precision to produce useful results. As for caching well, I'd expect that to depend highly on how the program was written, and not be the same for all scientific computing. |
GDFSend message Joined: 14 Mar 07 Posts: 1958 Credit: 629,356 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() |
We will take advantage of the 2.5 Tflops single precision. gdf |
|
Send message Joined: 2 Mar 09 Posts: 159 Credit: 13,639,818 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
We will take advantage of the 2.5 Tflops single precision. with those flops, that card would finish a current wu in little under 2 hrs, if my math is correct, which it may not be. |
HydropowerSend message Joined: 3 Apr 09 Posts: 70 Credit: 6,003,024 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]()
|
My confidence in NVidia has dropped significantly. They presented a mockup Fermi-Tesla card as 'the real thing' without telling anyone beforehand that it was a fake card. It took this article to make them confirm it was a mockup. This would suggest they do not have a real card. http://www.semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc/ |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Scientific accuracy can be obtained/validated in more than one way. So, double precision is not the only method! PS. I doubt if the amount of DDR5 will make any difference to GPUGRID, well not any time soon. If there is a card released with 512MB that costs 30% less than one with say 2GB, get the 512MB card - If you want to go from A to B fast and not spent too much getting there, buy the car without the caravan attached! |
|
Send message Joined: 24 Dec 08 Posts: 738 Credit: 200,909,904 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
My confidence in NVidia has dropped significantly. Apparently the GTX260 and GTX275 have been killed off. Better get your last orders in. They expect the GTX295 will be next. Pity they haven't got anything to replace them with (ie the GT300-based cards). BOINC blog |
BeyondSend message Joined: 23 Nov 08 Posts: 1112 Credit: 6,162,416,256 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
My confidence in NVidia has dropped significantly. Doesn't sound too good for NVidia at the moment. Nvidia kills GTX285, GTX275, GTX260, abandons the mid and high end market: http://www.semiaccurate.com/2009/10/06/nvidia-kills-gtx285-gtx275-gtx260-abandons-mid-and-high-end-market/ Current NV high end cards dead, fermi so far vapor, stopping development on all chipsets, legal problems with intel. Add that to the sony PS3 no Linux debacle and I'd think GPUGRID might want to accelerate the development of an ATI client. |
Paul D. BuckSend message Joined: 9 Jun 08 Posts: 1050 Credit: 37,321,185 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
I have run across some other discussion threads that don't seem to see things quite so direly ... only time is going to tell if all this is true or not and how this will shake out. Maybe Nvidia is dead or will be dead ... but all the current cards are going to die tomorrow... so there will be plenty of support for months to come. In the PS3 case it was more that any machine updated with the new firmware would not be able to run Linux any longer and no new PS3 would be able to either ... I don't know how much of the total work was being done by the PS3 but if it was low enough then it is a logical decision to shout it down because the cost was not worth the benefit along with the fact that it would be a rapidly shrinking pool ... The cases thusly, are not parallel. Superficially similar, but no where near on the same timescale ... Showing a mock-up that is not functional at a press briefing? Makes sense to me ... who in their right mind wants to risk one of the few working prototypes to butterfingered execs or press flacks? |
GDFSend message Joined: 14 Mar 07 Posts: 1958 Credit: 629,356 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() |
Guys the G300 is simply amazing! I want to test it before saying it, but it could really be the reference architecture for two years. Once it is out nobody would want to buy a GTX295. That is why factories are not producing them anymore. It is different for the GTX285, I am sure that it will just become a low end card, maybe under another name. Nvidia has always, as a practice, advertised something before they have it in the market. In this case, it is just few months away, so quite in time. G300 should be out between Christmas and new year. The G300 should then the fastest GPU out there, including ATI (which however will be just almost there with the 5870, amazing as well). When to support ATI does not depend on us, but on ATI themselves to provide a working openCL implementation. I believe that this is close now. gdf |
HydropowerSend message Joined: 3 Apr 09 Posts: 70 Credit: 6,003,024 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]()
|
Showing a mock-up that is not functional at a press briefing? Makes sense to me ... who in their right mind wants to risk one of the few working prototypes to butterfingered execs or press flacks? The issue is: the mockup was presented as the real item and it was not at a press briefing, it was at the GPU Technology Conference for developers and investors. If they had a real card, it would have made sense to me to actually show it because at that time rumours were already spreading that they did not have a card. A few smudges on a real card, or a damaged one if fallen, would be no match to a stain on the NVidia image. It reminds me of a (freely reproduced) quote by an Intel exec many years ago when they introduced the microchip. "someone asked me 'how are they going to service such a small part when one breaks', I replied 'if one breaks there are plenty more where that one came from', they just did not understand the concept of a replaceable chip." If you drop a shell of a real GT300, you replace the shell. If you cannot even show a real shell, what does that say about the egg ? Sounds like the chicken won here. (And believe me I'd love to have a nest of GT300 today) |
Paul D. BuckSend message Joined: 9 Jun 08 Posts: 1050 Credit: 37,321,185 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Showing a mock-up that is not functional at a press briefing? Makes sense to me ... who in their right mind wants to risk one of the few working prototypes to butterfingered execs or press flacks? Still sounds like a dog-and-pony show ... And I am sorry, why is waving around a "real" card that much more impressive than waving about a "fake" one ... to be honest I just want to see it work in a machine. All else is meaningless ... As to GDF's comment, yes I would like to buy a GTX300, but if the only option is to buy at the top, I would rather buy a "slower" and less capable GTX295 so I can get the multi-Core aspect for processing two tasks at once in the same box ...at times throughput is not just measured by how fast you can pump out the tasks, but by how many you can have in flight ... If Nvidia takes out too many levels in the structure they may lose me on my upgrade path as productivity wise the HD4870s I have beat all the Nvidia cards I have, and they are cheaper too ... should OpenCl hit and several projects move to support it ... well, I would not hesitate to consider replacing failed Nvidia cards with ATI versions... |
©2025 Universitat Pompeu Fabra