Nvidia GT300

Message boards : Graphics cards (GPUs) : Nvidia GT300
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · Next

AuthorMessage
STE\/E

Send message
Joined: 18 Sep 08
Posts: 368
Credit: 4,174,624,885
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 13882 - Posted: 11 Dec 2009, 21:24:16 UTC - in response to Message 13874.  

Please nobody buy a G315 ever.

gdf


I have some on Pre-Order ... ;)
ID: 13882 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 13885 - Posted: 12 Dec 2009, 1:39:03 UTC - in response to Message 13882.  

I've noticed that some of the recent high end HP computers offer Nvidia boards that I don't remember seeing on your list of what is recommended and what is not.

notebook:

GT230M 1 GB

GT220 1 GB

desktop:

G210 512 MB

also an older card, already on your list:

GTX 260 1.8 GB

Nothing else already on the list you now recommend, and no information about whether the higher GTX cards will even fit.

Since some of us NEED to buy computers only from a computer company that will unpack them and do the initial setup, you might want to make sure that the first three are added to your list of what is suitable and what is not.

Also, Nvidia is rather slow about upgrading their notebook-specific drivers to have the latest capabilities; for example, their web site for downloading drivers says that their 190.* family of drivers are NOT suitable replacements for the driver for the G 105M board in my notebook computer. The closest I've been able to find is the 186.44 driver, and it did NOT come from the Nvidia site. That one provides CUDA 2.2, but not CUDA 2.3.

I did find two sites that offer drivers for SOME Nvidia notebook cards, apparantly NOT including a general purpose one for all such cards:

http://www.nvidia.com/object/notebook_winvista_win7_x64_186.81_whql.html

http://www.laptopvideo2go.com/drivers

ID: 13885 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
CTAPbIi

Send message
Joined: 29 Aug 09
Posts: 175
Credit: 259,509,919
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 13891 - Posted: 12 Dec 2009, 4:03:39 UTC - in response to Message 13885.  

robertmiles
"don't worry, be happy" (c) Bob Marley

all drivers are universal, so i do not think that 19x.xx will not work on your video card. Look, how this can be that older driver supports new video card and new version - does not?

And furthermore - what stops you to try 195.xx? :-)

About GF100. Last rumours - it will be available in March.
ID: 13891 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 13895 - Posted: 12 Dec 2009, 13:02:25 UTC - in response to Message 13891.  

The Nvidia site said otherwise when I asked it which driver was suitable.

My guess is that the 190.* Nvidia drivers are universal ONLY for the desktop graphics cards; they have separate series (186.* and 191.*) for the laptop graphics cards, and mention that even for those series, there are likely to be manufacturer-specific requirements for a specific member of the series.
ID: 13895 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 13898 - Posted: 12 Dec 2009, 15:58:18 UTC

Now, Nvidia has changed their driver downloads page since the last time I looked at it. It now says that 195.62 will work on MOST, but not all, of their laptop video cards, at least on most laptops.

They are not very clear on just which board-laptop combinations it will work properly on, but that looks hopeful enough that my laptop is installng 195.62 now.
ID: 13898 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
CTAPbIi

Send message
Joined: 29 Aug 09
Posts: 175
Credit: 259,509,919
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 13899 - Posted: 12 Dec 2009, 16:35:59 UTC

just try :-)
ID: 13899 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 13904 - Posted: 13 Dec 2009, 12:25:25 UTC - in response to Message 13899.  

Here is my opinion on the GT210, GT220 and GT315.

The G210 is a resounding No, don’t go there card!
The GT220, GT230M and GT315 cards are not cards I would buy to participate in this project. So stay away from them. They are not good value for money, if you want to participate here. That said the GT220 / GT315 might get through about 1 task a day, if the system is on 24/7.

Mobile devices tend not to be on too much. They tend to use power saving features by default, so the system goes to sleep after a very short time.
I would expect a GT230M to overheat quickly and make the system very noisy.
I found that the graphics card in my laptop caused system instability when running GPUGrid, and rarely finished a job. It was not worth it, and may have caused more bother than it was worth to the project.

As for the GTX 260 1.8GB, the amount of RAM makes no difference. What matters is the core.
ID: 13904 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 13919 - Posted: 14 Dec 2009, 7:51:08 UTC - in response to Message 13899.  
Last modified: 14 Dec 2009, 8:04:00 UTC

just try :-)


So far, it's worked successfully for both Collatz and Einstein.

The G105M board in that laptop is listed as not suitable for GPUGRID, so I haven't tried it for GPUGRID.


Recently, I looked at what types of graphics boards are available for the high-end HP desktop computers, using the option to have one built with you choice of options. As far as I could tell, the highest-end graphics boards they offer are G210 (not the same as a GT210), GTX 260 (with no indication of which core), and for ATI, part of the HD4800 series. Looks like a good reason NOT to buy an HP computer now, even though they look good for the CPU-only BOINC projects.

Even that much wasn't available until I entered reviews for several of the high-end HP computers, saying that I had thought of buying them until I found that they weren't available with sufficiently high-end graphics cards to use with GPUGRID.

Anyone else with an HP computer (and therefore eligible for an HP Passport account) want to enter more such reviews, to see if that will persuade them even more?
ID: 13919 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 13930 - Posted: 14 Dec 2009, 20:56:22 UTC - in response to Message 13919.  

Seriously, notebooks and GPU-crunching don't mix well. Nobody expects a desktop GPU to run under constant load and even less so for a mobile card. It will challenge the cooling system heavily, if the GPU has any horsepower at all (e.g. not G210).

MrS
Scanning for our furry friends since Jan 2002
ID: 13930 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 13933 - Posted: 14 Dec 2009, 21:51:50 UTC - in response to Message 13930.  

MrS, as always you are quite correct, but perhaps he will be fine with Einstein, for a while at least.
It barely uses the GPU. Its not even 50/50, CPU/GPU; more like 90% CPU.
Einstein GPU+CPU tasks compared to CPU tasks only improve turnover by about 30min; from about 8h to 7h 30min and so on. When I tested it, albeit on a desktop, my temps did not even rise above normal usage. Not even by one degree! GPUGrid puts on about 13 degrees (mind you my GTX260 has 2 fans, and the system has a front fan, side fan, exhaust fan, and PSU fan).

I would not recommend using it with Collatz however!

RM, Keep an eye on the temps just in case they manage to improve on the design, you can use GPU-Z, and see if there are any notable changes.

I use to use a USB laptop cooler that just sat under the laptop blowing cold air around it. I think I bought it for around £10. At the time I was just crunching on the CPU, but it did pull the temps down to a reasonable level.
ID: 13933 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
CTAPbIi

Send message
Joined: 29 Aug 09
Posts: 175
Credit: 259,509,919
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 13970 - Posted: 17 Dec 2009, 17:50:58 UTC - in response to Message 13919.  
Last modified: 17 Dec 2009, 17:55:50 UTC

robertmiles

while you have REALLY strong reason to deal with HP (and other such kind of stuff), but me personally prefer to keep miles away from them. Just look inside HP: the very cheapest component money can buy, very often - mATX mobos (with no OCing - can you believe?), the worst cases in terms on air flow, the worst PSU, stock CPU cooler, the cheapest RAM... The worst think - you can do nothing coz it sealed and under warranty. I can continue but it's clear for me that it's just piece of crap for huge amount of money. Imagine: you paying up to 50% just for brand...

Me personally biult my rig early this fall:
- i7-920 D0
- TR IFX-14 (polished base + 2 Scythe Slip Stream running on 1200rpm)
- OCZ Freeze eXtreme thermal interface
- Asus P6T6 WS Revolution
- 6Gb Muskin 998691 running at 6-7-6-18@1600
- eVGA GTX275 OCed 702/1584/1260
- Enermax Infinity 720W
- CM HAF932

Can you imagine what much you'll pay if you can (but you can not at all) get it from HP?

This rig easily runs at 4200 (200*21) 21.4V, but I was not lucky enough to get better CPU, so now I'm runnig at 4009 (21*191) @1.325V solid rock stable.
ID: 13970 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile robertmiles

Send message
Joined: 16 Apr 09
Posts: 503
Credit: 769,991,668
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 13997 - Posted: 19 Dec 2009, 21:21:58 UTC - in response to Message 13970.  

robertmiles

while you have REALLY strong reason to deal with HP (and other such kind of stuff), but me personally prefer to keep miles away from them. Just look inside HP: the very cheapest component money can buy, very often - mATX mobos (with no OCing - can you believe?), the worst cases in terms on air flow, the worst PSU, stock CPU cooler, the cheapest RAM... The worst think - you can do nothing coz it sealed and under warranty. I can continue but it's clear for me that it's just piece of crap for huge amount of money. Imagine: you paying up to 50% just for brand...


From what I've read, Dell is even worse. Significant reliability problems, even compared to HP.

I'm no longer capable of handling a desktop well enough to build one myself, or even unpack one built elsewhere and then shipped here, so I'll need to choose SOME brand that offers the service of building it and unpacking it for me. Want to suggest one available in the southeast US? There's a local Best Buy, but they do not have any of the recommended Nvidia boards in stock; that's the closest I've found yet.

For the high-end HP products I've been looking at lately, I've found that they offer you SOME choices in what to include, at least for the custom-built models, just not enough; and they don't send them sealed. The rest of your description could fit, though; I don't have a good way of checking.

I sent email to CyperPower today asking them if they offer the unpacking service, so there's a possibility that I may just have to check more brands to find one that meets my needs.
ID: 13997 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 14001 - Posted: 20 Dec 2009, 12:52:51 UTC - in response to Message 13997.  
Last modified: 20 Dec 2009, 12:54:10 UTC

Ask around and find somone who can build a system for you. Pay them $200 and you will save money on buying an OEM HP, Dell...
Alternatively contact a local dealer (shop) and get them to build one.
If you must get a system with a low spec card, get a GT240. There is not much difference in price between a GT220 and a GT240, but the GT240 will do twice the work. You could even have Two GT240s instead of a GTX260 sp216 or GTX275.
The GT240 cards do not require additional power connectors, so you will not need an expensive PSU.
ID: 14001 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 14256 - Posted: 20 Jan 2010, 21:28:11 UTC
Last modified: 20 Jan 2010, 21:28:35 UTC

The gaming-related features of GT300 (or now GF100) have been revealed (link). Impressive raw power and more flexible than previous chips.

MrS
Scanning for our furry friends since Jan 2002
ID: 14256 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 14263 - Posted: 21 Jan 2010, 16:19:49 UTC - in response to Message 14256.  
Last modified: 21 Jan 2010, 16:20:14 UTC

Good Link MrS.
The limited article I read this morning had little to offer, that one was much better.

For what it’s worth, this is what I would speculate:
It will be about 512/240 *1.4 times as fast as a GTX285 (3 times as fast).
This would make it over 65% faster than the GTX 295.

The 1.4 is based on a guess that the cores will be faster due to the 40nm GT100 architecture and using speedier DDR5 (which is not confirmed).
Assuming it uses GDDR5 the frequency increase will be advantageous despite the memory bus width. The memory temperatures should be lower. I would say 2GB would be the new standard.

With higher transistor count the performance should be higher. With the 40nm core size (and given the low temperatures of cards such as the GT 240 and GT220) core temperatures should also be lower.

Unfortunately I cannot see these cards being sold at low prices. I would speculate that they would be well over the $500 mark; around $720 to $850 – hope I am wrong! If they are released in March at that price I don’t think I will be rushing out to buy one, though many will.
I doubt that they will be significantly more power hungry than a GTX 295, but again I would want to know before I bought one. I don’t fancy paying £400 per year to run one.

Perhaps someone else would care to speculate, or correct me.
I expect that over the comming few weeks there will be many more details released.
ID: 14263 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 14267 - Posted: 21 Jan 2010, 19:53:11 UTC - in response to Message 14263.  

Unrelated.
We are doing the last tests on the new application. After a lot of work, at the moment we are 60% faster on the same hardware than the previous one.
On linux is ready, we are fixing it for Windows.

gdf
ID: 14267 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 14273 - Posted: 21 Jan 2010, 22:53:40 UTC - in response to Message 14263.  

Perhaps someone else would care to speculate


Well.. :D

- 512/240 = 2.13 times faster per clock is a good starting point

- 2x the performance also requires 2x the memory bandwidth
- they'll get approximately 2x the clock speed from GDDR5 compared to GDDR3, but reduce bandwidth from 512 to 386 bit -> ~1.5 times more bandwidth
-> so GDDR5 is not going to speed things up, but I'd suspect it to be sufficiently fast so it doesn't hold back performance

- the new architecture is generally more efficient per clock and has the following benefits: better cache system, much more double precision performance, much more raw geometry power, much more texture filtering power, more ROPs, more flexible "fixed function hardware"
-> the speedup due to these greatly depends on the application

- 40 nm alone doesn't make it much faster: the current ATIs aren't clocked much higher than their 55 and 65 nm brothers
- neither does the high transistor count add anything more: it's already included in the 512 "CUDA cores"

- and neither does 40 nm guarantee a cool card: look at RV790 vs RV870: double the transistor count, same clock speed, almost similar power consumption and significantly reduced voltage
-> ATI needed to lower the voltage considerably (1.3V on RV770 to 1.0V on RV870) to keep power in check

- nVidia is more than doubling transistors (1.4 to 3.0 Billion) and has already been at lower voltages of ~1.1V before (otherwise GT200 would have consumed too much power) and paid a clock speed penalty compared to G92, even at the 65 nm node (1.5 GHz on GT200 compared to 2 GHz on G92)
-> nVidia can't lower their voltage as much as ATI did (without choosing extremly low clocks) and needs to power even more transistors
-> I expect GT300 / GF100 to be heavily power limited, i.e. to run at very low voltages (maybe 0.9V) and to barely reach the same clock speeds as GT200 (it could run faster at higher voltages, but that would blow the power budget)
-> I expect anywhere between 200 and 300W for the single chip flagship.. more towards 250W than 200W
-> silent air cooling will be a real challenge and thus temperatures will be very high.. except people risk becoming deaf

- definitely high prices.. 3 Billion transistors is just f*cking large and expensive

- I think GF100s smaller brother could be a real hit: give it all the features and half the crunching power (256 shaders) together with a 256 bit GDDR5 interface. That's 66% the bandwidth and 50% performance per clock. However, since you'd now be at 1.6 - 1.8 Billion transistors it'd be a little cheaper than RV870 and consume less power. RV870 is already power contrained: it needs 1.0V to keep power in check and thus can't fully exploit the clock speed headroom the design and process have. With less transistors the little GF100 could hit the same power envelope at ~1.1V. Compare this to my projected 0.9V for the big GF100 and you'll get considerably higher clock speeds at a reasonable power consumption (by todays standards..) and you'd probably end up at ~66% the performance of a full GF100 at half the die size.

MrS
Scanning for our furry friends since Jan 2002
ID: 14273 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 14275 - Posted: 22 Jan 2010, 2:29:44 UTC - in response to Message 14273.  

My rough guess of 512/240=2.13 and multiplied by an improved architecture factor (1.4) giving about 3 times the performance of a GTX 285 is only meant as a guide to anyone reading this and interested in one of these cards, as is my price speculation and power consumption guess!

I broadly incorporated, what you more accurately referred to as architecture improvements, into what I thought would result in GPU and RAM system gains via their new supporting architectures (not that I understand them well), but I was just looking (stumbling about) for an overall ball park figure.

I take your point about the 40nm core being packed with 3B transistors; the performance vs power consumption is tricky, so there may not be any direct gain there.
Where the RAM is concerned, I spotted the lower bandwidth, but with DDR5 being faster and with the cache improvements I guessed there might be some overall gain.
Although my methods are not accurate they might suffice at this stage.
Do you think the new architecture will improve GPUGrid performance in itself by about 40% (my guesstimate of 1.4)?

I like the sound of your little GF100. Buying one of those might just about be possible at some stage. Something around the performance of a GTX 295 or perhaps 25% faster would go down very well, especially if it uses less power.

O/T
GDF – Your’ new, 60% faster application sounds like an excellent achievement. Does it require Cuda Capable 1.3 cards or can 1.1 and 1.2 cards also benefit to this extent?
ID: 14275 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 14276 - Posted: 22 Jan 2010, 9:37:08 UTC - in response to Message 14275.  


O/T
GDF – Your’ new, 60% faster application sounds like an excellent achievement. Does it require Cuda Capable 1.3 cards or can 1.1 and 1.2 cards also benefit to this extent?[/quote]

It is slightly slower if compiled for 1.1. We are trying to optimize it, the only other solution is to release an application for 1.3 cards alone.

gdf
ID: 14276 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Quinid

Send message
Joined: 11 Jan 10
Posts: 1
Credit: 3,791,364
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 14282 - Posted: 22 Jan 2010, 17:40:17 UTC - in response to Message 14276.  

I can't remember what article I read a couple days ago, but Nvidia admitted the new cards will run VERY hot. They claimed an average PC case and cooling will NOT handle more than one of these new cards. Just FYI.....

If that's the case, I wonder if waterblock versions will be more common this time around. My 260 GTX(216) already spews air hotter than a hairdyer just running GPUGRID or Milkyway.
ID: 14282 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · Next

Message boards : Graphics cards (GPUs) : Nvidia GT300

©2025 Universitat Pompeu Fabra