Which graphic card

Message boards : Graphics cards (GPUs) : Which graphic card
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 3 · 4 · 5 · 6 · 7 · 8 · 9 · Next

AuthorMessage
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30591 - Posted: 31 May 2013, 15:44:21 UTC - in response to Message 30588.  

Don't know the exact layout,. but any GTX550 should have at least 2 digital outputs. Just take a look at the b*tt of the card ;)

A GTX285 for Milkyway? Yikes... my HD6970 did those WUs in less than a minute! The GTX285 runs DP at 1/8th the SP speed - better than current nVidias (1/24), but no match for (former) high-end AMDs at 1/5th or 1/4.

MrS
Scanning for our furry friends since Jan 2002
ID: 30591 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30592 - Posted: 31 May 2013, 16:11:16 UTC - in response to Message 30591.  
Last modified: 31 May 2013, 16:12:50 UTC

Don't know the exact layout,. but any GTX550 should have at least 2 digital outputs. Just take a look at the b*tt of the card ;)

A GTX285 for Milkyway? Yikes... my HD6970 did those WUs in less than a minute! The GTX285 runs DP at 1/8th the SP speed - better than current nVidias (1/24), but no match for (former) high-end AMDs at 1/5th or 1/4.

MrS

I know that I have the card in my PC and 2 monitors to it working fine.
But I mean, can I place a graphics card in a system without monitors to it. Will it then crunch?
Okay the GTX285 is no longer okay, I get the message now :).

What about a GTX580 of 560, they have some speedy results I see at times with wingman?
Would be nice to have a second card besides the GTX550Ti, which take not to much watt for the PSU (770Watt), not the warm (heat) and not to expensive.
Greetings from TJ
ID: 30592 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30593 - Posted: 31 May 2013, 16:16:10 UTC - in response to Message 30592.  

As far as I know under Win you'd either need to extend the desktop to the 2nd card, attach a monitor to it, or at least a 2nd cable to an existing monitor or use a VGA dummy. But I'm no multi-GPU expert, maybe there's a way around this by now?

GTX580 to 560 are still fine if you already have them (well, not at Milkyway), but I won't get another one for crunching (not even relatively cheap used ones) since the Keplers are far more power efficient (i.e. they'll pay for themselves after some time). GTX660 seems to be the sweet spot right now.

MrS
Scanning for our furry friends since Jan 2002
ID: 30593 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30594 - Posted: 31 May 2013, 17:12:31 UTC - in response to Message 30593.  

All Windows drivers that include CUDA4.2 or newer support a second GPU, without the need to attach a monitor, dummy plug or omnicube. This was introduced well over a year ago. Unless there is some oddity with having 2 monitors supported by one GPU when the other GPU isn't supporting any monitor, I don't think there should be any issue, even with older GPU's.
The GTX780 and GTX770 arrivals resulted in many GPU prices dropping throughout the NVidia GF600 range. Mid range prices are good and might dip further. These cards are reasonably future proofed.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 30594 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30613 - Posted: 1 Jun 2013, 16:09:30 UTC

Okay my plan didn't work. The PSU is only 375Watt and the GTX660 need at least a PSU of 450 Watt.
The SSD didn't arrive but I made one partition as skgiven suggested and installed the OS (WinVista) as a new installation with formatting the disk.
Did not get Win7 as skgiven suggested this make no sense, ETA would prefer Win7 or 8.
It took almost my entire Saturday to get it installed with all the updated and reboots. Booting goes faster, but that is about it.
Kernel times are the same with CPU ussage and everthing is still slow. IE oftern sying not responding, driver installation for keyboard and mouse, sometimes not responding and such.
Thus this means the MOBO is kaput?

I will not by a new one, but then it is the memory, than a controller, than the PSU. Then I have to wait for money to buy a refurbished Dell, or by parts and build me a new one.
The GTX550Ti will than be the only contribution from me to the project for a while.
Greetings from TJ
ID: 30613 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30642 - Posted: 2 Jun 2013, 21:41:19 UTC

As I mentioned in my previous message, the i7 was completely installed with new OS, but all was working very slow with the new GTX660 installed, even the browser going to NASA.gov took ages. FF downloading wasn't even possible.

Today I removed the GTX660 and put the old heater back the GTX285. Now it is working like a speedboat. All is opening direct with the mouse click.

Could it been that the GTX660 is to new for the MOBO, or that jumpers are wrong?
I used the first slot the one closed to the processor so that should be okay.
Greetings from TJ
ID: 30642 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Vagelis Giannadakis

Send message
Joined: 5 May 13
Posts: 187
Credit: 349,254,454
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 30645 - Posted: 3 Jun 2013, 9:11:59 UTC - in response to Message 30642.  

I did a little searching about your motherboard. I couldn't find much information about it though - no official info, support site needs registration...

The fact that everything works as expected with your other card, inevitably leads to a single prognosis: the 660 is not fully compatible with your setup.

Now, "setup" is a broad term, meaning both your hardware and software:

    Maybe the 660 stresses your PCIe 2.0 slot / bus too much and exposes some minor incompatibility / BIOS bug your motherboard may have
    Your motherboard supports triple SLI, maybe this causes some trouble.
    Maybe you have to install additional, motherboard-specific drivers to your Vista installation to make it communicate correctly with your PCIe bus and the card.
    Maybe the Nvidia drivers you're using aren't fully compatible or have a bug



As you see, there are many "maybes".. Hardware problems are like that unfortunately, unless you have specialized tools, it's a hide-and-seek "game" you have to play. :(

There are a number of things you can do:

    1. Try your 660 with another system, this will validate the card.
    2. Experiment with your BIOS settings around GPUs, PCIe, SLI, etc. Reset to defaults. Disable any overclocking.
    3. Make sure all system hardware is detected in Windows and you have no question marks in the device manager. Use the latest drivers you can get.
    4. If you have a spare hard drive, setup another version of Windows (XP or 7), install ONLY basic drivers to get you going (probably just for your network card, ONLY if Windows doesn't recognize it by itself), fully update Windows, THEN install motherboard-specific drivers, THEN install the Nvidia driver.
    5. As a last resort, use your 660 with another system.



Come to think about it, you could repeat step 4 above for Vista as well.

I hope all this helps!


ID: 30645 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30647 - Posted: 3 Jun 2013, 10:25:57 UTC
Last modified: 3 Jun 2013, 10:26:50 UTC

Thanks for your information Vagelis Giannadakis.

It does help a bit.
I searched at the XFX site for the MOBO and several fora report problems with it and no new BIOS. The company seems to be "deaf" for comments. They don't make MOBO's any more if I understand correct.
I installed the drivers from the CD with the MOBO and there are no questions marks.
After the new install of Vista, windows installed more than 150 updates.
All works fine now, fast and the kernel times are very low.

So or the GTX660 is faulty, or, as you said, it is not good with my setup.
I guess the latter. I put it back in, did a cold boot and the graphics experience was directly affected, with slower responses of opening winds, browser response ad such.
I have opened all my cases and there is no system with a PSU of 450 Watt or more what the GTX660 needs. Thus I can not test it.
Greetings from TJ
ID: 30647 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Vagelis Giannadakis

Send message
Joined: 5 May 13
Posts: 187
Credit: 349,254,454
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 30648 - Posted: 3 Jun 2013, 10:44:48 UTC - in response to Message 30647.  

What about a friend then? You must have a friend or acquaintance with a system that can handle a 660 and test it for you, no? It's not that we're talking about some experimental, next-gen prototype GPU requiring an internal Thunderbolt port! :D

This way, you can test the card and make sure it is fully functional, before going ahead and dishing out cash to upgrade / buy stuff in attempts to make it work.
ID: 30648 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30649 - Posted: 3 Jun 2013, 11:06:36 UTC - in response to Message 30648.  
Last modified: 3 Jun 2013, 11:08:10 UTC

Just move the PSU as well as the GTX660 into the other system. That way you will be able to test if the GPU is faulty, and be able to give the PSU a clean.

Check for a motherboard chipset update. Make sure that the Bios isn't configured to use all PCIE slots if you only want to use one. Also try turning SLi off, if it's on. If two monitors are plugged in, remove one.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 30649 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30650 - Posted: 3 Jun 2013, 13:52:12 UTC - in response to Message 30647.  

I have opened all my cases and there is no system with a PSU of 450 Watt or more what the GTX660 needs. Thus I can not test it.

This certainly isn't set in stone. A quality PS with lower ratings will easily run your GTX 660. Think you said you had a 375 watt PS in another box. Try it. I've run more powerful GPUs than a 660 on 350 watt power supplies with no problems, as have many others.
ID: 30650 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30651 - Posted: 3 Jun 2013, 14:22:23 UTC - in response to Message 30650.  

I have opened all my cases and there is no system with a PSU of 450 Watt or more what the GTX660 needs. Thus I can not test it.

This certainly isn't set in stone. A quality PS with lower ratings will easily run your GTX 660. Think you said you had a 375 watt PS in another box. Try it. I've run more powerful GPUs than a 660 on 350 watt power supplies with no problems, as have many others.

That is good information Beyond!
I was tempted to do so, but checked nVidia's website once more and there was the 450Watt quote.
I will set BOINC to no new work and will try this.
Greetings from TJ
ID: 30651 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30657 - Posted: 4 Jun 2013, 20:40:01 UTC - in response to Message 30651.  

Yeah, those recommendations are pure BS or.. over cautionary, depending on your point of view. I think they assume something along the lines:

- big GPU means he's got to be running a really big gas-guzzling CPU too
- there may be lot'S of HDDs, periphels etc.
- he's probably got some dirt-cheap chinese firecracker PSU which can't output even 300 W

In practice the GTX660 won't consume more than ~130 W, because that's the target power consumption for these cards. Maybe less, depending on application and GPu load.

MrS
Scanning for our furry friends since Jan 2002
ID: 30657 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30719 - Posted: 7 Jun 2013, 19:32:19 UTC

Hi guys, I would like to update you.

I put the new GTX660 in another PC and it is now doing a short run from Nathan.
GPU load is steady 91% temperature is 82 degrees C. GPU clock is 1084MHz.

Has done 10% in 20 minutes.
Greetings from TJ
ID: 30719 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30722 - Posted: 7 Jun 2013, 20:33:37 UTC - in response to Message 30719.  
Last modified: 7 Jun 2013, 20:34:13 UTC

Great, except for the temperature - download MSI Afterburner and use it to increase the fan speed so that it stays below 70°C ;)
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 30722 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30724 - Posted: 7 Jun 2013, 22:10:05 UTC - in response to Message 30722.  

Great, except for the temperature - download MSI Afterburner and use it to increase the fan speed so that it stays below 70°C ;)

I used EVGA Precision X to set fan speed on auto. Temperature is now 70-71°C.
Greetings from TJ
ID: 30724 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 30741 - Posted: 9 Jun 2013, 0:02:14 UTC

I set the "old heater" (GTX285) to work again, to increase my RAC ;-)
Only short runs and then I can experiment with MSI Afterburner.
GPU load steady 95%, estimate 6 hours.
Greetings from TJ
ID: 30741 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 31155 - Posted: 2 Jul 2013, 14:30:49 UTC

Today my second EVGA GTX660 arrived. I built it in beneath the first one in the Alienware. It has no monitor connected and the SLI bridge is not mounted as well.
Now I request new work and did only get one task. BOINC does indeed see only one GPU.
So here are the questions:
1. How can I get the card working. Do I need he SLI bridge, connect a monitor to it. Both monitors are now in the first card.
2. More worrying thing is that the card is now running at 78°C with EVGA Precison as temperature/fan speed regulating (like Afterburner) and I have set it to auto. How can this be?
Tomorrow one GPU (only one in) ran at 69°C with the same EVGA software and settings? What has happened and how can I resolve this.

Thanks, I am hoping for a quick answer this time so that I can get bot GPU's crunching.
Greetings from TJ
ID: 31155 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 31156 - Posted: 2 Jul 2013, 14:34:08 UTC - in response to Message 31155.  
Last modified: 2 Jul 2013, 14:38:11 UTC

1. http://www.gpugrid.net/forum_thread.php?id=3156&nowrap=true#31007 or read the FAQ's.

2. Configure a profile in Afterburner; Settings (bottom right corner), Fan tab, enable user defined software automatic fan control. You may have to click Auto and User define after doing this.

Quick enough for you?
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 31156 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 31157 - Posted: 2 Jul 2013, 15:01:25 UTC - in response to Message 31156.  

1. http://www.gpugrid.net/forum_thread.php?id=3156&nowrap=true#31007 or read the FAQ's.

2. Configure a profile in Afterburner; Settings (bottom right corner), Fan tab, enable user defined software automatic fan control. You may have to click Auto and User define after doing this.

Quick enough for you?

Yes very quick skgiven, thanks.

I have it placed in the boinc data directory. It was read as you can see:
7/2/2013 4:52:55 PM | | Re-reading cc_config.xml
7/2/2013 4:52:55 PM | | Not using a proxy
7/2/2013 4:52:55 PM | | Config: use all coprocessors
7/2/2013 4:52:55 PM | | log flags: file_xfer, sched_ops, task

But still no extra task.
I have indeed set EVGA Precision (is same as Afterburner but from EVGA, same menu as well), to autimatic fan control by software and all. In fact I did not change it. The only thing I did was place an extra GTX660 and now one runs hot. There is not a lot of space between both cards though.

Do I need to restart the system? Will that kill the part of Nathan's WU already done?

Another question. When teh two AMD's (5870) where in the Alienware they were both recognised by BOINC and there was no need for a cc_config. How about that?

Greetings from TJ
ID: 31157 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 . . . 3 · 4 · 5 · 6 · 7 · 8 · 9 · Next

Message boards : Graphics cards (GPUs) : Which graphic card

©2026 Universitat Pompeu Fabra