Building a Desktop for GPUGrid

Message boards : Number crunching : Building a Desktop for GPUGrid
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 . . . 9 · Next

AuthorMessage
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34399 - Posted: 19 Dec 2013, 19:13:40 UTC - in response to Message 34396.  

The connectors on the ends of the cables usually have names such as USB, HDD led, PWR, RESET, etc. stenciled on them. Your mobo's manual will tell you wherte each connector goes. The connectors usually are keyed which means they can fit only one way. Where there is any doubt about polarity, the mobo manualwill probably show + and - on a diagram. Red is +, black is -, white can also be + and green is sometimes -.
BOINC <<--- credit whores, pedants, alien hunters
ID: 34399 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34421 - Posted: 21 Dec 2013, 19:17:04 UTC - in response to Message 34399.  

The connectors on the ends of the cables usually have names such as USB, HDD led, PWR, RESET, etc. stenciled on them. Your mobo's manual will tell you wherte each connector goes. The connectors usually are keyed which means they can fit only one way. Where there is any doubt about polarity, the mobo manualwill probably show + and - on a diagram. Red is +, black is -, white can also be + and green is sometimes -.

Thanks for that, Dagorath. You were right. The mobo manual told me exactly what to do with all those pretty cables!

The earth strap arrived 11:00 this morning so I continued the build. Google was a great help in sorting out information missing from some of the Micky Mouse instructions. Only the mobo instructions were comprehensive. Thanks, ASUS!

I chickened out on the Hyper 212 Evo CPU cooler and installed the stock cooler that came with the CPU. Time will tell if that was a bad decision.

The build is done. Five hours, allowing for a quick lunch and my standard two-hour nap! I decided to leave the christening till the morning, after a thorough check that everything's in the right place. Fingers crossed...

Tom




ID: 34421 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34422 - Posted: 21 Dec 2013, 20:10:43 UTC - in response to Message 34421.  
Last modified: 21 Dec 2013, 20:14:51 UTC

You're welcome.

I bet the stock CPU cooler will be adequate when the rig crunches GPUgrid tasks because GPUgrid tasks don't work the CPU extremely hard, they work the GPU hard instead and for that reason I would watch the GPU temperature very carefully.

The stock/default GPU cooling mode for my 660ti is Automatic and it sucks because it allows the temperature to hover between 80C and 84C which is far too high. I have to put the fan control in manual mode and set it to a high RPM manually in order to keep the temp no higher than 65C. Actually I have a script that takes care of that for me but the point is that you might want to have your GPU temperature monitoring software installed and running before you start the first GPU task and be prepared to either suspend the task if the temperature soars too high or setup manual fan control or whatever method your software allows.
BOINC <<--- credit whores, pedants, alien hunters
ID: 34422 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34423 - Posted: 21 Dec 2013, 20:42:15 UTC - in response to Message 34422.  

I would watch the GPU temperature very carefully.

you might want to have your GPU temperature monitoring software installed and running before you start the first GPU task and be prepared to either suspend the task if the temperature soars too high or setup manual fan control or whatever method your software allows.

Another good tip. Thanks!

Currently copying my GPU temperature monitoring software onto a USB stick.
ID: 34423 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34427 - Posted: 22 Dec 2013, 10:14:38 UTC - in response to Message 34421.  

I decided to leave the christening till the morning, after a thorough check that everything's in the right place. Fingers crossed...

With some trepidation I powered up this morning. Post screen, followed by BIOS screen. Checked first boot was the CD, popped in the Win 7 CD and it booted.

Win 7 installed and now installing a humongous number of Windows updates.

I'm amazed! It works!! :)

Tom
ID: 34427 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34428 - Posted: 22 Dec 2013, 11:32:07 UTC - in response to Message 34422.  

I would watch the GPU temperature very carefully.

Both PCs are running Santi WUs on GTX 660.

-------------Temp C------Fan RPM
Old------------65---------2130
New-----------60---------1470

Looking good for New, methinks...




ID: 34428 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34431 - Posted: 22 Dec 2013, 12:03:25 UTC - in response to Message 34428.  
Last modified: 22 Dec 2013, 12:04:41 UTC

It looks very good for New. Congratulations! Your first build appears to be a success :-)

Now, in your opinion:

1) was it a difficult process?
2) did you save enough money to justify the effort?
3) would you build your own again in the future and would you recommend it to others?
BOINC <<--- credit whores, pedants, alien hunters
ID: 34431 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
John C MacAlister

Send message
Joined: 17 Feb 13
Posts: 181
Credit: 144,871,276
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 34432 - Posted: 22 Dec 2013, 13:21:54 UTC

And congratulations from me, too: I may even try such a project in 2014....
ID: 34432 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34434 - Posted: 22 Dec 2013, 17:15:50 UTC - in response to Message 34431.  

The new beast is now running GPUGrid with both GTX 660s so I retired the old PC with a GTX 460. I have planned to include the 460 in the new rig but I had forgotten it too is double width. There's no room in this case for three double width GPUs.

1) was it a difficult process?
2) did you save enough money to justify the effort?
3) would you build your own again in the future and would you recommend it to others?


1) Stepping into the unknown is always intimidating so I took my time, double checking as I went along and using Web searches for clarification when in doubt. No. It was not a difficult process even for this old guy!

2) Oh yes. Up to now I had always bought Dell and my opening bid was to check their top-of-the-line desktop; €2000. I spent a tad over half that for better function and expandability. She who holds the purse strings is delighted :)

3) Definite 'Yes' on both counts. If I can do it, anyone can.

Tom
ID: 34434 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34436 - Posted: 22 Dec 2013, 18:35:14 UTC - in response to Message 34434.  

Congratulations Tom!

You have now the same rig as I have with two 660's on a Sabertooth MOBO and an FX8350. I also use the stock CPU cooler still. However it will become noisy if ambient temperature rises and 6 cores are doing Rosetta.

It is indeed nice to build you own rigs, you can put in whatever you like.
What you can do as watch as many offers at PC shops and buy stuff, then you can build a second one over time. I used 5 months to collect all I needed for a new rig. Like an action for a WD Black HD 1TB for €40. That's more than half off the normal price.

One thing though, and a lot of people will not agree, but Dell builds very good silent cooling systems.
Greetings from TJ
ID: 34436 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34439 - Posted: 23 Dec 2013, 10:59:32 UTC

Had a fright last night. Got home to find the new PC off!

Found that the default power setting in Windows 7 is to sleep after 30 minutes.
ID: 34439 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34444 - Posted: 23 Dec 2013, 16:47:02 UTC

BOINC Notices tells me:



My cc_config.xml file contains:

<cc_config>
<use_all_gpus>1</use_all_gpus>
</cc_config>

I'm happily using two GPUs. Perhaps this parameter is now redundant?

Tom
ID: 34444 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Richard Haselgrove

Send message
Joined: 11 Jul 09
Posts: 1639
Credit: 10,159,968,649
RAC: 318
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34446 - Posted: 23 Dec 2013, 17:04:10 UTC - in response to Message 34444.  

Always check with the documentation - in this case, client configuration.

I think your tag would be valid if you put it inside an <options> block.

By default, BOINC will use the 'better' of two dis-similar GPUs. Since your two GPUs are the same, there is no 'better' or 'worse'. BOINC will use both of them without complaint.
ID: 34446 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34449 - Posted: 23 Dec 2013, 18:59:09 UTC - in response to Message 34446.  

I think your tag would be valid if you put it inside an <options> block.

That fixed it. Thank you! Tom
ID: 34449 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34462 - Posted: 24 Dec 2013, 15:23:31 UTC - in response to Message 34446.  

By default, BOINC will use the 'better' of two dis-similar GPUs.

The Boinc manual merely says,

    <use_all_gpus>0|1</use_all_gpus>
    If 1, use all GPUs (otherwise only the most capable ones are used).


That isn't enough information and is vague.
While not applicable to this situation, it would be useful to clarify what you mean by dis-similar GPU's (Fermi vs Kepler, or AMD vs NVidia vs Intel), as that's not mentioned in the manual.

If you just have NVidia cards then Boinc reads what's in the NVIDIA library, and then proceeds to report the 'lesser' card(s). Also, if it reports 2 GPUs then Boinc will use both of these - you don't need <use_all_gpus>1</use_all_gpus> in a cc_config.xml file.
It's been like that since around the time of NVidia's 175 drivers (years ago).

IIRC when you have an AMD attached to the monitor and an NVidia in another slot both are used, but not the other way around without using cc_config, though that might depend on 'better', and it's not clear whether one ATI would be used or two NVidia's in the case where each NVidia is only slightly less powerful than the AMD, and vice versa...

While Boinc might use the better, dissimilar card, Boinc doesn't report the better similar card to the project:

Intel(R) Xeon(R) CPU E3-1265L V2 @ 2.50GHz [Family 6 Model 58 Stepping 9]
(8 processors) [2] NVIDIA GeForce GTX 670 (2048MB) driver: 331.93 Microsoft Windows 7
Professional x64 Edition, Service Pack 1, (06.01.7601.00) 24 Dec 2013 | 10:01:16 UTC
http://www.gpugrid.net/show_host_detail.php?hostid=139265

A Stderr output from that system.

<core_client_version>7.2.33</core_client_version>
<![CDATA[
<stderr_txt>
# GPU [GeForce GTX 770] Platform [Windows] Rev [3203] VERSION [55]
# SWAN Device 0 :
# Name : GeForce GTX 770

http://www.gpugrid.net/result.php?resultid=7582795

Before encouraging people to use the manual, wouldn't it be wise to make sure it was up to date and contained sufficient information to be of use by crunchers?

http://boinc.berkeley.edu/wiki/GPU_computing,

"GPUgrid.net (Linux 32 & 64bit and Windows) Bit slow but a GT220 works."

Firstly, GPUGrid is Linux x64 only, not Linux x86.
Secondly, it's Windows x64 or x86.
Thirdly, it's CC1.3 and above only (GT220 is CC1.2), and a GT220 was never recommended.


FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 34462 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34463 - Posted: 24 Dec 2013, 17:06:13 UTC - in response to Message 34462.  

@skgiven,

I fixed the info regarding Linux 64bit, Windows and CC1.3 at http://boinc.berkeley.edu/wiki/GPU_computing#Attach_to_projects_with_GPU_applications. If you wish to see other changes PM me.

BOINC <<--- credit whores, pedants, alien hunters
ID: 34463 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34464 - Posted: 24 Dec 2013, 17:18:34 UTC - in response to Message 34463.  

Excellent!
Thank you,
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 34464 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
tomba

Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34468 - Posted: 24 Dec 2013, 18:28:56 UTC

Well, my new rig is busy making a bigger contribution to GPUGrid. I even have the old rig upstairs running with a GTX 460…

I have a problem. The new rig is fan-noisy with just two CPU cores active, running the acemd.814 code. An ear inside the case tells me it’s the stock CPU cooler. I guess that chickening-out on the EVO cooler was not a good idea!

ASUS doc tells me the CPU warranty is void with a non-stock cooler. What to do? …

Tom
ID: 34468 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Richard Haselgrove

Send message
Joined: 11 Jul 09
Posts: 1639
Credit: 10,159,968,649
RAC: 318
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34470 - Posted: 24 Dec 2013, 21:27:40 UTC - in response to Message 34462.  

By default, BOINC will use the 'better' of two dis-similar GPUs.

The Boinc manual merely says,

    <use_all_gpus>0|1</use_all_gpus>
    If 1, use all GPUs (otherwise only the most capable ones are used).


That isn't enough information and is vague.
While not applicable to this situation, it would be useful to clarify what you mean by dis-similar GPU's (Fermi vs Kepler, or AMD vs NVidia vs Intel), as that's not mentioned in the manual.


Sorry, I tend to vary my answers - both style of writing, and depth of technical detail included - according to my perception of the needs of the questioner (and how energetic I'm feeling at the time...). Here are a couple of previous attempts at the same subject - feel free to grab either of them.

SETI message 1085712 (technical, quotes source)
BOINC message 42194 (interpretation)
ID: 34470 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34473 - Posted: 24 Dec 2013, 22:12:24 UTC - in response to Message 34468.  

Try ducting cool air from outside the case directly to the CPU fan. Here are a bunch of pictures of ducts in use. Notice the pics of a pop bottle with both ends cut off. Now that's my kind of modding... cheap and recycling stuff. I'll give you a link to a tool you can use to cut holes easily in cases, the tool is called a nibbler.

Checkout The Effectiveness of Air Ducts in CPU Cooling.

Google for pc cooling duct for more links. I don't know how expensive those ducts are these days but I used one years ago and it gave amazing results.

Or just put the other cooling solution on and don't tell Asus. They'll never know.Keep the original and if you ever need warranty put the original back on and send it in. However, if you keep theb CPU cool and power it through a good spike protector the chances of it failing are very slim.

BOINC <<--- credit whores, pedants, alien hunters
ID: 34473 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 . . . 9 · Next

Message boards : Number crunching : Building a Desktop for GPUGrid

©2025 Universitat Pompeu Fabra