Message boards : Number crunching : What to build in 2014?
Author | Message |
---|---|
We've been having some excellent discussions about building our own systems lately. Thanks to the excellent advice and feedback we receive here from our intrepid gurus my own thoughts on what to build has changed several times. My goals are unchanged:
____
| |
| # |
|___|
The # represents a single PSU that feeds both mobo/GPU combos. Let's call it an "LPSUL module". Now imagine several LPSUL modules side by side in a "rack". If the 2 L's and PSU are spaced far enough apart and/or the ambient air temp is kept low enough they'll cool well. Before I go into details of the rest of the grand plan... what do you guys think of this mobo and CPU combo? The CPU has integrated Intel 2000 graphics which I think is NOT the kind we can use with OpenGL, correct? One thing I don't like about it is that it's single PCIe X16 slot runs at X8 but I'm convinced that if the CPU can keep up, the PCIe bandwidth will be sufficient for GPUgrid for some time to come. Is the CPU going to be too weak? Opinions, please? Unless the mobo gets bad ratings here, in January I think I'll buy one and put it plus 1 GB RAM on one of my GTX 670 to see how it performs. If it works I'll try a GTX 780 or perhaps 780Ti later in the year, maybe even a Maxwell, maybe a dual GPU card (GTX 690 for example). If the mobo works as well as I hope, it would allow me to buy many small, inexpensive pieces slowly over time rather than requiring a big initial expenditure to get up and running. For the "rack" I have an old wooden (particle board) bookshelf with adjustable shelves. It was headed to the recycle bin so I am free to cut and modify it any way I want or toss it in the bin if it doesn't pan out. I already have it positioned up against a window. The shelves are 72cm long X 30cm deep (28.5" X 11.75"). One mobo/GPU combo will attach to the top of a shelf while it's sister hangs like a bat from the shelf above. In my mind I see 4 of the double L modules per shelf. The unit has 6 shelves at the moment. I see myself pulling a 30 amp feed into the room in the near future. I already have the bookshelf cum "cruncher rack" positioned up against a window and I'm measuring it up for custom add-on ducts, doors and fans, all of which will be top quality junk I find here and there and preferably free. I'll take pics and put them in an online gallery as Dag's Mongo Cruncher Rack progresses. I promised pics of my previous "wind tunnel" thing but after I realized it was a good start but flawed I scrapped the idea and the pics. I'll definitely share pics of the mongo cruncher rack. ____________ BOINC <<--- credit whores, pedants, alien hunters | |
ID: 34458 | Rating: 0 | rate: / Reply Quote | |
On the plus side this could be very power efficient, with a good PSU, but my gut feeling is that it’s more suited to a GTX650Ti than a GTX770 or above. | |
ID: 34460 | Rating: 0 | rate: / Reply Quote | |
Thanks, those are all good points. After Googling around I find several reports of people running Linux on that mobo/CPU combo, no problems there. | |
ID: 34492 | Rating: 0 | rate: / Reply Quote | |
While I would not be comfortable getting a Biostar board, ITX boards are really not designed for discrete GPU’s and would be more suited to the likes of a GT620, than a GTX770. Some don't even have a PCIE slot. I don’t think they are up to running 24/7 for years and that’s without a GTX770. The solder tracks are probably cheap as, but so is everything else including capacitors quality or count, and its not worth rebuilding a board when you can simply get another better board. I’m not even sure it’s possible to retrace the solder or rewire; some tracks may be hidden within the motherboard (which is basically about 6 layers of plenum grade resin stuck together). | |
ID: 34554 | Rating: 0 | rate: / Reply Quote | |
OK scrap the Biostar/Celeron combo board. It was a bad idea plus they're all sold out. Now here's the bomb... | |
ID: 34658 | Rating: 0 | rate: / Reply Quote | |
An extended ATX mobo that has 4 PCIe gen 3.0 slots that all run at x16 speeds when all 4 slots are occupied sells for far more than $400 plus you need a very expensive CPU to run all the slots at x16. You don't need to think in extremes. Look for the Gigabyte GA-Z87X-OC motherboard, it should be around $220. It has a more expensive version called GA-Z87X-OC Force (around $400), which has the capability of running two PCIe 3.0 cards at x16, and four at PCIe 3.0 x8. PCIe 3.0 x8 is sufficient for the current GPUGrid client. I don't recommend to put four cards in a single system, though. The Celeron G1610 may not have enough lanes for x16 but for another $20 I can get an i3 Ivy bridge that I am quite sure does. Ivy Bridge i3 processors don't have PCIe 3.0, they are only PCIe 2.0. See i3-3240. Only the i5 and the i7 series has PCIe 3.0 from the Ivy bridge series. However the Haswell series i3 processors do have PCIe 3.0. See i3-4130. | |
ID: 34664 | Rating: 0 | rate: / Reply Quote | |
You recommended the Gigabyte GA-Z87X-OC to tomba a few days ago so I checked it out. All the vendors I like to deal with are out of stock on that board and don't plan to stock more. The reason seems to be that it has trouble running more than 2 PCIe cards. It sounds like the GA-Z87X-OC Force is intended to correct the deficiencies in the GA-Z87X-OC. Well, I want at least 3 more GPUs so the GA-Z87X-OC is crossed off my list. I don't recommend to put four cards in a single system, though. Why? Cooling issues? I have the heat beat, not a problem here. Over a year ago you said there are driver issues with 4 cards on one mobo but I read in the NVIDIA forums the newer drivers don't have a problem with 4. Is there another reason? ____________ BOINC <<--- credit whores, pedants, alien hunters | |
ID: 34669 | Rating: 0 | rate: / Reply Quote | |
I don't recommend to put four cards in a single system, though. It depends on what kind of GPUs are we talking about. Four non high-end GPUs: It's better to have two high-end cards than four non-high end despite they are more expensive because of cooling problems, and because every GPU processes a separate WU, therefore a high-end card (GTX780Ti) is more futureproof. Today's fastest GPU could process a long workunit under 24 hours, even in the future when they will be 5 times longer than today's long workunits. Four high-end GPUs (~250W/GPU): it is difficult to silently dissipate ~1kW from a PC case, so it's recommended not to put the PC in a case at all, or to have water cooling (but it could have reliability problems). Power problems: four high-end GPUs could consume 75W each from the PCIe slots, that is 300W (at worst case it could be even more if you overclock them). A regular motherboard powers its PCIe slots through the 24-pin ATX power connector, which has only two 12V pins. 300W/12V=25A, that is 12.5A on each pin (plus the power for the chipset, and the memory, and the coolers). I assure you, that those pins will burn. Not in flames, hopefully. In the end the OS will crash, and a GPU could be damaged also (my GTX 690 broke that way). So if there are four high-end GPUs in a single MB, that MB should have extra power connectors for its PCIe slots, and they must be connected to the PSU (without converters). It is highly recommended not to skimp on any component when building a PC for crunching with four high-end GPUs. And this host should not have Windows Vista, 7, 8, 8.1 OS, because of the WDDM overhead (maybe it won't be that bad using the Maxwell GPUs, but we don't know it yet) Over a year ago you said there are driver issues with 4 cards on one mobo but I read in the NVIDIA forums the newer drivers don't have a problem with 4. Is there another reason? You are right about that, but - as far as I can recall - that discussion was about putting four GTX 690s in a single MB, that is 8 GPUs in a single system. | |
ID: 34673 | Rating: 0 | rate: / Reply Quote | |
So if there are four high-end GPUs in a single MB, that MB should have extra power connectors for its PCIe slots Aha, now I understand. Now that I think about it, I have a very old mobo with only 2 PCIe slots and it has an extra power 4-pin Molex connector just for PCIe slot power. Yes, the discussion was about putting 4 GTX 690 on one mobo. I've scaled that ambition back to 4 GTX 780Ti. Now that you've done the math for me (thanks) I agree 12.5A on those pins is a lot and that is probably why I have read elsewhere the non-Force Gigabyte board is inadequate for 4 high demand cards. I know Gigabyte's literature for the Force model mentions something similar to "it meets PCIe slot power requirements in extreme conditions better than previous models". I'll definitely investigate that before putting 4 high end cards on a Force board. I think I'll email Gigabyte and ask them if it will handle 4 high-end cards and what they estimate the life-expectancy to be. I still like the 4 mini-ATX mobo scheme. It allows a lot of flexibility for physically arranging all the components into a compact yet easily cooled custom case. Also, if 1 board fails it's a smaller loss compared to a $220 board. I agree about dissipating 1KW from a case silently but I don't want the expense and problems that come with water block, pump and radiator style liquid cooling. I know I can do it with air alone and I know I can make it very quiet as long as I think outside the box. I am still considering liquid submersion cooling but not with mineral oil. It's too thick. My friend who works for an electric utility company told me he will bring me a free 20 liter pail of the cooling oil they use in big transformers. He guarantees it will not affect any of the components on a motherboard and it's environmentally safe. He says the fumes are safe to breath but the aroma is not the kind of aroma you want in your home. He says all it would need is a reasonably air-tight seal on the container (cheap and easy to do) and a small (10mm) vent hose to the outdoors. Most important is the fact it is thinner than mineral oil, more like diesel fuel, so it will circulate well due to convection alone. If it circulates well enough then it might not need a pump, just a properly designed convection system. ____________ BOINC <<--- credit whores, pedants, alien hunters | |
ID: 34686 | Rating: 0 | rate: / Reply Quote | |
A regular motherboard powers its PCIe slots through the 24-pin ATX power connector, which has only two 12V pins. 300W/12V=25A, that is 12.5A on each pin (plus the power for the chipset, and the memory, and the coolers). I assure you, that those pins will burn. Not in flames, hopefully. In the end the OS will crash, and a GPU could be damaged also (my GTX 690 broke that way). So if there are four high-end GPUs in a single MB, that MB should have extra power connectors for its PCIe slots, and they must be connected to the PSU (without converters). It turns out both the the Gigabyte GA-Z87X-OC and the GA-Z87X-OC Force have that extra power connector for the PCIe slots, Gigabyte's name for the feature is OC Peg. Another nice feature that both boards have is OC Brace which appears to be a metal bracket that holds up to 4 cards in place perpendicular to the mobo in case-less or custom case applications like mine will be. Excellent feature! I just finished laying out such a bracket on a piece of sheet metal and was about to start drilling and cutting it. Instead I'll order a GA-Z87X-OC Force and use their bracket. Unless someone points to some negative aspect, I'll be ordering a GA-Z87X-OC Force and modest i5 Haswell at month's end. BTW, I don't see the issue of cooling 4 cards side by side on 1 board as a problem. I'll be handling that issue by populating the 4 PCIe slots with 4 cards that have radial fans. Unlike axial fans which come in all sorts of configurations, radial fans are very similar in size and location regardless of brand or model which will make it easy to design and fabricate a custom duct that will carry cool air from outside the case directly into the cards' fan intakes. The fans will be lined up one behind the other almost perfectly, give or take a few millimeters, even closer if the cards are all of one brand and model. The radial fans will then push the hot exhaust into a duct leading to the outdoors to keep the case temp and ambient temp low. ____________ BOINC <<--- credit whores, pedants, alien hunters | |
ID: 34753 | Rating: 0 | rate: / Reply Quote | |
Pounced on the GA-Z87X-OC Force mobo at Newegg Canada, priced at 419.99 but with Newegg discount I'm getting it for $394.99 CDN plus $10 shipping plus 5% tax on (the price of the board + price of shipping). Then I get a $100 manufacturer's rebate so it'll be $294.99. | |
ID: 34841 | Rating: 0 | rate: / Reply Quote | |
Good price... that's what changes *my* game and makes *my* jaw drop. What makes me do the forehead palm and say "Doh!" is when I realize in spite of being warned 101 times by skgiven and possibly others I fall into the trap anyway. I got a great deal on the mobo and it's up to the job wrt to slot spacing and power capacity but it's socket 1150. CPUs for socket 1150 have only 16 PCI lanes when I want a minimum of 32 lanes and preferably 64. Doh! That's why I like tables and charts and precisely why I should have made one to summarize all the advice and things I was hearing as well as the implications. That's what ya have to do when you get this old or ya forget. Now my jaw-droppingly "inexpensive for the features it has" mobo will run at most only 3 video cards, if I'm lucky, and they will run at (x16,x4,x4) if they run at all. It is guaranteed to run 2 cards at (x8,x8). I don't think it can do 4 cards at (x4,x4,x4,x4) though if it did that config just might be fast enough since it is PCIe gen 3.0. If 3 cards work then iff GTX 690 has some sort of on card bus arbitration perhaps 3 GTX 690 or 790 (if they ever appear) is a good plan B now that I've screwed up my own game. Would someone please pass the salt, this crow tastes flat. ____________ BOINC <<--- credit whores, pedants, alien hunters | |
ID: 34851 | Rating: 0 | rate: / Reply Quote | |
ouch :( i was under the impression that the oc force board could do 4 x16 lanes too :/ | |
ID: 34858 | Rating: 0 | rate: / Reply Quote | |
It's not the board itself or the chipset that is limited. The limitation is in the CPUs that can be used on the board. The board and chipset are capable of providing 8 lanes to each of 4 slots, a total of 32 lanes. I would be quite content with that as they are PCIe 3.0 which would provide data throughput roughly equivalent to 16 lanes on each of 4 slots at PCIe 2.0 speed. An x16 2.0 slot is said to be sufficient even for a GTX 690. | |
ID: 34859 | Rating: 0 | rate: / Reply Quote | |
I should return it [the GA-Z87X-OC Force] and pick something more suitable... As you said before in this post, the PCIe lanes are CPU limited. So if you want something more suitable, you should look for a Socket 2011 board (and a Socket 2011 CPU), but it would be an overkill. There is no significant GPU crunching performance gain between having 4xPCIe3.0x8 or having 4xPCIe3.0x16. You can spend your money more wisely, if you want the best bang for the buck. So I still recommend you to use a Socket 1150 based system. You won't find a better S1150 board than this one, because such a motherboard should have three PCIe switch, but other high-end S1150 motherboards also have only one. For example: ASUS Maximus VI Extreme, or ASUS Z87WS I should have gone with 4 mini ATX boards. I would have ended up with more total PCIe capacity for less money. PCIe capacity is not the Holy Grail of GPU computing. You should have a decent CPU to properly utilize the bandwith of the PCIe 3.0, so in the end you would spend more money on CPUs than the optimal. On the other hand this mobo might do very well with 3 Maxwell cards on it. It might even be sufficient for the 2 GTX 670 and 1 GTX 660Ti I now have. True. I think it could do very well even with 4 pieces of GTX780Ti. | |
ID: 34860 | Rating: 0 | rate: / Reply Quote | |
Thanks for the reassurance, Retvari. I will keep it. I like the way it's laid out, it's going to work very well in my custom case. | |
ID: 34863 | Rating: 0 | rate: / Reply Quote | |
Nope, this board was a mistake. I'm glad I'm getting a $100 rebate because it's going to take at least 2 hours additional labor and $50 in parts to make it work. The problem is the cards are too close together to allow sufficient air into the fans. I'll solve it by removing the 4 fans and replacing them with 1 big fan and a duct that pushes air into the back of all 4 cards. Actually, maybe that's a good thing as it will allow me to crop 25 or 30 mm off the tail of each card which will allow room for a thicker sound/heat barrier on the cabinet door. | |
ID: 34886 | Rating: 0 | rate: / Reply Quote | |
Message boards : Number crunching : What to build in 2014?