Help with 3 GPU motherboard

Message boards : Graphics cards (GPUs) : Help with 3 GPU motherboard
Message board moderation

To post messages, you must log in.

1 · 2 · Next

AuthorMessage
mymbtheduke

Send message
Joined: 3 Sep 12
Posts: 40
Credit: 186,780,650
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 39040 - Posted: 29 Nov 2014, 13:58:16 UTC

I want to give myself a Xmas present early. I currently have an AMD 8 core proc with a Gforce GTX 660Ti. Love this rig but what if I did the following?

Upgrade to an i7 with 4 core and hyper = 8 core
Get this motherboard with 3 PCI express slots.
http://www.newegg.com/Product/Product.aspx?Item=N82E16813131977

Will this motherboard run 3 GPUS? Looks like it. I know I will need to work out spacing and heat. Probably need to go with enclosed heat sink GPUs so they push the heat outside. I will look at getting 2 660ti and a lower end sub $100 card that generates less heat.

Any problems with trying to run 3 cards? I have a 700 watt modular Cooler Master power supply. Any advice would be appreciated.
ID: 39040 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39041 - Posted: 29 Nov 2014, 15:30:53 UTC - in response to Message 39040.  

It depends what you want to spent and what you want to achieve.
The new GTX970 is at least twice as fast as a 660Ti and consumes less energy.
And if you need to buy a new CPU, MOBO, a 66Ti you pay more or less the same.

This MOBO will work with two and three GPU's but there is almost no space between the 2 lower slots. You will get then two using 8 lanes and one using 4 lanes. That is no problem for crunching here. Someone who is better in English can you explain better about the lanes of the PCIe buses. Also the CPU has to support those lanes otherwise they don't work at maximum. This however is mainly influencing bus speed.

I guess you will also get several advices and read them all and decide what you feel best. If more questions arise later, then you will get an answer.

Two 660Ti's on a 700W PSU will work. Three perhaps or perhaps not. Other components use energy as well and then you might have to little head room which may result in errors while crunching. Even to much heat may cause errors.

I myself will only use GPU's with one fan that blows hot air out the case. This is based on experience with a GTX770 and a GTX780Ti both with two fans and several measurements in and outside the case.
Greetings from TJ
ID: 39041 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
mymbtheduke

Send message
Joined: 3 Sep 12
Posts: 40
Credit: 186,780,650
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 39042 - Posted: 29 Nov 2014, 15:55:13 UTC - in response to Message 39041.  

Thanks for the quick response. The Gigabyte board, AMD proc and GPU consume about 325W at full power. I have a new laptop from work that has an i7 in it and it is completing WCG Cancer work 50% quicker with 700MHz less speed. I figure I can do twice the amount of WCG work and at least twice the amount of GPU work (add another GPU) with 50-75 more watts. Not bad.

I hadn't though about replacing the 660 with a 9xx. I am getting worried about the total cost though. Newer GPUs can cost a lot. So many options here.
ID: 39042 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39043 - Posted: 29 Nov 2014, 17:27:06 UTC - in response to Message 39040.  
Last modified: 29 Nov 2014, 17:51:35 UTC

Will this motherboard run 3 GPUS? Looks like it. I know I will need to work out spacing and heat. Probably need to go with enclosed heat sink GPUs so they push the heat outside. I will look at getting 2 660ti and a lower end sub $100 card that generates less heat.

It will not even run 2 GPUs effectively. For proper GPU spacing the top and bottom slots should be used, except that the bottom slot is only x4, which will constrict a 660ti. If you wedge the 2 GPUs into the top 2 slots they will run at x8 but there will be very poor cooling for the top GPU. The x4 slot will also then be covered. I'll leave MB suggestions to those who are more familiar with Intel boards (I run mostly AMD). Also if buying GPUs for GPUGRID now, I'd suggest sticking to the 750Ti, 970 and 980 for efficiency's sake.

Edit: Here's an inexpensive MB that should work better:

http://www.newegg.com/Product/Product.aspx?Item=N82E16813157503

Says it has 3 PCI Express 3.0 x16 slots. The top 2 give plenty of spacing, the bottom 1 is a bit tight as is usual on these boards. MFG link:

http://www.asrock.com/mb/Intel/Z97%20Extreme4/
ID: 39043 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39044 - Posted: 29 Nov 2014, 19:15:17 UTC - in response to Message 39040.  

I want to give myself a Xmas present early. I currently have an AMD 8 core proc with a Gforce GTX 660Ti. Love this rig

Rereading your post, apparently your current AMD MB doesn't support multiple GPUs well. Here's an inexpensive one that has 2 properly spaced x16 slots that will run any GPU at full speed. There is also the bottom x4 slot that would be OK for a slower card (although as in the Intel example above would compromise cooling). I'm running this exact board on one of my latest builds and it works great. FWIW you'll gain NOTHING in GPUGRID crunching speed by switching from your AMD X8 to an Intel CPU. You would gain a little in CPU project processing speed on most but not all CPU projects, but not on the GPUGRID NVidia WUs.
ID: 39044 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39045 - Posted: 29 Nov 2014, 21:30:07 UTC - in response to Message 39044.  

I want to give myself a Xmas present early. I currently have an AMD 8 core proc with a Gforce GTX 660Ti. Love this rig

Rereading your post, apparently your current AMD MB doesn't support multiple GPUs well. Here's an inexpensive one that has 2 properly spaced x16 slots that will run any GPU at full speed. There is also the bottom x4 slot that would be OK for a slower card (although as in the Intel example above would compromise cooling). I'm running this exact board on one of my latest builds and it works great. FWIW you'll gain NOTHING in GPUGRID crunching speed by switching from your AMD X8 to an Intel CPU. You would gain a little in CPU project processing speed on most but not all CPU projects, but not on the GPUGRID NVidia WUs.

Oops, forgot to post the link:

GIGABYTE GA-990FXA-UD3 AM3+ AMD 990FX + SB950 SATA 6Gb/s USB 3.0 ATX AMD Motherboard

http://www.newegg.com/Product/Product.aspx?Item=N82E16813128514
ID: 39045 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39046 - Posted: 30 Nov 2014, 0:14:19 UTC - in response to Message 39043.  
Last modified: 30 Nov 2014, 0:17:37 UTC

Also if buying GPUs for GPUGRID now, I'd suggest sticking to the 750Ti, 970 and 980 for efficiency's sake.

That's right.
I would like to add that never buy two (or more) lesser cards instead of one high-end card. At the moment the GTX970 is the best bang for the buck.

BTW for multiple card configurations I suggest the Gigabyte Z87X-OC, or the Z97X-SOC motherboards.

To have two real PCIe3.0 connectors at x16 speed on a Socket 1150 motherboard: Z87X-OC Force, Z97X-Gaming GT, Z97X-Gaming G1
(the Z97X-SOC Force doesn't have the PLX PE8747 PCIe splitter chip, so it's not recommended)
These motherboards support up to 4 graphic cards.
ID: 39046 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Simba123

Send message
Joined: 5 Dec 11
Posts: 147
Credit: 69,970,684
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 39048 - Posted: 30 Nov 2014, 4:41:38 UTC - in response to Message 39043.  

Whilst technically correct, The difference running in x4 vs x8 in GPUGrid is so low that it is not worth talking about.
ID: 39048 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Simba123

Send message
Joined: 5 Dec 11
Posts: 147
Credit: 69,970,684
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 39049 - Posted: 30 Nov 2014, 4:47:42 UTC

depends on how much money you really want to spend.

Minimal --> get a 970 or 980 and plug it into your existing rig.

More --> you could get new motherboard that uses your existing CPU/RAM etc, but also supports proper SLI/3 slot. That way you can install your existing 660Ti in the secondary slot and use it to play games surf web watch videos etc while your new card crunches in the Primary slot.

That's actually how I have mine set-up.

Linkhttp://www.modsrigs.com/detail.aspx?BuildID=31030
ID: 39049 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39051 - Posted: 30 Nov 2014, 6:03:29 UTC - in response to Message 39048.  
Last modified: 30 Nov 2014, 6:07:40 UTC

Whilst technically correct, The difference running in x4 vs x8 in GPUGrid is so low that it is not worth talking about.

The difference between PCIe 2.0 x4 and x8 is quite noticeable here (the faster the GPU the bigger the loss). However there is no real difference between x8 and x16 at least on my GPUs (but my fastest GPU is a GTX 670). PCIe 3.0 x4 would probably not cause a slowdown though.
ID: 39051 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
mymbtheduke

Send message
Joined: 3 Sep 12
Posts: 40
Credit: 186,780,650
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 39059 - Posted: 1 Dec 2014, 15:30:19 UTC - in response to Message 39051.  

This has been very helpful. I will probably just upgrade to a GTX970 for now. My wife needs a root canal which will cost $1000. Fun.
ID: 39059 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Betting Slip

Send message
Joined: 5 Jan 09
Posts: 670
Credit: 2,498,095,550
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39060 - Posted: 1 Dec 2014, 17:13:07 UTC - in response to Message 39041.  

The new GTX970 is at least twice as fast as a 660Ti and consumes less energy.


Not quite twice as fast I have found.

ID: 39060 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Simba123

Send message
Joined: 5 Dec 11
Posts: 147
Credit: 69,970,684
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 39064 - Posted: 2 Dec 2014, 1:52:36 UTC - in response to Message 39051.  

I run my 660Ti in PCIE2 x4 and last time I checked it was less than 4% slower than running in in X8.

ID: 39064 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
mymbtheduke

Send message
Joined: 3 Sep 12
Posts: 40
Credit: 186,780,650
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 39066 - Posted: 3 Dec 2014, 19:46:24 UTC - in response to Message 39060.  

To get almost twice the speed for the same watts is compelling. I upgraded from my Phenom X6 to an FX X8 and got two more cores for the same wattage. But the fan noise is starting to annoy me. When I get more money, I may replace it with an i7 at 3.6Ghz and 8 cores. Twice the amount of WUs for 50 less watts.
ID: 39066 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Retvari Zoltan
Avatar

Send message
Joined: 20 Jan 09
Posts: 2380
Credit: 16,897,957,044
RAC: 0
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39067 - Posted: 3 Dec 2014, 23:04:15 UTC - in response to Message 39066.  
Last modified: 3 Dec 2014, 23:04:54 UTC

To get almost twice the speed for the same watts is compelling. I upgraded from my Phenom X6 to an FX X8 and got two more cores for the same wattage. But the fan noise is starting to annoy me. When I get more money, I may replace it with an i7 at 3.6Ghz and 8 cores. Twice the amount of WUs for 50 less watts.

The AMD FX 8320 has 8 full CPU cores (with FPU), while the i7 you're planning to buy has 4 cores hyperthreaded, so it has only 4 FPUs. Depending on what will you crunch, the AMD FX 8320 could give you more credit per day than the socket 115X i7.
ID: 39067 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
mymbtheduke

Send message
Joined: 3 Sep 12
Posts: 40
Credit: 186,780,650
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 39068 - Posted: 4 Dec 2014, 17:50:39 UTC - in response to Message 39067.  

Ok, I altered my plan a bit. I purchased an i7, motherboard and GTX 750. I will be able to still run 8 cores at 3.6Ghz but will save 45-55 watts. The GTX 750 uses 55 watts. So I will be net 0 for power usage. I will run both the 660ti and 750 on my new board.

After I get some more $$$$, I can replace my 660ti with a 970 and my new rig will process twice as many WCG WUs and three times as many GPUgrid units. Roughly for the same watts.

Oh and I will have less fan noise. If the Gigabyte 750 fan is as quiet as my Gigabyte 660ti, I should be in good shape.

Thoughts????

ID: 39068 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Simba123

Send message
Joined: 5 Dec 11
Posts: 147
Credit: 69,970,684
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 39071 - Posted: 5 Dec 2014, 2:04:40 UTC - in response to Message 39068.  

Ok, I altered my plan a bit. I purchased an i7, motherboard and GTX 750. I will be able to still run 8 cores at 3.6Ghz but will save 45-55 watts. The GTX 750 uses 55 watts. So I will be net 0 for power usage. I will run both the 660ti and 750 on my new board.

After I get some more $$$$, I can replace my 660ti with a 970 and my new rig will process twice as many WCG WUs and three times as many GPUgrid units. Roughly for the same watts.

Oh and I will have less fan noise. If the Gigabyte 750 fan is as quiet as my Gigabyte 660ti, I should be in good shape.

Thoughts????




Sounds good.
How is the 970 compared to the 660Ti on GPUGrid? for that plan to work out, it will have to be about on par.
ID: 39071 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Beyond
Avatar

Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39072 - Posted: 5 Dec 2014, 3:40:04 UTC - in response to Message 39068.  

After I get some more $$$$, I can replace my 660ti with a 970 and my new rig will process twice as many WCG WUs and three times as many GPUgrid units. Roughly for the same watts.

As Retvari mentions above your CPU performance won't always be better, and depends on the project. On the average the i7 should be faster, but not always and not by as much as you think. On the GPU side I'd estimate that you'll increase your output about 2.5x. Overall watts used I'd guess will be a little higher. Of course now there's a number of FX-83xx CPUs that are rated at 95 watts TDP. Just bought an FX-8310 to test: 8 cores at 3.4GHZ base and 95 watts. They're very cheap right now at TD so it'll be interesting to see the performance compared to the i7 and the Phenom X6. The Phenom X6 CPUs are available aftermarket only and used on eBay often sell for more than they originally cost new.
ID: 39072 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
TJ

Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39074 - Posted: 5 Dec 2014, 11:08:20 UTC - in response to Message 39068.  

Ok, I altered my plan a bit. I purchased an i7, motherboard and GTX 750. I will be able to still run 8 cores at 3.6Ghz but will save 45-55 watts. The GTX 750 uses 55 watts. So I will be net 0 for power usage. I will run both the 660ti and 750 on my new board.

After I get some more $$$$, I can replace my 660ti with a 970 and my new rig will process twice as many WCG WUs and three times as many GPUgrid units. Roughly for the same watts.

Oh and I will have less fan noise. If the Gigabyte 750 fan is as quiet as my Gigabyte 660ti, I should be in good shape.

Thoughts????


My suggestion is to keep your current rig and put a GTX970 in it. Safe the rest of your money and start collecting parts for a second rig. If shops have an offer then buy. I guess there is a lot more to choose from in the US then elsewhere. This is the way I build my rigs, but in the Netherlands I have only a few shops to look for, oh and I have my brands I buy from.
Greetings from TJ
ID: 39074 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
mymbtheduke

Send message
Joined: 3 Sep 12
Posts: 40
Credit: 186,780,650
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 39077 - Posted: 5 Dec 2014, 15:12:41 UTC - in response to Message 39074.  

Thanks to all for the replies. I have a new Lenovo laptop with an i7 running at 2.9Ghz. The WCG WUs run twice as fast as my FX 8320. Even with hyperthreading. After 2.5 years of 24x7 crunching I want to be as efficient of I can with one rig. The fan noise is at 50db for the rig and it puts out some heat. Frankly, I can live with the heat, it is the noise that gets old.

I will sell my 8320 and motherboard on ebay and get my money back or more. My Phenom sold for $10 more than I paid. Interestingly when I switched from an ASUS motherboard(AMD) to a Gigabyte one, the power dropped by 30 watts. I was surprised.
ID: 39077 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
1 · 2 · Next

Message boards : Graphics cards (GPUs) : Help with 3 GPU motherboard

©2025 Universitat Pompeu Fabra