Message boards :
Graphics cards (GPUs) :
Amd mainboard running nvidia cards in SLI
Message board moderation
| Author | Message |
|---|---|
|
Send message Joined: 5 Jan 12 Posts: 117 Credit: 77,256,014 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Should be a simple question, as im only planning running diablo 3 i dont need sli. AMD motherboards dont come with sli cables anyway and thats what im used to building. SO do I need* a sli bridge to run nvidia cards on a amd board or can I run them independently of each other which is the requirement of gpugrid anyways. Thanks Ben |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
In the past I had 4 NVidia cards on an AMD board, all crunching GPUGrid tasks, so yes. FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
|
Send message Joined: 5 Jan 12 Posts: 117 Credit: 77,256,014 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
everything i read says that theres a performance drop, yet I dont think they take into account that people will be crunching on these cards independently, do you see much performance decrease? or are you unsure? |
|
Send message Joined: 8 Mar 12 Posts: 411 Credit: 2,083,882,218 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Amd is slower than Intel.chips, so the performance is CPU related I would assume. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
For GPU-Grid SLI or Crossfire don't matter at all, since different tasks are being run on each chip. There's no need for any communication between these tasks, so the bridges don't matter either. What matter a little bit in multi-GPU configurations is the PCIe speed. Drop very fast cards into slots with less bandwidth (e.g. 4x) and performance will drop. MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 5 Jan 12 Posts: 117 Credit: 77,256,014 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
well im deducting that since sli bridge is communication between cards, that independent usage as you say works so without the communication between there will be minimal performance issues. just a guess. Thats my view, i read that there could be 30% drop in performance, somewhere recently, that im taking out of context. I think they meant sli. Thats the biggest performance drop ive seen of even a 4x pci-e channel. So if you happen to know I would still appreciate the advice |
|
Send message Joined: 5 Jan 12 Posts: 117 Credit: 77,256,014 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
well i would like to recommend GA-990FXA-UD7 from buy.com I already have a 8150 gigabyte board and it works like a charm. |
skgivenSend message Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
On the GA-990FXA-UD7, you can run two cards at PCIE2.x X16, and up to 4 cards at PCIE2.x at X8. If you just had two GPU's you would lose nothing in terms of speed compared to another PCIE2 motherboard. If you had 3 or 4 cards then you would lose 3 or 4 times the difference between X8 and X16. Obviously this depends on the cards; the bigger/faster they are the more your would lose. If you had 4 high end GPU's (say GTX580's), and for the sake of argument, each lost 7.5% performance then in total you would lose 7.5% * 4GPU's = 30% of one GPU. At present the real picture is unclear; we are not yet running GTX680 (or similar) work units, and we don't actually know what the difference is here between PCIE2.x and PCIE3.0. Then there is the issue that no AMD boards support PCIE3 and boards for Intel processors vary in their support; some offer 1 or 2 PCIE3 X16 slots, or possibly up to 3 at X8. FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help |
|
Send message Joined: 8 Mar 12 Posts: 411 Credit: 2,083,882,218 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Actually its the pcie issue why i got the sabertooth x79, since boards that had 4 would have 2 at 16 but once you got to 3 it would be 16, 8, 8. Since the sabertooth only has 3, its locked at 16, 16,8. |
|
Send message Joined: 5 Jan 12 Posts: 117 Credit: 77,256,014 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
http://www.gigabyte.com/products/product-page.aspx?pid=3880#sp that link says that 2 at 16x and 2 at 8x and 2 at 4x but yeah, i dont know of any other board really with that kinda performance for price. Also I might end up buying windows 7, and just keep the knowledge, cause i have alot to pay for this month anyway, diffence of about 100 dollars, plus the downclocking is killing my current 570. But yea I can vouch for how insanely easy and compatible the 8 core 8150 is with gigabyte. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
how insanely easy and compatible the 8 core 8150 is with gigabyte. I'm all up for Gigabyte boards, but this is really just the behaviour you should expect from any CPU - board combination. Anything less is unacceptable. BTW: if you're tight on money.. don't spend it for distributed crunching. It's a cool hobby, but only if the cost doesn't hurt you. And the electricity bill for your hardware is going to be.. interesting. Your best investment so far may be a medium sized 80+ Gold PSU, if you don't own one already. Depends on your PSU, though. MrS Scanning for our furry friends since Jan 2002 |
|
Send message Joined: 8 Mar 12 Posts: 411 Credit: 2,083,882,218 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Agreed. Especially considering this graph.. http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/10 Great hobby. Expensive, but great. |
|
Send message Joined: 1 Mar 10 Posts: 147 Credit: 1,077,535,540 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
http://www.gigabyte.com/products/product-page.aspx?pid=3880#sp See here , this is one of my boards with an AMD FX8150, runs like a charm with 2 GPUs @ PCI2.0 16X : http://www.asus.com/Motherboards/AMD_AM3Plus/SABERTOOTH_990FX/ Cheers Lubuntu 16.04.1 LTS x64 |
|
Send message Joined: 24 Jan 09 Posts: 42 Credit: 16,676,387 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
As many of user already said, you can use two nvidia cards as long as the motherboard has two PCI-E connectors. Today I put the GTX 470 graphics card to assist the GTX 260 graphics cards. Gpugrid.net detect cards wrong, claiming that i have two GTX 470 graphics cards in one machine. I do not worry about it. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
That's the way BOINC reports the GPUs, not GPU-Grid. As you say this has no consequences :) MrS Scanning for our furry friends since Jan 2002 |
©2025 Universitat Pompeu Fabra