Advanced search

Message boards : Graphics cards (GPUs) : Amd mainboard running nvidia cards in SLI

Author Message
]{LiK`Rangers`
Avatar
Send message
Joined: 5 Jan 12
Posts: 117
Credit: 77,256,014
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwat
Message 24622 - Posted: 28 Apr 2012 | 20:59:31 UTC
Last modified: 28 Apr 2012 | 21:24:30 UTC

Should be a simple question, as im only planning running diablo 3 i dont need sli. AMD motherboards dont come with sli cables anyway and thats what im used to building. SO do I need* a sli bridge to run nvidia cards on a amd board or can I run them independently of each other which is the requirement of gpugrid anyways. Thanks Ben

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24623 - Posted: 28 Apr 2012 | 21:35:12 UTC - in response to Message 24622.

In the past I had 4 NVidia cards on an AMD board, all crunching GPUGrid tasks, so yes.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

]{LiK`Rangers`
Avatar
Send message
Joined: 5 Jan 12
Posts: 117
Credit: 77,256,014
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwat
Message 24624 - Posted: 28 Apr 2012 | 21:37:16 UTC

everything i read says that theres a performance drop, yet I dont think they take into account that people will be crunching on these cards independently, do you see much performance decrease? or are you unsure?

5pot
Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24625 - Posted: 28 Apr 2012 | 21:43:27 UTC

Amd is slower than Intel.chips, so the performance is CPU related I would assume.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24626 - Posted: 28 Apr 2012 | 21:44:26 UTC - in response to Message 24624.

For GPU-Grid SLI or Crossfire don't matter at all, since different tasks are being run on each chip. There's no need for any communication between these tasks, so the bridges don't matter either. What matter a little bit in multi-GPU configurations is the PCIe speed. Drop very fast cards into slots with less bandwidth (e.g. 4x) and performance will drop.

MrS
____________
Scanning for our furry friends since Jan 2002

]{LiK`Rangers`
Avatar
Send message
Joined: 5 Jan 12
Posts: 117
Credit: 77,256,014
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwat
Message 24627 - Posted: 28 Apr 2012 | 21:53:45 UTC - in response to Message 24624.

well im deducting that since sli bridge is communication between cards, that independent usage as you say works so without the communication between there will be minimal performance issues. just a guess. Thats my view, i read that there could be 30% drop in performance, somewhere recently, that im taking out of context. I think they meant sli. Thats the biggest performance drop ive seen of even a 4x pci-e channel. So if you happen to know I would still appreciate the advice

]{LiK`Rangers`
Avatar
Send message
Joined: 5 Jan 12
Posts: 117
Credit: 77,256,014
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwat
Message 24628 - Posted: 28 Apr 2012 | 21:56:45 UTC - in response to Message 24627.
Last modified: 28 Apr 2012 | 22:38:48 UTC

well i would like to recommend GA-990FXA-UD7 from buy.com I already have a 8150 gigabyte board and it works like a charm.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24629 - Posted: 28 Apr 2012 | 23:23:11 UTC - in response to Message 24628.

On the GA-990FXA-UD7, you can run two cards at PCIE2.x X16, and up to 4 cards at PCIE2.x at X8.
If you just had two GPU's you would lose nothing in terms of speed compared to another PCIE2 motherboard. If you had 3 or 4 cards then you would lose 3 or 4 times the difference between X8 and X16. Obviously this depends on the cards; the bigger/faster they are the more your would lose. If you had 4 high end GPU's (say GTX580's), and for the sake of argument, each lost 7.5% performance then in total you would lose 7.5% * 4GPU's = 30% of one GPU.

At present the real picture is unclear; we are not yet running GTX680 (or similar) work units, and we don't actually know what the difference is here between PCIE2.x and PCIE3.0. Then there is the issue that no AMD boards support PCIE3 and boards for Intel processors vary in their support; some offer 1 or 2 PCIE3 X16 slots, or possibly up to 3 at X8.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

5pot
Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24630 - Posted: 28 Apr 2012 | 23:58:55 UTC

Actually its the pcie issue why i got the sabertooth x79, since boards that had 4 would have 2 at 16 but once you got to 3 it would be 16, 8, 8. Since the sabertooth only has 3, its locked at 16, 16,8.

]{LiK`Rangers`
Avatar
Send message
Joined: 5 Jan 12
Posts: 117
Credit: 77,256,014
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwat
Message 24631 - Posted: 29 Apr 2012 | 1:22:13 UTC - in response to Message 24630.

http://www.gigabyte.com/products/product-page.aspx?pid=3880#sp

that link says that 2 at 16x and 2 at 8x and 2 at 4x but yeah, i dont know of any other board really with that kinda performance for price. Also I might end up buying windows 7, and just keep the knowledge, cause i have alot to pay for this month anyway, diffence of about 100 dollars, plus the downclocking is killing my current 570. But yea I can vouch for how insanely easy and compatible the 8 core 8150 is with gigabyte.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24633 - Posted: 29 Apr 2012 | 8:44:34 UTC - in response to Message 24631.
Last modified: 29 Apr 2012 | 8:47:26 UTC

how insanely easy and compatible the 8 core 8150 is with gigabyte.

I'm all up for Gigabyte boards, but this is really just the behaviour you should expect from any CPU - board combination. Anything less is unacceptable.

BTW: if you're tight on money.. don't spend it for distributed crunching. It's a cool hobby, but only if the cost doesn't hurt you. And the electricity bill for your hardware is going to be.. interesting. Your best investment so far may be a medium sized 80+ Gold PSU, if you don't own one already. Depends on your PSU, though.

MrS
____________
Scanning for our furry friends since Jan 2002

5pot
Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24644 - Posted: 29 Apr 2012 | 17:33:20 UTC
Last modified: 29 Apr 2012 | 17:38:19 UTC

Agreed. Especially considering this graph..

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/10

Great hobby. Expensive, but great.

jlhal
Send message
Joined: 1 Mar 10
Posts: 147
Credit: 1,077,535,540
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24679 - Posted: 2 May 2012 | 9:43:18 UTC - in response to Message 24631.

http://www.gigabyte.com/products/product-page.aspx?pid=3880#sp

... i dont know of any other board really with that kinda performance ...


See here , this is one of my boards with an AMD FX8150, runs like a charm with 2 GPUs @ PCI2.0 16X :

http://www.asus.com/Motherboards/AMD_AM3Plus/SABERTOOTH_990FX/

Cheers
____________
Lubuntu 16.04.1 LTS x64

Old man
Send message
Joined: 24 Jan 09
Posts: 42
Credit: 16,676,387
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24682 - Posted: 2 May 2012 | 11:02:25 UTC

As many of user already said, you can use two nvidia cards as long as the motherboard has two PCI-E connectors. Today I put the GTX 470 graphics card to assist the GTX 260 graphics cards. Gpugrid.net detect cards wrong, claiming that i have two GTX 470 graphics cards in one machine. I do not worry about it.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 24729 - Posted: 4 May 2012 | 21:20:28 UTC - in response to Message 24682.

That's the way BOINC reports the GPUs, not GPU-Grid. As you say this has no consequences :)

MrS
____________
Scanning for our furry friends since Jan 2002

Post to thread

Message boards : Graphics cards (GPUs) : Amd mainboard running nvidia cards in SLI

//