Advanced search

Message boards : Graphics cards (GPUs) : Select GPU for F@H

Author Message
UnknownPL1337
Send message
Joined: 22 Mar 15
Posts: 12
Credit: 530,700
RAC: 0
Level
Gly
Scientific publications
watwat
Message 40629 - Posted: 26 Mar 2015 | 21:22:18 UTC

I'm using a A10-7850K for playing games on the APU, it has a "dedicated" graphics card inside the CPU for those that don't know this :p

I bought a GT630 with a GK208 Chip that is cooled Fanless. It uses only 25Watt and produces about 692GFlops its not much... but the GFlops per Watt performance is better than the Titan X and under 0.1Gflops less than the GTX980. It uses 1 Watt for 27Gflops.

So basicly i want to use my APU for gaming, so the game should not even try to run or get any help from the GT630. The GT630 should ONLY ONLY ONLY be for F@H, nothing else. Can i somehow disable it that a GAME don't even see that there is a GT630 ? :D

Faster help = better for me and the world :)

eXaPower
Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 40632 - Posted: 26 Mar 2015 | 22:04:20 UTC - in response to Message 40629.
Last modified: 26 Mar 2015 | 22:14:26 UTC

I'm using a A10-7850K for playing games on the APU, it has a "dedicated" graphics card inside the CPU for those that don't know this :p

If you like - a Hybrid CPU/CPU GPUGRID (Molecular Dynamics) application designed for this hardware exists for testing.

I bought a GT630 with a GK208 Chip that is cooled Fanless. It uses only 25Watt and produces about 692GFlops its not much... but the GFlops per Watt performance is better than the Titan X and under 0.1Gflops less than the GTX980. It uses 1 Watt for 27Gflops.

The GT630 Zotac passive has [3] memory configurations. GT630 (800MHz/GDR3 64bit memory interface) 901 core clock will complete long runs in over [2] days - or 3. There passive cooling (heatsink only) Maxwell 750's also.

Total performance is no where near a Titan. What's similar is SMX/SMX per core wattage. GT630 (2SMX*12.5Wper =25W total) Titan X (24SMM*12.5per=250W Total) (Although an overclocked Titan can go over 250watts) Look at total cores/wattage -- 25W for 384 cores and 250W for 3072 CUDA. Each core in a SMX/SMM consumes 0.0**w or lower. Many performance analysis metrics exist for (Energy) . Explanations about CPU/GPU technology advancements and requirements can found at Anandtech: articles of interest and depth.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40633 - Posted: 26 Mar 2015 | 22:06:04 UTC - in response to Message 40629.
Last modified: 26 Mar 2015 | 22:09:05 UTC

Your GK208 GPU is a Kepler. With its older architecture it is less efficient than Maxwells under most real world workloads, despite the comparable raw GFlops/W rating.

Kepler can use 128 of its 192 shaders in each SMX without problems. Under some circumstances it can use the additional 64. Maxwell has no such restrictions on using all 128 shaders in each SMM. Hence a SMM does about 90% the work of a SMX - says nVidia, and for many apps (GPU-Grid, Einstein, SETI, games) this holds about true.

The theoretical GFlops are based on the assumption that the chip is able to run in the most optimum way, i.e. a Kepler uses all 192 shaders in each SMX during each clock. If you want to compare real world performance between Kepler and Maxwell using GFlops and GFlops/W you first need to find the conversion factor between those architectures.

BTW: we're running GPU-Grid here, not Folding@Home!

Edit: you can use both GPUs under BOINC. You can also define exclusive apps, so that stops working (or stops GPU work, not sure) when that game is detected. Your AMD GPU could e.g. run Einstein@Home.

MrS
____________
Scanning for our furry friends since Jan 2002

UnknownPL1337
Send message
Joined: 22 Mar 15
Posts: 12
Credit: 530,700
RAC: 0
Level
Gly
Scientific publications
watwat
Message 40636 - Posted: 26 Mar 2015 | 23:15:07 UTC - in response to Message 40633.
Last modified: 26 Mar 2015 | 23:29:28 UTC

Your GK208 GPU is a Kepler. With its older architecture it is less efficient than Maxwells under most real world workloads, despite the comparable raw GFlops/W rating.

Kepler can use 128 of its 192 shaders in each SMX without problems. Under some circumstances it can use the additional 64. Maxwell has no such restrictions on using all 128 shaders in each SMM. Hence a SMM does about 90% the work of a SMX - says nVidia, and for many apps (GPU-Grid, Einstein, SETI, games) this holds about true.

The theoretical GFlops are based on the assumption that the chip is able to run in the most optimum way, i.e. a Kepler uses all 192 shaders in each SMX during each clock. If you want to compare real world performance between Kepler and Maxwell using GFlops and GFlops/W you first need to find the conversion factor between those architectures.

BTW: we're running GPU-Grid here, not Folding@Home!

Edit: you can use both GPUs under BOINC. You can also define exclusive apps, so that stops working (or stops GPU work, not sure) when that game is detected. Your AMD GPU could e.g. run Einstein@Home.

MrS


You don't understand me, i wrote i DONT want to use my AMD iGPU for GPUgrid. I want to use ONLY GT630 for GPUgrid. And the AMD would be used only for gaming and other stuff.

GT630 = GPUgrid
AMD iGPU = Games, browsing and other stuff.
I also DONT want that when i start a game that the game start using the GT630.
The Game should ONLY be able to use the AMD iGPU.

I say it like that i don't want that the GT630 will be used by any Program or Game or what ever it will ONLY work for GPUGRID... ONLY

The AMD iGPU should work like the GT630 would NEVER exist...

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40640 - Posted: 27 Mar 2015 | 7:04:57 UTC - in response to Message 40636.

Your title is "Select GPU for F@H" and now you are saying you want to use your GT630 for GPUgrid, "GT630 = GPUgrid".

Assuming you can actually use both (which might depend on the motherboard) you can exclude the A10-750K iGPU from being used by Boinc (or GPUGrid) by creating a cc_config.xml file.

It's possible that the discrete GPU will take precedence for games (motherboard/Bios) and you wont be able to do anything about that.

You might need the monitor to be plugged in to the iGPU to have any chance.
The use of iGPU's has been discussed before, try searching for it.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

UnknownPL1337
Send message
Joined: 22 Mar 15
Posts: 12
Credit: 530,700
RAC: 0
Level
Gly
Scientific publications
watwat
Message 40653 - Posted: 27 Mar 2015 | 21:22:55 UTC - in response to Message 40640.
Last modified: 27 Mar 2015 | 21:23:56 UTC

Your title is "Select GPU for F@H" and now you are saying you want to use your GT630 for GPUgrid, "GT630 = GPUgrid".

Assuming you can actually use both (which might depend on the motherboard) you can exclude the A10-750K iGPU from being used by Boinc (or GPUGrid) by creating a cc_config.xml file.

It's possible that the discrete GPU will take precedence for games (motherboard/Bios) and you wont be able to do anything about that.

You might need the monitor to be plugged in to the iGPU to have any chance.
The use of iGPU's has been discussed before, try searching for it.


Yes i want to SELECT the GT630 for F@H not the iGPU. What happens if i want to F@H and play games ? I get like 1FPS -.-
Or can i do something like hmm when i browse or watch youtube IGPU and GT630 is working for GPUgrid. But when i start a game and want to play the IGPU STOP to work for GPUgrid and starts to render/run the game.

It would be awesome if the person who made BOINC would add some more features like selecting GPUs and that stuff.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40654 - Posted: 27 Mar 2015 | 21:56:45 UTC - in response to Message 40653.
Last modified: 27 Mar 2015 | 21:58:23 UTC

GPUgrid is a Boinc project.
Folding@home is not a Boinc project :|
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

UnknownPL1337
Send message
Joined: 22 Mar 15
Posts: 12
Credit: 530,700
RAC: 0
Level
Gly
Scientific publications
watwat
Message 40658 - Posted: 27 Mar 2015 | 22:57:50 UTC - in response to Message 40654.

GPUgrid is a Boinc project.
Folding@home is not a Boinc project :|

Didn't knew that :)
How can i run 2 GPUs (Not SLI mode) in BOINC ?
GOT a GTX460, GT630 :)
The GTX580 is sold :(

When i make here any suggestions will it be added to BOINC Manager ?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40688 - Posted: 29 Mar 2015 | 17:07:00 UTC - in response to Message 40658.
Last modified: 29 Mar 2015 | 17:08:26 UTC

How can i run 2 GPUs (Not SLI mode) in BOINC ?
GOT a GTX460, GT630 :)

Install them and add the 'use all gpu's' instruction in a cc_config.xml file,
http://boinc.berkeley.edu/wiki/Client_configuration

When i make here any suggestions will it be added to BOINC Manager ?
No. Berkeley maintain and publish the Boinc software,
http://boinc.berkeley.edu/
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

mikey
Send message
Joined: 2 Jan 09
Posts: 297
Credit: 6,452,540,163
RAC: 20,171,539
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40694 - Posted: 30 Mar 2015 | 11:31:12 UTC - in response to Message 40653.

-
Or can i do something like hmm when i browse or watch youtube IGPU and GT630 is working for GPUgrid. But when i start a game and want to play the IGPU STOP to work for GPUgrid and starts to render/run the game.

It would be awesome if the person who made BOINC would add some more features like selecting GPUs and that stuff.


It's already in there! Go to the Boinc Manager and click on Tools, computing options, then click on the exclusive applications tab and put your game and youtube command file names in there. Then anytime those files run Boinc will stop crunching until you stop running that file.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40706 - Posted: 30 Mar 2015 | 17:06:29 UTC - in response to Message 40694.
Last modified: 30 Mar 2015 | 17:07:32 UTC

On Boinc Manager 7.4.42 it's Tools, Computing Preferences (not options), exclusive applications.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

mikey
Send message
Joined: 2 Jan 09
Posts: 297
Credit: 6,452,540,163
RAC: 20,171,539
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40723 - Posted: 31 Mar 2015 | 12:16:42 UTC - in response to Message 40706.

On Boinc Manager 7.4.42 it's Tools, Computing Preferences (not options), exclusive applications.


You're right I was looking right at and STILL typed it wrong, thanks for fixing it!!

Post to thread

Message boards : Graphics cards (GPUs) : Select GPU for F@H

//