Advanced search

Message boards : Graphics cards (GPUs) : What Pascal brand&model card are you crunching with?

Author Message
Trotador
Send message
Joined: 25 Mar 12
Posts: 103
Credit: 13,920,977,393
RAC: 9,448,390
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 44999 - Posted: 29 Oct 2016 | 11:36:45 UTC

Hi,

Kind of survey to help potential buyers in making a decision.:)

Please provide the full model name to make possible to identify it. I mean not just "EVGA/1080" but EGA/1080 GTX 1080 SC GAMING ACX 3.0 for example.

Any additional comment on your experience with the card: noise, boost, temperature will be of course welcome.

Thanks!

3de64piB5uZAS6SUNt1GFDU9d...
Avatar
Send message
Joined: 20 Apr 15
Posts: 285
Credit: 1,102,216,607
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwat
Message 45000 - Posted: 29 Oct 2016 | 11:57:39 UTC
Last modified: 29 Oct 2016 | 11:58:22 UTC

I have switched from MSI to Palit/Gainward as they provide the most efficient and most silent Pascal cards at present.

Review is in German only... sorry:
https://www.computerbase.de/2016-07/geforce-gtx-1080-partnerkarten-vergleich-test/

Whatever Palit or Gainward card you choose, they all have huge heat sinks and therefore need a lot of space, one triple slot per GPU. I have one Palit Gamerock GTX 1080 (Non-Premium) which is just a mild OC and runs stable at full load at 66°C ... all around the clock. Product code is NEB1080T15P2-1040G.

The other GPU I have is a Palit Super Jetstream GTX1070 which uses the same cooler but is even more overclocked. Stable and silent too, having said this, it heats up to 70°C which is OK, but noticeably higher than the Gamerock. Product code is NE51070S15P2-1041J

If there is a good offer, you could take the equivalent Gainward too, as they are widely identical. The Phoenix is the mild OC (=equal to the Gamerock/Jetstream) whereas the GS and GLH utilize much higher clock rates (=comparable to the Gamerock Premium and Super Jetstream).
____________
I would love to see HCF1 protein folding and interaction simulations to help my little boy... someday.

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,206,655,749
RAC: 261,147
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 45024 - Posted: 29 Oct 2016 | 20:35:39 UTC

I'm using a Gainward GTX 1080 Phoenix GLH (Goes Like Hell) edition card

Greger
Send message
Joined: 6 Jan 15
Posts: 76
Credit: 24,192,102,249
RAC: 13,992,829
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 45032 - Posted: 29 Oct 2016 | 23:45:51 UTC
Last modified: 29 Oct 2016 | 23:52:25 UTC

Gigabyte GeForce GTX 1080 Waterforce Xtreme Gaming HDMI 3xDP 8GB
Clock: +50 at 2063MHz
overall very good cooling and good that it cooling it all with water (AIO)
Some noise that could be annoying from water pump but not bad. Temp are good have reach at max 55 Celsius.

https://www.gpugrid.net/show_host_detail.php?hostid=194568
http://www.gigabyte.com/fileupload/product/3/5919/20160617180225_big.png

Two MSI GeForce GTX 1080 Sea Hawk HDMI 3xDP 8GB
Clock: +90 2063 MHz
AOI and blower do work fine and low noise even with 90% to blower fan, i manage to get same clock as Gigabyte and would be max clock with boinc projects.

https://www.gpugrid.net/show_host_detail.php?hostid=195187
https://asset.msi.com/resize/image/global/product/product_5_20160527170716_57480e4441fe3.png62405b38c58fe0f07fcef2367d8a9ba1/600.png

Both get hot backplate so i have attached a separated fan to cool down feed and cool down the card. temp are good at 40-55 at normal work.

3de64piB5uZAS6SUNt1GFDU9d...
Avatar
Send message
Joined: 20 Apr 15
Posts: 285
Credit: 1,102,216,607
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwat
Message 45068 - Posted: 31 Oct 2016 | 15:04:06 UTC
Last modified: 31 Oct 2016 | 15:07:03 UTC

Two throw in my two cents. Pascal is generally a good choice because of power efficiency. There is no disastrous board partner design on market, so you cannot do anything wrong with buying a GTX1080. My suggestion is to let purpose, individual preferences and available space decide. See the below review (German only) from Computerbase (comparison of a few air-cooled custom axial designs).

In brief … the reviewed models of Palit, Gainward, Gigabyte and Inno3D have about the same power efficiency. But four “outliers”, the ZOTAC, the ASUS, the MSI and the EVGA.

https://www.computerbase.de/2016-07/geforce-gtx-1080-partnerkarten-vergleich-test/

One outlier to the low end is (to my surprise) the EVGA FTW which needs 20-30W more power than the center span, produces more heat (up to 80°C full load) and is still not very silent or fast. The MSI Gaming X is also somewhat inefficient and even slower than the EVGA, most likely due to the lower clock rates. As a result, the temperatures are slightly better after all. Both the MSI and the EVGA are no bad products … but do not really sound like perfect crunching cards to me. For all I know, EVGA utilizes the genuine Nvidia PCB, whereas ASUS (and some other board partners) modify it or even use own designs. That is a fact you can observe in the reduced power draw.

The Zotac AMP! Extreme seems to be the card if you want to play games and crunch just now and again. It is long, bulky, heavily overclocked and pulls 40W more than the Palit/Gainward but is just slightly faster. Fortunately the heatsinks do a good job, the card is relatively silent. Anyway, that low efficient configuration doesn’t make much sense for crunchers at a first view, having said this, things could look a Little better when you lower the clock of the Zotac. Unlike the MSI, the Zotac has a lot of down-clocking potential.

The ASUS Strix OC is the outlier to the high end, the most power efficient card in the review, a superbly modified ASUS board, low temperatures, at first view it’s the perfect crunching card. But alas … it is somewhat loud. Twice as loud as the winning Palit/Gainward. But probably can be improved by choosing a different fan profile, increasing the operating temperature a little.

From the Computerbase review, the Gigabyte G1 Gaming and Inno3D iChill X3 are well balanced cards with a few weak points. The graph shows that the Gigabyte default configuration is not as power efficient as the ASUS - but a little quieter. The Inno3D on the other hand, performs well, is even less noisy than the Gigabyte, but is a true hothead! (82°C) As it doesn’t have the down-clocking potential of the Zotac, you could only increase the fan speed of the Inno3D, leading to more noise.

Finally, the Palit and Gainward are for people (like me) who run the cards 24/7 and even sleep nearby their machines. They are very fast (2nd place in the review), most silent (33dB) and maintain low temperatures as well. That is something I can confirm and do appreciate. The only real CONs are their bulky heatsinks. So if you don’t have the three slots, the ASUS or Gigabyte would be my next choices.

Well, the above is just a summary of the Computerbase article and therefore not a perfect match to real world conditions. Therefore it would be interesting to read some reviews of owners of the above cards and their experience.
____________
I would love to see HCF1 protein folding and interaction simulations to help my little boy... someday.

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,206,655,749
RAC: 261,147
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 45076 - Posted: 31 Oct 2016 | 19:05:24 UTC - in response to Message 45068.
Last modified: 31 Oct 2016 | 19:10:06 UTC

... They are very fast, most silent and maintain low temperatures as well. The only real CONs are their bulky heatsinks.

There are no wonders in cooling: the larger the heatsink, the lower the temperature will be at a given power output; therefore two slot cards (with axial fans) will be always louder and / or hotter than 3-slot cards.
I have good experiences with a Zotac GTX980Ti AMP Omega, and an ASUS Matrix Platinum GTX980Ti. I prefer the Zotac's fin alignment: in which the heatpipes are parallel with the length of the card, and the fins are parallel with the height of the card. (the ASUS Strix 1080 is using this alignment now.) In this way the hot air could be removed at the top of the card(s) by fans blowing outwards (upwards).
The ASUS GTX980Ti Strix OC is loud. The Gigabyte Windforce GTX980Ti G1 Gaming is a bit better.

About efficiency: an OC card will always be less efficient than a non-OC one. The overall power consumption of the card could be lowered by tweaking its PSU by applying more power phases than the standard, using FETs with lower switching time and Rds(on), using better capacitors. Lower working temperatures also lower the power consumption by a fraction.

My Gainward GTX 1080 Phoenix GLH (Goes Like Hell) edition card is silent, but a WDDM OS couldn't utilize this card high enough.
Hopefully I can make it work with Windows XP x64, then I will report back.

3de64piB5uZAS6SUNt1GFDU9d...
Avatar
Send message
Joined: 20 Apr 15
Posts: 285
Credit: 1,102,216,607
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwat
Message 45080 - Posted: 31 Oct 2016 | 20:11:11 UTC - in response to Message 45076.

There are no wonders in cooling: the larger the heatsink, the lower the temperature will be at a given power output; therefore two slot cards (with axial fans) will be always louder and / or hotter than 3-slot cards.


Correct, the 2-slot has to use a higher flow rate in order to remove the same heat quantity from the (smaller) cooling fins.

About efficiency: an OC card will always be less efficient than a non-OC one. The overall power consumption of the card could be lowered by tweaking its PSU by applying more power phases than the standard, using FETs with lower switching time and Rds(on), using better capacitors.


yes.. and parasitic capacitances in general as to the passive components but also the print layout, if I may add that to your list. That's the reason why an architecture shrink is faster than its predecessor, a scale down simply reduces capacitances and improves slew rates.

Lower working temperatures also lower the power consumption by a fraction.


No doubt. If there is heat, there must be a loss of energy on the electrical side, thus a voltage decrease is always a good idea, as it reduces power by square. The only question is how long do the chips play along. ;-)
____________
I would love to see HCF1 protein folding and interaction simulations to help my little boy... someday.

Trotador
Send message
Joined: 25 Mar 12
Posts: 103
Credit: 13,920,977,393
RAC: 9,448,390
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 45093 - Posted: 1 Nov 2016 | 12:20:11 UTC

Thanks for the info Retvari and Joerg,

One question more, are you running one or two wus per GPU?

Thanks!

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,206,655,749
RAC: 261,147
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 45096 - Posted: 1 Nov 2016 | 15:00:32 UTC - in response to Message 45093.

One question more, are you running one or two wus per GPU?

I'm running one.

3de64piB5uZAS6SUNt1GFDU9d...
Avatar
Send message
Joined: 20 Apr 15
Posts: 285
Credit: 1,102,216,607
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwat
Message 45101 - Posted: 1 Nov 2016 | 21:40:04 UTC
Last modified: 1 Nov 2016 | 21:40:56 UTC

I run two per GTX 1070 or 1080... but that is a habit from Poem in order to utilize the GPU to >90% making sure it gets sufficient CPU support per task. It worked perfectly there, however I did not manage yet with GPUGRID neither with short nor with long runs.

Zoltan, what utilization do you have on your 980ti? Do you observe any load oscillations?
____________
I would love to see HCF1 protein folding and interaction simulations to help my little boy... someday.

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,206,655,749
RAC: 261,147
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 45103 - Posted: 1 Nov 2016 | 22:32:19 UTC - in response to Message 45101.

Zoltan, what utilization do you have on your 980ti? Do you observe any load oscillations?

It depends on the workunit, but usually 95-96%, ±1%.

3de64piB5uZAS6SUNt1GFDU9d...
Avatar
Send message
Joined: 20 Apr 15
Posts: 285
Credit: 1,102,216,607
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwat
Message 45104 - Posted: 2 Nov 2016 | 0:12:47 UTC

Strange. My 1070 does not get better than 90% and the 1080 is even worse, maybe 75%. No matter how many tasks and the CPU/GPU ratio. As if the algorithm is not yet Pascal optimized.

But this gets off topic a little... maybe I should bring that question forward somewhere else.
____________
I would love to see HCF1 protein folding and interaction simulations to help my little boy... someday.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 45111 - Posted: 2 Nov 2016 | 12:57:18 UTC - in response to Message 45104.

A Pascal Settings and Performance thread would be useful to discus such matters.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

nanoprobe
Send message
Joined: 26 Feb 12
Posts: 184
Credit: 222,376,233
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 45113 - Posted: 2 Nov 2016 | 13:20:39 UTC - in response to Message 45104.
Last modified: 2 Nov 2016 | 13:24:54 UTC

Strange. My 1070 does not get better than 90% and the 1080 is even worse, maybe 75%. No matter how many tasks and the CPU/GPU ratio. As if the algorithm is not yet Pascal optimized.

But this gets off topic a little... maybe I should bring that question forward somewhere else.

FWIW my 1060 runs @ 95% with 2 tasks at a time at stock settings. What's truly amazing is that according to my UPS it's only pulling 45 watts.

3de64piB5uZAS6SUNt1GFDU9d...
Avatar
Send message
Joined: 20 Apr 15
Posts: 285
Credit: 1,102,216,607
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwat
Message 45133 - Posted: 3 Nov 2016 | 11:16:36 UTC
Last modified: 3 Nov 2016 | 11:17:09 UTC

Back on Topic. Unless I am mistaken, Trotadors question was which Pascal card(s) do we use ... but he did not specifically ask for a 1080. From price and price/performance ratio I assume that the 1060 6G and 1070 8G will be more popular anyway in the long term.

As I wrote, my Palit Super Jetstream GTX1070 performs very well compared with my 1080 which suffers from low utilization. We still do not know exactly why this is the case and how to improve that. See

https://www.gpugrid.net/forum_thread.php?id=4413

Are there any other crunchers who can comment on their experience with 1070, 1060 or even 1050ti cards? In theory the smaller cards should have a relatively good utilization. Comments welcome.
____________
I would love to see HCF1 protein folding and interaction simulations to help my little boy... someday.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 45137 - Posted: 3 Nov 2016 | 12:27:40 UTC - in response to Message 45133.
Last modified: 3 Nov 2016 | 12:28:31 UTC

Should get a GTX1060-3GB tomorrow, and will probably replace a GTX970 with it in a Linux system to generate some sort of early like for like performance comparison, if possible; it's difficult with tasks performing so differently.
Model: Palit Own Design Version NVIDIA GeForce GTX1060 3 GB GDDR5 Graphics Card - Black
(The cheapest dual fan model I could get).

In the UK the 3GB 1060's are presently the best bang for buck (for here); £190 compared to £240 for the 6GB version. So 10% less theoretical (1152 vs 1280 shaders) performance for 26% cheaper.

The GTX1050Ti's are basically a paper release as far as the UK is concerned - none in stock. At £140 they are decent value for money at the entry end, and being 14nm might perform very competitively (if you can get one).

There are a few GTX1050's & £120 is comparable value (performance/price) against the £140 Ti version, but I expect most would want the larger version (768 shaders vs 640).

The cheapest GTX1070 is £395, but that's more than twice the price of a GTX1060-3GB which has 60% the theoretical performance. When you factor in scaling and bus width per core config the 10603GB wins hands down (in terms of performance/price, taking nothing else into consideration; such as the whole computer cost, or possibly higher cache amounts on some models). The GTX1080's are £600 upwards, and not suited to 2GHz entry level CPU systems or WDDM impacted Windows OS's.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

ericbe
Send message
Joined: 29 Sep 16
Posts: 6
Credit: 27,120,051
RAC: 0
Level
Val
Scientific publications
watwatwat
Message 45166 - Posted: 4 Nov 2016 | 12:53:20 UTC
Last modified: 4 Nov 2016 | 13:16:28 UTC

I installed an MSI GeForce GTX 1060 GAMING X 6G just a week ago.

The PC(Win10x64) runs one WU of GPUGRID and a couple of atlas@Home WU's. The card never gets any hotter than 60-65° C and the cooler is so silent I can't even hear it (666rpm usually). The load on the card typically hovers around 60%. Power draw according to hwinfo is 63W.

As for performance, GPUGRID doesn't really offer any good insight into that, or I haven't found it. But I can tell you it's a whole lot faster than my GTX760 :).

Overall: heartily recommended.

3de64piB5uZAS6SUNt1GFDU9d...
Avatar
Send message
Joined: 20 Apr 15
Posts: 285
Credit: 1,102,216,607
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwat
Message 45176 - Posted: 4 Nov 2016 | 15:19:38 UTC - in response to Message 45166.
Last modified: 4 Nov 2016 | 15:20:00 UTC

The load on the card typically hovers around 60%. Power draw according to hwinfo is 63W.


Sounds as if the 1060 is not fully utilized and could go much faster than that. See also the other topics

http://www.gpugrid.net/forum_thread.php?id=4413

For all I know there is no magic bullet yet to correct that. Rather a mix of many influencing factors.
____________
I would love to see HCF1 protein folding and interaction simulations to help my little boy... someday.

fuzzydice555
Send message
Joined: 3 Oct 16
Posts: 5
Credit: 212,527,991
RAC: 3,600,352
Level
Leu
Scientific publications
watwatwat
Message 45183 - Posted: 4 Nov 2016 | 21:29:06 UTC

ASUS GeForce STRIX-GTX 1060-6G-GAMING

PROS:
It has a huge 3 fan cooler for a 120W card. It's whisper silent. In a colder room, the fans turn off even during full load. At 22C ambient the card runs at 54C, fan speed is 28%.

This is the only GPU I can run next to me without the noise bothering me.

Build quality is very good, if you like flashy leds the design is unbeatable.

CONS:
I overpaid a LOT for this card (400$ in Hungary). A cheaper 1060 3G can be had for half this price. I could have bought a 1070 with just a bit more cash.

I use the 375.70 driver, folding@home is broken, but GPUGRID runs like a dream!

Points/day is around 400.000, utilization is 88-92%.

ericbe
Send message
Joined: 29 Sep 16
Posts: 6
Credit: 27,120,051
RAC: 0
Level
Val
Scientific publications
watwatwat
Message 45194 - Posted: 5 Nov 2016 | 11:39:07 UTC - in response to Message 45113.

FWIW my 1060 runs @ 95% with 2 tasks at a time at stock settings. What's truly amazing is that according to my UPS it's only pulling 45 watts.


Could you use HWinfo64 (https://www.hwinfo.com/download.php) to check your power usage and report it here? I'm curious, because according to that my 1060 pulls around 65W when only running at 60% load. HWinfo will tell you the GPU load, power draw, temperature and many other things.

ericbe
Send message
Joined: 29 Sep 16
Posts: 6
Credit: 27,120,051
RAC: 0
Level
Val
Scientific publications
watwatwat
Message 45195 - Posted: 5 Nov 2016 | 11:42:29 UTC - in response to Message 45183.

ASUS GeForce STRIX-GTX 1060-6G-GAMING

Points/day is around 400.000, utilization is 88-92%.


Same question as in previous post. Could you use HWinfo64 to report some numbers? I'd like to find out how you get the %usage so high, I can't get mine to go over 65%.

Trotador
Send message
Joined: 25 Mar 12
Posts: 103
Credit: 13,920,977,393
RAC: 9,448,390
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 45274 - Posted: 15 Nov 2016 | 22:35:33 UTC

I've finally purchased a Gainward Phoenix GLH as Retvari. I've installed it in an ubuntu 16.04.1 host substituting a 750Ti with a dual e5 2690 v1 CPU configuration (boost 3.3 GHz). I've left two threads unused out of 32 to make GPU crunching smoother, rest left cruching WCG now.

So far the card uses to boost to around 2 GHz if card temperature is below 60ºC, which does not occur with the default fan profile setting. Set coolbits=12 to get only fan control, now set about 65%, driver 367.57 does not allow overclocking.

Maximum GPU clock I've seen is 2042Mhz with a temperature of 40ºC. NVIDIA control panel shows the card in Performance Level 2 that puts the memory in 9524MHz, below default setting I think.

The average ppd seems to be around 950Kppd except the 63750 points wu that gives much less.

will
Send message
Joined: 17 Jan 17
Posts: 6
Credit: 8,770,500
RAC: 0
Level
Ser
Scientific publications
wat
Message 46366 - Posted: 28 Jan 2017 | 1:03:22 UTC

A co-worker got me into compute crunching after I splurged on a Titan X Pascal, pretty cool stuff.

The BNBS WU's i'm doing now take about 6-7 hours depending on ambient temperatures, averaging about 1.2 M credits per day. I Macgyver'd a closed-loop watercooler onto it, so clock speeds stay well into the upper 1800 MHz's and temps hover around 50C. Usage is pretty much a constant 85%, so I imagine these runs aren't entirely Pascal-optimized yet.

Rion Family
Send message
Joined: 13 Jan 14
Posts: 21
Credit: 15,415,926,517
RAC: 290
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 46367 - Posted: 28 Jan 2017 | 2:03:33 UTC

I am running

(2)
EVGA GeForce GTX 1080 FTW HYBRID GAMING, 8GB GDDR5X, RGB LED, All-In-One Watercooling with 10CM FAN, 10 Power Phases, Double BIOS, DX12 OSD Support (PXOC) 08G-P4-6288-KR

and (2)
EVGA GeForce GTX 1070 FTW HYBRID GAMING, 8GB GDDR5, RGB LED, All-In-One Watercooling with 10CM FAN, 10 Power Phases, Double BIOS, DX12 OSD Support (PXOC) 08G-P4-6278-KR

Both units are plug and play hybrid cooling, they run ~ 40-45 degrees under full load.

I see the best results running 2 work units per card under GPUGRID.



PappaLitto
Send message
Joined: 21 Mar 16
Posts: 511
Credit: 4,672,242,755
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 46368 - Posted: 28 Jan 2017 | 2:59:56 UTC - in response to Message 46366.

A co-worker got me into compute crunching after I splurged on a Titan X Pascal, pretty cool stuff.

The BNBS WU's i'm doing now take about 6-7 hours depending on ambient temperatures, averaging about 1.2 M credits per day. I Macgyver'd a closed-loop watercooler onto it, so clock speeds stay well into the upper 1800 MHz's and temps hover around 50C. Usage is pretty much a constant 85%, so I imagine these runs aren't entirely Pascal-optimized yet.


Make sure those VRMs and MOSFETs are cool, those are the things that kill the card typically if left hot. Also make sure the ram chips are cool

will
Send message
Joined: 17 Jan 17
Posts: 6
Credit: 8,770,500
RAC: 0
Level
Ser
Scientific publications
wat
Message 46369 - Posted: 28 Jan 2017 | 5:17:46 UTC - in response to Message 46368.



Make sure those VRMs and MOSFETs are cool, those are the things that kill the card typically if left hot. Also make sure the ram chips are cool


I wondered about that too; I left the stock blower on and took off the heatsink (per Gamers Nexus):



The blower cools the baseplate which sinks heat from the VRM and transistors. Previously the stock heatsink cooled the RAM chips around the GPU, but apparently GDDR5X uses much less voltage than chips of the past and doesn't get very hot. The plate is just cool enough to touch under full load, i'd say around 60C.

Your/anyone's thoughts?

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,206,655,749
RAC: 261,147
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 46374 - Posted: 28 Jan 2017 | 16:41:38 UTC - in response to Message 46366.
Last modified: 28 Jan 2017 | 16:45:13 UTC

... on a Titan X Pascal ... The BNBS WU's i'm doing now take about 6-7 hours depending on ambient temperatures, averaging about 1.2 M credits per day. Usage is pretty much a constant 85%, so I imagine these runs aren't entirely Pascal-optimized yet.
Well, these BNBS WU's are the most GPU utilizing tasks here at GPUGrid (I see 99% GPU usage on my GTX980Tis under Windows XP & SWAN_SYNC). You can optimize your system to such high-end card by applying the SWAN_SYNC environmental value to make the GPUGrid app to use a full CPU thread to feed the GPU. It is also recommended to crunch CPU tasks only on one CPU thread, or not crunch CPU tasks at all.

To apply the swan_sync environmental value:
Copy systempropertiesadvanced to your clipboard
Click on the Start button, paste and press [enter].
Click on [Environmental Variables]
Look for the lower section called "System Variables", click on the [New] button below the list of System Variables.
Type swan_sync in the name field
Type 1 in the Value field
Click [OK] 3 times.
Exit BOINC manager with stopping scientific applications.
Start BOINC manager.

Profile Logan Carr
Send message
Joined: 12 Aug 15
Posts: 240
Credit: 64,069,811
RAC: 0
Level
Thr
Scientific publications
watwatwatwat
Message 46375 - Posted: 28 Jan 2017 | 20:17:44 UTC - in response to Message 46374.

... on a Titan X Pascal ... The BNBS WU's i'm doing now take about 6-7 hours depending on ambient temperatures, averaging about 1.2 M credits per day. Usage is pretty much a constant 85%, so I imagine these runs aren't entirely Pascal-optimized yet.
Well, these BNBS WU's are the most GPU utilizing tasks here at GPUGrid (I see 99% GPU usage on my GTX980Tis under Windows XP & SWAN_SYNC). You can optimize your system to such high-end card by applying the SWAN_SYNC environmental value to make the GPUGrid app to use a full CPU thread to feed the GPU. It is also recommended to crunch CPU tasks only on one CPU thread, or not crunch CPU tasks at all.

To apply the swan_sync environmental value:
Copy systempropertiesadvanced to your clipboard
Click on the Start button, paste and press [enter].
Click on [Environmental Variables]
Look for the lower section called "System Variables", click on the [New] button below the list of System Variables.
Type swan_sync in the name field
Type 1 in the Value field
Click [OK] 3 times.
Exit BOINC manager with stopping scientific applications.
Start BOINC manager.



Zoltan, how did windows XP work with your gtx 1080?
____________
Cruncher/Learner in progress.

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,206,655,749
RAC: 261,147
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 46376 - Posted: 28 Jan 2017 | 21:54:05 UTC - in response to Message 46375.

Zoltan, how did windows XP work with your gtx 1080?
The GPUGrid app v9.14 does not work under Windows XP, even if I hack the latest Windows XP driver to have my GTX 1080 installed under Windows XP. However I successfully used it under Windows XP with other projects (Einstein@home and SETI@home if I recall it correctly). While I'm using the x64 edition of Windows XP, it seems that the app thinks it's not an x64 OS, or it simply checks the version of the OS, and exits if it's under v6.x.

will
Send message
Joined: 17 Jan 17
Posts: 6
Credit: 8,770,500
RAC: 0
Level
Ser
Scientific publications
wat
Message 46816 - Posted: 3 Apr 2017 | 21:06:35 UTC - in response to Message 46374.

[To apply the swan_sync environmental value]


Awesome, thanks for the tip. Saw a ~10% increase with that.

One issue though, it seems the app won't use a full thread, at least not consistently. It'll very occasionally (for .5 secs every 2 mins. or so) spike to 91% on a single thread, but it hovers at 20-40% across ALL cores the vast majority of the time. Not sure if it's related, but my GPU still isn't fully utilized at 90% (although it is more so than before).

How would I get it to run full-speed on one thread?

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,206,655,749
RAC: 261,147
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 46817 - Posted: 3 Apr 2017 | 23:02:42 UTC - in response to Message 46816.

[To apply the swan_sync environmental value]

Awesome, thanks for the tip. Saw a ~10% increase with that.
You're welcome.

One issue though, it seems the app won't use a full thread, at least not consistently. It'll very occasionally (for .5 secs every 2 mins. or so) spike to 91% on a single thread, but it hovers at 20-40% across ALL cores the vast majority of the time.
This is how every modern OS distribute the workload across all available cores/threads. It has some risks to use only one core (thread) all the time, as that core will be hotter than the others, and in the long term it could reduce the lifetime of the CPU chip.

Not sure if it's related, but my GPU still isn't fully utilized at 90% (although it is more so than before).
It's related to the Windows Display Driver Model, which could not be turned off. The Intel(R) Core(TM) i7-4790K CPU has very advanced techniques to make the thread change of a process as "seamless" as possible.

How would I get it to run full-speed on one thread?
It's easy:
Start task manager by right click on the taskbar and then click on "task manager". Click on "more details" at the bottom if you haven't already done that. Click on the "details" tab, and look for the acemd.914-80.exe, right click on that and select "Set affinity". Then select as many threads as you want (an even-odd pair of threads resides on the same, single "core").

Jozef J
Send message
Joined: 7 Jun 12
Posts: 112
Credit: 1,140,895,172
RAC: 384,933
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 46826 - Posted: 5 Apr 2017 | 15:28:46 UTC
Last modified: 5 Apr 2017 | 15:30:56 UTC

Have few weeks GIGABYTE GeForce GTX 1080 Ti Founders Edition. I started testing gpugrid apps.. first ADRIA_FOLDGREED90_crystal_ss_contacts_100_ubiquitin_
Crystal_SS_ 8.88 timestamp 1080 ti looks very interesting..))

Revrnd
Send message
Joined: 21 Mar 16
Posts: 6
Credit: 76,105,375
RAC: 0
Level
Thr
Scientific publications
watwat
Message 47192 - Posted: 11 May 2017 | 4:40:58 UTC - in response to Message 46817.

Been running my Titan X for a couple of nights now and am a little surprised it's been up to 1.8GHz when it boosts. Pretty impressive for the blower cooler on such a massive chip.

Haven't seen too many WUs crunched within the 5 - 7hrs timeframe yet, but at least it's contributing to scientific achievements.

Profile Tuna Ertemalp
Send message
Joined: 28 Mar 15
Posts: 46
Credit: 1,547,496,701
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 47200 - Posted: 12 May 2017 | 22:22:57 UTC

Just replaced two Maxwell EVGA Titan X Hybrids with two EVGA 1080Ti SC + Hybrid kits. Works like a charm. The PC is in a closet, yet the GPUs don't exceed 50C for GPUGRID and 70C for PrimeGrid CUDA apps. Using 382.05 drivers.

Speaking of which, there is a significant usage difference between GPUGRID and PrimeGrid. I have my cards set to 120% max clock and 85C max temp (using MSIAfterBurner), whichever comes first, and turns out, they are not the limiting factor.

In the few workunits I was able to watch with MSIAfterBurner displaying what is going on, GPUGRID is at about 65-75% GPU usage, at about 1920-1940Mhz, with 68-72% of power, temps at 45-55C. I am using swan_sync, by the way. Currently running tasks are:

e95s25_e87s22p0f184-PABLO_Q15004_0_IDP-0-1-RND4080
e71s59_e51s11p0f22-PABLO_P01106_0_IDP-0-1-RND9004

But PrimeGrid consistently occupies the GPUs at about 96-99% GPU usage, at about 1925-1950Mhz, with 110-115% of power, temps at 65-70C. In other words, PrimeGrid is getting much more bang for my buck...

Should someone know about this at GPUGRID? Or is there a better forum to post this at?

Tuna

____________

Profile Tuna Ertemalp
Send message
Joined: 28 Mar 15
Posts: 46
Credit: 1,547,496,701
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 47201 - Posted: 12 May 2017 | 22:47:17 UTC - in response to Message 46367.

Rion Family wrote:
I see the best results running 2 work units per card under GPUGRID.

Iiiiinteresting... Care to share your app_config.xml?

Tuna

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,206,655,749
RAC: 261,147
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 47202 - Posted: 13 May 2017 | 0:46:23 UTC - in response to Message 47201.

You can find it here.

Profile Tuna Ertemalp
Send message
Joined: 28 Mar 15
Posts: 46
Credit: 1,547,496,701
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 47206 - Posted: 13 May 2017 | 16:36:53 UTC - in response to Message 47202.

You can find it here.


You are awesome!

Matt
Avatar
Send message
Joined: 11 Jan 13
Posts: 216
Credit: 846,538,252
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 47316 - Posted: 23 May 2017 | 17:49:20 UTC
Last modified: 23 May 2017 | 17:52:08 UTC

Just added an EVGA GTX 1080Ti FTW3. The host is here.

Very happy with it so far.

I was previously running an EVGA GTX 1080 SC in that box but moved it to this host. The CPU is a slightly slower and I think my run times have consequently gotten a little longer.

Post to thread

Message boards : Graphics cards (GPUs) : What Pascal brand&model card are you crunching with?

//