Author |
Message |
skgivenVolunteer moderator Volunteer tester
Send message
Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level
Scientific publications
|
Tips for using coolbits with different versions of Linux...
____________
FAQ's
HOW TO:
- Opt out of Beta Tests
- Ask for Help |
|
|
|
This my experience on the subject, forgiveness is the Google English.
NVIDIA - SETTINGS. From the Nvidia driver ver. 331.17 also shows the fan speed if it supports it.
Add control GPU fan . Coolbits have several options:
OPTION - 1 Allows the GPU overclock
OPTION - 4 Allows the fan control .
No two options can be used simultaneously.
Activate the utility Coolbits in UBUNTU 13.10 or similar .
By Ubuntu terminal (Ctrl+Alt+t) you need to edit the file - / etc/X11/xorg.conf . - Is the Nvidia -Settings configuration .
First make a backup of the file :
"sudo cp /etc/X11/xorg.conf /etc/X11/xorgBACK.conf " without quotes.
Then edit the file - /etc/X11/xorg.conf -
"sudo gedit /etc/X11/xorg.conf " without quotes.
In its Section "Screen" add exactly the line: Option " Coolbits " "4"
Section "Screen"
Identifier "Screen0"
Device "Device0"
Monitor "Monitor0"
DefaultDepth 24
Option "Stereo" "0"
Option "Coolbits" "4" <-------------------------------------------<
Option "nvidiaXineramaInfoOrder" "CRT-0"
Option "metamodes" "nvidia-auto-select +0+0"
SubSection "Display"
Depth 24
EndSubSection
EndSection
Save and close the file, exit terminal Ubuntu and reboot, Coolbits be activated at boot , access
to " Nvidia X server Settings " and paragraph -Thermal Settings - have control Gpu fan and see also the rpm. fan
If there is more than one GPU installed option only works if Coolbits Gpu have connected a monitor.
Can be a Real or Virtual monitor .
Mount a Virtual Monitor is one slightly more complex and can be explained in another section. I installed a second Virtual monitor I have to control the Fan 2nd GPU.
See my Nvidia Settings screen and GKrellm |
|
|
|
In all versions of Ubuntu Linux from 12.04 thru 13.10 that I have personally used with GPUs and GPUGrid, only fan control (coolbits=4) is confirmed to work.
Clock control has never worked.
Presently I use Ubuntu 13.10 with Nvidia 331.17 drivers.
The simplest method to add fan control, IMHO, is to open a terminal window and type without quotes
"cd /etc/X11"
"sudo nvidia-xconfig --cool-bits=4"
This will place the option for fan control in the proper location in xorg.conf and backup the file automatically before saving the changed file.
You need to logout and log back in, or restart your machine to make this change effective.
Open Nvidia-xsettings to see fan control under Thermal Settings. |
|
|
MJHProject administrator Project developer Project scientist Send message
Joined: 12 Nov 07 Posts: 696 Credit: 27,266,655 RAC: 0 Level
Scientific publications
|
OPTION - 1 Allows the GPU overclock
OPTION - 4 Allows the fan control .
No two options can be used simultaneously.
It's a bitmask. To enable both options use "Coolbits 5"
MJH |
|
|
|
It's a bitmask. To enable both options use "Coolbits 5"
MJH
Hello From Nvidia Fermi cores does not support the option Coolbits 1, if we use Coolbits 5 has the same effect as "4". Greetings. |
|
|
DagorathSend message
Joined: 16 Mar 11 Posts: 509 Credit: 179,005,236 RAC: 0 Level
Scientific publications
|
@ Robert Gammon,
Your tip for setting Coolbits is excellent. It's easier than opening xorg.conf in an editor and it eliminates the possibility of making a typo that prevents X from booting.
@all,
With Coolbits 1 I was able to adjust with clock freqs on my GTX 570 using the following commands:
1) nvidia-settings --assign localhost:0[gpu:0]/GPUAdaptiveClockState=0
2) nvidia-settings --assign localhost:0[gpu:0]/GPUCurrentClockFreqs=x (x = an integer in a valid range)
1) unlocks clock settings, 2) assigns a frequency.
With Coolbits 5 I was able to adjust both clock and fan speed. again that was on GTX 570.
Now, on my 660ti with 331.17 driver, the clocks cannot be accessed with the above commands. Using either of the above commands gets a response similar to "nope, that setting is read only".
I can adjust fan speed with Coolbits 4 or Coolbits 5 on 660ti, that's not locked.
Windows users are overclocking Kepler cards so I assume the clocks aren't really locked. It appears nvidia-settings simply isn't capable of getting around whatever barriers exist. I am going to install Win XP and BOINC in a VirtualBox VM to see if:
1) it's possible to crunch GPUgrid in a VM
2) I can use utils for Win from EVGA, MSI, Asus, etc. to alter/unlock the BIOS or do whatever needs doing to allow Linux access to the clocks. Has anybody tried that yet, either in real Windows or virtual Windows?
____________
BOINC <<--- credit whores, pedants, alien hunters |
|
|
skgivenVolunteer moderator Volunteer tester
Send message
Joined: 23 Apr 09 Posts: 3968 Credit: 1,995,359,260 RAC: 0 Level
Scientific publications
|
on my 660ti with 331.17 driver, the clocks cannot be accessed with the above commands. Using either of the above commands gets a response similar to "nope, that setting is read only".
I can adjust fan speed with Coolbits 4 or Coolbits 5 on 660ti, that's not locked.
It might be the same for all GK104 cards.
Ubuntu 13.04, GTX770 + GTX670 (313.30 drivers):
Tried,nvidia-settings --assign localhost:0[gpu:0]/GPUAdaptiveClockState=0
Response was along the lines of 'don't recognize localhost:0'.
So, tried,
nvidia-settings --assign [gpu:0]/GPUAdaptiveClockState=0
Response was the same as you got, 'that file is read only'.
Windows users are overclocking Kepler cards so I assume the clocks aren't really locked. It appears nvidia-settings simply isn't capable of getting around whatever barriers exist. I am going to install Win XP and BOINC in a VirtualBox VM to see if:
1) it's possible to crunch GPUgrid in a VM
2) I can use utils for Win from EVGA, MSI, Asus, etc. to alter/unlock the BIOS or do whatever needs doing to allow Linux access to the clocks. Has anybody tried that yet, either in real Windows or virtual Windows?
Last time I tried to update a VM (with drivers) it crashed the VM and left it unusable. I suspect you need to build the driver into the VM before opening it in Virtual Box (or whatever you use).
So far I think there are only 3 projects, which you mentioned, (and possibly test projects) that use the virtual environment to run Boinc. These are all CPU projects AFAIK.
To me it makes more sense to launch Boinc within a VM rather than just run tasks within a VM but keep the manager outside. I used Dotschux this way to isolate the CPU tasks, so only so many would run (Boinc wouldn't suspend GPU work to run CPU work), and it also allowed me to run Linux only tasks on a Windows system.
Typically there is an inefficiency with any VM (as it has to go through the host system to some extent), but on the plus side you might find improved stability (or perhaps checkpointing sort-of becomes possible, RNA World). The stability of the VM is a different matter. When I ran Climate models and WCG's Clean Energy tasks, the performance was very poor, but other tasks from other projects were fine.
It would be interesting to see what benefit the latest Virtualization brings. Haswell might reduce the reliance on the host operating system, reducing the overhead. There is a bit of a conundrum here. WRT XP, it's performance is usually very close to that of Linux, so you probably wouldn't see any performance benefit, but it might facilitate testing for the sake of testing.
I suspect that there would not be any improvement from Vista,W7,W8 due to the WDDM, but it would be interesting to see. You would need to install the drivers and the tools to control fan speed and GPU clocks. Fine on XP (I guess) but I expect you would incur the wrath of the WDDM on Vista/W7/W8. Obviously if you didn't install the drivers on Windows you could not control the clocks.
Anyway, I think you would need to build/use a ready built VM image with NVidia drivers. Maybe Matt can help with this?
____________
FAQ's
HOW TO:
- Opt out of Beta Tests
- Ask for Help |
|
|
DagorathSend message
Joined: 16 Mar 11 Posts: 509 Credit: 179,005,236 RAC: 0 Level
Scientific publications
|
I'm not sure why that command won't work with "localhost:0" in it. You can also use your system's name in place of localhost. The list of arguments for nvidia-settings is long and complicated with several ways to accomplish the same thing.
If you want/need to become more familiar with nvidia-settings, you might try these old school CLI oriented techniques for viewing help and docs:
1) man nvidia-settings (traditional Linux CLI docs method)
2) nvidia-settings --help
3) nvidia-settings --help | more (more paginates screen output, press spacebar to scroll down)
4) nvidia-settings --query all > nv_sets.txt ; less < nv_sets.txt (or can open nv_sets.txt in a text editor)
More doesn't allow scrolling up, just down. Substitute the newer less for more (no pun intended). Less allows up-arrow to scroll up one line, down-arrow to scroll down one line, page-up to scroll up one page, page-down to scroll down one page. Press q to quit.
BTW, don't pipe nvidia-settings --query all directly to more or less! nvida-settings won't exit until you exit more/less, it just keeps on querying and dumping more output into the pipe.
Thanks for the thoughts and suggestions, Kevin. Good to hear XP is the OS most likely to succeed in this experiment and yes, maybe Haswell's VT-d is what's needed to make it work.
Testing just for the sake of testing sometimes bears fruit and that's one of my motives for the experiment with a virtual XP. I'm also thinking of OS X users. If my experiment works on my Linux system then it might pave the way for OS X users to crunch GPUgrid too, if they don't mind the performance hit imposed by the VM. Also, we don't want them to feel like they're excluded from Noelia's world GPU memory domination agenda. (Still chuckling over that thread.)
The main reason for my experiment is that my system boots Linux only. I don't want to turn it into a dual-boot rig just to see if I can use Win to permanently unlock the clocks, mod the BIOS or do whatever it needs. If I can't do that in a VM, I'll consider going dual-boot but in the end I'll get back to crunching GPUgrid on Linux even if it means I can't OC.
____________
BOINC <<--- credit whores, pedants, alien hunters |
|
|
yodapSend message
Joined: 2 May 12 Posts: 1 Credit: 189,065,429 RAC: 285 Level
Scientific publications
|
Thank you Robert. It worked for me in the new LM16 install. |
|
|
jlhalSend message
Joined: 1 Mar 10 Posts: 147 Credit: 1,077,535,540 RAC: 0 Level
Scientific publications
|
Hi all !
As I was cirious of finding the latest about Option 'Coolbits" values this is what I founded , reading the last README file for NVidia beta driver 346.16 the following is the exact text from the README file of rhe 346.16 driver :
(Coolbits option)Enables various unsupported features, such as support for GPU clock manipulation in the NV-CONTROL X extension.
This option accepts a bit mask of features to enable.
WARNING: this may cause system damage and void warranties.
This utility can run your computer system out of the manufacturer's design specifications, including, but not limited to: higher system voltages, above normal temperatures, excessive frequencies, and changes to BIOS that may corrupt the BIOS. Your computer's operating system may hang and result in data loss or corrupted images. Depending on the manufacturer of your computer system, the computer system, hardware and software warranties may be voided, and you may not receive any further manufacturer support. NVIDIA does not provide customer service support for the Coolbits option. It is for these reasons that absolutely no warranty or guarantee is either express or implied. Before enabling and using, you should determine the suitability of the utility for your intended use, and you shall assume all responsibility in connection therewith.
When "1" (Bit 0) is set in the "Coolbits" option value, the nvidia-settings utility will contain a page labeled "Clock Frequencies" through which clock settings can be manipulated. On mobile GPUs, limited clock manipulation support is available when "1" is set in the "Coolbits" option value: clocks can be lowered relative to the default settings, but overclocking is not supported due to the thermal constraints of notebook designs. This is allowed on GeForce GPUs before the GeForce GTX 400 series and Quadro GPUs before the Quadro Fermi series.
When "2" (Bit 1) is set in the "Coolbits" option value, the NVIDIA driver will attempt to initialize SLI when using GPUs with different amounts of video memory.
When "4" (Bit 2) is set in the "Coolbits" option value, the nvidia-settings Thermal Monitor page will allow configuration of GPU fan speed, on graphics boards with programmable fan capability.
When "8" (Bit 3) is set in the "Coolbits" option value, the PowerMizer page in the nvidia-settings control panel will display a table that allows setting per-clock domain and per-performance level offsets to apply to clock values. This is allowed on certain GeForce GPUs in the GeForce GTX 400 series and later. Not all clock domains or performance levels may be modified.
When this option is set for an X screen, it will be applied to all X screens running on the same GPU.
The default for this option is 0 (unsupported features are disabled).
____________
Lubuntu 16.04.1 LTS x64 |
|
|
jlhalSend message
Joined: 1 Mar 10 Posts: 147 Credit: 1,077,535,540 RAC: 0 Level
Scientific publications
|
Added to the Coolbits option in 346.16 Beta version of Nvidia driver for Linux, but not documented :
When "16" (Bit 4) is set in the "Coolbits" option value, Allow Over/Under Voltage but this can be done only through CLI .
Found on that link but not tested by me :
Over-Volting Your GPU With The New NVIDIA Linux Driver
http://www.phoronix.com/scan.php?page=news_item&px=MTg0MDI
USE AT YOUR OWN RISK !
____________
Lubuntu 16.04.1 LTS x64 |
|
|
klepelSend message
Joined: 23 Dec 09 Posts: 189 Credit: 4,721,295,350 RAC: 1,828,616 Level
Scientific publications
|
I have tried to install Linux Mint 17.3 version 64-bit several times but it does not recognize the GPU in BOINC. When I install Linux Mint 17.3 32-bit version it does recognize the GPU in BOINC, but it will not work with GPUGRID.
So I decided to install LUBUNTU 16.04 64-bit. Yes and it does recognize the GPU in BOINC.
However I am not able to activated COOLBITS=12 permanently.
I am able to install it in the terminal once:
cd /etc/X11
sudo nvidia-xconfig --cool-bits=12
After restarting the computer, I open "Nvidia X Server Settings" and I can find the desired fan speed control slide and I can activate it.
In /etc/X11 exists “xorg.conf” file at the moment.
However when I restart the computer again the fan speed control in "Nvidia X Server Settings" is gone. And in etc/X11 the file “xorg.conf” is replaced with “xorg.conf.06092016”.
I opened this file with “sudo leafpad /etc/X11/xorg.conf” and the coolbits option 12 is there.
I saved the same file as “xorg.conf” and restarted the computer again, but after the restart it is replaced again with “xorg.conf.06092016”.
I tried to rename it as well, but it does not have any effect.
If I activate "Nvidia X Server Settings" without restarting the Fan Control slide will not appear. If I am restarting the computer it will get renamed as described.
Any suggestions?
|
|
|
jlhalSend message
Joined: 1 Mar 10 Posts: 147 Credit: 1,077,535,540 RAC: 0 Level
Scientific publications
|
Hi Klepel, I"m not using my GPUs since about 1 year and I"m at this moment using the default 16.04 dist driver but if I remember well you should put a copy of the nvidia config file in your home directory . You will easily find something about this elsewhere in this forum or on the web.
Hope this helps
Regards
____________
Lubuntu 16.04.1 LTS x64 |
|
|
klepelSend message
Joined: 23 Dec 09 Posts: 189 Credit: 4,721,295,350 RAC: 1,828,616 Level
Scientific publications
|
Hi jlhal,
I reverted to Nvidia Driver 304.131. Tried to put the xorg.conf in /etc/X11, doing this I restarted the computer several times, and the last time I did it, Linux crashed. Luckily I have the xorg.conf in /etc/X11 now, and the "Nvidia X Server Settings" does show the fan slide and I was able to activate it.
I have not restarted the computer anymore – never knows what would happen – let it run PRIMEGRID overnight. Works! Tried to run SETI, but this askes for the newest driver. GPUGRID does not have any work available, doubt it as the server stats shows 2600 something taskes ready to send. CLIMATEPREDICTION crashes, I assume, it because of the lack of Lib32 on the 64-bit system.
I was not able to find something specific in the Lubuntu or Ubuntu forums. So something is very strange… I will run it over the weekend, to see if it fetches GPUGRID WUs, or I will reinstall Ubuntu or just wait until Linux Mint 18 is ready to ship.
|
|
|
|
If it is a bit-mask than setting CoolBits to 255 should enable all features.
I am using 255 without problems. |
|
|
klepelSend message
Joined: 23 Dec 09 Posts: 189 Credit: 4,721,295,350 RAC: 1,828,616 Level
Scientific publications
|
I would like to report back: I restarted and so installed all the up-dates of my LUBUNTU computer after LINUX MINT 18 was available last week.
I also installed latest NVIDIA driver: 361.42.
Since then the system runs flawlessly with Coolbits=12. So I adjusted the fan speed to the maximum and Xserver reports the temperature of the Video card at around 77º to 80ºC. I would have to down-clock the card a bit to get the temperature lower, but I am not sure how to do it.
The system is:
http://www.gpugrid.net/results.php?hostid=340025
|
|
|
Beyond Send message
Joined: 23 Nov 08 Posts: 1112 Credit: 6,162,416,256 RAC: 0 Level
Scientific publications
|
I would like to report back: I restarted and so installed all the up-dates of my LUBUNTU computer after LINUX MINT 18 was available last week.
Good news. Are you using LUBUNTU or LINUX MINT 18? |
|
|
klepelSend message
Joined: 23 Dec 09 Posts: 189 Credit: 4,721,295,350 RAC: 1,828,616 Level
Scientific publications
|
I am using LUBUNTU! It is slim and starts-up way faster than LINUX Mint 17.3. And as I use this particular computer to test LINUX - BOINC - GPUGRID and CLIMATEPREDICTION – Nvidia Cards, I will let it run for a while!
I am just very happy that after months of trying it just works! Honestly the solution came with the Lubuntu 16.04 LTS release and has nothing to do with my computer or acquired Linux skills…
I have tried with LINUX Mint 17.3 for months, I was able to achieve that it works with Nvidia Cards on the 32-bit version, but it has never worked with 64-bit. So I was not able to crunch for GPUGRID, it did work only with PRIMEGRID and EINSTEIN. But with this new release (Lubuntu 16.04 LTS ) it worked more or less out of the box, besides the COOLBITS problem mentioned earlier in this thread.
Last note: However if you like to work on the computer, I would recommend you that you use LINUX Mint, it is just pretty and the feeling was very Windows like. So I will test the LINUX Mint 18 on another computer which runs LINUX Mint 17.3 with a Nvidia 9800 GT card (I use this one as a space heater…).
|
|
|
_Ryle_Send message
Joined: 7 Jun 09 Posts: 24 Credit: 1,149,643,416 RAC: 9,306 Level
Scientific publications
|
Hi Klepel,
When installing NVIDIA drivers on Mint 17.3, did you remember to blacklist the Nouveau driver? (it is the built-in opensource driver). It's an important step to do, when installing NVIDIAs driver.
I made a small guide for myself, because it involves a conf file, and editing of grub in commandline. Sounds complicated maybe, but if it is done right, it isn't :)
I can translate the guide I have into english, if you want. It works on 17.3, but I haven't tried Mint 18 yet. Basically I just made this guide out of tips I found here and there online, so instead of looking around each time, I just have it on my filebox here :)
I also tried Lubuntu 16.04, it worked well with the repository NVIDIA driver. However, I had a problem where the Network Manager would lose connection randomly, until I manually reconnected. Could be my box was the culprit, I don't know. I just went back to Mint 17.3. |
|
|