Message boards : Graphics cards (GPUs) : A little warning to GTX690 owners
Author | Message |
---|---|
You might be interested in my long story if you run this type of card with the factory made cooling. | |
ID: 27390 | Rating: 0 | rate: / Reply Quote | |
Cooling problems on graphiccards are always a big shit ^^ wish ya good luck with the new one. | |
ID: 27394 | Rating: 0 | rate: / Reply Quote | |
If it's any consolation I would have done the same thing, and I've used some cryo research kit. I expect you are right, the vapour chamber leaked - it happens. | |
ID: 27395 | Rating: 0 | rate: / Reply Quote | |
Interesting story, thank you. Sometimes I have surges of temptation to buy a GTX 690 (or even two). But, given its price, I am not ready to risk having any mulfunctioning and I would not like troubling myself with fixing it. On the other hand, any equipment can fail, crunching one even at a higher risk, I think. | |
ID: 27397 | Rating: 0 | rate: / Reply Quote | |
interesting story. Let's see what happens. | |
ID: 27399 | Rating: 0 | rate: / Reply Quote | |
Heat issues always scare me, I figure now that I am running multiple machines, and crunching so often, I will eventually have to conqueror my fears of dealing with thermal paste. | |
ID: 27404 | Rating: 0 | rate: / Reply Quote | |
I expect you are right, the vapour chamber leaked - it happens. Sure it happens, but all of my existing coolers have vapour chambers or tubes, and none of them leaked before, even after two years of operation. I'm sure you checked the fan was turning, heatsink was tight... I did. If the fan wasn't rotating, the other GPU would overheat as well. Although you could have RTM'ed it, that's always a pain to deal with. I've bought it as a used part from Slovakia (it was quite cheap), it is a replacement card (so the original one was RTM'ed), altough I have it's invoice, so I could RTM this one also, it would be a long and difficult process. I hope the problem isn't anything to do with the VRM. To rule this one out, I've checked the power consumption of each GPU on the GTX 690, and they are within 5%, so the hot chip doesn't dissipates significantly more heat than the normal one. So long as it's something to do with cooling the Arctic GPU Cooler should fix it. They make great kit, though I recently had to remove a motherboard to dismount a wide Arctic heatsink (screwed into a CPU backplate; on the back of the motherboard), just to replace RAM. That's a bad construction. I use Noctua NH-D14. It has two big screws on the upper side of the MB holding the heatsink to a mount. It's still difficult to dismount, because I have to remove the middle fan before I can access these two big screws, and on some motherboards the GPU in the first PCIe slot is so close to the heatsink that I have to remove the GPU first to access the fan's lever. Anyway, Good Luck I keep my fingers crossed, that I haven't spent another 100 euros in vain. | |
ID: 27405 | Rating: 0 | rate: / Reply Quote | |
Hi, | |
ID: 27514 | Rating: 0 | rate: / Reply Quote | |
Do you know the power consumed running two acemd instances? | |
ID: 27516 | Rating: 0 | rate: / Reply Quote | |
is it possible to remove the fan from a gtx690? It's possible, but what for? I'm sure that the fan (and the airflow) is good. Would it complain that there is no fan provided that it is well ventilated? If the temps stay low, it'll work. As far as I understand the 690 spits air out of the front and of the back, while we would like the air to flow front to back. Yes, but this card has two GPUs, with separated heatsinks, and it's not a good idea to cool one GPU with the hot air coming from the other GPU. This card is in the open air, so there's no heat buildup. | |
ID: 27517 | Rating: 0 | rate: / Reply Quote | |
Do you know the power consumed running two acemd instances? The power consumption is went up by 260W while both GPUs were crunching (and the fan rev up). I've just received the new cooler, so I've removed this GTX 690 from my host, but I'll put it back as soon as I've finished changing the cooler. Stay tuned. | |
ID: 27518 | Rating: 0 | rate: / Reply Quote | |
Hi, Is this for your own GTX690? I would be inclined to keep the fan and try to modify the casing so that it's blowing out the back/side/top. A couple of days ago I was playing around with trying to better cool a GTX660Ti and a GTX470 in the same case. Both cards have 2 fans and blow the air all over the place. When I put an extra fan at the back of the case their temps actually got worse. Ditto when I added a fan to the front. I then placed a fan blowing directly onto the cards and both dropped their temperatures and then their fan speeds. Blasting works best for cooling, and good case fans help. Normally you can remove a fan, but you can't crunch with it; it'll get too hot. I've done this several times with smaller cards when the fans start rattling. That said, I once removed a fan from a Gigabyte GT240 and it prevented the system starting. The power draw from many GF600 cards when crunching tends to be around 95% of reference TDP, which might prove challenging to a newer more power hungry app, though I expect the cards just won't boost as high. ____________ FAQ's HOW TO: - Opt out of Beta Tests - Ask for Help | |
ID: 27519 | Rating: 0 | rate: / Reply Quote | |
So 1500W power supply should be able to cope with 4 gtx690? | |
ID: 27520 | Rating: 0 | rate: / Reply Quote | |
https://www.youtube.com/watch?v=KGXBZvS6qJc | |
ID: 27523 | Rating: 0 | rate: / Reply Quote | |
So 1500W power supply should be able to cope with 4 gtx690? I've seen on a video, that a guy was using two GTX 690s with a 800W PSU, so 1500W should be enough for 4 GTX 690s. Is it easy to disassemble the standard single fan as in this picture? The fan is fastened by 3 Philips type screws, but they are threadlocked, so you have to press the screwdriver very hard while you turning it. We will use external fans. 4 GTX 690 with external fans? It's a very bad idea. You will fry your cards. These dual GPU cards were designed to have two of them at maximum in a single PC. This results a quad-SLI, so there is no point to put more than two dual GPU cards in a single PC from the gamer point of view. Placing 4 dual slot GPUs in a single PC using air cooling is very dangerous. | |
ID: 27524 | Rating: 0 | rate: / Reply Quote | |
I'm finished changing the cooler on my GTX 690. | |
ID: 27525 | Rating: 0 | rate: / Reply Quote | |
Zoltan, how did you get on with your heatsink and Fan replacement? OK, slow post! | |
ID: 27526 | Rating: 0 | rate: / Reply Quote | |
So 1500W power supply should be able to cope with 4 gtx690? I've found this video (sorry, it's in Hungarian) Corsair 800Watt PSU + 2x GTX690 QUAD SLI Core i7 3770K @4.4GHz Corsair GS800 4x4GB | |
ID: 27527 | Rating: 0 | rate: / Reply Quote | |
Sounds like GDF wants to build a monster-cruncher to finish very important or huge jobs faster than GPU-Grid could. That's why he's shooting for the maximum number of cards. The system might be mounted in a server rack, so there will be massive air flow. I can't tell if it's enough, though.. 4 x 260 W = 1.04 kW is extreme. | |
ID: 27528 | Rating: 0 | rate: / Reply Quote | |
Zoltan, Good to hear your GTX690 is up and running well. 56°C and 59°C is excellent for a dual card. A good purchase. | |
ID: 27529 | Rating: 0 | rate: / Reply Quote | |
Risers is a good idea. If I was going to build a monster 4-gpu machine I would plan on using risers to get the GPUs apart from each other. | |
ID: 27531 | Rating: 0 | rate: / Reply Quote | |
Heat issues always scare me, I figure now that I am running multiple machines, and crunching so often, I will eventually have to conqueror my fears of dealing with thermal paste. Thermal paste is NOT a problem, just get some good paste, I personally like Arctic Silver, but there ARE others too. ALSO it IS possible to put too much of a blob on there, I have seen examples of 'the size of a pea' but it is just doing it several times that will tell you how much is enough and not too much. You want enough to be able to spread out over the whole cpu, but NOT out over the edges! Remember heat MUST go thru the paste and into the heatsink/fan and too thick of a layer is not good for heat transfer. Basically don't worry EVERYONE gets it wrong a few times before they sort of 'figure it out'. I have NEVER burnt up a cpu by putting too much, but when the cpu DOES run hotter than I thought it should I take it apart and redo it. I use Alcohol Prep Swabs to clean the fan and cpu, making sure it is completely dry before redoing it. | |
ID: 27541 | Rating: 0 | rate: / Reply Quote | |
Thermal paste should be spread as thin as possible; although its designed to transfer heat, being a liquid phase it inevitably can't do it as well as the metal solids either side. So just enough to actually cover and link the CPU and heatsink. This is why some people lap both the CPU and heatsink surfaces. | |
ID: 27542 | Rating: 0 | rate: / Reply Quote | |
Some people may go mad about thermal paste.. but it's not that critical. Personally I like to position the heat sink and then twist it a bit forth and back before I fasten it. Doing so squeezes excess paste to the edges (it doesn't hurt there, it's just wasted) and upon several repetitions I can feel a stronger resistance to my torque, which I interpret as "sitting tightly". Never had thermal paste related problems.. so just give it a try (if you have to). | |
ID: 27549 | Rating: 0 | rate: / Reply Quote | |
Yes, we would like to try to build a 8 gpu machine made of 690. | |
ID: 27551 | Rating: 0 | rate: / Reply Quote | |
Some people may go mad about thermal paste.. but it's not that critical. Personally I like to position the heat sink and then twist it a bit forth and back before I fasten it. Doing so squeezes excess paste to the edges (it doesn't hurt there, it's just wasted) and upon several repetitions I can feel a stronger resistance to my torque, which I interpret as "sitting tightly". Never had thermal paste related problems.. so just give it a try (if you have to). I give mine a couple of little twists too before clamping it down. GDF are making a BitCoin machine? | |
ID: 27566 | Rating: 0 | rate: / Reply Quote | |
Definitely not! | |
ID: 27572 | Rating: 0 | rate: / Reply Quote | |
What are the screws to remove (3 or 4)? 3 scrwes. Not the visible ones. (The fan blades are shrouding them) I didn't find any picture of them, so I can make some if you like me to do. | |
ID: 27577 | Rating: 0 | rate: / Reply Quote | |
I'm considering the hypothesis of buying a PC with 4xGTX690 water-cooled. SLI will be off. 4 GTX690 works good together? The 690 is a dual GPU, by each GPU there will be 2 WU processing together? | |
ID: 27657 | Rating: 0 | rate: / Reply Quote | |
I'm considering the hypothesis of buying a PC with 4xGTX690 water-cooled. SLI will be off. 4 GTX690 works good together? The 690 is a dual GPU, by each GPU there will be 2 WU processing together? It's very hard to get 8 GPUs working in a sigle PC. You need special BIOS, and forget about Windows. | |
ID: 27660 | Rating: 0 | rate: / Reply Quote | |
So couldn't someone then just get 3 690's and then 1 680? | |
ID: 27661 | Rating: 0 | rate: / Reply Quote | |
So couldn't someone then just get 3 690's and then 1 680? More than 4 GPUs is usually problematic, especially with Windows. | |
ID: 27662 | Rating: 0 | rate: / Reply Quote | |
"I'm considering the hypothesis of buying a PC with 4xGTX690 water-cooled. SLI will be off. 4 GTX690 works good together? The 690 is a dual GPU, by each GPU there will be 2 WU processing together? That`s the reason why you have "only" 3 GTX690? Is it also not a problem with 3 GTX690 and Windows? ____________ Member of Boinc Italy. | |
ID: 27667 | Rating: 0 | rate: / Reply Quote | |
[3] NVIDIA GeForce GTX 690 (2047MB) driver: 310.33 | |
ID: 27669 | Rating: 0 | rate: / Reply Quote | |
[3] NVIDIA GeForce GTX 690 (2047MB) driver: 310.33 That`s means also 1X690 and another GTX right? The maximal number of 690 is also 2? Another question...with one 690 the PC will crunch 2 WU at the same time? What`s the better solutions between 2x690 and 4x680? ____________ Member of Boinc Italy. | |
ID: 27680 | Rating: 0 | rate: / Reply Quote | |
[3] NVIDIA GeForce GTX 690 (2047MB) driver: 310.33 Exactly. This host has one GTX 690 (reported by BOINC manager as two GTX 690s) and a GTX 670. The maximal number of 690 is also 2? Yes. Another question...with one 690 the PC will crunch 2 WU at the same time? Yes. What`s the better solutions between 2x690 and 4x680? It's hard to tell. Both configuration have pros and cons. 2x690: Pros: the motherboard have to have only two PCIe x16 slots, less PCIe power cables required (four 8-pin) Cons: less overclockability, larger heat dissipation into the case (considering the factory made cooler) 4x680: Pros: higher overclockability, the heat dissipated only through the rear grille (considering the factory made cooler) Cons: the motherboard have to have four PCIe x16 slots, and the PSU have to have four 8-pin and four 6-pin PCIe power cables I would choose the 2x690, with lager than factory made cooler (or water cooler), and a motherboard with 4 PCIe x16 slots (in this kind of MB the GPUs are 1 slot farther apart, allowing better airflow) | |
ID: 27681 | Rating: 0 | rate: / Reply Quote | |
Thank you very much Zoltan :-) | |
ID: 27684 | Rating: 0 | rate: / Reply Quote | |
[3] NVIDIA GeForce GTX 690 (2047MB) driver: 310.33 [6] NVIDIA GeForce GTX 690 (2048MB) driver: 301.42...what`s this? 6 GPU... ____________ Member of Boinc Italy. | |
ID: 27814 | Rating: 0 | rate: / Reply Quote | |
[6] NVIDIA GeForce GTX 690 (2048MB) driver: 301.42...what`s this? 6 GPU... The really interesting part is that this host is running Windows 7 Ultimate x64. I guess that this host has UEFI BIOS. I'm sure that besides crunching there is no use of a 3rd dual GPU card in a PC, because you can't connect the 3rd card with an SLI cable to the other two, since dual GPU cards have only a single SLI connector. | |
ID: 27816 | Rating: 0 | rate: / Reply Quote | |
Not new to GPUGrid but recently completed my new rig. It includes 2 GTX690 (intended for GPUGrid) and one GT640 (intended for SETI@HOME) I have one problem with the GPU utilization, hope to get advice from the masters ;) 25/12/2012 03:21:49 p.m. | | NVIDIA GPU 0: GeForce GTX 690 (driver version 310.70, CUDA version 5.0, compute capability 3.0, 2048MB, 8382371MB available, 3132 GFLOPS peak) 25/12/2012 03:21:49 p.m. | | NVIDIA GPU 1: GeForce GT 640 (driver version 310.70, CUDA version 5.0, compute capability 3.0, 1024MB, 836MB available, 692 GFLOPS peak) 25/12/2012 03:21:49 p.m. | | NVIDIA GPU 2: GeForce GTX 690 (driver version 310.70, CUDA version 5.0, compute capability 3.0, 2048MB, 1955MB available, 3132 GFLOPS peak) 25/12/2012 03:21:49 p.m. | | NVIDIA GPU 3: GeForce GTX 690 (driver version 310.70, CUDA version 5.0, compute capability 3.0, 2048MB, 1955MB available, 3132 GFLOPS peak) 25/12/2012 03:21:49 p.m. | | NVIDIA GPU 4: GeForce GTX 690 (driver version 310.70, CUDA version 5.0, compute capability 3.0, 2048MB, 1955MB available, 3132 GFLOPS peak) 25/12/2012 03:21:49 p.m. | | OpenCL: NVIDIA GPU 0: GeForce GTX 690 (driver version 310.70, device version OpenCL 1.1 CUDA, 2048MB, 8382371MB available) 25/12/2012 03:21:49 p.m. | | OpenCL: NVIDIA GPU 1: GeForce GT 640 (driver version 310.70, device version OpenCL 1.1 CUDA, 1024MB, 836MB available) 25/12/2012 03:21:49 p.m. | | OpenCL: NVIDIA GPU 2: GeForce GTX 690 (driver version 310.70, device version OpenCL 1.1 CUDA, 2048MB, 1955MB available) 25/12/2012 03:21:49 p.m. | | OpenCL: NVIDIA GPU 3: GeForce GTX 690 (driver version 310.70, device version OpenCL 1.1 CUDA, 2048MB, 1955MB available) 25/12/2012 03:21:49 p.m. | | OpenCL: NVIDIA GPU 4: GeForce GTX 690 (driver version 310.70, device version OpenCL 1.1 CUDA, 2048MB, 1955MB available) 25/12/2012 03:21:49 p.m. | | NVIDIA library reports 5 GPUs 25/12/2012 03:21:49 p.m. | | No ATI library found. I have correctly (I think) configured the cc_config file to use the 690 in GPUGrid and the 640 in SETI, this is my config file: <cc_config> <log_flags> <coproc_debug>1</coproc_debug> <sched_op_debug>1</sched_op_debug> </log_flags> <options> <use_all_gpus>1</use_all_gpus> <ncpus>0</ncpus> <report_results_immediately>1</report_results_immediately> <exclude_gpu> <url>http://www.gpugrid.net/</url> [<device_num>1</device_num>] </exclude_gpu> <exclude_gpu> <url>http://setiathome.berkeley.edu/</url> [<device_num>0</device_num>] </exclude_gpu> <exclude_gpu> <url>http://setiathome.berkeley.edu/</url> [<device_num>2</device_num>] </exclude_gpu> <exclude_gpu> <url>http://setiathome.berkeley.edu/</url> [<device_num>3</device_num>] </exclude_gpu> <exclude_gpu> <url>http://setiathome.berkeley.edu/</url> [<device_num>4</device_num>] </exclude_gpu> <max_file_xfers_per_project>4</max_file_xfers_per_project> </options> </cc_config> and this is the response I get from BOINC: 25/12/2012 03:21:49 p.m. | | Config: report completed tasks immediately 25/12/2012 03:21:49 p.m. | | Config: use all coprocessors 25/12/2012 03:21:49 p.m. | GPUGRID | Config: excluded GPU. Type: all. App: all. Device: 1 25/12/2012 03:21:49 p.m. | SETI@home | Config: excluded GPU. Type: all. App: all. Device: 0 25/12/2012 03:21:49 p.m. | SETI@home | Config: excluded GPU. Type: all. App: all. Device: 2 25/12/2012 03:21:49 p.m. | SETI@home | Config: excluded GPU. Type: all. App: all. Device: 3 25/12/2012 03:21:49 p.m. | SETI@home | Config: excluded GPU. Type: all. App: all. Device: 4 The GT640 is is crunching WUs from SETI, as expected. But my problem is that I only have 2 WUs for GPUGrid. And the curious thing is that one WU is one one card (say the card on PCIe slot 1) and the other is on another card (say the one on PCIe slot 2). This leaves 2 GPU cores idle, one on each card. I can confirm this by using MSI afterburner, following temps are on the cards: GPU1 (GTX690) 41 C GPU2 (GTX690) 82 C GPU3 (GTX690) 81 C GPU2 (GTX690) 35 C GPU2 (GT 640) 43 C and by manually feeling the exhaust air temps on the cards: the first 690 is expelling hot air only one the "inner" card (inside the case), while the other 690 is expelling hot air on the "outer" (away from the system grille). What am I doing wrong here? Thanks for your help! | |
ID: 27835 | Rating: 0 | rate: / Reply Quote | |
errata: GPU1 (GTX690) 41 C GPU2 (GTX690) 82 C GPU3 (GTX690) 81 C GPU4 (GTX690) 35 C GPU5 (GT 640) 43 C | |
ID: 27836 | Rating: 0 | rate: / Reply Quote | |
Using a file cc_config to limit the use of the GPU is possible that the Boinc manager stops any new WU. I had a same experience with Poem and GPUGRID...after a little my cache was empty and the Boinc manager didn`t get new WU! By removing the cc_config file the problem was solved :-) | |
ID: 27837 | Rating: 0 | rate: / Reply Quote | |
Mistery solved. I am now in the second batch of my new rig and all four coprocesors are now running cuda tasks. This might be some kind of "testing" of the cards from the project side. | |
ID: 27838 | Rating: 0 | rate: / Reply Quote | |
Zoltan; | |
ID: 28138 | Rating: 0 | rate: / Reply Quote | |
I have never understood why Nvidia thought having the airflow going in two directions (front and back) was ever going to work properly in a cases with fans blowing directly into the exhaust of the front GPU (like my Dell Precision). Maybe they never did? Reverse the front fans (and have some input somewhere) and the problem is gone. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 28162 | Rating: 0 | rate: / Reply Quote | |
Problem is on the Precision chassis (as with most, if not all, Dell tower chassis) the airflow goes in the front (usually assisted by one or more fans) and out the back. | |
ID: 28164 | Rating: 0 | rate: / Reply Quote | |
I know now that it's easier to build a "killer cruncher" by buying components and doing all the engineering yourself than it is to try and take an off the shelf workstation and adapt it to the task of devouring WUs. Easier, better and less expensive... | |
ID: 28441 | Rating: 0 | rate: / Reply Quote | |
My only GTX 690 (with the failed then changed cooling) has failed yesterday while it was crunching. There was no environmental hazard of any kind. | |
ID: 29575 | Rating: 0 | rate: / Reply Quote | |
My only GTX 690 (with the failed then changed cooling) has failed yesterday while it was crunching. There was no environmental hazard of any kind. What´s the temperature you was running it mate? I don´t let it ever pass 75°C, and my 6 where ok for almost a year now. | |
ID: 29583 | Rating: 0 | rate: / Reply Quote | |
The temperatures of the GTX 690 were 62°C and 65°C. Everything was OK. This host had a GTX 670 beside the GTX 690. The GTX 670 kept on crunching fine after the GTX 690 had failed. I didn't overclocked the GTX 690. It simply broke down in the middle of crunching. This host has an Enermax MaxRevo 1500W power supply, and an ASUS Rampage III Extreme motherboard. | |
ID: 29584 | Rating: 0 | rate: / Reply Quote | |
what was the cause of the error? | |
ID: 29585 | Rating: 0 | rate: / Reply Quote | |
what was the cause of the error? It wasn't an error from what he wrote, the video card died. I had a GTX670 die 2 months ago for no reason (water cooled), it was a eVga video card with a 3 year warranty and I have it as a spare now after getting the replacement back. | |
ID: 29587 | Rating: 0 | rate: / Reply Quote | |
Sometimes hardware just breaks. If there was any observable, even remotely possible reason in his case he probably would have mentioned it. | |
ID: 29590 | Rating: 0 | rate: / Reply Quote | |
what was the cause of the error? The workunits didn't fail, only the card. The GTX670 finished the two workunits which the GTX 690 could process only partially. | |
ID: 29592 | Rating: 0 | rate: / Reply Quote | |
It seems that one of my 690´s on the 3x690 rig had a leaked vapor chamber. Luckly enough, I was able to catch the temperature increase in the exctly moment and underclock it. Just ordered a Accelero cooler to this unit. | |
ID: 30767 | Rating: 0 | rate: / Reply Quote | |
It seems that one of my 690´s on the 3x690 rig had a leaked vapor chamber. Luckly enough, I was able to catch the temperature increase in the exctly moment and underclock it. Just ordered a Accelero cooler to this unit. Yes. It's HUGE in every direction :). It's 5 cm taller and 2 cm longer than the card itself. | |
ID: 30805 | Rating: 0 | rate: / Reply Quote | |
Man, it seems a pain to assemble. When it was all ready, I figure that I didn´t have the little wrench to remove the original screws from the back of the 690. It´s a micro allen wrench? It´s so small that I can´t even see it right! | |
ID: 31026 | Rating: 0 | rate: / Reply Quote | |
Man, it seems a pain to assemble. It takes awhile. However I've made my job easier by using nail polish (enamel) instead of the insulating tapes. When it was all ready, I figure that I didn´t have the little wrench to remove the original screws from the back of the 690. It´s a micro allen wrench? No, it's a Torx (TX5, if I can recall it correctly, but I will check it tomorrow for you) It´s so small that I can´t even see it right! They are. I have reading glasses for that purpose :) | |
ID: 31031 | Rating: 0 | rate: / Reply Quote | |
LOL thanks mate. I will look for that wrench... I really need it to put my hands on the card again. In the meantime, its crunching at 915mhz, 0.975v, 64°C... | |
ID: 31045 | Rating: 0 | rate: / Reply Quote | |
LOL thanks mate. I will look for that wrench... It's a Torks for sure, the only question is its size. It could be a TX6, or a TX5. I've find my screwdriver set, and the separate scredriver I've bought for disassebling video cards (it's a TX6) but I didn't have time to check which one fits exacly in those screws. (I was busy selling one of my old MB+CPU, and buying a new one) | |
ID: 31122 | Rating: 0 | rate: / Reply Quote | |
I will buy an entire set, don´t bother.... | |
ID: 31128 | Rating: 0 | rate: / Reply Quote | |
I will buy an entire set, don´t bother.... After months of operation? I don't think so. That is where I've used nail polish instead of the insulating tape, because I was afraid that the tape will peel off when the heatsink gets hot. | |
ID: 31130 | Rating: 0 | rate: / Reply Quote | |
I will buy an entire set, don´t bother.... That´s what I was thinking! I will make a double insulation there....ty for your thoughts! | |
ID: 31132 | Rating: 0 | rate: / Reply Quote | |
I install the beast today. It´s really huge. That was a TORX 6 toll that was needed. Really necessary to mess with a 690 cooler. Once the original cooler is out, the assemble of the Aero is not that hard. The performance, on the other hand, is kinda worse than I think. I don´t think its even on a pair of the original cooler. I can´t tell about the noise, because the machine has other 2 original 690s, but the cooling is not that good. | |
ID: 31173 | Rating: 0 | rate: / Reply Quote | |
Once the original cooler is out, the assemble of the Aero is not that hard. The performance, on the other hand, is kinda worse than I think. I don´t think its even on a pair of the original cooler. It's much better than the original cooler, however it needs more fresh (cool) air than the original (that is how it can be better), so the more other GPUs are in the system, the less gain in cooling performance. I can´t tell about the noise, because the machine has other 2 original 690s, but the cooling is not that good. It's much less noisy. But its performance is the best when there is at least 1 slot space between the GPUs. So if you have a leaked chamber (a kinda comom flaw it seems), it´s a must have. But if not, don´t mess with your 690...... I would say: if you have only 1 GTX 690, it's worth the mess anyway :) | |
ID: 31175 | Rating: 0 | rate: / Reply Quote | |
It seems that the long story of my late GTX 690 isn't over. At first one of the vapor chambers failed, later the whole card. | |
ID: 33511 | Rating: 0 | rate: / Reply Quote | |
Thanks for sharing your experience. Sounds like the issue was with the lack of quality in the modular power cables/PSU rails, or a connection issue (loose connection's cause burn marks). Did you ever OC that card? | |
ID: 33525 | Rating: 0 | rate: / Reply Quote | |
Message boards : Graphics cards (GPUs) : A little warning to GTX690 owners