Message boards : Graphics cards (GPUs) : Various GPU's Performance
Author | Message |
---|---|
I've been perusing the forums lately and haven't come across any posts relating different GPU's performance. | |
ID: 4816 | Rating: 0 | rate: / Reply Quote | |
Well, I just built a new i7 and I put into it a NVidia 9800 GT from PNY with 1G VRAM and it takes about 6 hours to 9 hours to run one task, The difficulty is that the time reported (now) is the CPU time and not the GPU time it takes to process a task. | |
ID: 4817 | Rating: 0 | rate: / Reply Quote | |
We did have a comparison prior to the speed up for GT200 based cards, so it's not valid any more. The key points were: | |
ID: 4828 | Rating: 0 | rate: / Reply Quote | |
I won't be able to add much here for a month or two because I only have the two cards to run on and one of them is still idle ... though it is looking like I am able to fetch and run work normally ... just got another task while taking a nap and I did not do a thing! | |
ID: 4835 | Rating: 0 | rate: / Reply Quote | |
Hey Paul, | |
ID: 4858 | Rating: 0 | rate: / Reply Quote | |
Hey Paul, But some of us "poor" old guys have to make do with a 9600:D mike | |
ID: 4859 | Rating: 0 | rate: / Reply Quote | |
Hmmm, GT200 rules ... well I can find GTX280, GTX260 ... but no GT200 ... unless you mean the GT200 series ... | |
ID: 4864 | Rating: 0 | rate: / Reply Quote | |
That`s easy: | |
ID: 4866 | Rating: 0 | rate: / Reply Quote | |
I have 1 GTX280 on Vista 64 running. The card is to fast for 4+1 on Windows, I have to work with 3+1 or I lost over 30% of the performance. Only good for gaming. This card would better run in a Linux machine for crunching, but it's in my gaming box. | |
ID: 4868 | Rating: 0 | rate: / Reply Quote | |
Hi Paul, And sadly, I am not sure that if at this moment I want to spend $500 to support basically one project that I am not really all that interested in ... Hey, that's absolutely fine! Everyone contributes as much as he / she wants to. And the original idea of DC is to use spare ressources and not to buy hardware for DC. So nobody is going to get mad at you for only buying 1 new card instead of 3 ;) And generally, what you're talking about (in your last post) is how to best use your systems. That's fine for you, but what this thread is about is "if I want to spend money on GPU-power, what's the best way to do it now?" It's clear that in 6 months cards will be better and/or cheaper.. that's always the case and doesn't help much if I want to buy now. BTW, your current plan of buying 1 fast card and keeping the 9800GT makes a lot of sense. I guess I'll take that route as well shortly.. GTX 260², I'm watching you! Regards, MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 4872 | Rating: 0 | rate: / Reply Quote | |
Hi Paul, Got todays paper... no Frys ad ... sigh ... Anyway, I kinda thought that was what you meant, the GPU chip. I was looking on TIger Direct just now and what a mess. There are about 20 some versions of the 280 card from 4 or 5 "manufacturers"... Yes, the intent of BOINC was to use idle CPU time as was the SaH Classic, but it was mostly used by hard core anal retentive autistic people like me ... :) Well, maybe most people are not autistic ... but, sadly, getting "real" people to use BOINC has been, I think, an abject failure. In the past I have laid out my opinion of some of the reasons why and was greeted for those opinions as warmly as, um, well ... I better not say as to not cause offense ... But, by and large the reason is that the BOINC *SYSTEM* as a whole is user hostile. Efforts like my past effort to document founder on the shoals of project indifference, lack of support (or outright hostility and sabotage), and continual roadblocks. I was looking at BOINC View last night and I see that it has not had an update in over a year, the originator has vanished even though the site is still up and the old version is still available. But GPU processing and other incremental changes are slowly making BV unusable. But the core of BOINC is done by hard timers like me... you can see that with the position percentages ... run any project for a week, even on only one computer, of almost any speed and at the end of the week you will have done more work than 50% or more of the people that joined that project. Heck, I just joined this project a week ago, don't have all that special a card and I am at ~ 30K CS and 42% ... WEP-M+2 I am at 4,497 CS and 70%!!!! as another example. I do almost no SaH anymore and I am still above 98% ... But I digress... Anyhow, I am still dithering about that card .... :) {edit}what is this 260 2 thing you are watching? Dual SLI?{/edit} | |
ID: 4875 | Rating: 0 | rate: / Reply Quote | |
Paul I think he means 216 shaders vs 190 shaders ....2 different types of GTX260. | |
ID: 4877 | Rating: 0 | rate: / Reply Quote | |
Paul I think he means 216 shaders vs 190 shaders ....2 different types of GTX260. And the Buck stops here ... :) Well, I fell off the wagon and ordered this which rumor says has 240 shaders. Well, I am not sure I got the best deal, or the best card, but wading through the options got to the point of nonsense ... So, choice made, now we wait for delivery ... Even if it is not the best card, or best for the price, it is certainly better than what I have. It will give me 3 cards going though the weakest one looks to take two days to do one task here. What the heck, it is work done with no extra cost ... The best news will be if the number of projects using CUDA rises rapidly ... When that happens it will be sad news for this project (as least as far as I am concerned as a participant) in that I will be redirecting almost all of my effort, though I will likely continue working for the project as tier 3 or 4 project (my ranking system) ... | |
ID: 4883 | Rating: 0 | rate: / Reply Quote | |
Well, about EVGA GeForce GTX 280 Video Card - Superclocked Edition I can say just 1 thing; You have `overpaid`. You should choose as everybody suggested 260GTX core 216. Twice cheaper and just 10% slower. | |
ID: 4884 | Rating: 0 | rate: / Reply Quote | |
Well, about EVGA GeForce GTX 280 Video Card - Superclocked Edition I can say just 1 thing; You have `overpaid`. You should choose as everybody suggested 260GTX core 216. Twice cheaper and just 10% slower. Well, it is only money ... I will keep that in mind and maybe the next one will be a as suggested ... | |
ID: 4890 | Rating: 0 | rate: / Reply Quote | |
Yes, the GTX 260² is an unofficial synonym for GTX 260 Core 216, which is a clumsy name. In this thread it was used by Kokomiko before, so I figred it'd be fine. | |
ID: 4896 | Rating: 0 | rate: / Reply Quote | |
And I don't think you overpaid that much, if any. The style of Tigerdirect is horrible and for me it looks like you either pay 380$ or 350$ for the card and this 450$ number is just there to make people feel better when buying it.. without any connection to the price of the product in the last few months. My walk out price was $409 for w day shipping and taxes height and weight ... What the heck is a 260²? I did not see them, or am miss understanging the reference. Perhaps I might keep an eye out and get one as my next upgrade and make my own personal test ... | |
ID: 4911 | Rating: 0 | rate: / Reply Quote | |
What the heck is a 260²? -> the GTX 260² is an unofficial synonym for GTX 260 Core 216 So it's a GTX 260 with an additional shader cluster: 216 shaders instead of 192 at the same clocks and with similar memory configuration. That's 1/8th more raw horse power, that's why we prefer it over the regular GTX 260. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 4915 | Rating: 0 | rate: / Reply Quote | |
What the heck is a 260²? I will only answer once, not squared if that is alright ... :) UPS has my package now ... estimated delivery on Tuesday by Tiger ... though I can hope they move it over the week end and I get an early present ... And I found what you were suggesting ... well, I will have to consider this ... too early to buy one now ... I have the overpriced one on the way ... | |
ID: 4921 | Rating: 0 | rate: / Reply Quote | |
Wait for 1st quarter 2009 and the GTX 295should hit the stores. Its 2 280s sandwiched together and should sell for 500$. Naturally, the 280 should drop in price by then?! http://www.tomshardware.com/reviews/geforce-gtx-295,2107.html | |
ID: 4933 | Rating: 0 | rate: / Reply Quote | |
Now you tell me ... :) No big deal ... fact of buying parts ... if I waited until then, the next card would be better and I should wait for it ... I kinda learned early on, buy a computer and then don't look at prices for at least 6 months (after you leave the price challenge period)... otherwise you will find you can never buy anything because there is NEVEER the right time ... something better, faster, cheaper is always just around the corner. Oh, and I have waited for the next release only to find it delayed for months ... not saying that will happen here ... just too easy to never get anything because of waiting for .... | |
ID: 4934 | Rating: 0 | rate: / Reply Quote | |
Its 2 280s sandwiched together and should sell for 500$. I hear it's going to be 2 260s, which makes a lot of sense considering power consumption, cooling / noise and price. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 4937 | Rating: 0 | rate: / Reply Quote | |
I've also read it's two 260's but also two 280's. Here's the quote: | |
ID: 4939 | Rating: 0 | rate: / Reply Quote | |
Tell me about it! :) I've figured this out almost 2 decades ago and lived through xts, 286s, 386s, etc, various peripherals, and Moore's law still hold true to some extent today! But I guess I keep my budget under control, I always try to sell my current hardware and stay ahead of the curve and jump to the next gen before my current gen obsoletes. And I just might do that with my 6 month old 280 to grab the 295... | |
ID: 4950 | Rating: 0 | rate: / Reply Quote | |
I still remember buying cache chips that cost more than the whole of my RAM ... And the 8087 co-processors ... My first computer was and Ohio Scientific based on the 6502 ... I couldn't afford an apple at the time so I built mine on bare boards ... my introduction to Frys actually, I used to drive from San Diego to Anaheim to shop for parts there ... My cabinet was a wood dresser kit I modified to hold the boards ... Video was on a modified TV set when the wife got tired of me monopolizing the main set ... Frys did run an ad in todays paper, thankfully no card of note listed. THough my current 9800 GT is still showing for $120 ... UPS has my order in Kentucky ... if they put it on a plane today it is possible I could see the card Monday (not likely, but possible ... sometimes I think they actually hold the boxes shortstopped to try to extort the higher shipping fees ... even though they COULD get it here Monday, I won't hold my breath ... For those not tuned in to the BOINC Dev mailing list, there have been a number of comments on Dr. Anderson's proposal to fix some of the issues with the fetch policy. I wish there were more ... alas ... | |
ID: 4954 | Rating: 0 | rate: / Reply Quote | |
Stop!! Stop!! You guy are doing two things: | |
ID: 4959 | Rating: 0 | rate: / Reply Quote | |
Stop!! Stop!! You guy are doing two things: Who better to date? I know exactly how to please myself ... :) Besides, I don't mind you knowing that I am over 21 and under 92 ... Want more ... I remember where I was when President Kennedy was shot and the news came over the radio. | |
ID: 4961 | Rating: 0 | rate: / Reply Quote | |
While the debate here is focused on the 'bang-for-the-buck' in general, it is also important to note that many systems require some significant upgrades to run the GT200 series...especially since no stock power supply can handle any of them (not to mention the various 9800 cards can also overload stock PS's). For those stuck with stock equipment, I would suggest that the older 9600 GSO (with 96 shaders - 384mb or 768mb) might be the best at about 84 watts. If one's system is really under-powered, a significantly factory OC'd 9500GT (at only 50 watts) might be the best. Of course, this assumes a single card install (since SLI increase PS requirements considerably). | |
ID: 4963 | Rating: 0 | rate: / Reply Quote | |
My main interest being WU's per $$$ (no offense to our Euro-spending brethren). Can't cut and paste from spreadsheet, so please forgive errors. All prices from pricegrabber as of today, 28-dec-2008. time/wu from an earlier forum thread, and match my experiences. 280gtx : 11k ppd, $350, 31.91 ppd/$ 260gtx : 9.9kppd, $220, 45.33 ppd/$ 9800gtx+ : 6.3k ppd, $200, 31.73 ppd/$ 9800gtx : 5.9k ppd, $160, $37.13 ppd/$ 9800gts : 5.5k ppd, $150, 37.23 ppd/$ 8800gt : 4.8k ppd, $185, $26.02 ppd/$ (price is nuts, I bought mine for half that) 9600gt : 3.9k ppd, $120, $33.24 ppd/$ 9600gso : 3.7k ppd, $75, $50.31 ppd/$ So, if all one cares about is the incremental PPD for the incremental $, the 9600gso STILL reins. IF you have a box with a spare PCIe slot that is already running BOINC, then the 9600gso is the best value per dollar. BUT, if you factor in the cost of the system ... the 260gtx looks pretty good. I myself am waiting for the 55nm 2xx gtx's to hit the street. I suspect the 290gtx will be joined by a sub 200 retail (150 street) 55nm double precision card. At least I am hoping. | |
ID: 5029 | Rating: 0 | rate: / Reply Quote | |
Well, my card is in Oakland now (according to the UPS site) so it is looking good for Tuesday still (sadly) ... though it is still possible that Oakland will have it up so Sacrodemented in time for Monday (not holding my breath) ... | |
ID: 5031 | Rating: 0 | rate: / Reply Quote | |
Fractal, | |
ID: 5036 | Rating: 0 | rate: / Reply Quote | |
All prices from preistrend.de as of today, 28-dec-2008. time/wu from an earlier forum thread, and match my experiences. Only available cards listed to exclude "too good to be true" prices which you may never get because the cards are always out of stock. | |
ID: 5037 | Rating: 0 | rate: / Reply Quote | |
Fractal, Try Saudi! Many shops will easily demand a 2x price for any high end and newly released stuff! The workaround I always take is a visa card and DHL from the US. :P | |
ID: 5038 | Rating: 0 | rate: / Reply Quote | |
While the debate here is focused on the 'bang-for-the-buck' in general, it is also important to note that many systems require some significant upgrades to run the GT200 series...especially since no stock power supply can handle any of them (not to mention the various 9800 cards can also overload stock PS's). For those stuck with stock equipment, I would suggest that the older 9600 GSO (with 96 shaders - 384mb or 768mb) might be the best at about 84 watts. If one's system is really under-powered, a significantly factory OC'd 9500GT (at only 50 watts) might be the best. Of course, this assumes a single card install (since SLI increase PS requirements considerably). I'd have to 2nd that. I bought myself a 9800GT card thinking all I had to do was put it in the box, but nooo. The power supply wasn't grunty enough and it only had SATA power connectors. Ended up replacing the power supply as well. | |
ID: 5039 | Rating: 0 | rate: / Reply Quote | |
Another benefit of slower cards with less power consumption is that they're easier to cool. The stock coolers are probably still too loud and the cards get hotter than necessary.. but any aftermarket cooler should handle them easily. | |
ID: 5040 | Rating: 0 | rate: / Reply Quote | |
Well, the card is her, plugged in ... and processing ... pull is now 450 some watts from the wall ... up from 290 with just the 9800 GPU ... | |
ID: 5054 | Rating: 0 | rate: / Reply Quote | |
I'm still wondering about this 450 Watt. My self build AMD Phenom 9950 BE (2.6 GHz stock frequency) takes 185 Watt in idle mode, crunching on 3 cores without GPU 250 Watt and with GPU 330 Watt. So the GTX280 (stock frequency 602/1296/1107) takes only 80 Watt more for GPUGrid crunching, a lot less than the data sheet makes believe. I have build in a efficient PSU and 2 HDs inside (2 x Samsung HD403LJ). In the case also a 25cm fan in the left cover, 2 12 cm fans in the case in front and rear and a 12 cm fan for the CPU cooler (Noctua). With my 8800GT the same systems has used 310 Watt. SO the GTX280 use only 40 Watt more. | |
ID: 5060 | Rating: 0 | rate: / Reply Quote | |
My rig runs a QX9650 and 2xXFX GTX280 XXX's as well as a WD Raptor and 4x1Tb Seagates in RAID5. The PSU is a PC Power and cooling 1200 ESA. With both GPU's and 3 cores crunching my draw is around 520Watts. The GPU's run at about 82°C and the CPU at 56°C. I can complete a WU in about 4-6hrs on each GPU. | |
ID: 5063 | Rating: 0 | rate: / Reply Quote | |
GPU power draw is much more differentiated than "it draws xxx Watts". There are different "levels" of power draw specified and the actual power strongly depends on the software. | |
ID: 5138 | Rating: 0 | rate: / Reply Quote | |
@Paul & Koko: these are interesting numbers, thanks! Even more interesting is that the draw is now 380 W from the wall ... note this is an i7 Asus MB and now only the GTX280 installed. PS is 680W capable. I have not remeasured the Q9300 where I put the 9800 GT (where is shows a really bad ghosting on the VGA cable connection, DVI looks better, but then I can't use the KVM I have to monitor the 4 windows systems I am running here. Note that I do run 25/7 with BOINC at 99% load to allow me some room to get into and out of the system. (the older version seemed to act better getting out of the way than the later versions for some reason, not sure what it is and too ill to spend much time trying to figure it out, of course even if I did figure it out ... well, that is another story). Anyway, I post these numbers for some to consider on their systems. Both seem to not be running too warm (hand wave test) ... the last time I looked ... lets see .. GTX 280 is running at 76 degrees C 32% memory 621 MHz and 1134 MHz I just upped the PS in the Q9300 (to see if the ghosting was the fault of the PS) from a 400 something to a 500W ... like I said I did not check the draw on that system recently ... of course I suspended GPU Grid there because of the high CPU use from the 6.55 Application. Maybe when the new app comes out and the the CPU load goes back down I may restart GPU Grid there... New thought ... I wonder if the high pull of the 6.55 application is affecting overall pull because the CPU though running full speed is not heavily loaded ... hard to believe that it would be a major draw but I have seen odder things in my life ... food for thought anyway ... | |
ID: 5145 | Rating: 0 | rate: / Reply Quote | |
Note that I do run 25/7 *watching you enviously* ... of course I suspended GPU Grid there because of the high CPU use from the 6.55 Application. I wouldn't say "of course". As we concluded somewhere here in this thread a 9800GT delivers almost 5k credits/day, with the recent drop due to the 1888 credit WUs maybe 4k. I don't know about your Q9300, but my Q6600@3GHz can certainly not make up for such an amount with just a half core ;) New thought ... I wonder if the high pull of the 6.55 application is affecting overall pull because the CPU though running full speed is not heavily loaded Yes, it does affect power draw. Although the load is 100% it's less stressful than real calculations (GPU-Grid currently just polls the GPU). When I switched from 4+0 to 4+1 I've seen a slight drop in my CPU temperature. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 5150 | Rating: 0 | rate: / Reply Quote | |
I wouldn't say "of course". As we concluded somewhere here in this thread a 9800GT delivers almost 5k credits/day, with the recent drop due to the 1888 credit WUs maybe 4k. I don't know about your Q9300, but my Q6600@3GHz can certainly not make up for such an amount with just a half core If I were all about credits and credits alone ... well, then my project mix would be a lot different and I would of course (to steal your words) be running all GPUs 24/7... I cannot display my signature here because of the problem with the preference page I noted inthe approprite forum lo these many days ago ... but, as you can see were you to click through is that I have a few projects undergoing work ... if you look at this you can see that I dump some work into almost all live projects going ... My only restriction is that I tend not to do pure Alpha test projects that will never get out of alpha state. That is a TEND .. as is obvious ... Anyway, my years goals include getting the bulk of the production projects and a few Beta projects above the totals I built up for SaH when there were few projects and it seemed like the thing to do. Now, I have choices and I tend to lean towards projects that are actually doing SCIENCE ... not that the exploration of the universe and looking for others is not something that needs to be done, it does, I just don't think it deserves so much computing power ... just me ... SO, I have this long term goal I have been working on for YEARs and have only managed 3 projects to this point ... so ... it is painful to surrender some power when I could be using it do do other things ... plenty of time to rev up GPU Grid when the next generation application comes out ... so ... WCG COsmology LHC ABC Genetic Life Malaria Control SIMAP Prime Grid Mind Modeling GPU Grid AI are all targets to get above 388,000 CS ... or as many as possible ... besides, the 9800 only does one, at best two (of the short tasks) a day ... and with so much trouble getting them queued up ... I only have to babysit one system instead of two ... And, as I have said before, Bio stuff does not really lift my skirts ... something about cutting up frogs biased me forever ... so, I lean towards Physics as my PREFERENCE ... Not that I am not willing to support frog killer types when that is the choices ... | |
ID: 5157 | Rating: 0 | rate: / Reply Quote | |
I guess I dropped out of seti a long time ago for similar reasons as you did. I also like to support other projects, preferrably from the realm of physics. And I also like to contribute as much as I can, without spending too much. That means I want to make good use of my hardware, i.e. only machines with a 64 Bit OS run ABC etc. And this means I'd always want to use that 400 GFlops Coprocessor, even if that means loosing 10 GFlops somewhere else. And about the babysitting.. well, if the GPU runs dry I wouldn't loose anything compared to not running GPU-Grid in the first place. | |
ID: 5160 | Rating: 0 | rate: / Reply Quote | |
I guess I dropped out of seti a long time ago for similar reasons as you did. I also like to support other projects, preferrably from the realm of physics. And I also like to contribute as much as I can, without spending too much. That means I want to make good use of my hardware, i.e. only machines with a 64 Bit OS run ABC etc. And this means I'd always want to use that 400 GFlops Coprocessor, even if that means loosing 10 GFlops somewhere else. And about the babysitting.. well, if the GPU runs dry I wouldn't loose anything compared to not running GPU-Grid in the first place. I knew you understood all a long ... :) But it is nice to "talk" about it ... I kinda do want to keep that GPU busy ... but, it has been idle so long that, well, what does a couple more days mean ... :) As to the rest, well, home bound as I am, I have to find something to amuse myself ... were there another project that used the GPU be sure that it would be turning and burning as we speak ... and when that next project comes out, be assured that even the 8500 card I have will likely be mounted again until I can afford to go out and buy again ... :) That aside, my GTX 280 is more than pulling in the credits to make me smile ... Even better, I took a nap (another side effect of my illness is I seem to only get to sleep ~3 hours at a pop) and was down to my last task ... lo and behold, it had finished that one and has sneaked out another and was working on it when I woke. When I reported the one in work when I went to bed I got two more ... almost as if it was working right ... :) | |
ID: 5162 | Rating: 0 | rate: / Reply Quote | |
Sorry for a delayed response...have been stuck at the relatives for the holidays with poor internet access and just got back home. Anyway, I'd generally agree with what you have said, but since I did say "really" under powered systems I thought I'd give the example that I have that may make some sense. Specifically, I have a second system (a freebie from my father's company that upgraded to a quad-core) which is a woefully underpowered stock HP desktop (Athlon x2 3800+ with 250w PS). I use it as a machine for my kids (educational software mostly) and to web browse, etc. when I can't get my wife off our main PC. Upgrading the PS (or just about any other component) makes no sense given the "budget construction" of such a desktop. With the 250w PS, there isn't much headroom, so the extra 30-35 watts that a 9600 GSO could use very well might exceed what such a system can handle. A similar scenario might also be evident with some of the media/home theater PC builds. Anyway, I just installed the OC'd 9500GT and will report back with some run times (I'd estimate around 2 days based on the few other 32 shader cards I've noticed running here). As for the power of the OC, I think I have some evidence that suggest that shader clock may be more important that you are suggesting. Specifically, my 9600 GSO (96 shaders - GPUGRID reported shader speed of 1674000) is flatly faster than an FX3700 Quadro that I got to test in December (112 shaders - GPUGRID reported shader speed of 1242000) with about a 2 hour per workunit advantage. Wonder if anyone has been able to do comparisons on their cards at different shader speeds to see how much of a difference can be obtained? Last, the "new" 9600 GSO (with 512mb rather than 384mb or 768mb) is actually a scaled back 48 shader card. Performance should be just shy of the 9600GT. I would guess that the card is lower powered, but I have not seen any hard figures yet on the generalized wattage? If comparable to the power usage of the 9500GT, then this would of course nullify my arguments regarding the slower card. | |
ID: 5183 | Rating: 0 | rate: / Reply Quote | |
Anyway, I just installed the OC'd 9500GT and will report back with some run times (I'd estimate around 2 days based on the few other 32 shader cards I've noticed running here). I have factory OC'd 9500GT (700 core, 1750 shader, 800 memory <no OC>). If just that card works on a WU, it will take about 2 days 6 hours or so to work on the big 3232 WU's. The 24xx WU's take < 2 days. I do have two 8800GT's as well, so if I close BOINC they can take over working on the WU the 9500 was previously working on. This will lower the overall time of a single WU, but my numbers above were for a WU processed strictly with the 9500. | |
ID: 5193 | Rating: 0 | rate: / Reply Quote | |
11 hours into its first workunit with a bit over 27% done. That equates to about 40 - 41 hours for the full unit (I'd guess it is a 24xx one). It will be an interesting comparison to your card since the only difference between the two (assuming like me that you are not heavily gaming, etc.) is the memory clock (mine is 700 core, 1750 shader, 2000 memory...all factory OC). | |
ID: 5195 | Rating: 0 | rate: / Reply Quote | |
Hi Scott, | |
ID: 5224 | Rating: 0 | rate: / Reply Quote | |
Thanks for the reply MrS...these types of conversations are the ones that I really enjoy on the forum. :)
Not running any CPU projects is a good point for power savings. Still, it is those spikes that worry me most, especially on the stock power supply (I am really doubtful that HP in particular went with any real quality on the PS).
Unfortunately, even those educational games for the kids have some rather taxing video activity at times, so when they are on the GPU wattage can spike fairly high. BTW, you keep referring to the 64 shader 9600GT, but I thought that the max wattage for it was 100 - 105? | |
ID: 5239 | Rating: 0 | rate: / Reply Quote | |
| |
ID: 5282 | Rating: 0 | rate: / Reply Quote | |
That sounds pretty much dead on with my experience. My OC'd 9500GT is about 3.3x slower than my 8800GT's (based on several WU's completed solely with the 9500). I've also seen about 40 hr runtimes on the 18xx credit WU's on the 9500GT, which equates to just over 12 hours on a 8800GT (just like I have experienced). This is good because it shows you can process two of these (small) WU's in 48 hours. However, while the 24xx WU's only took me a couple hours longer, if you get a 29xx or 3232 size WU, both will not be able to finish in 96 hours (4-day deadline). I advise you keep an eye on this and abort any that are no where close to finishing on time. If it will be within 6-8 hours or so late based on your estimates, then you might as well let it finish because only a 260 or 280 card will receive it (after the 96 hour deadline passes) and be able to process / return that fast. | |
ID: 5286 | Rating: 0 | rate: / Reply Quote | |
| |
ID: 5288 | Rating: 0 | rate: / Reply Quote | |
YOu may want to keep an eye on those and if they get re-issued send a note to your wingman to monitor the situation so that they don't process it half way through and have it canceled by the server ... | |
ID: 5295 | Rating: 0 | rate: / Reply Quote | |
I've read/heard that the memory clock does not make much difference as you said. I have overclocked the memory slightly (4%) on my 8800GT's, but given the very small increase any benefit might not be apparent anyhow. | |
ID: 5298 | Rating: 0 | rate: / Reply Quote | |
Since the run times are nearly identical, it also looks like memory clock is irrelevant for the GPUGRID apps... Just a quick reply.. see here plus the next couple of posts. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 5310 | Rating: 0 | rate: / Reply Quote | |
Well after deciding a 260GTX Core 216 would be my best bet (price to performance wise), a recent post on sale prices (thx JAMC) led to a rather hasty purchase of an overclocked EVGA 280GTX (at only $211)... | |
ID: 5460 | Rating: 0 | rate: / Reply Quote | |
Hi Scott, Unfortunately, even those educational games for the kids have some rather taxing video activity at times, so when they are on the GPU wattage can spike fairly high. Ah, now that's a different point. Games have the bad habit of using 100% (at least of one core) and 100% GPU, no matter if they need them or not. So in that case you'd be drawing considerably more than during GPU-Grid crunching, especially if you're kind to the CPU. When I still had the stock cooler on my 9800GTX+ I could hear the air movement due to the fan when GPU-Grid was running, but when I launched 3D Mark the fan started to scream.. so that's a diffeent kind of load. BTW, you keep referring to the 64 shader 9600GT, but I thought that the max wattage for it was 100 - 105? Not sure where I got my numbers from, I think it was in the beginning of this thread. I'm using 9600GT and 9600GSO (old version) and 64 and 96 shaders almost synonymical for the faster cards. That goes along with "somewhat more than 80W". Taking a look here I find a TDP of 84W for the 9600GSO and 95W for the 9600GT. So, yes, the 9600GT draws a bit more (and is a bit slower) than the old 9600GSO, but not that much. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 5464 | Rating: 0 | rate: / Reply Quote | |
Message boards : Graphics cards (GPUs) : Various GPU's Performance