Advanced search

Message boards : Graphics cards (GPUs) : Various GPU's Performance

Author Message
Priapismo
Send message
Joined: 26 Oct 08
Posts: 6
Credit: 443,525
RAC: 0
Level

Scientific publications
watwat
Message 4816 - Posted: 24 Dec 2008 | 5:00:28 UTC
Last modified: 24 Dec 2008 | 5:00:55 UTC

I've been perusing the forums lately and haven't come across any posts relating different GPU's performance.

My main interest being WU's per $$$ (no offense to our Euro-spending brethren). Obviously the prices for cards fluctuate, but having a chart to see how different GPU's perform (at stock speeds as a minimum) would be useful for those looking at building/upgrading their box. One could use real-time cost data to calculate the most efficient use of their money.

Just a thought.
____________

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4817 - Posted: 24 Dec 2008 | 5:26:47 UTC

Well, I just built a new i7 and I put into it a NVidia 9800 GT from PNY with 1G VRAM and it takes about 6 hours to 9 hours to run one task, The difficulty is that the time reported (now) is the CPU time and not the GPU time it takes to process a task.

So, not sure how to answer the question for sure ... maybe someone else has several different GPU cards runing and would be able to say. I will say that the PNY card was on special and it was just over $100 in Frys ... I have not been able to find the like on the web with most cards there only having 512M VRAM.

Also not sure if the extra VRAM helps or not ...

I may have more later as I do have another computer with a GPU card that I have not started processing because I was one of those having trouble getting work and I did not want to baby sit two computers until it was working better. I have had one cycle where the system seems to have worked as it should but I still need to wait a day or so to see if the trend continues or if I had a long streak of once ...

My *OWN* personal take on this (at the moment) is that buy what you would normally buy to populate the system for normal usage.

If you want the maximize processing power, then by all means get the meanest most expensive card going.

If you only have one slot, get a better card with most memory and speed you can afford.

If you have several slots (like I do, the MB can host up to 3 GPU cards) ... well, not at all sure what I am going to do... :)

For me, the project selection is still too poor to interest me that much in that I "like" projects that are in the realm of physics (MWay, Cosmology, Einstein, LHC, etc.) and none of them are GPU capable yet. HOWEVER, I may be watching the ads and getting new GPU cards even if they are not the absolute best and for the moment putting them on GPU Grid. I can run up my score here while filling out my dance card (as it were) and when more GPU capable projects arrive I will be ready ... :)

And Frys puts some very nice cards on sale ... :)

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 4828 - Posted: 24 Dec 2008 | 15:52:33 UTC

We did have a comparison prior to the speed up for GT200 based cards, so it's not valid any more. The key points were:

- the amount of vid mem does not influence speed (256 MB could hinder at some point in the future, though)
- core, shader and memory speed all help GPU-Grid speed
- 8800GTS 512 was about the sweet spot in price/performance
- but there's another argument.. would you spend e.g. 30€ more to get 500 credits per day more, even if this makes price/performance a bit worse? e.g. upgrade to a 9800GTX+? I certainly would, because I only have a limited number of systems I can stuff GPUs in.

Now we have an app which improves performance on GT200 a lot, whereas on G92 (and similar) it remained the same. Now a GTX 260 should be the most effective card to buy. I'd choose the core 216, which is almost twice as fast as my slightly OCed GTX9800+, which represents the high end G92s.

I'd prefer a single GT200 card over 2 or 3 smaller G92 based cards because of:
- future proof: higher CUDA capability (64 bit, etc.) & more memory
- unnoticeable delay in interactive use, can even game while crunching
- more power efficient than several slower cards
- only occupies one PCIe slot

I'm not totally sure how GTX 280 fits in here. It should have a worse price/performance ratio, the entry price is a bit steep and power consumption is higher. Still wouldn't be a bad buy, though.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4835 - Posted: 25 Dec 2008 | 0:00:02 UTC

I won't be able to add much here for a month or two because I only have the two cards to run on and one of them is still idle ... though it is looking like I am able to fetch and run work normally ... just got another task while taking a nap and I did not do a thing!

But I would imagine I would be able to afford a couple cards in the not too distant future and then maybe I could run a small example, but it would be a very limited test.

Though maybe the post-christmas sales???

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 4858 - Posted: 25 Dec 2008 | 22:16:30 UTC - in response to Message 4835.

Hey Paul,

there's no need for you to buy several cards just for test purposes. That would mean you'd have to get some with non-ideal price/performance ratio ;)
I think the data which we have is clear enough (GT200 rules).

MrS
____________
Scanning for our furry friends since Jan 2002

Profile mike047
Send message
Joined: 21 Dec 08
Posts: 47
Credit: 7,330,049
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwatwat
Message 4859 - Posted: 25 Dec 2008 | 22:51:37 UTC - in response to Message 4858.

Hey Paul,

there's no need for you to buy several cards just for test purposes. That would mean you'd have to get some with non-ideal price/performance ratio ;)
I think the data which we have is clear enough (GT200 rules).

MrS


But some of us "poor" old guys have to make do with a 9600:D

mike

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4864 - Posted: 26 Dec 2008 | 8:27:32 UTC

Hmmm, GT200 rules ... well I can find GTX280, GTX260 ... but no GT200 ... unless you mean the GT200 series ...

It is always a question of price, performance, availablity, what I have already, what the machines can take for installs, etc.

So, The i7 box can hold 3 cards, but, I did not anticipate this so only bought a 680 W power supply. Not sure it won't get sucked out of the wall with a suite of 3 GT280s installed. Besides, may generate too much heat for the box.

For the moment I was just thinking to buy one new card and drop it into the box and to let it rip and keep the slower/older 9800 GT running as a bonus...

At the moment the question is moot in that I am waiting to see what the Frys ad has to offer (if anything). And sadly, I am not sure that if at this moment I want to spend $500 to support basically one project that I am not really all that interested in ...

And if I buy cards over time, I am still going to get a cross-section of power. For example if I get mid-high end for now, the same price point 6 months from now is likely to be higher capability even if only a little. For the nonce, assume one per quarter and my last new system (for awhile) in 6 months, well, by then I will have 4 GPU cards and a new system with a GPU card making 5. With slots for 7 ... so, not wanting to waste anything ... I will be using the two cards, and what ever I buy for some time ... it will only be the end of next year when I start to fill all slots and a buy will bump off the slowest card.

And if I buy them on sale, well, who can say what I will get. :)

Profile [BOINC@Poland]AiDec
Send message
Joined: 2 Sep 08
Posts: 53
Credit: 9,213,937
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwat
Message 4866 - Posted: 26 Dec 2008 | 9:12:33 UTC

That`s easy:

Got `more` money - buy 260GTX core 216
Got `less` money - buy 9800GTX+

And I absolutely agree with ExtraTerrestrial Apes.
____________

Profile Kokomiko
Avatar
Send message
Joined: 18 Jul 08
Posts: 190
Credit: 24,093,690
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4868 - Posted: 26 Dec 2008 | 11:58:33 UTC

I have 1 GTX280 on Vista 64 running. The card is to fast for 4+1 on Windows, I have to work with 3+1 or I lost over 30% of the performance. Only good for gaming. This card would better run in a Linux machine for crunching, but it's in my gaming box.

One GTX260² is running on LINUX 64, it's running fine with 4+1.

One GTX260² is running on XP 32, it's running fine with 4+1 (Account Cebion, my wife).

One GTX260² is running on Vista 64, it's running fine with 4+1.

One 8800GT is running on Vista 64, it's running fine with 4+1.

One 8800GT is running on XP 32, it's running fine with 4+1.

The 8800GT is similar to the 9800GT.

The best choice was to buy the GTX260² cards. I never will buy a GTX280 again, too dear and not really any advantage here.
____________

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 4872 - Posted: 26 Dec 2008 | 12:56:18 UTC

Hi Paul,

sorry, by "GT200" I mean the chip which is used for GTX 260 / 260² / 280.. maybe should have made that more clear.

And sadly, I am not sure that if at this moment I want to spend $500 to support basically one project that I am not really all that interested in ...


Hey, that's absolutely fine! Everyone contributes as much as he / she wants to. And the original idea of DC is to use spare ressources and not to buy hardware for DC. So nobody is going to get mad at you for only buying 1 new card instead of 3 ;)

And generally, what you're talking about (in your last post) is how to best use your systems. That's fine for you, but what this thread is about is "if I want to spend money on GPU-power, what's the best way to do it now?" It's clear that in 6 months cards will be better and/or cheaper.. that's always the case and doesn't help much if I want to buy now.

BTW, your current plan of buying 1 fast card and keeping the 9800GT makes a lot of sense. I guess I'll take that route as well shortly.. GTX 260², I'm watching you!

Regards,
MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4875 - Posted: 26 Dec 2008 | 13:16:12 UTC - in response to Message 4872.
Last modified: 26 Dec 2008 | 13:17:51 UTC

Hi Paul,

sorry, by "GT200" I mean the chip which is used for GTX 260 / 260² / 280.. maybe should have made that more clear.

And sadly, I am not sure that if at this moment I want to spend $500 to support basically one project that I am not really all that interested in ...


Hey, that's absolutely fine! Everyone contributes as much as he / she wants to. And the original idea of DC is to use spare ressources and not to buy hardware for DC. So nobody is going to get mad at you for only buying 1 new card instead of 3 ;)

And generally, what you're talking about (in your last post) is how to best use your systems. That's fine for you, but what this thread is about is "if I want to spend money on GPU-power, what's the best way to do it now?" It's clear that in 6 months cards will be better and/or cheaper.. that's always the case and doesn't help much if I want to buy now.

BTW, your current plan of buying 1 fast card and keeping the 9800GT makes a lot of sense. I guess I'll take that route as well shortly.. GTX 260², I'm watching you!

Regards,
MrS


Got todays paper... no Frys ad ... sigh ...

Anyway, I kinda thought that was what you meant, the GPU chip. I was looking on TIger Direct just now and what a mess. There are about 20 some versions of the 280 card from 4 or 5 "manufacturers"...

Yes, the intent of BOINC was to use idle CPU time as was the SaH Classic, but it was mostly used by hard core anal retentive autistic people like me ... :)

Well, maybe most people are not autistic ... but, sadly, getting "real" people to use BOINC has been, I think, an abject failure. In the past I have laid out my opinion of some of the reasons why and was greeted for those opinions as warmly as, um, well ... I better not say as to not cause offense ...

But, by and large the reason is that the BOINC *SYSTEM* as a whole is user hostile. Efforts like my past effort to document founder on the shoals of project indifference, lack of support (or outright hostility and sabotage), and continual roadblocks. I was looking at BOINC View last night and I see that it has not had an update in over a year, the originator has vanished even though the site is still up and the old version is still available. But GPU processing and other incremental changes are slowly making BV unusable.

But the core of BOINC is done by hard timers like me... you can see that with the position percentages ... run any project for a week, even on only one computer, of almost any speed and at the end of the week you will have done more work than 50% or more of the people that joined that project. Heck, I just joined this project a week ago, don't have all that special a card and I am at ~ 30K CS and 42% ...

WEP-M+2 I am at 4,497 CS and 70%!!!! as another example.

I do almost no SaH anymore and I am still above 98% ...

But I digress...

Anyhow, I am still dithering about that card .... :)

{edit}what is this 260 2 thing you are watching? Dual SLI?{/edit}

Jayargh
Send message
Joined: 21 Dec 07
Posts: 47
Credit: 5,252,135
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwat
Message 4877 - Posted: 26 Dec 2008 | 15:23:23 UTC

Paul I think he means 216 shaders vs 190 shaders ....2 different types of GTX260.

I believe the 260-216 is the best bang for the "Buck" atm hehe

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4883 - Posted: 26 Dec 2008 | 17:08:12 UTC - in response to Message 4877.

Paul I think he means 216 shaders vs 190 shaders ....2 different types of GTX260.

I believe the 260-216 is the best bang for the "Buck" atm hehe


And the Buck stops here ... :)

Well, I fell off the wagon and ordered this which rumor says has 240 shaders. Well, I am not sure I got the best deal, or the best card, but wading through the options got to the point of nonsense ...

So, choice made, now we wait for delivery ...

Even if it is not the best card, or best for the price, it is certainly better than what I have. It will give me 3 cards going though the weakest one looks to take two days to do one task here. What the heck, it is work done with no extra cost ...

The best news will be if the number of projects using CUDA rises rapidly ...

When that happens it will be sad news for this project (as least as far as I am concerned as a participant) in that I will be redirecting almost all of my effort, though I will likely continue working for the project as tier 3 or 4 project (my ranking system) ...

Profile [BOINC@Poland]AiDec
Send message
Joined: 2 Sep 08
Posts: 53
Credit: 9,213,937
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwat
Message 4884 - Posted: 26 Dec 2008 | 17:54:11 UTC

Well, about EVGA GeForce GTX 280 Video Card - Superclocked Edition I can say just 1 thing; You have `overpaid`. You should choose as everybody suggested 260GTX core 216. Twice cheaper and just 10% slower.
____________

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4890 - Posted: 26 Dec 2008 | 21:52:45 UTC - in response to Message 4884.

Well, about EVGA GeForce GTX 280 Video Card - Superclocked Edition I can say just 1 thing; You have `overpaid`. You should choose as everybody suggested 260GTX core 216. Twice cheaper and just 10% slower.

Well, it is only money ...

I will keep that in mind and maybe the next one will be a as suggested ...

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 4896 - Posted: 26 Dec 2008 | 22:56:28 UTC

Yes, the GTX 260² is an unofficial synonym for GTX 260 Core 216, which is a clumsy name. In this thread it was used by Kokomiko before, so I figred it'd be fine.

And I don't think you overpaid that much, if any. The style of Tigerdirect is horrible and for me it looks like you either pay 380$ or 350$ for the card and this 450$ number is just there to make people feel better when buying it.. without any connection to the price of the product in the last few months.
I suppose you won't get a 260² for 175$? Otherwise I'd have to start crying now, because ours are still at 250+€..

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4911 - Posted: 27 Dec 2008 | 0:41:59 UTC - in response to Message 4896.

And I don't think you overpaid that much, if any. The style of Tigerdirect is horrible and for me it looks like you either pay 380$ or 350$ for the card and this 450$ number is just there to make people feel better when buying it.. without any connection to the price of the product in the last few months.
I suppose you won't get a 260² for 175$? Otherwise I'd have to start crying now, because ours are still at 250+€..MrS


My walk out price was $409 for w day shipping and taxes height and weight ...

What the heck is a 260²?

I did not see them, or am miss understanging the reference.

Perhaps I might keep an eye out and get one as my next upgrade and make my own personal test ...

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 4915 - Posted: 27 Dec 2008 | 1:03:27 UTC - in response to Message 4911.

What the heck is a 260²?


->

the GTX 260² is an unofficial synonym for GTX 260 Core 216


So it's a GTX 260 with an additional shader cluster: 216 shaders instead of 192 at the same clocks and with similar memory configuration. That's 1/8th more raw horse power, that's why we prefer it over the regular GTX 260.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4921 - Posted: 27 Dec 2008 | 6:42:18 UTC - in response to Message 4916.

What the heck is a 260²?


->

the GTX 260² is an unofficial synonym for GTX 260 Core 216


So it's a GTX 260 with an additional shader cluster: 216 shaders instead of 192 at the same clocks and with similar memory configuration. That's 1/8th more raw horse power, that's why we prefer it over the regular GTX 260.

MrS

I will only answer once, not squared if that is alright ... :)

UPS has my package now ... estimated delivery on Tuesday by Tiger ... though I can hope they move it over the week end and I get an early present ...

And I found what you were suggesting ... well, I will have to consider this ... too early to buy one now ... I have the overpriced one on the way ...

Thamir Ghaslan
Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 4933 - Posted: 27 Dec 2008 | 11:58:30 UTC - in response to Message 4828.
Last modified: 27 Dec 2008 | 12:02:50 UTC



I'm not totally sure how GTX 280 fits in here. It should have a worse price/performance ratio, the entry price is a bit steep and power consumption is higher. Still wouldn't be a bad buy, though.

MrS


Wait for 1st quarter 2009 and the GTX 295should hit the stores.

Its 2 280s sandwiched together and should sell for 500$.

Naturally, the 280 should drop in price by then?!



http://www.tomshardware.com/reviews/geforce-gtx-295,2107.html

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4934 - Posted: 27 Dec 2008 | 12:19:48 UTC - in response to Message 4933.



I'm not totally sure how GTX 280 fits in here. It should have a worse price/performance ratio, the entry price is a bit steep and power consumption is higher. Still wouldn't be a bad buy, though.

MrS


Wait for 1st quarter 2009 and the GTX 295should hit the stores.

Its 2 280s sandwiched together and should sell for 500$.

Naturally, the 280 should drop in price by then?!



http://www.tomshardware.com/reviews/geforce-gtx-295,2107.html


Now you tell me ... :)

No big deal ... fact of buying parts ... if I waited until then, the next card would be better and I should wait for it ...

I kinda learned early on, buy a computer and then don't look at prices for at least 6 months (after you leave the price challenge period)... otherwise you will find you can never buy anything because there is NEVEER the right time ... something better, faster, cheaper is always just around the corner. Oh, and I have waited for the next release only to find it delayed for months ... not saying that will happen here ... just too easy to never get anything because of waiting for ....

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 4937 - Posted: 27 Dec 2008 | 13:30:32 UTC - in response to Message 4933.

Its 2 280s sandwiched together and should sell for 500$.


I hear it's going to be 2 260s, which makes a lot of sense considering power consumption, cooling / noise and price.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Nognlite
Send message
Joined: 9 Nov 08
Posts: 69
Credit: 25,106,923
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwat
Message 4939 - Posted: 27 Dec 2008 | 13:46:22 UTC - in response to Message 4933.

I've also read it's two 260's but also two 280's. Here's the quote:

"As we briefly mentioned, the GPUs strapped on to this beast aren't your stock GTX 260 or GTX 280 parts. These chips are something like a GTX 280 with one memory channel disabled running at GTX 260 clock speeds. I suppose you could also look at them as GTX 260 ICs with all 10 TPCs enabled. Either way, you end up with something that has higher shader performance than a GTX 260 and lower memory bandwidth and fillrate (remember that ROPs are tied to memory channels, so this new part only has 28 rops instead of 32) than a GTX 280. This is a hybrid part."

Here's the URL:GTX295

Pat

Thamir Ghaslan
Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 4950 - Posted: 27 Dec 2008 | 17:04:04 UTC - in response to Message 4934.



I kinda learned early on, buy a computer and then don't look at prices for at least 6 months (after you leave the price challenge period)... otherwise you will find you can never buy anything because there is NEVEER the right time ... something better, faster, cheaper is always just around the corner. Oh, and I have waited for the next release only to find it delayed for months ... not saying that will happen here ... just too easy to never get anything because of waiting for ....


Tell me about it! :)

I've figured this out almost 2 decades ago and lived through xts, 286s, 386s, etc, various peripherals, and Moore's law still hold true to some extent today!

But I guess I keep my budget under control, I always try to sell my current hardware and stay ahead of the curve and jump to the next gen before my current gen obsoletes. And I just might do that with my 6 month old 280 to grab the 295...

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4954 - Posted: 27 Dec 2008 | 18:02:38 UTC - in response to Message 4950.



I kinda learned early on, buy a computer and then don't look at prices for at least 6 months (after you leave the price challenge period)... otherwise you will find you can never buy anything because there is NEVEER the right time ... something better, faster, cheaper is always just around the corner. Oh, and I have waited for the next release only to find it delayed for months ... not saying that will happen here ... just too easy to never get anything because of waiting for ....


Tell me about it! :)

I've figured this out almost 2 decades ago and lived through xts, 286s, 386s, etc, various peripherals, and Moore's law still hold true to some extent today!

But I guess I keep my budget under control, I always try to sell my current hardware and stay ahead of the curve and jump to the next gen before my current gen obsoletes. And I just might do that with my 6 month old 280 to grab the 295...




I still remember buying cache chips that cost more than the whole of my RAM ... And the 8087 co-processors ...

My first computer was and Ohio Scientific based on the 6502 ... I couldn't afford an apple at the time so I built mine on bare boards ... my introduction to Frys actually, I used to drive from San Diego to Anaheim to shop for parts there ... My cabinet was a wood dresser kit I modified to hold the boards ... Video was on a modified TV set when the wife got tired of me monopolizing the main set ...

Frys did run an ad in todays paper, thankfully no card of note listed. THough my current 9800 GT is still showing for $120 ...

UPS has my order in Kentucky ... if they put it on a plane today it is possible I could see the card Monday (not likely, but possible ... sometimes I think they actually hold the boxes shortstopped to try to extort the higher shipping fees ... even though they COULD get it here Monday, I won't hold my breath ...

For those not tuned in to the BOINC Dev mailing list, there have been a number of comments on Dr. Anderson's proposal to fix some of the issues with the fetch policy. I wish there were more ... alas ...

Profile Nognlite
Send message
Joined: 9 Nov 08
Posts: 69
Credit: 25,106,923
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwat
Message 4959 - Posted: 27 Dec 2008 | 18:58:40 UTC

Stop!! Stop!! You guy are doing two things:

Dating yourselves (age not orientation)

and bringing back old memories. ;P

Pat

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 4961 - Posted: 27 Dec 2008 | 19:09:12 UTC - in response to Message 4959.

Stop!! Stop!! You guy are doing two things:

Dating yourselves (age not orientation)

and bringing back old memories. ;P

Pat


Who better to date? I know exactly how to please myself ... :)

Besides, I don't mind you knowing that I am over 21 and under 92 ...

Want more ... I remember where I was when President Kennedy was shot and the news came over the radio.

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 4963 - Posted: 27 Dec 2008 | 20:02:25 UTC

While the debate here is focused on the 'bang-for-the-buck' in general, it is also important to note that many systems require some significant upgrades to run the GT200 series...especially since no stock power supply can handle any of them (not to mention the various 9800 cards can also overload stock PS's). For those stuck with stock equipment, I would suggest that the older 9600 GSO (with 96 shaders - 384mb or 768mb) might be the best at about 84 watts. If one's system is really under-powered, a significantly factory OC'd 9500GT (at only 50 watts) might be the best. Of course, this assumes a single card install (since SLI increase PS requirements considerably).

fractal
Send message
Joined: 16 Aug 08
Posts: 87
Credit: 1,248,879,715
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5029 - Posted: 29 Dec 2008 | 4:35:57 UTC - in response to Message 4816.
Last modified: 29 Dec 2008 | 4:37:39 UTC

My main interest being WU's per $$$ (no offense to our Euro-spending brethren).

Can't cut and paste from spreadsheet, so please forgive errors. All prices from pricegrabber as of today, 28-dec-2008. time/wu from an earlier forum thread, and match my experiences.

280gtx : 11k ppd, $350, 31.91 ppd/$
260gtx : 9.9kppd, $220, 45.33 ppd/$
9800gtx+ : 6.3k ppd, $200, 31.73 ppd/$
9800gtx : 5.9k ppd, $160, $37.13 ppd/$
9800gts : 5.5k ppd, $150, 37.23 ppd/$
8800gt : 4.8k ppd, $185, $26.02 ppd/$ (price is nuts, I bought mine for half that)
9600gt : 3.9k ppd, $120, $33.24 ppd/$
9600gso : 3.7k ppd, $75, $50.31 ppd/$

So, if all one cares about is the incremental PPD for the incremental $, the 9600gso STILL reins. IF you have a box with a spare PCIe slot that is already running BOINC, then the 9600gso is the best value per dollar.

BUT, if you factor in the cost of the system ... the 260gtx looks pretty good.

I myself am waiting for the 55nm 2xx gtx's to hit the street. I suspect the 290gtx will be joined by a sub 200 retail (150 street) 55nm double precision card. At least I am hoping.

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 5031 - Posted: 29 Dec 2008 | 7:28:53 UTC

Well, my card is in Oakland now (according to the UPS site) so it is looking good for Tuesday still (sadly) ... though it is still possible that Oakland will have it up so Sacrodemented in time for Monday (not holding my breath) ...

For the dual GTX, well, I will just wait to see if it is on the shelf when I build my next system. I suspect that I will top that one out as I likely will not replace my first newest one (W1) a quad (Q9300) ...

So I will suffer with that and the i7, Mac Pro and the two Dell Dual Xeons I have now ...

When the new card arrives I am not sure how I will arrange things ... at the moment I am leaning towards putting it into the i7 and leaving the 9800 GT there too ... the 8500 (or 8800 I forget) is too wimpy for GPU Grid so I may use it for SaH or the next GPU project that comes along and see how it fares there ...

I have 79 hours on the clock on the one task I have on it with 16 hours to go ... unless it is a "hanger" like the one I just finished on the i7 that took more than the expected amount of time to finish. It looks like I am going to blow the deadline, sadly, I guess I will have to look to see if it gets issued to someone else and send them a warning if so ...

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5036 - Posted: 29 Dec 2008 | 10:30:49 UTC - in response to Message 5029.

Fractal,

your performance data looks right and already takes the GT200 speedup into account. The prices look a bit strange, but so what.. everyone can check against their local prices. Compared to Germany all cards except 9600GSO and GTX 260/280 look too expensive, whereas GTX 260 is cheaper. Guess they know we like to buy the good stuff.. ;)

And is your 9800GTS the 8800GTS 512?

MrS
____________
Scanning for our furry friends since Jan 2002

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5037 - Posted: 29 Dec 2008 | 10:54:02 UTC - in response to Message 5029.

All prices from preistrend.de as of today, 28-dec-2008. time/wu from an earlier forum thread, and match my experiences. Only available cards listed to exclude "too good to be true" prices which you may never get because the cards are always out of stock.

GTX 280 : 11k ppd, 330€, 33.3 ppd/€
GTX 260 216: >9.9k kppd, 230€, >45 ppd/€
GTX 260 192: 9.9kppd, 220€, 45.0 ppd/€
9800gtx+ : 6.3k ppd, 140€, 45.0 ppd/€
9800gtx : 5.9k ppd, 105€, 56.2 ppd/€
8800gt : 4.8k ppd, 100€, $48.0 ppd/€
9600gt : 3.9k ppd, 85€, $45.9 ppd/€
9600gso : 3.7k ppd, 65€, $56.9 ppd/€

Interesting how different these prices are! Either the GF 9 series are already being phased out in the US or you're being ripped off for them. GTX 260 Core 216 is still king, although the 230€ is an exception (but available right now) .. others start at 250€.

MrS
____________
Scanning for our furry friends since Jan 2002

Thamir Ghaslan
Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 5038 - Posted: 29 Dec 2008 | 11:22:01 UTC - in response to Message 5036.

Fractal,

Compared to Germany all cards except 9600GSO and GTX 260/280 look too expensive, whereas GTX 260 is cheaper. Guess they know we like to buy the good stuff.. ;)

MrS


Try Saudi! Many shops will easily demand a 2x price for any high end and newly released stuff! The workaround I always take is a visa card and DHL from the US. :P

MarkJ
Volunteer moderator
Volunteer tester
Send message
Joined: 24 Dec 08
Posts: 738
Credit: 200,909,904
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5039 - Posted: 29 Dec 2008 | 12:12:43 UTC - in response to Message 4963.

While the debate here is focused on the 'bang-for-the-buck' in general, it is also important to note that many systems require some significant upgrades to run the GT200 series...especially since no stock power supply can handle any of them (not to mention the various 9800 cards can also overload stock PS's). For those stuck with stock equipment, I would suggest that the older 9600 GSO (with 96 shaders - 384mb or 768mb) might be the best at about 84 watts. If one's system is really under-powered, a significantly factory OC'd 9500GT (at only 50 watts) might be the best. Of course, this assumes a single card install (since SLI increase PS requirements considerably).


I'd have to 2nd that. I bought myself a 9800GT card thinking all I had to do was put it in the box, but nooo. The power supply wasn't grunty enough and it only had SATA power connectors. Ended up replacing the power supply as well.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5040 - Posted: 29 Dec 2008 | 12:54:31 UTC

Another benefit of slower cards with less power consumption is that they're easier to cool. The stock coolers are probably still too loud and the cards get hotter than necessary.. but any aftermarket cooler should handle them easily.

An OC'ed 9500GT is a really bad option, though! With 32 shaders even overclocking is not going to help it that much. 9600GT with 64 shaders at 1.7/1.8 GHz already needs one day per WU. I'd rather OC a 9600GSO than get a 9500GT. If the PS can't handle the additional 30 W it probably wouldn't last long with a 9500GT either.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 5054 - Posted: 29 Dec 2008 | 21:48:55 UTC

Well, the card is her, plugged in ... and processing ... pull is now 450 some watts from the wall ... up from 290 with just the 9800 GPU ...

Now to see how it performs doing work ...

I may still move the 9800 to another machine, but, for the moment I will see how it goes ...

Profile Kokomiko
Avatar
Send message
Joined: 18 Jul 08
Posts: 190
Credit: 24,093,690
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 5060 - Posted: 30 Dec 2008 | 1:48:00 UTC

I'm still wondering about this 450 Watt. My self build AMD Phenom 9950 BE (2.6 GHz stock frequency) takes 185 Watt in idle mode, crunching on 3 cores without GPU 250 Watt and with GPU 330 Watt. So the GTX280 (stock frequency 602/1296/1107) takes only 80 Watt more for GPUGrid crunching, a lot less than the data sheet makes believe. I have build in a efficient PSU and 2 HDs inside (2 x Samsung HD403LJ). In the case also a 25cm fan in the left cover, 2 12 cm fans in the case in front and rear and a 12 cm fan for the CPU cooler (Noctua). With my 8800GT the same systems has used 310 Watt. SO the GTX280 use only 40 Watt more.
____________

Profile Nognlite
Send message
Joined: 9 Nov 08
Posts: 69
Credit: 25,106,923
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwat
Message 5063 - Posted: 30 Dec 2008 | 2:32:23 UTC - in response to Message 5060.

My rig runs a QX9650 and 2xXFX GTX280 XXX's as well as a WD Raptor and 4x1Tb Seagates in RAID5. The PSU is a PC Power and cooling 1200 ESA. With both GPU's and 3 cores crunching my draw is around 520Watts. The GPU's run at about 82°C and the CPU at 56°C. I can complete a WU in about 4-6hrs on each GPU.

If I was to believe the specs, just my GPU's alone at 100% should be 500Watts or over so I think the specs are high.

Pat

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5138 - Posted: 1 Jan 2009 | 15:02:36 UTC

GPU power draw is much more differentiated than "it draws xxx Watts". There are different "levels" of power draw specified and the actual power strongly depends on the software.

- nVidia specifies maximum power draw, about 230 W for a GTX 280 .. this is what a "thermal virus" could achieve, a software which uses as many transistor at the same time as possible. The heat sinks have to be able to deal with such load.

- theres a typical power draw under load, e.g. in games. That's about 170 W for a GTX 280, depending on the load (resolution, FSAA, detail settings etc.). That's what the user usually has to deal with.

- GPU-Grid does not use parts of a GPU which games do use (e.g. texture units), therefore it draws less power than in games.. how much is open to debate (or measurements).

- there is also "instantaneous power draw", which could be higher than the maximum specified in point (1), but which can only be sustained over ns or micro seconds. The power circuitry has to be designed to buffer such load.

@Paul & Koko: these are interesting numbers, thanks!

Koko, you say your system drew 310 W with the 8800GT and 330 W with the GTX 280, which is only 40 W more. Did you mean only 20 W more or 350 W?

Anyway, from your numbers we see that the GTX 280 draws 330 - 250 = 80 W (or 350 - 250 = 100 W) more than at idle, if it is loaded with GPU-Grid. That's already influenced by power supply efficiency. From Pauls numbers we see that adding a GTX 280 increased load power draw by 160 W (= 130 W for the card at 80% PS efficiency). From the measurements at XBit-labs I would have expected a lower idle power draw, but these numbers are not directly comparable anyway because we don't know the exact power supply efficiency. But they're good guidelines to get a feeling for the numbers.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 5145 - Posted: 1 Jan 2009 | 16:42:11 UTC - in response to Message 5138.

@Paul & Koko: these are interesting numbers, thanks!


Even more interesting is that the draw is now 380 W from the wall ... note this is an i7 Asus MB and now only the GTX280 installed. PS is 680W capable.

I have not remeasured the Q9300 where I put the 9800 GT (where is shows a really bad ghosting on the VGA cable connection, DVI looks better, but then I can't use the KVM I have to monitor the 4 windows systems I am running here.

Note that I do run 25/7 with BOINC at 99% load to allow me some room to get into and out of the system. (the older version seemed to act better getting out of the way than the later versions for some reason, not sure what it is and too ill to spend much time trying to figure it out, of course even if I did figure it out ... well, that is another story).

Anyway, I post these numbers for some to consider on their systems. Both seem to not be running too warm (hand wave test) ... the last time I looked ... lets see .. GTX 280 is running at 76 degrees C 32% memory 621 MHz and 1134 MHz

I just upped the PS in the Q9300 (to see if the ghosting was the fault of the PS) from a 400 something to a 500W ... like I said I did not check the draw on that system recently ... of course I suspended GPU Grid there because of the high CPU use from the 6.55 Application. Maybe when the new app comes out and the the CPU load goes back down I may restart GPU Grid there...

New thought ... I wonder if the high pull of the 6.55 application is affecting overall pull because the CPU though running full speed is not heavily loaded ... hard to believe that it would be a major draw but I have seen odder things in my life ... food for thought anyway ...

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5150 - Posted: 1 Jan 2009 | 17:03:33 UTC - in response to Message 5145.

Note that I do run 25/7


*watching you enviously*

... of course I suspended GPU Grid there because of the high CPU use from the 6.55 Application.


I wouldn't say "of course". As we concluded somewhere here in this thread a 9800GT delivers almost 5k credits/day, with the recent drop due to the 1888 credit WUs maybe 4k. I don't know about your Q9300, but my Q6600@3GHz can certainly not make up for such an amount with just a half core ;)

New thought ... I wonder if the high pull of the 6.55 application is affecting overall pull because the CPU though running full speed is not heavily loaded


Yes, it does affect power draw. Although the load is 100% it's less stressful than real calculations (GPU-Grid currently just polls the GPU). When I switched from 4+0 to 4+1 I've seen a slight drop in my CPU temperature.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 5157 - Posted: 1 Jan 2009 | 21:38:52 UTC - in response to Message 5150.

I wouldn't say "of course". As we concluded somewhere here in this thread a 9800GT delivers almost 5k credits/day, with the recent drop due to the 1888 credit WUs maybe 4k. I don't know about your Q9300, but my Q6600@3GHz can certainly not make up for such an amount with just a half core


If I were all about credits and credits alone ... well, then my project mix would be a lot different and I would of course (to steal your words) be running all GPUs 24/7...

I cannot display my signature here because of the problem with the preference page I noted inthe approprite forum lo these many days ago ... but, as you can see were you to click through is that I have a few projects undergoing work ... if you look at this you can see that I dump some work into almost all live projects going ...

My only restriction is that I tend not to do pure Alpha test projects that will never get out of alpha state. That is a TEND .. as is obvious ...

Anyway, my years goals include getting the bulk of the production projects and a few Beta projects above the totals I built up for SaH when there were few projects and it seemed like the thing to do. Now, I have choices and I tend to lean towards projects that are actually doing SCIENCE ... not that the exploration of the universe and looking for others is not something that needs to be done, it does, I just don't think it deserves so much computing power ... just me ...

SO, I have this long term goal I have been working on for YEARs and have only managed 3 projects to this point ... so ... it is painful to surrender some power when I could be using it do do other things ... plenty of time to rev up GPU Grid when the next generation application comes out ...

so ...

WCG
COsmology
LHC
ABC
Genetic Life
Malaria Control
SIMAP
Prime Grid
Mind Modeling
GPU Grid
AI

are all targets to get above 388,000 CS ... or as many as possible ... besides, the 9800 only does one, at best two (of the short tasks) a day ... and with so much trouble getting them queued up ... I only have to babysit one system instead of two ...

And, as I have said before, Bio stuff does not really lift my skirts ... something about cutting up frogs biased me forever ... so, I lean towards Physics as my PREFERENCE ... Not that I am not willing to support frog killer types when that is the choices ...

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5160 - Posted: 1 Jan 2009 | 22:25:43 UTC - in response to Message 5157.

I guess I dropped out of seti a long time ago for similar reasons as you did. I also like to support other projects, preferrably from the realm of physics. And I also like to contribute as much as I can, without spending too much. That means I want to make good use of my hardware, i.e. only machines with a 64 Bit OS run ABC etc. And this means I'd always want to use that 400 GFlops Coprocessor, even if that means loosing 10 GFlops somewhere else. And about the babysitting.. well, if the GPU runs dry I wouldn't loose anything compared to not running GPU-Grid in the first place.

But that's just me (who, by the way, absolutely doesn't care about credit goals). So if what you're doing is what you want, by all means go for it :)

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 5162 - Posted: 2 Jan 2009 | 3:55:49 UTC - in response to Message 5160.

I guess I dropped out of seti a long time ago for similar reasons as you did. I also like to support other projects, preferrably from the realm of physics. And I also like to contribute as much as I can, without spending too much. That means I want to make good use of my hardware, i.e. only machines with a 64 Bit OS run ABC etc. And this means I'd always want to use that 400 GFlops Coprocessor, even if that means loosing 10 GFlops somewhere else. And about the babysitting.. well, if the GPU runs dry I wouldn't loose anything compared to not running GPU-Grid in the first place.

But that's just me (who, by the way, absolutely doesn't care about credit goals). So if what you're doing is what you want, by all means go for it :)

MrS


I knew you understood all a long ... :)

But it is nice to "talk" about it ...

I kinda do want to keep that GPU busy ... but, it has been idle so long that, well, what does a couple more days mean ... :)

As to the rest, well, home bound as I am, I have to find something to amuse myself ... were there another project that used the GPU be sure that it would be turning and burning as we speak ... and when that next project comes out, be assured that even the 8500 card I have will likely be mounted again until I can afford to go out and buy again ... :)

That aside, my GTX 280 is more than pulling in the credits to make me smile ...

Even better, I took a nap (another side effect of my illness is I seem to only get to sleep ~3 hours at a pop) and was down to my last task ... lo and behold, it had finished that one and has sneaked out another and was working on it when I woke. When I reported the one in work when I went to bed I got two more ...

almost as if it was working right ... :)

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 5183 - Posted: 3 Jan 2009 | 5:12:43 UTC - in response to Message 5040.


An OC'ed 9500GT is a really bad option, though! With 32 shaders even overclocking is not going to help it that much. 9600GT with 64 shaders at 1.7/1.8 GHz already needs one day per WU. I'd rather OC a 9600GSO than get a 9500GT. If the PS can't handle the additional 30 W it probably wouldn't last long with a 9500GT either.

MrS



Sorry for a delayed response...have been stuck at the relatives for the holidays with poor internet access and just got back home.

Anyway, I'd generally agree with what you have said, but since I did say "really" under powered systems I thought I'd give the example that I have that may make some sense.

Specifically, I have a second system (a freebie from my father's company that upgraded to a quad-core) which is a woefully underpowered stock HP desktop (Athlon x2 3800+ with 250w PS). I use it as a machine for my kids (educational software mostly) and to web browse, etc. when I can't get my wife off our main PC. Upgrading the PS (or just about any other component) makes no sense given the "budget construction" of such a desktop. With the 250w PS, there isn't much headroom, so the extra 30-35 watts that a 9600 GSO could use very well might exceed what such a system can handle. A similar scenario might also be evident with some of the media/home theater PC builds.

Anyway, I just installed the OC'd 9500GT and will report back with some run times (I'd estimate around 2 days based on the few other 32 shader cards I've noticed running here).

As for the power of the OC, I think I have some evidence that suggest that shader clock may be more important that you are suggesting. Specifically, my 9600 GSO (96 shaders - GPUGRID reported shader speed of 1674000) is flatly faster than an FX3700 Quadro that I got to test in December (112 shaders - GPUGRID reported shader speed of 1242000) with about a 2 hour per workunit advantage. Wonder if anyone has been able to do comparisons on their cards at different shader speeds to see how much of a difference can be obtained?

Last, the "new" 9600 GSO (with 512mb rather than 384mb or 768mb) is actually a scaled back 48 shader card. Performance should be just shy of the 9600GT. I would guess that the card is lower powered, but I have not seen any hard figures yet on the generalized wattage? If comparable to the power usage of the 9500GT, then this would of course nullify my arguments regarding the slower card.


Profile K1atOdessa
Send message
Joined: 25 Feb 08
Posts: 249
Credit: 395,402,681
RAC: 1,594,520
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5193 - Posted: 3 Jan 2009 | 14:18:38 UTC - in response to Message 5183.

Anyway, I just installed the OC'd 9500GT and will report back with some run times (I'd estimate around 2 days based on the few other 32 shader cards I've noticed running here).


I have factory OC'd 9500GT (700 core, 1750 shader, 800 memory <no OC>). If just that card works on a WU, it will take about 2 days 6 hours or so to work on the big 3232 WU's. The 24xx WU's take < 2 days. I do have two 8800GT's as well, so if I close BOINC they can take over working on the WU the 9500 was previously working on. This will lower the overall time of a single WU, but my numbers above were for a WU processed strictly with the 9500.

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 5195 - Posted: 3 Jan 2009 | 15:35:03 UTC - in response to Message 5193.

11 hours into its first workunit with a bit over 27% done. That equates to about 40 - 41 hours for the full unit (I'd guess it is a 24xx one). It will be an interesting comparison to your card since the only difference between the two (assuming like me that you are not heavily gaming, etc.) is the memory clock (mine is 700 core, 1750 shader, 2000 memory...all factory OC).

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5224 - Posted: 3 Jan 2009 | 21:24:01 UTC

Hi Scott,

it's not that shader clock wouldn't matter, but the 9600GT is clocked quite high already and the GSO is.. at least not slow. The 9500GT is produced in the same 65 nm process, so you can't expect any miracles from it. Maybe 1.9 GHz is possible, but I doubt much more. That gives you an advantage of 1.9/1.67 = 1.14 -> 14%. To make up for the lack of shaders you'd need 100 to 200%. There's no way an OC can give you this, both numbers are in completely different leagues.

In your example the difference in the numbers of shaders is rather small (112/96 = 1.17 -> 17%). That's why the clock speed difference of 1.67/1.24 = 1.34 -> 34% does matter.

And to consider your example: an X2 3800+ with a 250W PSU. The CPU is specified at 95W and runs at unnecessarily high voltages. If you lower the voltage (any of them will do 1.2V, most 1.1V) you should get the entire system below 100W. If you're unlucky and got on which runs at the highest stock voltage and really draws 95W than your system should draw something in the range of 130 - 150W without a GPU.
If we assume the worst (150W) and add a 80W GPU we're at 230W - certainly not a comfortable load, if the PSU is a quality unit and can handle it. We don't want to run our PSUs at more than 80% load as this decreases the life span. Ideally the load should sit around 50%, not much more than 60%.
80% load is just what we'd get in this example by going for a 9500GT instead of a 9600GT / old GSO (250W*0.8 = 200W). Still.. I'm not really convinced that this is a good idea. A system which can't run a 80W card will still be pushed to it's limits by a 50W card. A difference of 30W is just 12% of 250.. that's reducing PSU load from 80 to 68%..

I can see your point, but still I wouldn't want to give up that much performance by going with 32 shaders instead of 96. I'd rather use the somewhat bigger card and would not run any CPU projects. That should get cpu power consumption below 30W and the PSU would be fine. Of course the PSU would have to be able to handle an occasional spike in CPU usage.

Oh, and don't forget: the cards draw less power in GPU-Grid than in games, so the 80 / 50W figures are too high.. which makes the difference between them smaller ;)

MrS
____________
Scanning for our furry friends since Jan 2002

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 5239 - Posted: 3 Jan 2009 | 22:35:11 UTC - in response to Message 5224.

Thanks for the reply MrS...these types of conversations are the ones that I really enjoy on the forum. :)


I can see your point, but still I wouldn't want to give up that much performance by going with 32 shaders instead of 96. I'd rather use the somewhat bigger card and would not run any CPU projects. That should get cpu power consumption below 30W and the PSU would be fine. Of course the PSU would have to be able to handle an occasional spike in CPU usage.


Not running any CPU projects is a good point for power savings. Still, it is those spikes that worry me most, especially on the stock power supply (I am really doubtful that HP in particular went with any real quality on the PS).


Oh, and don't forget: the cards draw less power in GPU-Grid than in games, so the 80 / 50W figures are too high.. which makes the difference between them smaller ;)


Unfortunately, even those educational games for the kids have some rather taxing video activity at times, so when they are on the GPU wattage can spike fairly high.

BTW, you keep referring to the 64 shader 9600GT, but I thought that the max wattage for it was 100 - 105?

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 5282 - Posted: 4 Jan 2009 | 23:12:34 UTC


Just to update...the 9500GT completed the first workunit (an 18xx one) in about 40.72 hours.

Profile K1atOdessa
Send message
Joined: 25 Feb 08
Posts: 249
Credit: 395,402,681
RAC: 1,594,520
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5286 - Posted: 5 Jan 2009 | 1:44:11 UTC - in response to Message 5282.


Just to update...the 9500GT completed the first workunit (an 18xx one) in about 40.72 hours.


That sounds pretty much dead on with my experience. My OC'd 9500GT is about 3.3x slower than my 8800GT's (based on several WU's completed solely with the 9500). I've also seen about 40 hr runtimes on the 18xx credit WU's on the 9500GT, which equates to just over 12 hours on a 8800GT (just like I have experienced).

This is good because it shows you can process two of these (small) WU's in 48 hours. However, while the 24xx WU's only took me a couple hours longer, if you get a 29xx or 3232 size WU, both will not be able to finish in 96 hours (4-day deadline). I advise you keep an eye on this and abort any that are no where close to finishing on time. If it will be within 6-8 hours or so late based on your estimates, then you might as well let it finish because only a 260 or 280 card will receive it (after the 96 hour deadline passes) and be able to process / return that fast.

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 5288 - Posted: 5 Jan 2009 | 2:20:28 UTC - in response to Message 5286.


Since the run times are nearly identical, it also looks like memory clock is irrelevant for the GPUGRID apps...

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 5295 - Posted: 5 Jan 2009 | 4:34:52 UTC - in response to Message 5286.


Just to update...the 9500GT completed the first workunit (an 18xx one) in about 40.72 hours.


That sounds pretty much dead on with my experience. My OC'd 9500GT is about 3.3x slower than my 8800GT's (based on several WU's completed solely with the 9500). I've also seen about 40 hr runtimes on the 18xx credit WU's on the 9500GT, which equates to just over 12 hours on a 8800GT (just like I have experienced).

This is good because it shows you can process two of these (small) WU's in 48 hours. However, while the 24xx WU's only took me a couple hours longer, if you get a 29xx or 3232 size WU, both will not be able to finish in 96 hours (4-day deadline). I advise you keep an eye on this and abort any that are no where close to finishing on time. If it will be within 6-8 hours or so late based on your estimates, then you might as well let it finish because only a 260 or 280 card will receive it (after the 96 hour deadline passes) and be able to process / return that fast.


YOu may want to keep an eye on those and if they get re-issued send a note to your wingman to monitor the situation so that they don't process it half way through and have it canceled by the server ...

Profile K1atOdessa
Send message
Joined: 25 Feb 08
Posts: 249
Credit: 395,402,681
RAC: 1,594,520
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5298 - Posted: 5 Jan 2009 | 5:03:11 UTC - in response to Message 5288.


Since the run times are nearly identical, it also looks like memory clock is irrelevant for the GPUGRID apps...



I've read/heard that the memory clock does not make much difference as you said. I have overclocked the memory slightly (4%) on my 8800GT's, but given the very small increase any benefit might not be apparent anyhow.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5310 - Posted: 5 Jan 2009 | 17:18:28 UTC - in response to Message 5288.

Since the run times are nearly identical, it also looks like memory clock is irrelevant for the GPUGRID apps...


Just a quick reply.. see here plus the next couple of posts.

MrS
____________
Scanning for our furry friends since Jan 2002

Priapismo
Send message
Joined: 26 Oct 08
Posts: 6
Credit: 443,525
RAC: 0
Level

Scientific publications
watwat
Message 5460 - Posted: 10 Jan 2009 | 18:17:57 UTC - in response to Message 5310.

Well after deciding a 260GTX Core 216 would be my best bet (price to performance wise), a recent post on sale prices (thx JAMC) led to a rather hasty purchase of an overclocked EVGA 280GTX (at only $211)...

I don't think I'll be able to beat that price to performance ratio in a while.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 5464 - Posted: 10 Jan 2009 | 18:50:44 UTC

Hi Scott,

sorry for the late response, work got me again..

Unfortunately, even those educational games for the kids have some rather taxing video activity at times, so when they are on the GPU wattage can spike fairly high.


Ah, now that's a different point. Games have the bad habit of using 100% (at least of one core) and 100% GPU, no matter if they need them or not. So in that case you'd be drawing considerably more than during GPU-Grid crunching, especially if you're kind to the CPU.

When I still had the stock cooler on my 9800GTX+ I could hear the air movement due to the fan when GPU-Grid was running, but when I launched 3D Mark the fan started to scream.. so that's a diffeent kind of load.

BTW, you keep referring to the 64 shader 9600GT, but I thought that the max wattage for it was 100 - 105?


Not sure where I got my numbers from, I think it was in the beginning of this thread. I'm using 9600GT and 9600GSO (old version) and 64 and 96 shaders almost synonymical for the faster cards. That goes along with "somewhat more than 80W". Taking a look here I find a TDP of 84W for the 9600GSO and 95W for the 9600GT. So, yes, the 9600GT draws a bit more (and is a bit slower) than the old 9600GSO, but not that much.

MrS
____________
Scanning for our furry friends since Jan 2002

Post to thread

Message boards : Graphics cards (GPUs) : Various GPU's Performance

//