Advanced search

Message boards : Graphics cards (GPUs) : Performance of 3D Graphic @ PS3GRID

Author Message
TomaszPawel
Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1970 - Posted: 1 Sep 2008 | 12:31:13 UTC
Last modified: 1 Sep 2008 | 13:23:27 UTC

Performance of 3D Graphic @ PS3GRID
Hi!
This is performance guide of nVidia graphic cards on PS3GRID:
1. GeForce 280GTX: 25000 sec/WU
2. GeForce 260GTX: 28000 sec/WU
3. GeForce 9800GTX+: 44000 sec/WU
4. GeForce 9800GTX: 47000 sec/WU
5. GeForce 8800GTS512: 50000 sec/WU
6. GeForce 8800GT: 58000 sec/WU
7. GeForce 9600GT: 70000 sec/WU
8. GeForce 8800GS: 74000 sec/WU
This is estimated time (+/-2000 seconds ) of what you should expected from this cards when they operate at normal values. This data was taken from statistic of users who crunch. Although this comparison is not 100% correct due to different CPU, and RAM clocks, but this give some lights on GeForce performance.
I will update and correct this when more WU will be completed using 6.43 application.

It seems that for all WU You can get the same points, but deferent’s are in how much time computer needs to completed WU. Overcloacking helps :)

Also different drivers = different performance.

Profile [XTBA>XTC] ZeuZ
Send message
Joined: 15 Jul 08
Posts: 60
Credit: 108,384
RAC: 0
Level

Scientific publications
wat
Message 1971 - Posted: 1 Sep 2008 | 13:00:33 UTC

A 8800GS/9600GSO is more powerfull than a 9600GT

Wolfram1
Send message
Joined: 24 Aug 08
Posts: 45
Credit: 3,431,862
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 1972 - Posted: 1 Sep 2008 | 13:03:57 UTC - in response to Message 1970.

Performance of 3D Graphic @ PS3GRID
Hi!
This is performance guide of nVidia graphic cards on PS3GRID:
1. GeForce 280GTX: 25000 sec/WU
2. GeForce 260GTX: 28000 sec/WU
3. GeForce 9800GTX: 47000 sec/WU
4. GeForce 8800GTS512: 50000 sec/WU
5. GeForce 8800GT: 58000 sec/WU
6. GeForce 9600GT: 70000 sec/WU
7. GeForce 8800GS: 74000 sec/WU
This is estimated time (+/-2000 seconds ) of what you should expected from this cards when they operate at normal values. This data was taken from statistic of users who crunch. Although this comparison is not 100% correct due to different CPU, and RAM clocks, but this give some lights on GeForce performance.
I will update and correct this when more WU will be completed using 6.43 application.

It seems that for all WU You can get the same points, but deferent’s are in how much time computer needs to completed WU. Overcloacking helps :)


The Idee to habe a table with performance value is very good, But until now I think we have tooo few results.
I have a GeForce 9800GTX+ and I have 3 WU with 36.230, 38.266 and 35.527. It seems I am out of your range?

[AF>HFR>RR] Laxou
Send message
Joined: 15 Aug 08
Posts: 9
Credit: 1,973,745
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 1975 - Posted: 1 Sep 2008 | 13:34:16 UTC

the gtx+ is faster than a normal gtx so it's normal that it is between normal gts and gtx260 !

TomaszPawel
Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1977 - Posted: 1 Sep 2008 | 13:40:49 UTC - in response to Message 1972.


I have a GeForce 9800GTX+ and I have 3 WU with 36.230, 38.266 and 35.527. It seems I am out of your range?

What driver you use?

TO ADMIN ! i NEED ABILITY TO EDIT MY FIRST POST :)
THNX

Profile Phil Klassen
Send message
Joined: 6 Sep 07
Posts: 18
Credit: 14,764,147
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 1978 - Posted: 1 Sep 2008 | 14:06:11 UTC

hmmm what am I doing wrong I have 2 9600gt overclocked (factory) running on 2 Q6600 2.4 quads(Vista 32) and it takes them 19+ hours to complete?

____________

Wolfram1
Send message
Joined: 24 Aug 08
Posts: 45
Credit: 3,431,862
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 1980 - Posted: 1 Sep 2008 | 14:19:14 UTC - in response to Message 1977.


I have a GeForce 9800GTX+ and I have 3 WU with 36.230, 38.266 and 35.527. It seems I am out of your range?

What driver you use?

TO ADMIN ! i NEED ABILITY TO EDIT MY FIRST POST :)
THNX



I have the 6.43 and the nvidia 177.92

Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist
Send message
Joined: 14 Mar 07
Posts: 1957
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 1981 - Posted: 1 Sep 2008 | 14:38:36 UTC - in response to Message 1977.


I have a GeForce 9800GTX+ and I have 3 WU with 36.230, 38.266 and 35.527. It seems I am out of your range?

What driver you use?

TO ADMIN ! i NEED ABILITY TO EDIT MY FIRST POST :)
THNX


Are you able to do it? I am able to edit my own posts.

gdf

TomaszPawel
Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1983 - Posted: 1 Sep 2008 | 14:57:33 UTC - in response to Message 1981.


I have a GeForce 9800GTX+ and I have 3 WU with 36.230, 38.266 and 35.527. It seems I am out of your range?

What driver you use?

TO ADMIN ! i NEED ABILITY TO EDIT MY FIRST POST :)
THNX


Are you able to do it? I am able to edit my own posts.

gdf



I don't have button - "Edit", and it was written that after 60 minutes i am not able to edit....

Profile [XTBA>XTC] ZeuZ
Send message
Joined: 15 Jul 08
Posts: 60
Credit: 108,384
RAC: 0
Level

Scientific publications
wat
Message 1985 - Posted: 1 Sep 2008 | 15:24:17 UTC - in response to Message 1978.
Last modified: 1 Sep 2008 | 15:24:34 UTC

hmmm what am I doing wrong I have 2 9600gt overclocked (factory) running on 2 Q6600 2.4 quads(Vista 32) and it takes them 19+ hours to complete?


17 hours for me, 9600GT oc 1900mhz + E4300 @ 2.4ghz on vista64 + 177.84

Profile Stefan Ledwina
Avatar
Send message
Joined: 16 Jul 07
Posts: 464
Credit: 221,007,857
RAC: 4,333,521
Level
Leu
Scientific publications
watwatwatwatwatwatwatwat
Message 1986 - Posted: 1 Sep 2008 | 15:26:43 UTC - in response to Message 1981.
Last modified: 1 Sep 2008 | 15:27:10 UTC


Are you able to do it? I am able to edit my own posts.

gdf



Hi G!

Normal users only can edit their posts in a 60 minutes time frame.
Only admins/moderators have the ability to always edit their own posts.
____________

pixelicious.at - my little photoblog

Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist
Send message
Joined: 14 Mar 07
Posts: 1957
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 1987 - Posted: 1 Sep 2008 | 15:29:36 UTC - in response to Message 1986.


Are you able to do it? I am able to edit my own posts.

gdf



Hi G!

Normal users only can edit their posts in a 60 minutes time frame.
Only admins/moderators have the ability to always edit their own posts.


Do you know if this can be changed?

gdf

Profile Stefan Ledwina
Avatar
Send message
Joined: 16 Jul 07
Posts: 464
Credit: 221,007,857
RAC: 4,333,521
Level
Leu
Scientific publications
watwatwatwatwatwatwatwat
Message 1988 - Posted: 1 Sep 2008 | 15:33:22 UTC - in response to Message 1987.


Are you able to do it? I am able to edit my own posts.

gdf



Hi G!

Normal users only can edit their posts in a 60 minutes time frame.
Only admins/moderators have the ability to always edit their own posts.


Do you know if this can be changed?

gdf


No sorry, I don't know. And I've never heard of any project which would have changed that from the default...
____________

pixelicious.at - my little photoblog

Profile koschi
Avatar
Send message
Joined: 14 Aug 08
Posts: 124
Credit: 792,979,198
RAC: 17,226
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 1989 - Posted: 1 Sep 2008 | 16:18:50 UTC - in response to Message 1978.

hmmm what am I doing wrong I have 2 9600gt overclocked (factory) running on 2 Q6600 2.4 quads(Vista 32) and it takes them 19+ hours to complete?


You are not doing anything wrong, it's your card that has only 64 shader units at a clock speed of 1700 MHz.

But amazing to see that the 9600GT from [AF>XTC] ZeuZ which is clocked just 200MHz higher is 2hours faster and only 40min slower that my 9800GT with 112 shaders at 1512Mhz
I think my times are still to slow, but I have no clue why...

Profile Phil Klassen
Send message
Joined: 6 Sep 07
Posts: 18
Credit: 14,764,147
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 1990 - Posted: 1 Sep 2008 | 16:35:02 UTC - in response to Message 1989.

hmmm what am I doing wrong I have 2 9600gt overclocked (factory) running on 2 Q6600 2.4 quads(Vista 32) and it takes them 19+ hours to complete?


You are not doing anything wrong, it's your card that has only 64 shader units at a clock speed of 1700 MHz.

But amazing to see that the 9600GT from [AF>XTC] ZeuZ which is clocked just 200MHz higher is 2hours faster and only 40min slower that my 9800GT with 112 shaders at 1512Mhz
I think my times are still to slow, but I have no clue why...



OK thanks. I'll let er run without any more changes.


____________

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1991 - Posted: 1 Sep 2008 | 19:00:16 UTC - in response to Message 1980.

I have a GeForce 9800GTX+ and I have 3 WU with 36.230, 38.266 and 35.527. > What driver you use?
I have the 6.43 and the nvidia 177.92


Is that on Vista / Vista 64? Because the current Nv driver seems to be faster there. I also run a 9800GTX+ with 6.43 and 177.92 on XP32 and I average about 44.000s with none faster than 44130s (including 6.41).

@Thomasz: how many samples of 260GTX did you see? Because Stefan said his was about as fast as a super clocked 9800GTX here. This could have changed with the new CUDA 2.0 compilers though, as they may make better use of the tweaks in the GT200 architecture.

MrS
____________
Scanning for our furry friends since Jan 2002

Wolfram1
Send message
Joined: 24 Aug 08
Posts: 45
Credit: 3,431,862
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 1992 - Posted: 1 Sep 2008 | 19:08:51 UTC - in response to Message 1991.

I have a GeForce 9800GTX+ and I have 3 WU with 36.230, 38.266 and 35.527. > What driver you use?
I have the 6.43 and the nvidia 177.92


Is that on Vista / Vista 64? Because the current Nv driver seems to be faster there. I also run a 9800GTX+ with 6.43 and 177.92 on XP32 and I average about 44.000s with none faster than 44130s (including 6.41).

@Thomasz: how many samples of 260GTX did you see? Because Stefan said his was about as fast as a super clocked 9800GTX here. This could have changed with the new CUDA 2.0 compilers though, as they may make better use of the tweaks in the GT200 architecture.

MrS


My above results are on vista 64 and I have an Intel Q6600 overclocked an 3 GHz

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 1994 - Posted: 1 Sep 2008 | 21:10:33 UTC
Last modified: 1 Sep 2008 | 21:12:42 UTC

Oh boy, this Vista speed advantage is massive! That's ~1300 credits/day more for you, with the same hardware.
(edit: I also have a Q6600 @ 3 GHz - the main difference is the XP/Vista driver)

And it's about 20% faster. That's just the amount of speed which we lost due to the current drivers + CUDA 2.0 compiler, according to GDF. Remember the initial problems nVidia had with their Vista drivers? They arose because they had to write the driver again to fit the new driver model, almost from scratch. Seems like that's finally good for something :p

But buying Vista 64 just because of this driver advantage would be a bit rush right now.
GDF, can you tell us anything more about this? Is NV aware of the issue and are they working on a fix?

MrS
____________
Scanning for our furry friends since Jan 2002

Wolfram1
Send message
Joined: 24 Aug 08
Posts: 45
Credit: 3,431,862
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 1997 - Posted: 1 Sep 2008 | 21:25:38 UTC - in response to Message 1994.

Oh boy, this Vista speed advantage is massive! That's ~1300 credits/day more for you, with the same hardware.
(edit: I also have a Q6600 @ 3 GHz - the main difference is the XP/Vista driver)

And it's about 20% faster. That's just the amount of speed which we lost due to the current drivers + CUDA 2.0 compiler, according to GDF. Remember the initial problems nVidia had with their Vista drivers? They arose because they had to write the driver again to fit the new driver model, almost from scratch. Seems like that's finally good for something :p

But buying Vista 64 just because of this driver advantage would be a bit rush right now.
GDF, can you tell us anything more about this? Is NV aware of the issue and are they working on a fix?

MrS



It is very intresting for me. I am happy to have vista. :-)

GPUGRID Role account
Send message
Joined: 15 Feb 07
Posts: 134
Credit: 1,349,535,983
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 2002 - Posted: 1 Sep 2008 | 22:54:18 UTC - in response to Message 1994.


GDF, can you tell us anything more about this? Is NV aware of the issue and are they working on a fix?


We have reported the performance problem to Nvidia and are hopeful that a fix will be forthcoming soon.

MJH

TomaszPawel
Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2013 - Posted: 2 Sep 2008 | 11:09:19 UTC - in response to Message 1997.
Last modified: 2 Sep 2008 | 11:11:12 UTC

Wolfram1

So you have 9800GTX+, and q6600@3Ghz, and Vista x64. - 38000sec.

What driver you use? Are you crunching something on onther cores?

What FSB you have set?

ExtraTerrestrial Apes

So you have 9800GTX+, and q6600@3Ghz, and XPSP2. - 44000sec.

What driver you use? Are you crunching something on onther cores?

What FSB you have set?

Thamir Ghaslan
Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 2024 - Posted: 2 Sep 2008 | 15:52:46 UTC - in response to Message 1970.

Performance of 3D Graphic @ PS3GRID
Hi!
This is performance guide of nVidia graphic cards on PS3GRID:
1. GeForce 280GTX: 25000 sec/WU
2. GeForce 260GTX: 28000 sec/WU
3. GeForce 9800GTX+: 44000 sec/WU
4. GeForce 9800GTX: 47000 sec/WU
5. GeForce 8800GTS512: 50000 sec/WU
6. GeForce 8800GT: 58000 sec/WU
7. GeForce 9600GT: 70000 sec/WU
8. GeForce 8800GS: 74000 sec/WU
This is estimated time (+/-2000 seconds ) of what you should expected from this cards when they operate at normal values. This data was taken from statistic of users who crunch. Although this comparison is not 100% correct due to different CPU, and RAM clocks, but this give some lights on GeForce performance.
I will update and correct this when more WU will be completed using 6.43 application.

It seems that for all WU You can get the same points, but deferent’s are in how much time computer needs to completed WU. Overcloacking helps :)

Also different drivers = different performance.


I was'nt able to fully gauge how much difference there was between my 8800 GS and 280 GTX on this project due nvidia support being new here, but on folding@home, I was getting close to 1000 frames per second on my 8800 and 3000 frames per second on my 280. Time to completion was also cut to 3rd.

I'm not a big believer in flops as the ultimate truth, driver optimization, bandwith and bits all plays a role!

I'm getting 26,000 seconds, 7 hours 20 minutes per task!


Profile [SETI.USA]Tank_Master
Avatar
Send message
Joined: 8 Jul 07
Posts: 85
Credit: 67,463,387
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 2025 - Posted: 2 Sep 2008 | 16:25:50 UTC - in response to Message 2024.

I'm getting 26,000 seconds, 7 hours 20 minutes per task!

On which card?

My 8800 GTS 512 takes 10.76h with server 2008 x64 4GB RAM and a q9450 @ 3.05GHz

Profile koschi
Avatar
Send message
Joined: 14 Aug 08
Posts: 124
Credit: 792,979,198
RAC: 17,226
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 2026 - Posted: 2 Sep 2008 | 16:38:33 UTC

It's a GTX280, according to the information in stderr out...

@Thamir

So it looks like as if the ratio here is also ~1:3 when you compare your GTX280 to the times that Tomasz collected for a 8800GS, +/- some %...

Thamir Ghaslan
Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 2029 - Posted: 2 Sep 2008 | 18:06:40 UTC - in response to Message 2025.

I'm getting 26,000 seconds, 7 hours 20 minutes per task!

On which card?

My 8800 GTS 512 takes 10.76h with server 2008 x64 4GB RAM and a q9450 @ 3.05GHz


GTX 280, win xp 32 bit, 4 GB ram, quad 6600, 2.4 default but on some days I'll overclock close to 3 GHZ.

I could lower that 26,000 value with a little CPU and GPU overclocking, but I prefer stability over speed!

Additionally, I heavily watch movies on my connected TV which does not really waste much GPU and occasionally spend an hour or two playing games while ps3grid tasks in the background.

No major performance degradation in terms of FPS. Most of my games are a year old and run full setting @ 1024x768.

Would like to see how Crysis feels about it :P

Thamir Ghaslan
Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 2030 - Posted: 2 Sep 2008 | 18:09:24 UTC - in response to Message 2026.

It's a GTX280, according to the information in stderr out...

@Thamir

So it looks like as if the ratio here is also ~1:3 when you compare your GTX280 to the times that Tomasz collected for a 8800GS, +/- some %...


Yes, seems that way!

Honestly, if people would stop whining about the price of a 280! I think its worth it with the latest price wars!

+/- a few bucks, if you buy 3 8800 GS the price is close to 1 280 GTX!

TomaszPawel
Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2035 - Posted: 2 Sep 2008 | 19:24:05 UTC - in response to Message 2030.

[quote]It's a GTX280, according to the information in stderr out...
@Thamir
+/- a few bucks, if you buy 3 8800 GS the price is close to 1 280 GTX!


I agree. 2x8800GTS >= 1x280GTX but i preffer 280GTX... so colecting $...

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2045 - Posted: 2 Sep 2008 | 21:43:26 UTC - in response to Message 2013.

Wolfram1

So you have 9800GTX+, and q6600@3Ghz, and Vista x64. - 38000sec.

What driver you use? Are you crunching something on onther cores?
What FSB you have set?

ExtraTerrestrial Apes

So you have 9800GTX+, and q6600@3Ghz, and XPSP2. - 44000sec.

What driver you use? Are you crunching something on onther cores?
What FSB you have set?


On average he is actually faster than 38.000. We're both on 177.92 & 6.3.10.
I'm running 3 x QMC on my other cores. FSB is set to 334 MHz with the memory at DDR2-800 5-4-4 and turbo sub timings.

But the latter 2 should really play no role, as the CPU speed itself is uncritical for GPU-Grid and the FSB & mem would only account for ~5% overall CPU speed anyway. We're talking about a 20% difference here! And what runs on the other cores *should* be pretty irrelevant, because GPU-Grid has a dedicated core. I wouldn't be my life on it though.

MrS
____________
Scanning for our furry friends since Jan 2002

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2046 - Posted: 2 Sep 2008 | 22:26:43 UTC

I'm not a big believer in flops as the ultimate truth, driver optimization, bandwith and bits all plays a role!


It's not a question of belief and no one is saying they'd be the "ultimate truth".

Look, >99.9% of what happens in GPU-Grid happens in the shaders (this number is just made up by me). In all chips of the G80/G90 class the shaders are almost identical (apparently apart from this CUDA 1.0 / 1.1 issue). The code is compute bound, memory bandwidth plays a very minor role. Therefore, the only thing which matters is how many shaders you have and how many clock cycles they get per second, assuming we're all running the same software. That's why FLops are a good guideline to compare GPUs - they reflect the key performance factors directly.

Do I know this to be true? Well, GDF said so [that performance scales linearly with Flops]. But all of this goes under the assumption that the shader core is basically identical between GPUs. The developers tested with 8800GTs and 9800GX2 where this condition is met.

However, for GT200 things are different. The shader core has been tweaked quite a bit, registers have been added etc. That's why I asked about the performance of 260GTX, because it's got the same raw Flops as a 9800GTX+. I got the answer that performance was basically identical.

As it seems now, maybe only after the devs switched to CUDA 2.0, the GT200 based chips are indeed way faster than their Flop-rating would imply. Guys, this is a major finding and greatly improves the value of GTX260/280!!

This is not a failure of Flops in general - you just have to understand what you're doing. It's perfectly normal for a different architecture to make better real world use of its theoretical maximum Flops. This just illustrates that GT200 and G80/90 can not be compared based on Flops. But all G80/90 can still be compared with each other as well as all GT200 based GPUs with each other.

Honestly, if people would stop whining about the price of a 280! I think its worth it with the latest price wars!


I don't hear anyone whining here.

+/- a few bucks, if you buy 3 8800 GS the price is close to 1 280 GTX!


You don't buy 8800GS for GPU-Grid. A worthy opponent for GTX260 would be a 9800GX2. Price-wise they're about the same, whereas the GX2 has the higher power consumption. Performance should be 50.000 - 52.000s for each WU. So you'd be a bit faster than the GTX260, but you'd need to sacrifice 2 CPU cores.

This makes the GTX260/280 look much better than before. Thanks Thomasz!

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Pigu
Send message
Joined: 1 Sep 08
Posts: 2
Credit: 4,544,099
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 2049 - Posted: 3 Sep 2008 | 5:38:18 UTC
Last modified: 3 Sep 2008 | 5:38:46 UTC

how many wu's can i cruch simultanously? always 1 or can i crunch more on better cards? can i crunch 4x more wus in quad sli or only one faster?
____________

Profile koschi
Avatar
Send message
Joined: 14 Aug 08
Posts: 124
Credit: 792,979,198
RAC: 17,226
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 2051 - Posted: 3 Sep 2008 | 6:52:55 UTC

You should be able to crunch as many units in parallel as you have GPU cores. As every GPU core consumes one whole CPU core at the moment, you also need the same or bigger amount of CPUs.

So if you have a normal SLI config on a Quad core processor, you can crunch two units in parallel, leaving two CPU cores free for traditional projects.

If you can afford 3 or even 4 way SLI, maybe via 2 * 9800GTX2 you will need all 4 cores of your Quad processor to utilize the 4 GPU cores, but you are crunching 4 units in parallel then :)

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2054 - Posted: 3 Sep 2008 | 7:48:41 UTC

And don't forget to turn SLI off for GPU-Grid.

MrS
____________
Scanning for our furry friends since Jan 2002

TomaszPawel
Send message
Joined: 18 Aug 08
Posts: 121
Credit: 59,836,411
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2097 - Posted: 5 Sep 2008 | 7:48:18 UTC - in response to Message 2051.


If you can afford 3 or even 4 way SLI, maybe via 2 * 9800GTX2 you will need all 4 cores of your Quad processor to utilize the 4 GPU cores, but you are crunching 4 units in parallel then :)


Hmmm, Is someone crunching on 9800GX2????

Profile [AF>HFR>RR] Jim PROFIT
Send message
Joined: 3 Jun 07
Posts: 107
Credit: 31,331,137
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 2098 - Posted: 5 Sep 2008 | 12:55:10 UTC - in response to Message 2097.


If you can afford 3 or even 4 way SLI, maybe via 2 * 9800GTX2 you will need all 4 cores of your Quad processor to utilize the 4 GPU cores, but you are crunching 4 units in parallel then :)


Hmmm, Is someone crunching on 9800GX2????


Maybe in a week....i hope!

Jim PROFIT

Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist
Send message
Joined: 14 Mar 07
Posts: 1957
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 2099 - Posted: 5 Sep 2008 | 15:28:18 UTC - in response to Message 2098.


If you can afford 3 or even 4 way SLI, maybe via 2 * 9800GTX2 you will need all 4 cores of your Quad processor to utilize the 4 GPU cores, but you are crunching 4 units in parallel then :)


Hmmm, Is someone crunching on 9800GX2????


Maybe in a week....i hope!

Jim PROFIT


Cpu cores will soon be not necessary.

g

Profile Sandro
Send message
Joined: 19 Aug 08
Posts: 22
Credit: 3,660,304
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 2100 - Posted: 5 Sep 2008 | 15:47:44 UTC - in response to Message 2099.



Cpu cores will soon be not necessary.

g

that is a very good news!
any idea how long it will take? ;)

Thamir Ghaslan
Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 2108 - Posted: 5 Sep 2008 | 21:26:32 UTC - in response to Message 1970.

Performance of 3D Graphic @ PS3GRID
Hi!
This is performance guide of nVidia graphic cards on PS3GRID:
1. GeForce 280GTX: 25000 sec/WU
2. GeForce 260GTX: 28000 sec/WU
3. GeForce 9800GTX+: 44000 sec/WU
4. GeForce 9800GTX: 47000 sec/WU
5. GeForce 8800GTS512: 50000 sec/WU
6. GeForce 8800GT: 58000 sec/WU
7. GeForce 9600GT: 70000 sec/WU
8. GeForce 8800GS: 74000 sec/WU
This is estimated time (+/-2000 seconds ) of what you should expected from this cards when they operate at normal values. This data was taken from statistic of users who crunch. Although this comparison is not 100% correct due to different CPU, and RAM clocks, but this give some lights on GeForce performance.
I will update and correct this when more WU will be completed using 6.43 application.

It seems that for all WU You can get the same points, but deferent’s are in how much time computer needs to completed WU. Overcloacking helps :)

Also different drivers = different performance.


One more benchmark!

I ran this application on my work computer 24/7 which was equipped with a 8400 GS.

http://www.ps3grid.net/workunit.php?wuid=38165

538,666 seconds give or take 6 days. Bad idea :P and passed the deadline and no credits!

Will detach my work machine once I get to the office.

Profile [SETI.USA]Tank_Master
Avatar
Send message
Joined: 8 Jul 07
Posts: 85
Credit: 67,463,387
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 2109 - Posted: 5 Sep 2008 | 22:25:16 UTC

of the 6 WUs I have completed so far, my average time is 37819.59333 seconds (min 37k, max 40k)

I have:
BFG GeForce 8800 GTS 512 OC (675/972) with 177.84 drivers
Server 2008 x64
4GB RAM
BOINC 6.3.10 x64

Profile koschi
Avatar
Send message
Joined: 14 Aug 08
Posts: 124
Credit: 792,979,198
RAC: 17,226
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 2113 - Posted: 6 Sep 2008 | 0:59:25 UTC

The card is too slow,
with only 16 shaders the times are as expected...
The card itself is not recommended for this project, as it has only few shaders and hence it is way to slow to crunch a work unit within deadline time.

http://www.gpugrid.net/forum_thread.php?id=316

Yeti
Avatar
Send message
Joined: 20 Jul 08
Posts: 3
Credit: 4,116,949,548
RAC: 41,791,318
Level
Arg
Scientific publications
watwatwatwatwatwatwat
Message 2115 - Posted: 6 Sep 2008 | 8:12:50 UTC

It looks, like I have one more that works but it seems to be to slow:

Quattro GVS 290 will take 175 hours :-((
____________


Supporting BOINC, a great concept !

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2122 - Posted: 6 Sep 2008 | 12:25:27 UTC - in response to Message 2115.

Quattro GVS 290 will take 175 hours :-((


It's actually difficult to find out what this piece of hardware is, but wiki says it uses a NVG86, which is basically the same as the slow 16-shader Geforce 8400. Crunching is not what these are made for.

MrS
____________
Scanning for our furry friends since Jan 2002

Thamir Ghaslan
Send message
Joined: 26 Aug 08
Posts: 55
Credit: 1,475,857
RAC: 0
Level
Ala
Scientific publications
watwatwat
Message 2133 - Posted: 6 Sep 2008 | 16:59:30 UTC - in response to Message 2122.

Quattro GVS 290 will take 175 hours :-((


It's actually difficult to find out what this piece of hardware is, but wiki says it uses a NVG86, which is basically the same as the slow 16-shader Geforce 8400. Crunching is not what these are made for.

MrS


I'd like to see how a tesla S1070 GPU would perform here. 960 cores, 16 GB ram, and 4.3 Tflops assumed to be based on a 280 GTX.

Less than 2 hours per task?

I never was an ATI fan even after they released the 2 Tflops 4870x2, after 2 generations of bad experience with ATI and 4 good experiences with nvidias, I'm sticking with nvidia.

That, and its industry wide acceptance from being run on PS3 and Xbox and its logo being slapped into every game!

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2136 - Posted: 6 Sep 2008 | 18:43:20 UTC

The Tesla S1070 uses 4 GT200 chips with more memory than on the desktop cards (4GB per chip). I don't know if the software sees them as 4 seperate GPUs or as a single one. In the first case you'd get about 4 WUs every 7h, in the second it should be a WU in less than 2h. Well, if you spend 8000$ you'd better get some serious performance for your money!
(its got the shaders clocked at 1.5 GHz instead of 1.3 GHz on the 280GTX)

I never was an ATI fan


I doesn't matter if your arguments are valid or not, please don't mention this topic here. Experience shows that any sane, technical conversation can be turned into a flame war within seconds, if the other company is mentioned ;)

MrS
____________
Scanning for our furry friends since Jan 2002

Profile [XTBA>XTC] ZeuZ
Send message
Joined: 15 Jul 08
Posts: 60
Credit: 108,384
RAC: 0
Level

Scientific publications
wat
Message 2212 - Posted: 9 Sep 2008 | 9:19:30 UTC - in response to Message 2099.



Cpu cores will soon be not necessary.

g


Can you say more about this please? what does "soon" mean for you? next week? next month? next year? xD

Thanks

XTC_ZeuZ

Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist
Send message
Joined: 14 Mar 07
Posts: 1957
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 2214 - Posted: 9 Sep 2008 | 12:04:46 UTC - in response to Message 2212.
Last modified: 10 Sep 2008 | 8:45:20 UTC



Cpu cores will soon be not necessary.

g


Can you say more about this please? what does "soon" mean for you? next week? next month? next year? xD

Thanks

XTC_ZeuZ



It does not depends on me, but on BOINC. We have at least to wait for version 6.3.11. The application and server are ready.

I would say one or two weeks.

gdf

Profile [XTBA>XTC] ZeuZ
Send message
Joined: 15 Jul 08
Posts: 60
Credit: 108,384
RAC: 0
Level

Scientific publications
wat
Message 2215 - Posted: 9 Sep 2008 | 12:05:47 UTC

Ah ok, thank you :D

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2278 - Posted: 11 Sep 2008 | 19:23:59 UTC

Guys, I'm confused! 2 Questions.

First: how fast is the GT200?

The data which Thomas collected, i.e. 25.000 and 28.000 s/WU, seems to be the norm. That means a GT200 shader does considerably more work per clock cycle than a G9x shader.

However, I just stoumbled across these 2 machines (1 from Stefan & Nr 2). The first uses a stock GTX260 and Vista and with 6.3.10 and 6.43 it needs 35.000 - 36.000 s/WU. That's what would be expected if its shaders were as fast as G92 per clock and I'd figure in a 20% bonus for the Vista driver.
The other machine is a OC'ed GTX280 and produced one 35.200 s result. Maybe it's just the odd man out.

The other question: is the theory about the Vista driver being ~20% faster really true?

So far there was nothing which forced me to that conclusion, but nothing contradicted it, so it was a reasonable thing to assume.

And.. damn! I accidently closed the tab. Well, I found a machine which did not behave as expected but forgot the details.

MrS
____________
Scanning for our furry friends since Jan 2002

zombie67 [MM]
Avatar
Send message
Joined: 16 Jul 07
Posts: 209
Credit: 4,095,161,456
RAC: 22,338,324
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2289 - Posted: 12 Sep 2008 | 13:07:58 UTC

Well, I just finished my first task on a GPU, and am concerned with the result. It took 58437.03 seconds, or 16.23 hours, which is about twice as long as I expected. Am I doing something wrong?

http://www.ps3grid.net/result.php?resultid=56210


stderr out : <core_client_version>6.3.10</core_client_version> <![CDATA[ <stderr_txt> # Using CUDA device 0 # Device 0: "GeForce GTX 280" # Clock rate: 1404000 kilohertz MDIO ERROR: cannot open file "restart.coor" called boinc_finish </stderr_txt> ]]>


Machine: Q6600 (@2.7ghz), 6gb RAM, running XP64; BFG GTX 280 BFGEGTX2801024OC2E; BOINC 6.3.10; Driver is 177.41_geforce_winxp_64bit_english_whql, freshly downloaded yesterday. I ran this task with no other tasks running. It is a dedcated cruncher, so nothing but BOINC is running on it.


____________
Reno, NV
Team: SETI.USA

Profile Stefan Ledwina
Avatar
Send message
Joined: 16 Jul 07
Posts: 464
Credit: 221,007,857
RAC: 4,333,521
Level
Leu
Scientific publications
watwatwatwatwatwatwatwat
Message 2290 - Posted: 12 Sep 2008 | 14:34:44 UTC
Last modified: 12 Sep 2008 | 14:36:58 UTC

Over 16 hours is a little bit slow for that card...
You could try newer drivers (177.84), from the CUDA download site - Direct link to Win64 177.84 driver -http://www.nvidia.com/object/thankyou.html?url=/compute/cuda/2_0/windows/driver/NVIDIADisplayWin2KAMD64(177_84)Int.exe.
____________

pixelicious.at - my little photoblog

Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist
Send message
Joined: 14 Mar 07
Posts: 1957
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 2291 - Posted: 12 Sep 2008 | 14:51:11 UTC - in response to Message 2289.

Well, I just finished my first task on a GPU, and am concerned with the result. It took 58437.03 seconds, or 16.23 hours, which is about twice as long as I expected. Am I doing something wrong?

http://www.ps3grid.net/result.php?resultid=56210


stderr out : <core_client_version>6.3.10</core_client_version> <![CDATA[ <stderr_txt> # Using CUDA device 0 # Device 0: "GeForce GTX 280" # Clock rate: 1404000 kilohertz MDIO ERROR: cannot open file "restart.coor" called boinc_finish </stderr_txt> ]]>


Machine: Q6600 (@2.7ghz), 6gb RAM, running XP64; BFG GTX 280 BFGEGTX2801024OC2E; BOINC 6.3.10; Driver is 177.41_geforce_winxp_64bit_english_whql, freshly downloaded yesterday. I ran this task with no other tasks running. It is a dedcated cruncher, so nothing but BOINC is running on it.



I have looked at it. The new windows application coming up will report the time per step which should tells us what's wrong with your system.
gdf

zombie67 [MM]
Avatar
Send message
Joined: 16 Jul 07
Posts: 209
Credit: 4,095,161,456
RAC: 22,338,324
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2292 - Posted: 12 Sep 2008 | 15:08:36 UTC - in response to Message 2291.

I have looked at it. The new windows application coming up will report the time per step which should tells us what's wrong with your system.


Thanks. When do you expect it to be released?

For what it's worth, I installed the Nvidia tools, and took a look at the GPU settings. It reports everything as expected:

GPU Core: 650 MHz
GPU mem speed: 1163 MHz
GPU shader clock: 1404 MHz

These are all factory settings. I have not changed anything.
____________
Reno, NV
Team: SETI.USA

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2294 - Posted: 12 Sep 2008 | 17:27:37 UTC

I'd also say give the new drivers a shot. Either 177.84 or 177.92.

MrS
____________
Scanning for our furry friends since Jan 2002

Wolfram1
Send message
Joined: 24 Aug 08
Posts: 45
Credit: 3,431,862
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 2295 - Posted: 12 Sep 2008 | 17:37:04 UTC - in response to Message 2294.

I'd also say give the new drivers a shot. Either 177.84 or 177.92.

MrS


The newest is 177.98, Running here under VISTA 64 fine,

Profile [SETI.USA]Tank_Master
Avatar
Send message
Joined: 8 Jul 07
Posts: 85
Credit: 67,463,387
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 2296 - Posted: 12 Sep 2008 | 18:07:09 UTC

Where did you get the 177.98? I still only see 177.92 when I select beta drivers from nvidia's site for vista x64.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2297 - Posted: 12 Sep 2008 | 18:07:36 UTC

Any performance changes?

MrS
____________
Scanning for our furry friends since Jan 2002

Wolfram1
Send message
Joined: 24 Aug 08
Posts: 45
Credit: 3,431,862
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 2298 - Posted: 12 Sep 2008 | 20:19:45 UTC - in response to Message 2296.

Where did you get the 177.98? I still only see 177.92 when I select beta drivers from nvidia's site for vista x64.


I took the driver from this side:

http://www.laptopvideo2go.com/forum/index.php?showforum=94

Wolfram1
Send message
Joined: 24 Aug 08
Posts: 45
Credit: 3,431,862
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 2299 - Posted: 12 Sep 2008 | 20:21:56 UTC - in response to Message 2297.

Any performance changes?

MrS



At the same time I overclocked my card, so I can't answer your question.

zombie67 [MM]
Avatar
Send message
Joined: 16 Jul 07
Posts: 209
Credit: 4,095,161,456
RAC: 22,338,324
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2300 - Posted: 12 Sep 2008 | 22:58:53 UTC - in response to Message 2294.

I'd also say give the new drivers a shot. Either 177.84 or 177.92.


Good call. Looks like I will be down to a bit over 6 hours when my current task finishes.
____________
Reno, NV
Team: SETI.USA

zombie67 [MM]
Avatar
Send message
Joined: 16 Jul 07
Posts: 209
Credit: 4,095,161,456
RAC: 22,338,324
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2307 - Posted: 13 Sep 2008 | 13:36:44 UTC - in response to Message 2300.

I'd also say give the new drivers a shot. Either 177.84 or 177.92.


Good call. Looks like I will be down to a bit over 6 hours when my current task finishes.


Well, I am bottoming out around 24,600 seconds, or 6.8 hours, with 177.92. Much better than 16+ hours!

http://www.ps3grid.net/result.php?resultid=56209

I understand that Vista is supposed to be faster than XP. But how does the speed of XP64 compare to Linux64?

Thanks again for all the help!
____________
Reno, NV
Team: SETI.USA

Profile koschi
Avatar
Send message
Joined: 14 Aug 08
Posts: 124
Credit: 792,979,198
RAC: 17,226
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 2308 - Posted: 13 Sep 2008 | 13:47:27 UTC

The speed is almost the same. One member of our team has a 9800GTX+ (XP64) which is only little better clocked than my 8800GTS OC (Linux) and we have nearly the same time at around 44000 seconds.

zombie67 [MM]
Avatar
Send message
Joined: 16 Jul 07
Posts: 209
Credit: 4,095,161,456
RAC: 22,338,324
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 2309 - Posted: 13 Sep 2008 | 14:18:56 UTC - in response to Message 2308.

The speed is almost the same. One member of our team has a 9800GTX+ (XP64) which is only little better clocked than my 8800GTS OC (Linux) and we have nearly the same time at around 44000 seconds.


Thanks!
____________
Reno, NV
Team: SETI.USA

Post to thread

Message boards : Graphics cards (GPUs) : Performance of 3D Graphic @ PS3GRID

//