Advanced search

Message boards : Graphics cards (GPUs) : GT 640 DDR3

Author Message
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25521 - Posted: 5 Jun 2012 | 22:32:22 UTC

http://www.anandtech.com/show/5911/nvidia-announces-retail-gt-640-ddr3,
by Ryan Smith.

I'm just saying they now exists, and I'm not saying 'buy one'!
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25723 - Posted: 15 Jun 2012 | 21:23:49 UTC

hm, i just buyed one. lol

Now its the question , will it works well on cuda 4.0 and 4.2 apps ?

Any results on that , is it allready testet with beta apps ?

if there anybody who can tell about results ?

greetings

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25725 - Posted: 15 Jun 2012 | 21:36:00 UTC - in response to Message 25723.

Should work in the same way as the bigger Kepler cards. Expect ~1/4 the performance of a full GK104 at similar clocks.

MrS
____________
Scanning for our furry friends since Jan 2002

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25726 - Posted: 15 Jun 2012 | 21:44:34 UTC - in response to Message 25725.
Last modified: 15 Jun 2012 | 21:53:29 UTC

http://www.palit.biz/palit/vgapro.php?id=1894

ok, its this card.

cant calculate the gflops on this , i got no comparsion table.

Edit: ok i found a comparsion table on wiki that says:

12.24 Gflops per wattage = means 65 * 12.24 = ~ 800 gflops , not bad :)

frankhagen
Send message
Joined: 18 Sep 08
Posts: 65
Credit: 3,037,414
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 25731 - Posted: 16 Jun 2012 | 16:01:01 UTC - in response to Message 25723.

hm, i just buyed one. lol

Now its the question , will it works well on cuda 4.0 and 4.2 apps ?

Any results on that , is it allready testet with beta apps ?

if there anybody who can tell about results ?


well, at least you could. ;)

got that thing up and running?

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25732 - Posted: 16 Jun 2012 | 17:44:51 UTC - in response to Message 25731.
Last modified: 16 Jun 2012 | 17:49:34 UTC

Not yet, waiting for the ordering confirmation, still waiting , its saturday u know. :o/

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25737 - Posted: 17 Jun 2012 | 11:48:20 UTC - in response to Message 25726.

12.24 Gflops per wattage = means 65 * 12.24 = ~ 800 gflops , not bad :)

384 Shader * 2 Operations per clock * 900 MHz = 691 GFlops. However, 2 Ops/clock per shader is a theoretical maximum, which can hardly be achived in the real world. But we've been calculating nVidia Flops like this for a long time, so at least it's consistent for the sake of comparison.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25750 - Posted: 17 Jun 2012 | 17:53:01 UTC - in response to Message 25737.
Last modified: 17 Jun 2012 | 17:54:38 UTC

I expect the GT 640 should just about match a GTX 550 Ti in terms of runtime performance, but maybe the 4.2 app will be favourable to it. Obviously the 65W TDP is much more attractive than the 116W of the GTX550Ti.
At the end of 2009 the GT240 entered the scene. It was a bit of a sweet entry level card; an early 40nm GPU only requiring power from the PCIE slot (69W TDP). If the GT 640 offers reasonable performance at <75W then it might be a good entry level card and allow more people to contribute. The recommended minimum PSU for such cards is only ~350W. I would also expect the GT640 to actually use about 35 to 40W.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25770 - Posted: 19 Jun 2012 | 11:06:35 UTC - in response to Message 25750.

Bad news , for me, the ordered GT 640 was a GT 620. A database error by the vendor.

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25789 - Posted: 20 Jun 2012 | 11:26:19 UTC - in response to Message 25770.
Last modified: 20 Jun 2012 | 11:29:03 UTC

Very Good News !

I thought i ordered a stallion. They provide me a snail (GT 620), in the end i got now a pony (GT 640) :lol

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25793 - Posted: 20 Jun 2012 | 15:18:52 UTC - in response to Message 25789.

PAOLA task 4.2 on its way to completion. Almost about 5 hours , anybody who completed it or another with a GT 640 ? Approximately completion about 8 hours.

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25800 - Posted: 21 Jun 2012 | 10:47:58 UTC - in response to Message 25793.
Last modified: 21 Jun 2012 | 10:48:48 UTC

ok IBUCH took about 6 hours.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25805 - Posted: 21 Jun 2012 | 14:33:03 UTC - in response to Message 25800.

Performance roughly as expected. It would probably match the GTX 550 Ti, if it could run the CUDA 3.1 app, but on the CUDA 4.2 app it's actually matching a GTX460, due to the GTX600 series improvements in performance over the GTX400/500 series. Cool.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

frankhagen
Send message
Joined: 18 Sep 08
Posts: 65
Credit: 3,037,414
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 25806 - Posted: 21 Jun 2012 | 16:49:31 UTC - in response to Message 25805.

Performance roughly as expected. It would probably match the GTX 550 Ti, if it could run the CUDA 3.1 app, but on the CUDA 4.2 app it's actually matching a GTX460, due to the GTX600 series improvements in performance over the GTX400/500 series. Cool.


well, REALLY cool!

considering it's only drawing ~50W out of the wall, time to go green..

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25807 - Posted: 21 Jun 2012 | 17:13:33 UTC - in response to Message 25806.
Last modified: 21 Jun 2012 | 17:18:04 UTC

it must be about 65 Watt who´s coming out of the wall,
but it will drag less power as a 550 Ti (116watts).

Think in mind , when u overclock/voltage the power saving
is surely lower ;) so i did, cant hold me back an overclocked
it. So warranty has void :b who cares !?

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25809 - Posted: 21 Jun 2012 | 19:39:57 UTC

Why must it draw 65 W? Because TDP is stated as 65 W by nVidia? Well.. no. Cards usually draw less than TDP at GPU-Grid and crunching in general. Consider this: you're probably not even at 100% GPU utilization, let alone using the TMUs etc.

So it seems like such a card is good for ~44k RAC here.

Overclocking actually improves power efficiency here. Overvolting doesn't.. but I hope you're not pushing the card to 2x the power consumption ;)

MrS
____________
Scanning for our furry friends since Jan 2002

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25810 - Posted: 21 Jun 2012 | 20:17:37 UTC - in response to Message 25809.
Last modified: 21 Jun 2012 | 20:19:47 UTC

but I hope you're not pushing the card to 2x the power consumption ;)


I hope too (looking at my electricity bill), for the rest of the month i can only run this card :D. I am almost at 50,-€ for this month, thats quiet enough for me.

i have not tryed to overvoltage this card, i am hope others may try this before me, please :D.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25811 - Posted: 21 Jun 2012 | 21:31:06 UTC - in response to Message 25810.

Long term overvolting is even less recommended on lesser cards.
These are fairly 'green' cards compared to previous generations, but the bigger cards probably do more per watt, especially if you look at the power consumption of the system. Their niche is low end desktop systems with average PSU's.
As it's matching a GTX460 then it's doing the same work for half the electric consumed by the GPU.
The ~44k is for normal tasks. This should improve for the longer tasks, especially if the card can return them inside 24h.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25818 - Posted: 22 Jun 2012 | 8:26:18 UTC

Well.. overvolting it still wouldn't cost much.

Stock load voltage seems to be 1.00 V. AMD is using 1.20 stock on the same process, so this would be the maximum I'd think about using. At 1.10 V power consumption would increase by a factor of 1.21 (1.10^2/1.00^2), pushing the card from an estimated 50 W to 60.5 W and maybe getting you another 100+ MHz core clock (-> ~10% performance increase).

Overall that's not too bad (considering you'Ve got an entire PC up'n running to feed the card), but 1.20 V may push the boundaries of PCIe power delivery: 1.20^2/1^2 = 1.44, i.e. a 44% increase in power consumption. Here we'd be talking about 72 W for maybe another 50+ MHz - diminishing returns.

MrS
____________
Scanning for our furry friends since Jan 2002

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25822 - Posted: 22 Jun 2012 | 13:36:10 UTC - in response to Message 25818.
Last modified: 22 Jun 2012 | 14:33:53 UTC

I think undervolting maybe an option.

To maintain clockrates an lowering core voltage.

Edit:

ok ,that saved me 5 Watts , cant go lower because of hardware restriction of the card. :/

I have checked all data and think it was a misstake, cant lower the core voltage,

no software worked correctly . Using EVGA Precision and Asus GPU Tweak.

Any guesses ?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25824 - Posted: 22 Jun 2012 | 15:42:03 UTC - in response to Message 25822.

MIS Kombustor perhaps?
I think there are issues overvolting and possibly overclocking the GF600 cards. The software will turn up eventually, if there isn't already an overclocking kit. I don't have a GF 600 card so I haven't really looked into this.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25825 - Posted: 22 Jun 2012 | 15:51:24 UTC - in response to Message 25824.
Last modified: 22 Jun 2012 | 15:52:25 UTC

nope, same results. i must be waiting until they evolved that all softwares. :-/

Tryed Win7 64-Bit , same results :-/

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25827 - Posted: 22 Jun 2012 | 16:23:58 UTC - in response to Message 25825.
Last modified: 22 Jun 2012 | 16:26:14 UTC

Apparently Voltage auto-adjusts on reference cards, so you can't manually do it, but it should increase with the GPU clock.
This change makes OC'ing the GF600's more like OC'ing the i7-2600K and similar CPU's which automatically adjust the Voltage as required.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25830 - Posted: 22 Jun 2012 | 16:35:09 UTC - in response to Message 25827.
Last modified: 22 Jun 2012 | 16:39:10 UTC

hm , i expirienced more raising the clock results in errors, no automatic voltage raising here on this card :(

Must be only in 3D Mode , in games or benchmarks.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25831 - Posted: 22 Jun 2012 | 16:39:12 UTC - in response to Message 25830.
Last modified: 22 Jun 2012 | 16:42:09 UTC

If it causes errors then don't increase the clocks.
It's possible that the reported Voltage is just a reference one, read off the cards reference Voltage or the driver, and not a real live reading.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25836 - Posted: 23 Jun 2012 | 5:24:25 UTC - in response to Message 25831.

everthings fine now, i think the card needed some "burn-in" time. Now voltage settings work but low benefit.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25842 - Posted: 23 Jun 2012 | 23:31:45 UTC - in response to Message 25836.

Which clock speeds do you reach?

Automatic voltage scaling is linked to turbo mode, which GT640 doesn't support.

MrS
____________
Scanning for our furry friends since Jan 2002

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25860 - Posted: 25 Jun 2012 | 15:21:34 UTC

what will happen when the gt 640 starts to run a cuda 3.1 app ?

i asking why i though i what to put this card between my two GTX 460 .

Any knowledge about this out there ? The fury ones can answer this question ?

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25865 - Posted: 25 Jun 2012 | 17:05:08 UTC
Last modified: 25 Jun 2012 | 17:56:15 UTC

oops ! I did it :(

my two gtx 460 have been excluded, so ich removed the gt 640 again.

now one gtx 460 calculating his first non beta cuda 4.2 app !?

good or bad !?

Edit:

Could someone anyone help plz, if that all three cards will run together that will save me about 100 watts per hour !!!

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25869 - Posted: 25 Jun 2012 | 19:59:53 UTC - in response to Message 25865.
Last modified: 25 Jun 2012 | 20:00:39 UTC

Great ! U did well. Now i am crushing only 4.2 tasks . jeepieh !

But still cant overclock the gt 640 with any overclocking software, i have to wait until they evolved their software.

Running 3 cards on a Asus P5N-T Deluxe....

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25886 - Posted: 26 Jun 2012 | 10:06:50 UTC - in response to Message 25869.

Cant clock the GT 640 , under Windows 7 i can use all 3 cards at the same time. That saves me 100 watts per hour. great

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25899 - Posted: 26 Jun 2012 | 18:03:03 UTC - in response to Message 25886.

Side note: 100 watts per hour -> 100 W. Power is already divided by time.

MrS
____________
Scanning for our furry friends since Jan 2002

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25900 - Posted: 26 Jun 2012 | 18:05:14 UTC - in response to Message 25899.

Side note: 100 watts per hour -> 100 W. Power is already divided by time.

MrS


ok, then i saved 0.1 kw/h. greetings to the fury ones :)

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25901 - Posted: 26 Jun 2012 | 18:08:23 UTC - in response to Message 25900.

Each hour you save 0.1 kWh :)

MrS
____________
Scanning for our furry friends since Jan 2002

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25902 - Posted: 26 Jun 2012 | 18:11:45 UTC - in response to Message 25901.

ooook, i managed my overclocking problem so far. Used "Nvidia Inspector" to clock the gt 640.

But dont now if stable, or it will crash wen it gets a cuda 3.1 app again.

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25913 - Posted: 26 Jun 2012 | 22:16:00 UTC - in response to Message 25902.

the "overclocking" doesnt work, in real there was no overclocking.

klepel
Send message
Joined: 23 Dec 09
Posts: 189
Credit: 4,718,956,245
RAC: 2,154,410
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26077 - Posted: 1 Jul 2012 | 15:38:16 UTC

Does have anyone news about runtimes of long tasks with Cuda4.2 on GTX640 cards?

Rantanplan: I have seen that it takes about 55000 seconds per work unit. However one of your computers labeled to have a GTX640, had pretty much task failures?

I try to decide, if I shall buy an additional GTX640 (two slots) or a GTX 560 TI 448, which a friend might bring me from USA next week. The first would be my choice pricewise as well as it will get only Cuda4.2, and represents a newer technology, the second seems to me still expensive as it is an “older” card and might also consume much more energy as I could verify with my actual two cards from the same series.

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26078 - Posted: 1 Jul 2012 | 16:11:01 UTC - in response to Message 26077.

had pretty much task failures?


i overclocked by 250 Mhz , now its only 200 Mhz , just trying.

klepel
Send message
Joined: 23 Dec 09
Posts: 189
Credit: 4,718,956,245
RAC: 2,154,410
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26083 - Posted: 2 Jul 2012 | 4:44:03 UTC

As I am not very good in computers, might the GT640 work for GPUGRID in a second slot with a GTX670 in the first? My motherboard has two slots and the computer for the GTX640 actually does not exist.

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26086 - Posted: 2 Jul 2012 | 10:30:51 UTC - in response to Message 26083.
Last modified: 2 Jul 2012 | 10:32:07 UTC

that will work, no problem. I think but only the GTX is not a triple slot graphics card.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26087 - Posted: 2 Jul 2012 | 10:40:13 UTC - in response to Message 26083.
Last modified: 2 Jul 2012 | 11:09:20 UTC

The GT640 should work in the second slot, though you might want to check your PCIE slot performance - if it's only PCIE x1 then if it even works it would be very poor. If both slots are PCIEx16 and stay X16 when the second slot is occupied then that's optimal. If the first slot drops to X8 and the second is only X4, then you would probably still do more work, but would be loosing a fair bit of performance through bandwidth limitations.

We really need to measure the performance loss when using PCIE3 vs PCIE2 and at various amounts of lane count. It was done roughly on PCIE2 with the old app for previous generations of cards, but things have changed since then. Some apps perform much better with additional PCIE bandwidth, especially PCIE3 as it's twice as fast as PCIE2.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26092 - Posted: 2 Jul 2012 | 11:49:06 UTC
Last modified: 2 Jul 2012 | 11:53:05 UTC

The running times have changed ? now a WU of PAOLA Long Run take approximately 21 hours.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26101 - Posted: 2 Jul 2012 | 19:26:03 UTC
Last modified: 2 Jul 2012 | 19:26:28 UTC

@klepel: the GTX560 xxxCore Edition is quite fast at GPU-Grid, way faster than the GT640. But it's also a totally different beast power-consumption-wise. Could your power supply, case cooling, ears and electricity bill stand such a card? IMO that would be the main question. GT640 is completely tame in comparison.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26110 - Posted: 2 Jul 2012 | 21:53:20 UTC - in response to Message 26101.
Last modified: 2 Jul 2012 | 22:14:54 UTC

A reference GTX560 Ti 448 is almost twice as fast as a GT640; ~1.85, but the 65W vs 210W TDP means the GTX560Ti448 uses ~3.25 times the power of a GT 640.
If you had the slots/PCIE bandwidth you would be better off with two GT640's. That said, if I was thinking about 2 I might wait for a GTX660, of some description.
Anyway, as MrS said it's very much down to the PSU.

PS. In the UK a GTX560Ti448 costs ~£190 while a GT640 costs ~£75, so for crunching here the GT640 wins hands down; ~37% better crunching performance per initial outlay and 75% cheaper to run.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

klepel
Send message
Joined: 23 Dec 09
Posts: 189
Credit: 4,718,956,245
RAC: 2,154,410
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26116 - Posted: 2 Jul 2012 | 23:41:10 UTC

Thanks a lot for your advice! Considering also your comments on the forum “Lower level 600 release?”, I think it will be best to wait until a release of GTX 660 (Kepler), as I do not need this card urgently, as I do have a spare GT8400:

1)As Rantanplan noted a WU of PAOLA would take around 21 hours to complete, with my lousy internet connection on this side of the world, I will miss most of the time the 24 h limit. (I will not start the argue again, but one of my cards sat idle yesterday, because the two finished long WUs, have not fully up-loaded until I did it manually today)

2)@ExtraTerrestrial Apes: The electric bill is always a concern, that’s why I would opt for a Kepler card. The card would run in a new computer, however until it is build I would try it on my motherboard which has two slots, and there yes I do have only 650 W, might be up-graded to 800 W.

3)@Skgiven: The two Pcie2 slots should stay PCIEx16 with a second card, as I mentioned it is only temporarily. And yes, I am very concerned about performance versus investment and power consumption, so I might well wait until autumn.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26144 - Posted: 3 Jul 2012 | 18:24:57 UTC - in response to Message 26116.

Waiting sounds about right for you then. And a quality 650 W PSU won't have any problems pushing CPU, GT640 and a GTX680 :)

You could use your 8400 at POEM now. I've tried it today with an 8400GS @ 1.0 GHz: it's doing insane ~33k RAC over there!

MrS
____________
Scanning for our furry friends since Jan 2002

Norman_RKN
Send message
Joined: 22 Dec 09
Posts: 16
Credit: 23,522,575
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwat
Message 26346 - Posted: 17 Jul 2012 | 19:43:39 UTC - in response to Message 26087.


We really need to measure the performance loss when using PCIE3 vs PCIE2 and at various amounts of lane count. It was done roughly on PCIE2 with the old app for previous generations of cards, but things have changed since then. Some apps perform much better with additional PCIE bandwidth, especially PCIE3 as it's twice as fast as PCIE2.


hello skgiven and ETA,
i put two gt 640 in my PCIE2 16x slots and the memory controller load is at 70%.
that means the system is not fully busy (~ 30% load free) or am i wrong and a PCIE3 is required ? gpu load is 96-98%.

another strange thing:
the first card in slot:
GPU-temp : 53.0 °C
Fan speed: 35 %
VDDC: 0.9870 Volt

the second card in slot:
GPU-temp : 59.0 °C
Fan speed: 47%
VDDC: 1.0120 Volt

both cards are running with the same speed (clock and memory). :/


greetz
____________
http://www.rechenkraft.net

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26354 - Posted: 17 Jul 2012 | 20:52:39 UTC - in response to Message 26346.

96-98% GPU utilization is high.
I'm sure PCIE3 will outperform PCIE2 but don't know by how much.

It's normal that different cards require different voltages, though task types can vary temps.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Jorge Alberto Ramos Olive...
Send message
Joined: 13 Aug 09
Posts: 24
Credit: 156,684,745
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwat
Message 26588 - Posted: 11 Aug 2012 | 6:31:36 UTC

Hello fellow volunteers: I have a 1GB GT640 running in PCIex1 (on my laptop using the PCIexpress slot).

I would like to contribute to benchmarking this card with this setup, what specs might you be interested?

Preliminary specs (from CPU benchmarks from BOINC):
- driver v 301.42
- cuda version 4.2
- compute capability 3
- 692 GFlops peak

Completed task 2UY5_45_9-PAOLA_2UY5-0-100-RND0091 in 27,250.00 seconds (20.5 h)

best regards,

Jorge.

Rantanplan
Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 26707 - Posted: 26 Aug 2012 | 10:33:30 UTC
Last modified: 26 Aug 2012 | 10:35:31 UTC

With current Paola Wus and a GT 640 u cannot claim a 24 hour bonus, it tooks about 27 Hours to complete.

http://www.gpugrid.net/workunit.php?wuid=3680231

Hype
Send message
Joined: 21 Nov 11
Posts: 10
Credit: 8,509,903
RAC: 0
Level
Ser
Scientific publications
wat
Message 33562 - Posted: 20 Oct 2013 | 11:57:45 UTC

Hey guys,
I'm thinking about getting one of these cards for crunching.
As this thread is over a year old, can you still recommend getting this card?
There's even one from ASUS with only 25W TDP, it has got a pretty slow memory which doesn't matter for GPUGrid anyway, does it?

If not the GT640, which would be a good card at about 50W TDP?

Profile X1900AIW
Send message
Joined: 12 Sep 08
Posts: 74
Credit: 23,566,124
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 33568 - Posted: 21 Oct 2013 | 11:49:06 UTC - in response to Message 33562.

If not the GT640, which would be a good card at about 50W TDP?

Why TDP and not real wattage ?

Some data from the total system, Intel i7 with GTX 650 TI (see first test last year):

    62 W - Idle (EIST off, fixed voltage: idle/load 0.996V) > CPU usage 0%
    135 W - GPUgrid (GPU OC +100Mhz GPU clock as Mem clock , standard voltage) > CPU usage 12%
    152 W - GPUgrid + Einstein@iGPU (HD4000, OpenCL) > CPU usage 15%
    176 W - GPUgrid + Einstein + Docking@CPU (Hyperthreading on 7 cores) > CPU usage 100%
    160 W - Einstein paused for testing > CPU usage 100%
    135 W - Docking + Einstein paused for testing > CPU usage 12%
    62 W - BOINC paused for testing > CPU usage 0%


According to your power supply you will see better efficiency, mine is simple 80plus. Round about you need ~70 watts for this oced GTX 650 TI. It runs with ~90 percent usage crunching short workunits (SANTI_MAR, cuda55) at 30% cooling fan (1100 rmp) and reaches 59 Celsius degree in my case, system is running quiet, but the fan of this GTX 650 TI is powerful enough for smaller cases or less implemented fans.

Last 3 days crunching short workunits for ~14 hours per day > 38k points RAC.



Otherwise, why eliminate your powerful GTX 570, is it too noisy ?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33569 - Posted: 21 Oct 2013 | 14:40:55 UTC - in response to Message 33562.

This project favors faster, mid-range to high end cards.
The GT640 is an entry level card. While it should still work, I wouldn't recommend a GT640 to crunch here (never did, just said it should work).
The GTX 650 Ti 2GB is probably the lowest GPU I would recommend.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Hype
Send message
Joined: 21 Nov 11
Posts: 10
Credit: 8,509,903
RAC: 0
Level
Ser
Scientific publications
wat
Message 33571 - Posted: 21 Oct 2013 | 15:39:50 UTC

I'm using the GTX 570 in my "main"-computer, which I only run when I'm at home.
At the moment I'm thinking about building a small 24/7 crunching-machine.

The GTX 570 + i5 4670k draw about 300 watts, which is too much for me (24/7).

So I'm trying to find a card with a nice credit/wattage-ratio to use in the other machine.

I thought about 100 watts 24/7... which will be difficult?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33573 - Posted: 21 Oct 2013 | 17:23:17 UTC - in response to Message 33571.

Your i5 has an 84W TDP and it's pricey. While it won't consume 84W it might use around 60 or 70W crunching CPU projects. The other components would probably use over 30W. Even a GeForce GTX 650 Ti would use around 100W by itself, so your not going to make 100W starting from an i5. Around 200W is doable though, but that's still not a good setup.
Maybe better to read and discuss this in the following thread,
http://www.gpugrid.net/forum_thread.php?id=3518&nowrap=true#33570
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Profile X1900AIW
Send message
Joined: 12 Sep 08
Posts: 74
Credit: 23,566,124
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 33574 - Posted: 21 Oct 2013 | 17:53:04 UTC - in response to Message 33571.
Last modified: 21 Oct 2013 | 17:54:35 UTC


I thought about 100 watts 24/7... which will be difficult?

0,1 * 24 * 365 = 876 kWh * 0,25 € (Germany) = 219 Euro, only for energy, each year.

If you crunch with 1x workunit per day with the GTX 570, you get points & contribute to the project, but don´t have to pay for the new hardware or find a place for it as time for service, backups and so on.

Plus you are more flexible when new hardware (minimum) requirements are defined. Maybe the up to now low end hardware is pushed out of the game in few months.

For my part I suspended all 24/7 crunchers and concentrate now on one single cruncher, which is a compromise between energy consumption (on the long run), loudness (low overclocing in summer, higher oc in winter) and performance (overclocking range, undervolting option).

Looking backward the last years there is a quick development in hardware releases, programming and new standards (CUDA). Middle class componentes will be fast graduated to low end, high end to middle class. But high end consume too much energy, so you are forced to replace in case of inefficieny, not in every period but in those with huge efficiency advantages.

Best wishes for your decision !

Hype
Send message
Joined: 21 Nov 11
Posts: 10
Credit: 8,509,903
RAC: 0
Level
Ser
Scientific publications
wat
Message 33576 - Posted: 21 Oct 2013 | 18:01:34 UTC

Yea it's difficult with our high energy prices... :/
Maybe I'll build a cheap CPU-only machine and use my main system to crunch a GPU WU every now and then.
I'll have to think about it.

kingcarcas
Avatar
Send message
Joined: 27 Oct 09
Posts: 18
Credit: 378,626,631
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33602 - Posted: 24 Oct 2013 | 11:04:08 UTC

Is the jump to 2GB in a 650 Ti a big deal? You are talking about $50 more once you factor in the lack of rebate.

Betting Slip
Send message
Joined: 5 Jan 09
Posts: 670
Credit: 2,498,095,550
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33603 - Posted: 24 Oct 2013 | 11:27:07 UTC - in response to Message 33602.
Last modified: 24 Oct 2013 | 11:27:53 UTC

Is the jump to 2GB in a 650 Ti a big deal? You are talking about $50 more once you factor in the lack of rebate.


Not much to the speed of the card but there have been WU's on here that have taken over 1Gb or close to 1Gb of memory which makes them either slow or impossible to run on a 1Gb card.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33680 - Posted: 30 Oct 2013 | 15:24:53 UTC
Last modified: 30 Oct 2013 | 15:26:38 UTC

As SK said, GT640 DDR3 was never recommended for GPU-Grid due to these reasons:
- It's memory bandwidth starved. Mine runs GPU-Grid at 60 - 70% memory controller utilization, with the memory already OC'ed (not much room). GTX660Ti is already limited by memory bandwidth and runs at ~40% utilization. At Einstein GT640 even runs at 99% memory controller utilization!
- GPU-Grid needs results back fast, hence the bonus for returning WUs early. This makes small cards a bad choice over here.
- For a little more money you can get significantly faster cards.

The new GT640 GDDR5 fares a little better due to more bandwidth available, but compared to GTX650 and higher it's still pretty imbalanced.

And as others have already said: don't buy cards with 1 GB or less for GPU-Grid. That's not sufficient any more even today.

Regarding the idea of a small cruncher to reduce energy cost; I don't think it's a particularly good one due to the following reasons:

- every computer has a certain idle power draw, no matter if you use it or not. For a modern "econo-box PC" that's 30 - 40 W. Which you always have to pay just for the system to be there, no matter how fast or slow it is. If you talk about 100 W overall power consumption that's almost half of the power draw "wasted for nothing"! To summarize: the more of your power budget you actually use for crunching, the more efficient the system becomes.

- PSU efficiency peaks around 40 - 60% load. The smallest high efficiency PSUs are the 400 W FSP Aurum (80+ Gold) and 450 W Antec Earthwatts (80+ Platinum). But running these at 100 W load (25%) misses their sweet spot.. it's not a catastrophe for the small Aurum (~4% loss), but any bigger PSUs really start to suffer, which reduces system efficiency.

- Starting from low-end GPUs there's a range where every x1% more purchase price gets you x2% more performance - with x2 being larger than x1. GT640 is borderline in this regard:
GT640 | 691 GFlops | 28.5 GB/s | 65€
GTX650Ti Boost | 1505 GFlops | 144.2 GB/s | 115€

The comparison doesn't look too bad for theoretical GFlops: GTX650Ti gets you 2.2 times the raw performance for 1.8 times the price. But GT640 can't make proper use of its raw horse power due to being bandwidth limited.. which the 5 times higher value of the GTX650Ti Boost hints at (although this card has a bit more bandwidth than it needs - GTX660Ti has just as much, but wants to sustain 2460 GFlops from that). I'd estimate the overall performance difference between GT640 and GTX650Ti boost in the range of 2.5 - 3.0.

- Faster GPUs will have better resale / reuse value.

What I'd do in your case: pimp your primary rig a bit and you may be able to make it fit for 24/7 crunching, or just a bit more than it's doing now (depending on how much you want to invest). By running the faster rig a bit more often you could get the same throughput but wouldn't have to buy entirely new parts. A few points to consider:

- What PSU do you currently have? Exchanging it for a 80+ Gold model in the 400 - 500 W range (e.g. that 400 W FSP Aurum) could quickly pay for itself by reducing energy cost, depending on what you're currently using.

- Is you CPU heavily overclocked, since you're using a "K"? If so energy efficiency obviously suffers, a lot. In this case consider taking it back a few steps. My i7 3770K is running at 4.10 GHz at 1.03 V - that's pretty efficient and for me OK to run 24/7. And I wouldn't feel a difference compared to e.g. 4.5 GHz at 1.2+ V anyway, except for the additional heat, noise and electricity bill.
If you don't OC: you could lower your CPU voltage significantly below stock (as I have at 4.1 GHz), which should save ~20 W over the stock configuration.

- You could adjust your GTX570's clock speed and voltage down a bit. Fermi doesn't have much room for improvements here, though. Increasing fan speeds to lower temperatures could also gain you a few W - if the noise was OK (probably not).

- You could exchange your GPU for a medium sized Kepler. Performance would stay about the same, while power consumption may be lowered by up to 100 W under load. Running 24/7 this would save you ~200 €/year, which means a GTX660Ti would pay for itself within one year. And can be power-tuned well by reducing the power target.. down to about 100 - 110 W for the card under load (stock: 130 W, it's power target)

- If you're crunching on the CPU you could stop it altogether to get power consumption under control. This depends on your project choices, of course. For soem credits count (-> GPU), while for others WCG badges (CPU time) are holy.

MrS
____________
Scanning for our furry friends since Jan 2002

Post to thread

Message boards : Graphics cards (GPUs) : GT 640 DDR3

//