Milkyway@home on ATI cards

Message boards : Number crunching : Milkyway@home on ATI cards
Message board moderation

To post messages, you must log in.

1 · 2 · 3 · 4 · Next

AuthorMessage
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6361 - Posted: 3 Feb 2009, 20:48:38 UTC

DoctorNow wrote:
It's also possible now to run a new custom-made MilkyWay@Home-app on your GPU, but currently ONLY possible with an ATI-card and a 64-Bit Windows-system.
More details you can read in this thread.


Thought I'd just inform you, as it surely gets overlooked in the other thread. But be warned, currently it's really in pre-alpha stage. Buying a card for that wouldn't be fun. But if you've already got a HD38x0 (64 shader units) or HD48x0 (160 units) you might want to check it out. The speed is rediculous :)

Paul D. Buck wrote:
If they get it out the door soon I might just get a couple of the lower end ATI cards that can handle it just for the mean time till they get the Nvidia version done


NV version is not going to happen anytome soon as they use double precision exclusively. You may remember that NV included 30 double units in GT200 along with the 240 single precision shaders. Well, ATIs RV770 has 160 5-way VLIW units and all of them can run 1 or 2 doubles each clock. That's such a massive advantage, it just plain wouldn't make sense to use NV cards here.

MrS
Scanning for our furry friends since Jan 2002
ID: 6361 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Paul D. Buck

Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 6368 - Posted: 3 Feb 2009, 21:46:30 UTC

Which may mean that I will have a mix of GPUs in my future ...

I don't mind, I have had good luck with ATI cards (and with Nvidia ... 6 of one ... two dozen of the other .... or something)

Oh, I got the GTX 295 card today, a day early ... cards are moved about ... and I have 4 in flight on the i7 which is nice ... all the cards are moved about ...

The wall draw of the pair of 295s is the same as the draw of the 295 and 280 card ... just for your information ...

Now I only have the disk problem on the mac pro that is raining on my life ... sigh ... less than 10% space and the disk utilities are crashing and there is an error on the disk ...

So, I got 6 new 1.5 TB drives on the way ... so how long will it take me to fill up a 3.something TB RAID 5 array ... ah well ... several days of file moving and installing and configuring the os ... sigh ...
ID: 6368 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MarkJ
Volunteer moderator
Volunteer tester

Send message
Joined: 24 Dec 08
Posts: 738
Credit: 200,909,904
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6388 - Posted: 4 Feb 2009, 11:11:30 UTC - in response to Message 6361.  

DoctorNow wrote:
It's also possible now to run a new custom-made MilkyWay@Home-app on your GPU, but currently ONLY possible with an ATI-card and a 64-Bit Windows-system.
More details you can read in this thread.


Thought I'd just inform you, as it surely gets overlooked in the other thread. But be warned, currently it's really in pre-alpha stage. Buying a card for that wouldn't be fun. But if you've already got a HD38x0 (64 shader units) or HD48x0 (160 units) you might want to check it out. The speed is rediculous :)

Paul D. Buck wrote:
If they get it out the door soon I might just get a couple of the lower end ATI cards that can handle it just for the mean time till they get the Nvidia version done


NV version is not going to happen anytome soon as they use double precision exclusively. You may remember that NV included 30 double units in GT200 along with the 240 single precision shaders. Well, ATIs RV770 has 160 5-way VLIW units and all of them can run 1 or 2 doubles each clock. That's such a massive advantage, it just plain wouldn't make sense to use NV cards here.

MrS


One of the guys mentioned this in BOINC_dev a while back. I recall Dr A asking for details so he could add ATI support into BOINC. As to if he got the details I don't know.
BOINC blog
ID: 6388 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
localizer

Send message
Joined: 17 Apr 08
Posts: 113
Credit: 1,656,514,857
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6389 - Posted: 4 Feb 2009, 11:40:59 UTC - in response to Message 6388.  

.... before we get too excited, doesn't the GPU work hit the credit/hour limit? Surely as you are using a GPU to crunch a CPU WU, as opposed to what happens here, the credits generated will be subject to the same limits.

Nice of Dr A to show an interest in ATI - I hope he has the time to do fit that in with all the other items he is currently juggling/fumbling.
ID: 6389 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6404 - Posted: 4 Feb 2009, 20:06:35 UTC - in response to Message 6389.  

One of the guys mentioned this in BOINC_dev a while back. I recall Dr A asking for details so he could add ATI support into BOINC. As to if he got the details I don't know.


Read about that a few days ago as well. They found something which would help him, but I don't know about any further progress either.

.... before we get too excited, doesn't the GPU work hit the credit/hour limit? Surely as you are using a GPU to crunch a CPU WU, as opposed to what happens here, the credits generated will be subject to the same limits.


Exactly. That's what I had in mind when I said "Buying a card for that wouldn't be fun." The credit rules could change any day, though.

MrS
Scanning for our furry friends since Jan 2002
ID: 6404 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
pharrg

Send message
Joined: 12 Jan 09
Posts: 36
Credit: 1,075,543
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 6637 - Posted: 14 Feb 2009, 17:56:24 UTC

Something to keep in mind when building a new system, there are several motherboards out now that would support BOTH nVidia SLI and ATI Crossfire in a single system. You could conceivably have multiple cards of each running simultaneously... if you have the money to buy all those cards...
ID: 6637 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Paul D. Buck

Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 6638 - Posted: 14 Feb 2009, 18:43:18 UTC - in response to Message 6637.  

Something to keep in mind when building a new system, there are several motherboards out now that would support BOTH nVidia SLI and ATI Crossfire in a single system. You could conceivably have multiple cards of each running simultaneously... if you have the money to buy all those cards...


In a system with two PCI-e, one of each ...
Three PCI-e, two of one, one of the other ...
Four PCI-e, two and two ...
Five PCI-e, major electrical fire!
ID: 6638 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6645 - Posted: 14 Feb 2009, 19:26:56 UTC
Last modified: 14 Feb 2009, 19:27:50 UTC

But you wouldn't need SLI or crossfire for crunching, in fact you have to disable SLI anyway (don't know about crossfire). So the actual limit is rather the amount of power, cooling and space that you can provide.. ;)
Edit: oh, and I don't know what windows does if you mix multiple cards which require different drivers.

MrS
Scanning for our furry friends since Jan 2002
ID: 6645 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Paul D. Buck

Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 6648 - Posted: 14 Feb 2009, 21:34:51 UTC - in response to Message 6645.  

Edit: oh, and I don't know what windows does if you mix multiple cards which require different drivers.


In theory, as I have not tried it yet, is that you just install the card and the appropriate drivers and windows should be happy. Hard to say if there is hardware or driver incompatibility that would cause clashes though.

In my case, I would likely take the conservative position and allocate cards to machines keeping all ATI in one and Nvidia in others...

Though this is all in the future in that the first ATI application is not really ready for prime time and wide distribution on a variety of systems. As best as I can tell I have neither the right card or the right OS in my collective.

Unlike some though I can wait ... heck I still have to digest the 4 new GPUs that I have installed in the last two months ...
ID: 6648 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6656 - Posted: 15 Feb 2009, 11:58:43 UTC

Apparently with some fiddling around you can run a game on an ATI card and use a nVidia for PhysX. So.. there's hope for some heterogeneous GPU landscape ;)

MrS
Scanning for our furry friends since Jan 2002
ID: 6656 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Jeremy

Send message
Joined: 15 Feb 09
Posts: 55
Credit: 3,542,733
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 6676 - Posted: 16 Feb 2009, 16:18:24 UTC - in response to Message 6656.  

Mixing video card brands in the same box only really works in Vista and Windows 7 atm. Don't even think of trying it in XP of any flavor. It won't be happy with two different display drivers fighting each other behind the scenes from what I've read.
ID: 6676 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Paul D. Buck

Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 6680 - Posted: 16 Feb 2009, 18:12:41 UTC - in response to Message 6676.  

Mixing video card brands in the same box only really works in Vista and Windows 7 atm. Don't even think of trying it in XP of any flavor. It won't be happy with two different display drivers fighting each other behind the scenes from what I've read.


Well, that answers that ...

I did not have an ATI card of note that would allow me to have tested this and now i don't have too ...

Though I am tempted to get one for my sole 64 bit machine so that I can take part in the GPU revolution happening at MW ...
ID: 6680 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
nico342

Send message
Joined: 20 Oct 08
Posts: 11
Credit: 2,647,627
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 6813 - Posted: 20 Feb 2009, 13:49:42 UTC - in response to Message 6680.  
Last modified: 20 Feb 2009, 13:50:40 UTC

Is someone know what is require to run Milkywayathome project on ATI GPU. I'm looking for because I own an ATI 2400HD pro.

Thanks
ID: 6813 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Edboard
Avatar

Send message
Joined: 24 Sep 08
Posts: 72
Credit: 12,410,275
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 6814 - Posted: 20 Feb 2009, 14:07:07 UTC - in response to Message 6813.  

You need an ATI card with RV670 chip and up: HD38x0, HD4670?? y HD48x0.
ID: 6814 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6818 - Posted: 20 Feb 2009, 19:03:31 UTC - in response to Message 6814.  

Last time I checked 46xx series didn't work, only 38xx and 48xx.

MrS
Scanning for our furry friends since Jan 2002
ID: 6818 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Temujin

Send message
Joined: 12 Jul 07
Posts: 100
Credit: 21,848,502
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwat
Message 6863 - Posted: 22 Feb 2009, 10:54:04 UTC - in response to Message 6818.  
Last modified: 22 Feb 2009, 10:55:36 UTC

Would anyone like to comment on the huge difference in credit awarded by the MilkyWay ATI GPU app and the GPUGrid GPU app?

For example, my GTX260-216 returns about 13,000 credits a day at GPUGrid while my ATI HD4870 returns 77,000 credits a day at MilkyWay.
I don't know how the 2 cards compare but don't imagine they are miles apart in performance.

Possible reasons
1, some exceptionally efficient coding of the ATI GPU app by Gipsel
2, Milkyway awarding a higher than average credit return (despite recent adjustments)
3, inefficient coding of the GPUGrid GPU app
4, GPUGrid having a lower than average credit award
5, ATI cards are just better at Milkyway WUs than NVidia cards are at GPUGrid WUs

I'm not suggesting a change in credit awards, I'm just puzzled at what appears to be a huge difference from what I would think are similar cards
ID: 6863 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6865 - Posted: 22 Feb 2009, 11:39:55 UTC - in response to Message 6863.  

1, some exceptionally efficient coding of the ATI GPU app by Gipsel

yes
2, Milkyway awarding a higher than average credit return (despite recent adjustments)

yes
3, inefficient coding of the GPUGrid GPU app

no
4, GPUGrid having a lower than average credit award

Can't say for sure.
5, ATI cards are just better at Milkyway WUs than NVidia cards are at GPUGrid WUs

In some sense.. yes.

OK, that was the short version. Let me eleborate a bit:

Milkyway is an exceptional app in the way that the algorithm is perfectly suited to GPUs. The ATIs almost reach their peak FLOPS, a very rare case [if you intend do do anything useful in your code ;) ]. I think MW is still giving out a bit too much credits for CPUs.. now throw in the high-end GPUs, which are at least one order of magnitude faster, and you get a complicated situation.

The main problem is: awarding credits according to the benchmark was an idea which was bound to fail in practice. Now we have FLOP counting.. which leads to another problem: if you have a small and very well tuned app like MW you will automatically extract higher FLOPS from your hardware than with more complex code. You could say that the hardware is running this code more efficiently. So.. should we all only run apps like MW and neglect the rest, because they give many more credits per time? I don't think this is what BOINC is made for and I don't see a solution yet.

And a side note: under SETI@NV the current draw and temperatures are lower than under GPU-Grid, so you can be sure that GPU-Grid is not coded unefficiently ;)
And we won't get an apples-apples comparison between the ATI and NV cards here, because MW runs double precision, where NV cards are really weak. Their crunching power is roughly comparable at singles, though.

MrS
Scanning for our furry friends since Jan 2002
ID: 6865 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
STE\/E

Send message
Joined: 18 Sep 08
Posts: 368
Credit: 4,174,624,885
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 6870 - Posted: 22 Feb 2009, 14:42:04 UTC
Last modified: 22 Feb 2009, 15:20:34 UTC

It's definitely going to draw some people away from the GPUGrid Project no matter what. If you can get 60,000 to 70,000 Per Day verses 10,000 to 13,000 Per Day here not counting the GTX 295's what you gonna do.

Even the GTX 295's are only capable of about 25,000 but cost 2 1/2 to 3 times the amount it does for a ATI 4870 it stands to reason to go with the ATI's & the Project that can use them. I won't lessen my Participation here @ the moment but I have ordered 2 ATI 4870's already & will order more as needed & if need be shut the NVidia's down over Time to save Electrical Cost.

The word is a Nvidia Application will be out @ the MWay Project but it hasn't showed up yet, that & the word is also the NVidia Application's will be 3-4 Times slower, so a a Single ATI 4870 will be able to produce as much as 2 GTX 295's @ a Quarter or less of the Cost ...
ID: 6870 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile UBT - Ben

Send message
Joined: 12 Aug 08
Posts: 8
Credit: 137,219
RAC: 0
Level

Scientific publications
watwatwat
Message 6874 - Posted: 22 Feb 2009, 15:18:45 UTC - in response to Message 6870.  

The only Nvidia cards which will be able to take part if MW do build a CUDA app is the GTX200 series as they can support double precision data i.e 12.3984958

However even then like poorboy has said, the GTX's won't be able to get anywhere near the top ATI cards performance.
ID: 6874 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6888 - Posted: 22 Feb 2009, 17:49:41 UTC - in response to Message 6874.  

The only Nvidia cards which will be able to take part if MW do build a CUDA app is the GTX200 series as they can support double precision data i.e 12.3984958


You're right, apart from the fact that your example is floating point, not double precision ;)

G80, G92 etc. are fine with single precision floating point, that is numbers are represented by 32 bits. Double precision requires 64 bit, which these chips can't do. If I remember correctly a GTX 280 can do about 30 GFlops in double precision, whereas an RV770 can do 200+. If MW goes CUDA, the credits will reflect this ratio.

The real problem with FLOP counting is: if you create an app which just runs a stupid loop in the CPU caches and maximises ressource utilization, you should get the most credits per time for just running this application. MW is such an application, except for the fact that they actually do science in their highly optimized / efficient hot loop. So how are you going to award credits here? Based on FLOPS you have to give out many more..

MrS
Scanning for our furry friends since Jan 2002
ID: 6888 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
1 · 2 · 3 · 4 · Next

Message boards : Number crunching : Milkyway@home on ATI cards

©2025 Universitat Pompeu Fabra