Message boards :
Graphics cards (GPUs) :
Restricting use to one GPU in a multi-GPU system
Message board moderation
| Author | Message |
|---|---|
|
Send message Joined: 2 Jan 09 Posts: 40 Credit: 16,762,688 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]()
|
The preferences of BOINC manager (version 6.4.5) has an option, "On multiprocessor systems, use at most % of the processors." However, this preference has no apparent effect on the number of GPU's that are used, as it seems to only affect the number of CPU's used. How can this application be configured to use only one particular GPU without rudely seizing _all_ the GPU's available in the system for itself? |
Stefan LedwinaSend message Joined: 16 Jul 07 Posts: 464 Credit: 298,573,998 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
As of now that's not possible. Maybe someone of the BOINC devs will hear you and eventually understand that GPU resources should be treated different than CPUs if you also post this question as a feature request to the BOINC forum... pixelicious.at - my little photoblog |
Paul D. BuckSend message Joined: 9 Jun 08 Posts: 1050 Credit: 37,321,185 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
As of now that's not possible. He (or she) should by all means ask this question also on the BOINC Dev mailing list. Sadly, this was one of the points that several of us brought up and as of the last time I looked (yesterday) the design sketch by Dr. Anderson is not going to address this issue at all ... The unfortunate thing is that I doubt that he will ever listen to the participants, no matter how many of us complain ... if the PROJECT complains ... well, that might see some results... |
Bender10Send message Joined: 3 Dec 07 Posts: 167 Credit: 8,368,897 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]()
|
Your 'Restricting use to one GPU....' question caught my attention. But, your reason (or lack a reason) for wanting to 'select' or 'single out' 1 GPU out of many is missing. How can this application be configured to use only one particular GPU without rudely seizing _all_ the GPU's available in the system for itself? One reason that comes to mind is 'gaming'. Most people (except those with smokin' fast GPU's, I think..) have found that you play better with Boinc 'suspended' or shut down. Anyway... I tried to 'single out' a GPU last month (I'm running 4+2) when I was helping out on another project. And of course, it would not work the way I would have liked it...but that's another story... But, I did confirm that I could run 2 different GPU projects at the same time, on the same GPU(s). Although, I did have a completion time 'penalty' (completion time increase) for my GPUgrid WU, as I was sharing GPU cycles with 2 different projects. Some day (multiple) GPU Independence (or will that be non-interdependence...lol) will be a reality, maybe....we can hope. Consciousness: That annoying time between naps...... Experience is a wonderful thing: it enables you to recognize a mistake every time you repeat it. |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Your 'Restricting use to one GPU....' question caught my attention. But, your reason (or lack a reason) for wanting to 'select' or 'single out' 1 GPU out of many is missing. I can imgaine people wanting to have GPU 1 render their desktop and GPU 2 do the crunching to avoid lag. MrS Scanning for our furry friends since Jan 2002 |
Paul D. BuckSend message Joined: 9 Jun 08 Posts: 1050 Credit: 37,321,185 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
It also depends on the game ... I used to run EverQuest on three different systems with varying capabilities and I never noticed a problem ... of course that was then and this is now with a significantly different BOINC Manager ... so, my experience may be out of date ... Though I have run Spore and did not notice any problems... I suppose if you are playing a high frame rate game (first person shooter I suppose) this may be an issue ... or an urban legend ... :) Anyway, I do know that many always did shut down BOINC and several would complain days later that they forgot to start it up again when they quit playing ... but that is a different issue ... I have not tired games with BOINC running a CUDA task yet though ... too many other things on my limited mind ... |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Nevermind the BOINC cpu load during games, the problem is much more severe when crunching CUDA. But it gets less pronounced the faster the card is. MrS Scanning for our furry friends since Jan 2002 |
Paul D. BuckSend message Joined: 9 Jun 08 Posts: 1050 Credit: 37,321,185 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Nevermind the BOINC cpu load during games, the problem is much more severe when crunching CUDA. But it gets less pronounced the faster the card is. Maybe it is a good thing I gave up games for no good reason ... :) Well, I will be trying one soon enough ... though at the moment I am playing too much with BOINC and don't seem to have the spare time ... |
koschiSend message Joined: 14 Aug 08 Posts: 127 Credit: 913,858,161 RAC: 15 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
This advantages of restricting BOINC to use just one GPU applies not only to gaming. Think about recent motherboards based on CUDA capable nvidia IGP chipsets. Those might have 16 shader units at a low clock speed. But as BOINC itself doesn't care about this, it will start a WU on the IGP that will fail to meet the deadline. Actually, I thought about buying a new nvidia based µATX board with an integrated Geforce 9300 or 9400. That would be enough to power my Linux desktop with enabled composite effects. Right now this is still handled by my 8800GTs and often enough its kind of choppy :-/ I dropped the idea got now... |
|
Send message Joined: 24 Dec 08 Posts: 738 Credit: 200,909,904 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
The preferences of BOINC manager (version 6.4.5) has an option, "On multiprocessor systems, use at most % of the processors." However, this preference has no apparent effect on the number of GPU's that are used, as it seems to only affect the number of CPU's used. Over on Seti we were using app_info.xml's and had to specify the cuda part. This was because we had optimized apps. Under the app_name we put: <coproc> <type>CUDA</type> <count>1</count> </coproc> This should limit it to a single gpu. You'll need all the other bits for the app_info if you are going to build one. This is not something for the inexperienced - If you stuff things up it will usually wipe any work units on your local machine straight away. It is best to upload and report results before starting. |
|
Send message Joined: 2 Jan 09 Posts: 40 Credit: 16,762,688 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]()
|
The system's dedicated GPUs were not obtained for the sole purpose of running BOINC. When the time comes for another program to have full access to a GPU, BOINC need not be killed if it can cooperate by restricting itself to a spare GPU. Also, attempts to complete work units on the relatively weak GPU integrated into the mainboard seem a bit daft considering that deadlines tend to expire. Thus, restricting usage to certain GPU devices is desirable. If the BOINC application interface provides no help, is there another way to limit GPU usage via another layer of GPUGRID? |
Paul D. BuckSend message Joined: 9 Jun 08 Posts: 1050 Credit: 37,321,185 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Thus, restricting usage to certain GPU devices is desirable. This and other issues like it have been raised to the developers. Now it remains to be seen if they make any changes. This is where the project types can really make a difference. Participants urging features don't seem to make much of an impact. Projects insisting on features does ... |
|
Send message Joined: 2 Jan 09 Posts: 40 Credit: 16,762,688 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]()
|
Apparently, no user friendly method exists to tell BOINC to stay off particular GPUs, aside from shutting the whole process down. Since GPUGRID assigns one process per GPU, manually suspending a process remains an option. The drawback to this workaround is that the associated work unit becomes frozen. Regarding the integrated GPU, disabling it entirely remains an alternative to manually freezing the associated GPU process every single time BOINC is started. Yet disabling the integrated GPU denies an otherwise available GPU to other applications (such as high definition video), thus reducing overall system performance at full load. Meanwhile, none of these workarounds or annoying trade-offs need be considered for applications that offer a method of excluding specific GPU devices. |
Paul D. BuckSend message Joined: 9 Jun 08 Posts: 1050 Credit: 37,321,185 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Apparently, no user friendly method exists to tell BOINC to stay off particular GPUs, aside from shutting the whole process down. I propose these controls during the code rewrite going on now but got a response that seems to indicate that we will not be seeing something like this anytime soon. At the moment it seems we are not going to see any realistic GPU controls like the one where you can turn off one or more CPUs in a multi-cpu system ... what is special about GPUs is that just "blindly" saying turn one off means that you are still at the mercy of the system, and it inevitably will pick the wrong GPU every time ... |
|
Send message Joined: 17 Aug 08 Posts: 2705 Credit: 1,311,122,549 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
From my point of view the problems which the BOINC team has tried to adress in the last months (with limited success) seem almost trivial compared to this.. so I wouldn't hold my breath for a quick solution either. MrS Scanning for our furry friends since Jan 2002 |
Paul D. BuckSend message Joined: 9 Jun 08 Posts: 1050 Credit: 37,321,185 RAC: 0 Level ![]() Scientific publications ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
From my point of view the problems which the BOINC team has tried to adress in the last months (with limited success) seem almost trivial compared to this.. so I wouldn't hold my breath for a quick solution either. One of the reasons is Dr. Anderson has a long history of not allowing people to contribute changes... if it is not in his vision, won't be allowed in ... which is a mixed bag if he has a good vision ... but, controls that are much desired by users can be continually left out because he does not see the need ... Yes, you can dl the sources and make your own BOINC Manager, but, that requires programming experience and then you have to maintain the code forever ... |
©2025 Universitat Pompeu Fabra