New beta application for kepler is out

Message boards : Graphics cards (GPUs) : New beta application for kepler is out
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 · Next

AuthorMessage
jlhal

Send message
Joined: 1 Mar 10
Posts: 147
Credit: 1,077,535,540
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25030 - Posted: 13 May 2012, 11:01:37 UTC - in response to Message 25026.  
Last modified: 13 May 2012, 11:23:23 UTC

Xubuntu 12.04 UP-TO-DATE (Linux 3.2.0.24 (AMD64)
Boinc 7.0.27 (X86)
Driver 295.40
GTX 590

I just noticed that Boinc 7.0.27 from advanced distribution is a beta x86 version and not an x64 one .
Does it make a difference for Beta tasks ?
Normal and Long runs seem to run OK with this version of Boinc.
How can I obtain an x64 version of Boinc for ubuntu ?
Wher ecan I obtain the 295.49 version of the driver ?
Lubuntu 16.04.1 LTS x64
ID: 25030 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Rantanplan

Send message
Joined: 22 Jul 11
Posts: 166
Credit: 138,629,987
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25032 - Posted: 13 May 2012, 11:33:42 UTC - in response to Message 25030.  

Wher ecan I obtain the 295.49 version of the driver ?


geforce.com
ID: 25032 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25033 - Posted: 13 May 2012, 11:38:32 UTC - in response to Message 25018.  

For example, even though NVIDIA says their stock boost frequency is 1058, mine came out of the box at 1110.

There's not a single boost clock. It actually goes up in fine increments, depending on conditions. The number quoted by nVidia is the average you can expect, measured using many games and settings. The actual boost clock depends on temperature and power consumption - which both depend on chip quality, as well as the executed code.

MrS
Scanning for our furry friends since Jan 2002
ID: 25033 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25035 - Posted: 13 May 2012, 15:12:40 UTC - in response to Message 25026.  


Well, i think that nvidia-settings doesn't FULL support GTX680
I have 4 levels in nvidia-settings (0 to 3) the levels 2 and 3 are at the same clock.

I tried the coolbits option but the only thing i can change is the fan speed.

BTW, my GTX 680 is an EVGA SC+

I am certain the GTX680 isn't fully supported by the drivers yet; otherwise there would not be driver updates, and there always are. Most of these updates improve specific functions often in an attempt to improve performance on certain games, but occasionally the updates bring additional CUDA capabilities or improved CUDA performances. It's also possible that actual speeds aren't reported correctly!

EVGA say your cards boost is 1124MHz, which is reasonably high.

The purpose of suggesting you use the coolbits option to increase fan speeds was to reduce the chance of downclocking due to overheating or using too much power (cooling can reduce the power/voltage required).
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 25035 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25036 - Posted: 13 May 2012, 15:33:28 UTC - in response to Message 25030.  

Xubuntu 12.04 UP-TO-DATE (Linux 3.2.0.24 (AMD64)
Boinc 7.0.27 (X86)
Driver 295.40
GTX 590

I just noticed that Boinc 7.0.27 from advanced distribution is a beta x86 version and not an x64 one .
Does it make a difference for Beta tasks ?
Normal and Long runs seem to run OK with this version of Boinc.
How can I obtain an x64 version of Boinc for ubuntu ?
Wher ecan I obtain the 295.49 version of the driver ?


The app being tested here is,
ACEMD beta version 6.43 x86_64-pc-linux-gnu (cuda42)

I know that a 32bit client does allow 64bit CPU apps to work on 64bit Windows, but I'm not sure about a 32bit client supporting a 64bit GPU app on Linux?

We are testing Beta apps on Beta clients using untested OS with immature drivers!
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 25036 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25037 - Posted: 13 May 2012, 15:55:58 UTC

LOL. Got a good chuckle out of that one. Ah being on the front line!!!
ID: 25037 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
wdiz

Send message
Joined: 4 Nov 08
Posts: 20
Credit: 871,871,594
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25038 - Posted: 13 May 2012, 16:47:49 UTC - in response to Message 25035.  


Well, i think that nvidia-settings doesn't FULL support GTX680
I have 4 levels in nvidia-settings (0 to 3) the levels 2 and 3 are at the same clock.

I tried the coolbits option but the only thing i can change is the fan speed.

BTW, my GTX 680 is an EVGA SC+

I am certain the GTX680 isn't fully supported by the drivers yet; otherwise there would not be driver updates, and there always are. Most of these updates improve specific functions often in an attempt to improve performance on certain games, but occasionally the updates bring additional CUDA capabilities or improved CUDA performances. It's also possible that actual speeds aren't reported correctly!

EVGA say your cards boost is 1124MHz, which is reasonably high.

The purpose of suggesting you use the coolbits option to increase fan speeds was to reduce the chance of downclocking due to overheating or using too much power (cooling can reduce the power/voltage required).

Well, i tried to increase fan rotation to decrease GPU temp but no effect still stuck @ 705
Anyway GPU temp is low 78°c and Fan @ 55% with my other GTX580 i was @ 63% and 86°c
So, i tried to use nvidia Beta drivers (302.07) no more success, GPU clock still @ 705 Mhz according to nvidia-settings
For your information the Beta work failed with this driver
ID: 25038 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25040 - Posted: 13 May 2012, 18:28:07 UTC - in response to Message 25011.  

Wdiz wrote:
The values i have in nvidia-settings are :
Graphics clock : 705 Mhz
Memory clock : 3104 Mhz
Processor clock : 1411 Mhz

This looks almost certainly like nvidia-settings doesn't know about your card. Kepler doesn't have a "hot clock", i.e. doesn't run the shaders at 2x the GPU core speed any more. 705 MHz and 1411 MHz is a factor of 2, so I suppose 1411 is not your CPU speed, but rather the shader clock the tool would report for Fermi cards.

This fits initial Kepler rumors, where the clock speed was reported as ~700 MHz, instead of the actual shipping 1 GHz.

MrS
Scanning for our furry friends since Jan 2002
ID: 25040 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25041 - Posted: 13 May 2012, 18:54:58 UTC
Last modified: 13 May 2012, 19:23:45 UTC

Kepler throttles at both 70 and 80C, but not by that much. Yours is not at 700, you should increase fan though, probably to 65-70%

Here is a pic from NVIDIA forums to show the throttling in action:

ID: 25041 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25044 - Posted: 13 May 2012, 19:41:53 UTC - in response to Message 25041.  
Last modified: 13 May 2012, 19:42:43 UTC

Note that the fan remained at 30% throughout - well, except for the early blip.
When the temps reached about 70°C the card was throttled - You can see a slight Voltage and core clock drop at around 70°C, before the Voltage was 1.162V.

Any benchmarks that fail to manage the fans/temps should be completely dismissed as amateurish. Anyone that runs a 28nm GPU so hot (70°C+), with low fan speeds, doesn't know what they're doing!
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 25044 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25045 - Posted: 13 May 2012, 19:49:55 UTC
Last modified: 13 May 2012, 19:54:47 UTC

Clearly.

They were just showing the throttling in action for this pic. Quite a few discussions on NVIDIA forums about how GPU boost is a bad idea, and how NVIDIAs "stock" fan profile is horrendous. But as I stated, they just wanted to see what sort of throttle was being applied.

Also, why NVIDIA chose 70C and 80C. Sine 70C is not all that hot for gamers (think if Fermi throttled at 70C), for those who don't adjust profiles anyways. Probably should have been like 80 and 85

Like Intel, they seem to be making more products for people that have no idea what they're doing. "Dude did you see how much better their IGP is!!" LOL
ID: 25045 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25058 - Posted: 14 May 2012, 4:45:46 UTC - in response to Message 25045.  
Last modified: 14 May 2012, 4:48:50 UTC

Like Intel, they seem to be making more products for people that have no idea what they're doing.


Thats one recipe for success. It worked very well for Gates and Jobs.

Anyway, the beta app runs fine here on my GTX 570 and Linux though I had one heck of a time updating to the 295.40 driver. It and the 295.49 driver hosed my X server about 7 times before I learned you have to suck on a red M&M while installing, not a blue one. How does it know I hate the red ones?

(my machine never sleeps/hibernates so I won't be affected by the bug)
ID: 25058 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
5pot

Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25059 - Posted: 14 May 2012, 4:53:19 UTC

lol, very true.

Nice run times by the way. Stock clocks?
ID: 25059 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25063 - Posted: 14 May 2012, 8:24:46 UTC - in response to Message 25059.  

Thanks.

I have to admit I don't know if the clocks are stock or not. I have never OC'd it but I think Asus may have? The nvidia-settings utility says:

Graphics: 742 MHz
Memory: 1900 MHz
Processor: 1484 MHz
PCIe link width: x16
PCIe Link Speed: 5.0 GT/s
Fan Speed: 85%
Temp: 41 C

It's on a low end Asus P8 H67-M EVO mobo.

The temp is pretty low because the machine is outdoors and it's about 12 C outdoors right now (evening). It'll get up to 58 or 60 C in the daytime.

It runs on Linux which I have heard runs the science app faster than Windows. Maybe that's what gives the nice run times. It makes up for other projects whose app runs slower on Linux.
ID: 25063 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25078 - Posted: 14 May 2012, 18:52:37 UTC - in response to Message 25063.  
Last modified: 16 May 2012, 18:07:13 UTC

Just a driver note:
Boinc 7.0.26(x64) reports that my NVidia 295.73 driver for Windows supports CUDA4.2,
GeForce GTX 470 (driver version 295.73, CUDA version 4.20, compute capability 2.0, 1280MB, 1203MB available, 1219 GFLOPS peak)

however NVidia Control Panel says,
NVCUDA.DLL 6.14.12.9573 NVIDIA CUDA 4.1.1 driver

Control Panel also said the 296.10 driver was CUDA 4.1.1.

The 301.24 driver is reported as being 4.2.1 by NVidia but as 4.20 by Boinc.
Ditto for the 301.32 driver (which causes my monitor to auto-adjust repeatedly).

Same with Boinc 7.0.27(x64)
GeForce GTX 470 (driver version 301.32, CUDA version 4.20, compute capability 2.0, 1280MB, 1174MB available, 1219 GFLOPS peak)
OpenCL: NVIDIA GPU 0: GeForce GTX 470 (driver version 301.32, device version OpenCL 1.1 CUDA, 1280MB, 1174MB available)
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 25078 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25085 - Posted: 14 May 2012, 22:38:46 UTC - in response to Message 25078.  

You're seeing weird version numbers too, huh? I tried installing the 295.49 drive on Linux and when I rebooted I got:

ERROR: API mismatch: the NVIDIA kernel module has version 295.40, but this NVIDIA driver component has version 295.49

The boot stopped dead at that point.

Seems like NVIDIA got some version numbers wrong or something. Another odd thing is that their site claims the Linux 295.49 driver is more recent than the 302.07. The former is dated May 3/12, the latter May 2/12 according to http://www.nvidia.com/object/linux_amd64_display_archive.html. It shakes my confidence in them to the point where I'll stay with 295.40, which is good enough for the beta app, until the next driver release.
ID: 25085 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25087 - Posted: 14 May 2012, 23:31:00 UTC - in response to Message 25085.  
Last modified: 14 May 2012, 23:39:01 UTC

Same at GeForce:
    Version 295.49
    Release Date Thu May 03, 2012
    Operating System Linux 64-bit
    Language English (US)
    File Size 55.81 MB


Maybe they mixed up 05/03/2012 with 03/05/2012 (US vs UK), or they went back to 295.40 and made an update to that release version?

I used the Ubuntu repository version 295.40 and the same for Boinc (7.0.24 running as a daemon). It worked, despite the black screens and resets, so I'm leaving good enough alone.

- Actually, 302.07 is a Beta while 295.49 is a standard release.


FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 25087 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25089 - Posted: 15 May 2012, 1:05:44 UTC - in response to Message 25087.  

It worked, despite the black screens and resets, so I'm leaving good enough alone.


I'm not sure where you're getting black screens and needing resets. Are you aware there is a problem with BOINC 7.0.24 from Ubuntu 12.04 repository? Could that be causing the black screens? A temporary solution is given in http://setiathome.berkeley.edu/forum_thread.php?id=67864&nowrap=true#1229806. Their 7.0.27 works fine here.
ID: 25089 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile skgiven
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25094 - Posted: 15 May 2012, 10:02:41 UTC - in response to Message 25089.  

I upgraded from 11.10 and had to reinstall the driver (Ubuntu repo). When I rebooted it went to a black screen a few times, restarted itself a couple of times and I had to manually click the reset button a few times. I've installed updates since then, and I might have removed the driver and reinstalled it again (this fixed similar issues in the past). All seems to be well for now, so I will leave it be, but if things start playing up again I will apply said fix, thanks.
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help
ID: 25094 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GDF
Volunteer moderator
Project administrator
Project developer
Project tester
Volunteer developer
Volunteer tester
Project scientist

Send message
Joined: 14 Mar 07
Posts: 1958
Credit: 629,356
RAC: 0
Level
Gly
Scientific publications
watwatwatwatwat
Message 25127 - Posted: 17 May 2012, 9:58:18 UTC - in response to Message 25094.  

A first tentative version for windows is out.

Let's see if it works.

gdf
ID: 25127 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 · Next

Message boards : Graphics cards (GPUs) : New beta application for kepler is out

©2025 Universitat Pompeu Fabra