Advanced search

Message boards : Graphics cards (GPUs) : Will the price of Ampere GPUs drive down the value of their predecessors?

Author Message
Pop Piasa
Avatar
Send message
Joined: 8 Aug 19
Posts: 252
Credit: 458,054,251
RAC: 0
Level
Gln
Scientific publications
watwat
Message 55664 - Posted: 1 Nov 2020 | 18:44:25 UTC

With the RTX 3070 price of US$500, what will happen to used card prices? I started this thread to gather opinions.
It also seems to me that NVidia won't be selling too many Turings, Paschals etc at their current prices. Am I missing some marketing insight?

bozz4science
Send message
Joined: 22 May 20
Posts: 110
Credit: 114,275,746
RAC: 195,156
Level
Cys
Scientific publications
wat
Message 55666 - Posted: 1 Nov 2020 | 18:54:36 UTC

Great question. I just wondered about the very same. Just made a post a couple minutes earlier about thinking that the RTX 3070 is rather reasonably priced IMO. Especially comparing the initial performance benchmarks (vs. 2080/2080 Ti), I can't help but wonder how those prior generation cards can still uphold their prices. Especially with this year's fierce competition of the recently launched AMD cards and probably much better availably which will be an important factor impacting demand for NVIDIA's demand (Black Friday, Christmas), I only see one way... Prices of prior generations have to go down as soon as NVIDIA's current gen RTX 30xx series cards will reach a steady supply. And the competition from AMD will only help in the future to drive prices down which is great for us consumers.

In the end, most cards aren't sold to crunchers, so there are other clienteles that will drive overall demand that will in turn dictate NVIDIA prices in accordance to what the AMD completion will do. Just my two cents...

Ian&Steve C.
Avatar
Send message
Joined: 21 Feb 20
Posts: 1078
Credit: 40,231,533,983
RAC: 27
Level
Trp
Scientific publications
wat
Message 55669 - Posted: 1 Nov 2020 | 20:31:18 UTC - in response to Message 55664.

Unfortunately supply and demand still comes into play. Prices of the old cards will still remain high if you can’t even buy the new cards. It could be $100 but if there’s no stock, it’s not really relevant.

But I expect the prices to come down when stock levels normalize. I’d say the 2080ti will probably settle for around the same $500 mark. Even though it’s older, it has more VRAM and therefore will retain value for those that need the VRAM.
____________

Pop Piasa
Avatar
Send message
Joined: 8 Aug 19
Posts: 252
Credit: 458,054,251
RAC: 0
Level
Gln
Scientific publications
watwat
Message 55670 - Posted: 1 Nov 2020 | 22:20:22 UTC - in response to Message 55669.
Last modified: 1 Nov 2020 | 22:56:42 UTC

Ian, I agree that memory is a major factor in the price of a graphics card.

I recently completed a survey from NVidia and in the comments section I requested that they make cards for science crunchers with 2 GB of 256 bit GDDR6, similar to GTX 1060 3GB mining cards (no monitor interface) and give us a price break on them.

Probably got ignored.

(edit)
I wish cards had 2 or 3 GB onboard and a slot for additional memory. Is that asking for the moon?

Ian&Steve C.
Avatar
Send message
Joined: 21 Feb 20
Posts: 1078
Credit: 40,231,533,983
RAC: 27
Level
Trp
Scientific publications
wat
Message 55671 - Posted: 1 Nov 2020 | 22:38:44 UTC - in response to Message 55670.

Some projects still use a large portion of VRAM. Einstein for example, some of their Gravitational wave tasks can use upwards of 3GB per task. And people like to run multiple WUs at a time in some cases. So even though GPUGRID doesn’t use a large amount of VRAM, some projects still do.
____________

Pop Piasa
Avatar
Send message
Joined: 8 Aug 19
Posts: 252
Credit: 458,054,251
RAC: 0
Level
Gln
Scientific publications
watwat
Message 55672 - Posted: 1 Nov 2020 | 23:39:26 UTC - in response to Message 55671.

Thanks for that info, I tried to join Einstein but never got the promised email response. Maybe because I don't have any cards with more than 4 GB.

I've never run short of GPU memory on anything I've run as of yet (as an old guy newbie to GPU computing) so I'm no doubt guilty of a generalization due to lacking better knowledge of the memory required for the most intensive apps.

I am always grateful for any tutoring so thanks again.

Keith Myers
Send message
Joined: 13 Dec 17
Posts: 1358
Credit: 7,894,622,824
RAC: 6,651,820
Level
Tyr
Scientific publications
watwatwatwatwat
Message 55678 - Posted: 2 Nov 2020 | 7:29:55 UTC

If you run Nvidia cards, the drivers come with a utility named nvidia-smi. Installed on both Windows and Linux systems.

It is a terminal application that will show you the memory usage of the card, its fan speed,power state and temperature as well as the card's utilization percentage of resources.

A useful tool to see how hard a project's gpu tasks work the card.

On the AMD side, there is a very useful utility named gpu-utils by a BOINC developer that shows the same things. Actually can now show parameters for both AMD and Nvidia. Bonus is control of power,voltage, clocks and fan speeds on AMD.

gpu-utils

Pop Piasa
Avatar
Send message
Joined: 8 Aug 19
Posts: 252
Credit: 458,054,251
RAC: 0
Level
Gln
Scientific publications
watwat
Message 55682 - Posted: 2 Nov 2020 | 17:01:43 UTC - in response to Message 55678.

If you run Nvidia cards, the drivers come with a utility named nvidia-smi. Installed on both Windows and Linux systems.



Thanks Keith, I'll check that out.
Do you run it instead of the Afterburner hardware monitor?

Keith Myers
Send message
Joined: 13 Dec 17
Posts: 1358
Credit: 7,894,622,824
RAC: 6,651,820
Level
Tyr
Scientific publications
watwatwatwatwat
Message 55683 - Posted: 2 Nov 2020 | 19:00:34 UTC - in response to Message 55682.

No I don't run Windows. But I do use the nvidia-smi utility all the time.

I was just commenting that is available in Windows too and most Windows users have no clue it is installed by default and quite handy.

The standard Windows gpu monitoring tools provide more functionality. But a terminal program is very useful for a headless user.

Ian&Steve C.
Avatar
Send message
Joined: 21 Feb 20
Posts: 1078
Credit: 40,231,533,983
RAC: 27
Level
Trp
Scientific publications
wat
Message 55685 - Posted: 2 Nov 2020 | 19:19:07 UTC - in response to Message 55682.

Keith doesn't run windows. So he doesn't run Afterburner. I'm in the same boat. I'm pretty adverse to installing software I don't need on my crunching systems, I like to keep them with as minimal software payload as possible, so I'm pretty content with just using psensor for monitoring the cards, and using a custom script calling nvidia-smi to do things like power limiting and overclocking.

but back to the mining cards. I get the impression that it doesn't make financial sense for the big manufacturers to make high end cards with such small VRAM amounts. there isnt a lot of demand (relatively) for such a product to justify their expense in developing and manufacturing it.

also note that in conventional designs (excluding HBM), memory bus width scales with number of memory modules on the board. like how the RTX 2060 with 6x1GB modules has exactly 75% (6/8) the bus width of the 2070/2080 with 8x1GB modules. and how the 2070/2080 have exactly 72.727272% (8/11) the bus width of the 2080ti with 11x1GB modules. to make a 2GB card with 256-bit bus, you'd need to have 8x256MB modules, and it's very likely that no one even makes a GDDR6 module in that size, because why bother? and if they did, it might not be much cheaper than making a 1GB module, going back to the first comment of 'why bother'?

the rtx 3080 vs rtx 3090 is a slight exception, you can tell by the numbers that they are running the these cards at a 10:12 ratio, instead of the 10:24 you might expect from the module configs. they are essentially running 2x1GB modules in a single channel on the 3090, my guess is that the GPU core just doesnt have 24 inputs for memory, so they are forced to run it this way. it seems that the 16Gb (2GB) GDDR6X modules arent available yet either.
____________

Pop Piasa
Avatar
Send message
Joined: 8 Aug 19
Posts: 252
Credit: 458,054,251
RAC: 0
Level
Gln
Scientific publications
watwat
Message 55686 - Posted: 3 Nov 2020 | 2:32:17 UTC - in response to Message 55685.

Thanks for explaining the memory configs, I was clueless about that. Now I'm happily semi-clueless with some learning to do.

Post to thread

Message boards : Graphics cards (GPUs) : Will the price of Ampere GPUs drive down the value of their predecessors?

//