Less than a year after Ethereum transition to proof of stake plummeted demand – and therefore cost – for GPUs, the new Ai boom gives nVidia another reason to gimp performance and longevity of its new cards while keeping the usual price premium. Why fight for price in the consumer space when the same silicon can be sold to higher prices to Ai companies?
Compared to mining, this new market being much more “legitimate” and stable attracts totally different agents, at much higher scale and a totally different business model.
The fluctuating price for crypto and its naturally decentralized model meant that normal gaming gpus were targeted, as mining rigs made sense even using consumer or just slightly customized hardware (mining-ready-just-incase dekstop motherboards like this b660 ds3h) and the investors needed a way to quickly recoup their investment in case of a crash by selling them back to the gaming market.
Ai business on the contrary are naturally going to be run by bigger company using enterprise level hardware. This clear segmentation has two key advantages in commanding much higher prices and preventing them to find their way to budget-conscious gamers (not even in Tesla m40 style ).
The shaping of a new market
You see, when the crypto market crashes – as it did in 2018 and then 2022 – a flood of cards fills the demand of low end cards for real budget-conscious buyers, leaving brand new in the box products to less experienced or just anxious buyers that were going to buy new at any cost. The natural consequence is that both AMD and nVidia don’t have any interest in actually providing any value in that segment, with the 3050 and 6500 just slightly cheaper but much slower compared to their respective tier above.
And the very interesting 2023 trend is the new performance / price segmentation, wich is truly astounding to anyone that have been keeping an eye to this metric since the radeon 7500 was made by Ati. You can just look at the meaningless (less than 5% in some instances) performance gaps between the 3060ti and the 3070 (it was almost a bug, having also the same vram buffer) or the 3080ti and the 3090 (vram aside) and then 3090ti. Usually, in fact, the higher tier in a given generation commanded around 50% premium for less than 10% perf increase. nVidia also was very clever in making this “highest level card” a moving target by releasing the xx80, then xx80ti and finally the Titan version of the same core during an architecture lifespan of around 2 years.
With the 40×0 generation, we now have a 4090 that is 50% faster than the previous 3090ti king, but also around 35% faster than the next on the stack 4080.
Meanwhile, the newly released 4060ti is just around 10% faster in 1080p and 5% than the 3060ti, and they do perform around the same at 4k, all of this at the same MSRP and Vram buffer. You can see it’s now a much wider performance spectrum than before.
On the AMD side, a single X added to the 7900XT now signal a 15% performance gap that was previously occupied by the much clearly named 6800, 6800xt and 6900xt.
The nVidia VRAM issue
It was 2014 when Sapphire released the 8gb R9 290x at a 430$ msrp. With huge vram buffers (save from the fury expensive hbm) and the legendary GCN architecture longevity, AMD cards have been aging like the best wine since late 2011 up to now; in the current market we can see ram capacity for the 6800 at 16, twice as much than the more expansive 3070ti, or the 7900xt at 20 vs 4070ti 12.
nVidia had the 1070, then 2070 and 3070ti, with the last coming at a 599$ msrp in 2021, 7 years after the 430$ 290x with the same gddr5 capacity
There are 4 main reason this was allowed to happen:
1 this was the ram size of the previous gen consoles ps4 and xbox one, meaning developers were hard pressed to fit the best they could in it
2 during the 2020 mining craze, it was just the right amount to fit the ethereum pool, wich meant that anything at that size was going to sell anyway
3 nvidia wants to make vram size a clear additional segmentation tool for professional cards, to prevent any possible driver / resistor hack to ever cannibalize their workstation cards sales
4 nvidia buyers just aren’t very c̶l̶e̶v̶e̶r̶ informed and forward thinking in general, and will buy any crap that has a green box anyway, maybe hoping to somehow see playable raytracing eycandy while their card doesn’t even load the game textures properly.
Seriously, whoever said the customer is always right never tried to convince a warzone-only player gaming at high refresh 1080p that they would be better with a 6900xt than a 3090, and use the saving for a better cpu/ram combo.