Unless the AI bubble bursts and sends all those planned data center projects packing, you won’t see an end to memory pricing woes anytime soon. One of the big three semiconductor companies capable of making memory doesn’t imagine it will be able to meet all consumer demand for at least another four years.
The DRAM and NAND memory market is dominated by three major players, Micron, Samsung, and SK Hynix. The latter falls under the global conglomerate SK Group. The South Korean-based RAM maker’s chairman, Chey Tae-won, told Bloomberg that the company is expanding memory-making capacity. Unfortunately, it won’t be able to meet demand until “around” 2030.
Chey spoke to reporters outside of GTC 2026 earlier this week. He said that capacity for the basic wafers the company uses for its chips is lagging 20% behind demand. All this inevitably trickles down to consumers. We’ve seen prices increase for practically every gadget, from laptops to smartphones like the recent Samsung Galaxy S26, all the way down to enthusiast-level miniature computing chips like the Raspberry Pi.
The blame for the memory shortage falls at the feet of the AI boom. The largest AI datacenter projects, like OpenAI’s multi-state Stargate project, have such a massive demand for high-bandwidth memory (HBM). Semiconductor companies are making such a profit from these high-end memory chips that they have reduced capacity for consumer-level DRAM and SSD storage. Even major companies like Valve are struggling to source affordable RAM for its Steam Machine. One of the company’s staff reportedly joked to industry insiders at GDC 2026, “If you have a line on a bunch of RAM, we are in the market and would like to buy it.”
The reason why the major semiconductor companies are moving slowly to increase supply is—ironically—a fear of having too much RAM. Korean outlet Chosun Biz reported, based on anonymous industry sources, that Samsung hopes the global semiconductor market may reverse course in 2028. It reportedly doesn’t want to scale capacity so much because of “uncertainties in demand forecasting.”
Essentially, Samsung may be concerned about scaling up too fast and then the RAM supply being too much once AI data center demand for high-bandwidth memory falters. Currently, 50% of SK Hynix and Samsung memory output is in the form of HBM slated for AI data centers.
SK Hynix is expanding capacity, according to Chosun Biz. It’s building more centers located around Korea such as Icheon, Cheongju, and Yongin. It’s spending close to $13 billion on a massive new assembly plant made solely to supply HBM. The work is set to start in April and not be finished until the end of 2027, according to Reuters.
A single Nvidia Vera Rubin chip—the company’s latest, most powerful AI training SoC (system on a chip)—requires as much as 288GB of HBM. That’s nine times as much as your typical gaming-ready PC. The older Grace Blackwell-based B300 chips require just slightly less memory. And this AI boom is how Nvidia plans to drive $1 trillion in revenue.
Data centers plan to stack hundreds of these chips together, just for the sake of AI cloud compute. Until demand for AI data centers bottoms out, we’ll be in for years of inflated gadget prices.
Source: Gizmodo