If you're planning to buy a new laptop, smartphone, or gaming console this year, brace yourself. Prices are climbing 15-20% across the board, and the culprit isn't traditional inflation or supply chain disruptions. It's artificial intelligence. AI data centers are now consuming up to 70% of all memory chips manufactured globally, creating the most severe shortage the semiconductor industry has seen in decades and leaving consumer electronics to fight over what remains.
DRAM prices surged 172% through 2025. DDR5 spot prices have quadrupled since September. And every major analyst firm agrees: the squeeze is going to get worse before it gets better.
The Numbers Behind the Squeeze
The scale of the shortage is staggering. According to TrendForce, HBM (High Bandwidth Memory), the specialized chips used in AI accelerators, will consume 23% of total DRAM wafer output in 2026, up from 19% the previous year. Samsung has raised 32GB DDR5 module prices from $149 to $239, a 60% increase. In Tokyo's Akihabara electronics district, shops have imposed purchase limits on memory and storage to prevent hoarding, with some stores capping customers at two SO-DIMMs per visit. A G.SKILL Trident Z5 32GB DDR5 kit now sells for nearly $900 on Amazon.
"This is the most significant disconnect between demand and supply in terms of magnitude as well as time horizon that we've experienced in my 25 years in the industry," said Manish Bhatia, Micron's Executive Vice President of Operations. SK Hynix has declared its entire chip supply for 2026 "essentially sold out." Micron has exited the consumer memory market entirely, shutting down its Crucial brand to focus on enterprise and AI customers.

The three companies that dominate global memory production, Samsung, SK Hynix, and Micron, have collectively pivoted their limited manufacturing capacity toward higher-margin AI products. Gartner projects a 47% DRAM price increase across 2026. Samsung and SK Hynix are planning to raise server memory prices by up to 70% in Q1 alone.
How AI Ate the Memory Supply
The mechanics of the shortage come down to a brutal math problem. Every wafer allocated to an HBM stack for an Nvidia GPU is a wafer denied to the LPDDR5X module in a mid-range smartphone or the SSD in a consumer laptop. HBM consumes roughly three times the wafer capacity of standard DDR5 per gigabyte, meaning AI's appetite for memory is vastly disproportionate to its share of the market.
The demand side is equally lopsided. Big Tech spending on data centers has exploded from $217 billion in 2024 to an estimated $650 billion in 2026. A single Nvidia Blackwell accelerator includes 192GB of HBM3e, six times the memory in a typical high-end PC. One NVL72 rack system, containing 72 such GPUs, uses memory equivalent to 1,000 smartphones. The OpenAI Stargate Project alone could consume up to 40% of global DRAM output.
Sassine Ghazi, CEO of chip design firm Synopsys, put it plainly in a January CNBC interview: "Most of the memory from the top players is going directly to AI infrastructure, but many other products need memory, so those other markets are starved today because there is no capacity left for them."
The Consumer Fallout
The ripple effects are already hitting shelves. Dell has raised commercial PC prices by up to 20%, with increases ranging from $55 to $765 depending on memory configuration. Lenovo warned that all existing customer quotations expired on January 1, forcing contract resets at higher prices. IDC projects the global PC market could shrink by up to 8.9% in 2026, not because people don't want new computers, but because they can't afford them.

Smartphones aren't spared. IDC forecasts a potential 5.2% contraction in the global smartphone market, with average selling prices rising 6-8%. For low-end devices, memory costs could hit 30% of the total bill of materials, up from 10% in early 2025. Chinese manufacturers including Xiaomi and Oppo have cut shipment forecasts by up to 20%.
Gaming takes a particularly hard hit. Nvidia plans to slash RTX 50-series GPU production by 30-40% in the first half of 2026 due to GDDR7 shortages. Sony is reportedly considering delaying its next PlayStation console to 2028 or 2029. Some PC vendors have resorted to selling pre-built systems without RAM entirely, leaving customers to source memory on their own.
The Hidden AI Tax
What makes this shortage fundamentally different from the COVID-era chip crisis of 2020-2023 is that it isn't accidental. The pandemic shortage resulted from unpredictable supply chain disruptions and a sudden surge in work-from-home demand. It was cyclical, painful, and ultimately self-correcting as factories caught up. The current crisis is a deliberate, profit-driven reallocation by the industry's three dominant players toward customers willing to pay premium prices.
This distinction matters because it reveals what might be called the "hidden AI tax," a structural transfer of costs from AI companies to ordinary consumers. When Samsung, SK Hynix, and Micron redirect their limited manufacturing capacity toward HBM chips that sell at enormous margins to hyperscalers like Microsoft, Google, and Meta, the cost is not borne by those tech giants. It is borne by the person buying a laptop for school, a phone for work, or a gaming console for their kids.
The parallel to energy markets is instructive. For decades, the environmental costs of fossil fuel production were externalized onto communities and ecosystems while producers captured the profits. The AI memory shortage operates on a similar logic: the benefits of AI infrastructure accrue primarily to a handful of trillion-dollar companies, while the costs of constrained supply are distributed across billions of consumer device purchases worldwide. Micron's revenue is expected to more than double this fiscal year. Samsung posted a record quarterly profit of $13.8 billion. Meanwhile, global AI spending is on track to hit $2.52 trillion in 2026, and the companies driving that spending face no pressure to internalize the consumer costs their demand creates.
The irony is particularly sharp for the "AI PC" initiative that Microsoft and Intel have spent the last two years promoting. These devices require a minimum of 16GB of RAM, but the very AI boom that's supposed to make them essential is making that amount of memory prohibitively expensive for budget-conscious buyers.
When Does It End?

Not soon. SK Hynix's new HBM facility in Cheongju, South Korea, won't begin production until 2027. Micron's retooled fab in Taiwan comes online in the second half of 2027 at the earliest. Samsung's new plant in Pyeongtaek targets 2028. Micron's massive DRAM complex in New York won't reach full production until 2030. Building a modern memory fabrication facility requires two to three years of construction and $10-20 billion in capital investment.
Jeff Clarke, Dell's COO, captured the industry's predicament during a November analyst call: "I've never seen memory-chip costs rise this fast." Synopsys CEO Ghazi told CNBC the crunch will continue through 2026 and 2027. Most analysts don't expect meaningful relief until late 2027 at the earliest, and some projections extend the shortage through 2028.
Elon Musk has proposed a characteristically ambitious solution, announcing plans for a "TeraFab" that would combine logic, memory, and packaging under one roof. "We've got two choices," he said. "Hit the chip wall or make a fab." But even Tesla's timeline puts initial production at Q3 2026 for packaging only, with full wafer fabrication years further out.
What This Changes
The AI memory shortage marks the moment when the costs of the AI boom became tangible for ordinary consumers. For the past three years, AI's economic impact was largely confined to boardrooms, stock prices, and enterprise software launches. Now it's showing up in the price of a laptop at Best Buy.
The shortage also exposes a structural vulnerability in how the semiconductor industry is organized. Three companies controlling the global memory supply can effectively decide, through their capital allocation choices, whether the world gets affordable consumer electronics or well-fed AI data centers. Right now, they are choosing the latter, and no market mechanism or policy intervention is positioned to change that calculus before 2028.
The key metric to watch is HBM's share of total DRAM wafer capacity. If it rises above 25% in 2026, as some analysts project, the consumer squeeze will intensify further. For anyone shopping for electronics, the message is concrete: prices in six months will be higher than prices today, and they won't come back down until new fabrication capacity comes online in late 2027 at the earliest.
Sources
- Fortune: Rampant AI Demand for Memory Is Fueling a Growing Chip Crisis
- Tom's Hardware: Data Centers Will Consume 70% of Memory Chips in 2026
- CNBC: Memory Chip Shortage to Last Through 2027
- IDC: Global Memory Shortage Crisis Market Analysis
- Consumer Reports: AI Data Centers Buying Up RAM and Raising Laptop Prices






