AI Ate the Cheap Flash

Everybody talks about the AI hardware boom like it begins and ends with giant Nvidia bricks and hyperscalers measuring capex in units of “small nation.” Cute story. But the really annoying part is happening lower in the stack: AI is now eating the boring silicon too.

According to TrendForce, conventional DRAM contract prices are projected to jump 58 to 63 percent quarter over quarter in Q2 2026, while NAND Flash contract prices are expected to rise 70 to 75 percent. That is not a little market wobble. That is the kind of pricing move that makes finance teams start speaking in frightened spreadsheets.

The reason is exactly as dumb as you think. Suppliers are reallocating capacity toward HBM, server DRAM, and enterprise SSDs, because that is where the fat margins live. TrendForce says North American cloud providers are accelerating AI inference deployments, high-capacity RDIMMs are the main procurement target, and major customers are signing long-term agreements just to lock supply down. Near-term supply, unsurprisingly, remains tight. If you are a normal buyer who just wanted memory for regular computers instead of a robot word factory, congratulations, you are now standing at the back of the buffet.

The NAND side looks even more cursed. TrendForce says enterprise SSD demand is still climbing as generative AI moves into large-scale deployment, and it explicitly warns that a clear shortage is expected through 2026, with meaningful capacity expansion unlikely until late 2027 or 2028. That is the sort of sentence that should make anyone buying storage for laptops, desktops, phones, cameras, or embedded gear mutter obscenities into a coffee mug.

And yes, regular people are already getting hit. Tom’s Hardware reported this week that memory cards and USB flash drives are up 124 percent on average from 2025 levels, with some products peaking at 261 percent. That is hilarious if you are a NAND executive and less hilarious if you just wanted a stupid microSD card without taking out a second mortgage.

This is why I get cranky when people frame AI infrastructure as some isolated rich-company problem. It is not isolated. It is a resource vacuum. The hyperscalers are not merely buying accelerators, they are pulling on the whole supply chain behind them, wafers, packaging, power, cooling, DRAM, NAND, networking, the whole expensive circus.

You can see the same story at the top of the stack. Reuters reported that TSMC raised its 2026 revenue outlook to more than 30 percent growth, pushed capital spending to the high end of its $52 billion to $56 billion range, and said AI demand remains “extremely robust.” TSMC also said production capacity remains very tight, and that 3 nm chips now account for a quarter of its sales. Translation: the biggest foundry on Earth is still running hot, and the line for advanced silicon is not getting shorter.

My take is simple. AI is becoming a tax on general computing. Not because the models are magical, but because the market keeps rewarding vendors for feeding giant data-center deployments first and letting everybody else fight over scraps later. If your brilliant future requires making ordinary memory and storage weirdly scarce, maybe the industry should stop calling that efficiency and start calling it what it is: collateral damage.

The GPU shortage was the opening act. Now the boring parts are getting expensive, and that is when normal people notice the scam.

Sources: TrendForce | Reuters on TSMC | Tom’s Hardware on DRAM/NAND pricing | Tom’s Hardware on memory card and flash drive price spikes