Whereas looking our web site a number of weeks in the past, I stumbled upon “How and When the Memory Chip Shortage Will End” by Senior Editor Samuel Ok. Moore. His evaluation focuses on the present DRAM scarcity brought on by AI hyperscalers’ ravenous urge for food for reminiscence, a significant constraint on the velocity at which large language models run. Moore supplies a transparent rationalization of the scarcity, notably for prime bandwidth reminiscence (HBM).
As we and the remainder of the tech media have documented, AI is a useful resource hog. AI electricity consumption may account for as much as 12 p.c of all U.S. energy by 2028. Generative AI queries consumed 15 terawatt-hours in 2025 and are projected to eat 347 TWh by 2030. Water consumption for cooling AI data centers is predicted to double and even quadruple by 2028 in comparison with 2023.
However Moore’s reporting shines a lightweight on an obscure nook of the AI increase. HBM is a selected sort of reminiscence product tailored to serve AI processors. Makers of these processors, notably Nvidia and AMD, are demanding increasingly more reminiscence for every of their chips, pushed by the wants and desires of corporations like Google, Microsoft, OpenAI, and Anthropic, that are underwriting an unprecedented buildout of data centers. And a few of these services are colossal: You possibly can learn concerning the engineering challenges of constructing Meta’s mind-boggling 5-gigawatt Hyperion web site in Louisiana, in “What Will It Take to Build the World’s Largest Data Center?”
We realized that Moore’s HBM story was each essential and distinctive, and so we determined to incorporate it on this difficulty, with some updates for the reason that authentic printed on 10 February. We paired it with a current story by Contributing Editor Matthew S. Smith exploring how the memory-chip scarcity is driving up the worth of low-cost computer systems just like the Raspberry Pi. The result’s “AI Is a Memory Hog.”
The large query now’s, When will the scarcity finish? Value strain brought on by AI hyperscaler demand on every kind of consumer electronics is being masked by cussed inflation mixed with a perpetually shifting tariff regime, at the least right here within the United States. So I requested Moore what indicators he’s on the lookout for that may sign an easing of the reminiscence scarcity.
“On the availability facet, I’d say that if any of the large three HBM corporations—Micron, Samsung, and SK Hynix—say that they’re adjusting the schedule of the arrival of recent manufacturing, that’d be an essential sign,” Moore advised me. “On the demand facet, will probably be attention-grabbing to see how tech corporations adapt up and down the supply chain. Knowledge facilities would possibly steer towards {hardware} that sacrifices some efficiency for much less reminiscence. Startups creating all kinds of merchandise would possibly pivot towards artistic redesigns that use much less reminiscence. Constraints like shortages can result in attention-grabbing expertise options, so I’m trying ahead to overlaying these.”
To make certain you don’t miss any of Moore’s evaluation of this subject and to remain present on the complete spectrum of expertise improvement, sign up for our weekly newsletter, Tech Alert.
From Your Web site Articles
Associated Articles Across the Internet

