Memory Chip Giants Report Billions in Q3 Profits Amid AI-Driven Supercycle, $231 Billion DRAM Revenue Projected

Image for Memory Chip Giants Report Billions in Q3 Profits Amid AI-Driven Supercycle, $231 Billion DRAM Revenue Projected

Leading memory chip manufacturers Samsung, SK Hynix, and Micron Technology have reported robust third-quarter profits, signaling a significant upswing driven by the surging demand for artificial intelligence (AI) hardware. The industry is experiencing a "super boom cycle," with DRAM revenue projected to reach an unprecedented $231 billion by 2026.

Samsung Electronics announced a net profit of $8.6 billion for the July-September period, with its chip operating profit reaching $4.9 billion, bolstered by firmer pricing and tight supply in the market. Rival SK Hynix reported an $8.8 billion net profit for Q3 and stated that its production capacity through 2026 is already fully booked, underscoring the intense demand. Micron Technology also confirmed the strong market pull, posting a $3.2 billion net profit in its latest quarter.

The primary catalyst for this boom is High Bandwidth Memory (HBM), a critical component for AI accelerators. HBM stacks multiple DRAM layers close to the processor, enabling faster data transfer essential for training large AI models. "High bandwidth memory, or HBM, is the hot product because it stacks multiple DRAM layers close to the processor to move more data per second, which speeds up training runs that push huge batches through GPUs," as noted by Rohan Paul.

Beyond HBM, conventional DRAM is also experiencing tight supply as data centers expand to accommodate AI inference workloads. This has led to price increases across the memory stack, impacting buyers who did not pre-book supply. The shift has prioritized HBM production lines, causing standard DRAM additions to lag.

OpenAI has reportedly signed letters of intent with Samsung and SK Hynix for its ambitious "Stargate" project, with demand floated up to 900,000 DRAM wafers per month. SK Hynix indicated this figure is more than double the current HBM capacity. HBM bit demand is anticipated to grow over 30% annually for the next five years, with the overall memory shortage expected to persist through 2026 and potentially into early 2027. While some skeptics view OpenAI's demand forecasts as aggressive, even partial fulfillment is expected to keep supply tight given current capacity plans.