Nasdaq 100 falls into correction territory as AI memory chip stocks sell off

    The Nasdaq 100 dropped more than 10% from its recent peak this week, crossing the threshold that defines an official market correction. The selling was concentrated in memory chip stocks, with SK Hynix, Micron, SanDisk, and Western Digital each taking significant hits. The trigger was a piece of research from Google that suggested AI models may require less memory than the market had been pricing in. That single research publication was enough to wipe billions in market value from companies that had spent the past two years riding the AI infrastructure spending wave.

    What the Google research actually said

    Google's research, published earlier in the week, examined memory bandwidth and capacity requirements for running large AI models at scale. The paper found that certain architectural improvements in model design could reduce the memory footprint needed during inference, the phase where a trained model generates responses for users. Inference is where most of the ongoing hardware demand lives after the initial training phase is complete. If inference can be made less memory-intensive, the continuous upgrade cycle that memory chip makers had been counting on weakens considerably.

    The market read this as a direct threat to the demand forecasts that had been built into chip stock valuations. Micron, for instance, had seen its stock price more than double between early 2023 and mid-2024 largely on expectations of sustained AI-driven demand for high-bandwidth memory. SK Hynix had similar momentum, driven by its early lead in producing HBM3 chips used in Nvidia's H100 and H200 GPUs. One research paper does not rewrite demand fundamentals overnight, but investors were already looking for reasons to reduce exposure after a long run-up.

    Memory chip stocks led the Nasdaq 100 sell-off following Google research on AI memory efficiency
    Memory chip stocks led the Nasdaq 100 sell-off following Google research on AI memory efficiency

    Why memory chips became the center of the AI trade

    Training and running large language models requires moving enormous amounts of data between processors and memory at very high speeds. High-bandwidth memory, or HBM, sits physically close to the GPU and allows data to move much faster than conventional DRAM. The demand for HBM exploded after ChatGPT demonstrated the commercial potential of large language models in late 2022. Every major AI training cluster built since then has required large quantities of HBM, and memory chip makers could not produce it fast enough to meet orders.

    That supply constraint pushed prices higher and gave memory chip companies unusually strong pricing power for an industry that normally cycles through brutal oversupply periods. Investors treated Micron, SK Hynix, and their peers as direct plays on AI infrastructure spending without the concentration risk of betting solely on Nvidia. The Google research introduced doubt about whether that demand would compound at the same rate through 2025 and 2026, which is all it took to start a rotation out of the sector.

    Meta's 8% drop added to the index's pain

    The memory chip sell-off was not the only drag on the Nasdaq 100 this week. Meta Platforms fell 8% in a single session after losing two separate court cases. Meta carries a large weighting in the Nasdaq 100, so an 8% decline in its stock moves the index meaningfully on its own. The combination of the chip sector sell-off and Meta's drop was enough to push the index past the 10% correction threshold without requiring broader weakness across the rest of the index's components.

    Meta's legal losses came on the same day that sentiment was already fragile from the chip news. Markets tend to amplify moves when multiple negative catalysts land close together, and this week was a clear example of that pattern. Neither event on its own would necessarily have pushed the index into correction territory, but together they were sufficient.

    How much have the affected chip stocks fallen

    Micron was among the hardest hit, with its stock down sharply on heavy volume. SK Hynix, which trades on the Korea Stock Exchange rather than a US exchange, saw parallel losses in Seoul. SanDisk and Western Digital, which focus more on NAND flash memory used in storage rather than HBM, fell as well, though their exposure to AI inference demand is less direct than Micron's or SK Hynix's. The sell-off swept across the memory sector broadly rather than targeting only the companies with the most direct AI exposure.

    For context, Micron's stock had already pulled back from its 2024 highs before this week's move. The company reported strong fiscal Q2 2025 earnings in March 2025, with data center revenue hitting a record, but guidance for the following quarter came in below some analyst estimates. That earlier disappointment had already taken some air out of the stock, making this week's Google-driven sell-off a second leg down rather than a single sharp break from recent highs.

    What a correction means for the index from here

    A 10% decline is the standard definition of a correction, and it does not in itself predict whether the index will recover quickly or continue lower. The Nasdaq 100 entered correction territory twice in 2023 before going on to post strong annual gains. The more relevant question is whether the Google research changes the actual capital expenditure plans of the large cloud providers who buy memory-heavy AI hardware. If Microsoft, Amazon, and Google itself continue to expand their data center buildouts at the pace they have been signaling, memory demand will remain strong regardless of efficiency improvements at the model level.

    Microsoft is scheduled to spend approximately 80 billion dollars on data center infrastructure in fiscal year 2025, a figure the company confirmed in January 2025. Google's own capital expenditure guidance for 2025 is 75 billion dollars. Those numbers suggest the underlying demand for AI hardware has not reversed, even if one research paper raises questions about memory requirements per unit of compute. Memory chip earnings reports over the next two quarters will give a clearer picture of whether the sell-off was an overreaction or an early signal of genuine demand softness.

    Love this story? Explore more trending news on nasdaq

    Share this story

    Frequently Asked Questions

    Q: What did the Google research say that caused memory chip stocks to fall?

    Google published research suggesting that architectural improvements in AI model design could reduce the memory needed during inference, the phase where models generate responses. This raised concerns that the sustained demand growth memory chip companies had been counting on might be lower than expected.

    Q: Which memory chip companies were most affected by the sell-off?

    SK Hynix, Micron, SanDisk, and Western Digital all saw significant declines. SK Hynix and Micron faced the most direct scrutiny given their heavy investment in high-bandwidth memory production for AI hardware.

    Q: Why did Meta's stock drop 8% in a single day?

    Meta lost two separate court cases on the same day, which sent its stock down 8%. Because Meta carries a large weighting in the Nasdaq 100, that single-session decline contributed meaningfully to the index crossing into correction territory.

    Q: Does a 10% correction in the Nasdaq 100 mean a prolonged downturn is coming?

    Not necessarily. The Nasdaq 100 entered correction territory twice during 2023 and still posted strong annual gains both times. Whether this correction continues depends largely on whether major cloud providers like Microsoft and Google maintain their stated data center spending plans.

    Q: What is high-bandwidth memory and why does it matter for AI?

    High-bandwidth memory, or HBM, is a type of chip that sits physically close to a GPU and allows data to move at much faster speeds than standard DRAM. AI training and inference workloads move enormous amounts of data continuously, making HBM a core component of every major AI server built since 2023.

    Read More