Micron starts mass production of its memory chips for use in Nvidia’s AI semiconductors
MICRON Technology has started mass production of its high-bandwidth memory semiconductors for use in Nvidia’s latest chip for artificial intelligence, sending its shares up more than 4 per cent before the bell on Monday (Feb 26).
The HBM3E (High Bandwidth Memory 3E) will consume 30 per cent less power than rival offerings, Micron said, and could help tap into soaring demand for chips that power generative AI applications.
Nvidia will use the chip in its next-generation H200 graphic processing units, expected to start shipping in the second quarter and overtake the current H100 chip that has powered a massive surge in revenue at the chip designer.
Demand for high-bandwidth memory (HBM) chips, a market led by Nvidia supplier SK Hynix, for use in AI has also raised investor hopes that Micron would be able to weather a slow recovery in its other markets.
HBM is one of Micron’s most profitable products, in part because of the technical complexity involved in its construction.
The company had previously said it expects “several hundred million” US dollars of HBM revenue in fiscal 2024 and continued growth in 2025. REUTERS
GET BT IN YOUR INBOX DAILY
Start and end each day with the latest news stories and analyses delivered straight to your inbox.
KEYWORDS IN THIS ARTICLE
BT is now on Telegram!
For daily updates on weekdays and specially selected content for the weekend. Subscribe to t.me/BizTimes
Telcos, Media & Tech
Microsoft offers cloud customers AMD alternative to Nvidia AI processors
OpenAI, Reddit sign partnership on ChatGPT, AI products, ads
Applied Materials forecasts strong third quarter on AI boom
Baidu ‘confident’ AI will sustain growth after sluggish first quarter
Newly privatised Toshiba to cut 4,000 jobs in restructuring drive
Siemens misses profit forecast as industrial business struggles