High-bandwidth memory (HBM) is a crucial component in artificial intelligence (AI) computing. Nikkei Asia reports that CXMT has ordered and received several manufacturing and testing equipment from US and Japanese suppliers to assemble and produce HBM, as Beijing seeks to mitigate the negative impact of Washington's export restrictions and reduce its reliance on foreign technology.
Currently, HBM is not on the US export control list, but Chinese companies themselves still lack the capacity to produce this component on a "large scale".
Based in Hefei, eastern China, CXMT is the country's leading manufacturer of dynamic random-access memory chips. Sources indicate that since last year, the company has prioritized developing technologies for stacking DRAM chips vertically to replicate the architecture of HBM chips.
DRAM chips are a critical component for everything from computers and smartphones to servers and connected cars, enabling processors to access data quickly during computation. Stacking them into HBM expands communication channels, allowing for faster data transfer speeds.
HBM is a promising area for accelerating computing and artificial intelligence applications. The Nvidia H100 chip, the computing power behind ChatGPT, combines a graphics processor with six HBMs to enable fast, human-like responses.
Founded in 2006, CXMT announced late last year that it had begun domestic production of its first LPDDR5 memory chips – a popular mobile DRAM type suitable for high-end smartphones. According to the company, Chinese smartphone manufacturers such as Xiaomi and Transsion have completed the integration of CXMT's mobile DRAM chips.
This progress places CXMT behind only leading US memory chip manufacturer Micron and South Korea's SK Hynix in terms of technology, while ahead of Taiwan's Nanya Technology (China). However, CXMT will only account for less than 1% of the global DRAM market in 2023, while the three dominant companies – Samsung, SK Hynix, and Micron – control over 97%.
Meanwhile, HBM production is dominated by the world's two largest DRAM chip manufacturers, SK Hynix and Samsung, which together controlled over 92% of the global market in 2023, according to Trendforce. Micron, which holds approximately 4% to 6% market share, is also looking to expand its market share.
The production of HBM not only requires the ability to manufacture high-quality DRAM but also demands sophisticated chip packaging techniques to link those chips together. China still lacks a local chip manufacturer capable of producing HBM chips to accelerate AI computing.
(According to Nikkei Asia)
Source






Comment (0)