The bad year of 2023 is over, and a new outbreak is brewing in the memory chip market
According to TrendForce statistics, in the fourth quarter of 2023, global DRAM (memory) and NAND flash memory prices increased by approximately 3%-8%. Relative to the PC market, the server DRAM market plays a greater role in growth
In 2024, with the improvement of computing speed, the average single-quarter installed capacity of DRAM and NAND flash memory in various AI applications, such as smartphones, servers, and laptops, will increase. Among them, server applications will have the highest growth rate.
In terms of smartphones, in 2023, due to oversupply, memory prices will fall rapidly. Due to the low price, the average DRAM capacity of a single smartphone in 2023 will increase by 17.5% annually. In 2024, without the expected launch of new applications, the growth rate of the average single-machine DRAM capacity of smartphones will slow down, estimated at 11%
In terms of servers, as the demand for AI servers continues to increase, high-end AI chips are being launched one after another. Since training is the mainstream of the current AI market, the memory used is mainly DRAM, which facilitates high-speed computing. Compared with NAND flash memory, the average single-machine DRAM The growth rate of carrying capacity is even higher, with an expected annual increase of 17.3%.
In terms of notebook computers, it is expected that notebook computers equipped with new CPUs will not be launched until the second half of the year, which will have limited help in increasing memory capacity. In addition, AI/PC requires DRAM capacity to be increased to 16GB, and it is expected that the average stand-alone DRAM capacity of notebook computers will be The annual capacity growth rate is approximately 12.4%.
AI server drives DDR5 and HBM to accelerate progress
DDR5 accelerates penetration
Large AI models continue to be iteratively upgraded, and parameters continue to grow. The larger the parameter scale of a large model, the heavier the burden on computing power, and the AI server is the core of computing power. In 2023, AI servers (equipped with high computing power chips such as GPU, FPGA, ASIC) will be shipped nearly 1.2 million units, an annual increase of 38.4%, accounting for 9% of the overall server shipments. If this momentum continues, 2026 will Accounted for 15% of the market share
More large model parameters require large-capacity, high-speed flash memory support. Micron stated at the conference that the DRAM usage of an AI server is 6-8 times that of an ordinary server. By 2026, the server DRAM (excluding HBM) market is expected to reach US$32.1 billion
For the high performance requirements of AI servers, the demand for more powerful memory - DDR5 is increasing. Compared with DDR4, DDR5 has the characteristics of higher speed, larger capacity and lower energy consumption. The maximum transmission rate of DDR5 memory reaches 6.4Gbps, which is twice as high as DDR4
The memory interface chip is the core logic device of the server memory module. As the only channel for the server CPU to access memory data, its main function is to improve the speed and stability of memory data access and meet the increasing high performance requirements of the server CPU for the memory module. and large capacity requirements
As DDR memory technology evolves to DDR5, the use of memory interface chips is also increasing. After Intel and AMD launched new platforms that support DDR5, server DDR5 encountered PMIC matching problems, which prevented the market size of DDR5 from increasing rapidly and hindered the launch of new platforms. At present, DRAM original manufacturers and PMIC manufacturers have begun to solve this problem.
Although the supply of original DDR5 is blocked in the short term, due to the large number of PMIC suppliers, switching and adaptation will not become a bottleneck in increasing DDR5 penetration. Server DDR5 penetration is expected to reach about 30% this year
With the accelerated development of AI servers and the gradual resolution of PMIC yield issues, DDR5 penetration is expected to further increase, reaching 85% in 2026
Currently, the mainstream GPU memory solutions are GDD and HBM (High Bandwidth Memory). Among them, GDDR has a large number of peripherals around the SoC. This solution has been upgraded from GDDR5 to GDDR6 to increase bandwidth. However, if GDDR wants to increase the bandwidth by 1GB, it will bring more power consumption, which is not conducive to improving system performance.
As a typical near-memory technology, the HBM solution can improve the "storage wall" problem caused by the separation of storage and computing, that is, the bandwidth problem of the storage unit, the energy efficiency problem of data transmission between the storage unit and the computing unit, and the vertical die die in HBM Stacking also increases capacity. Therefore, before the integration of storage and computing is truly implemented, HBM technology is the best solution to meet the current GPU demand for more memory and higher bandwidth.
HBM is characterized by large capacity and high bandwidth (bandwidth is used to measure the data transmission rate of DRAM and is a core technical indicator). It stacks multiple DDR die and packages them together with the GPU to achieve large capacity and high bit width. DDR Combination Array
In 2013, SK Hynix took the lead in launching HBM1. HBM1 has a bandwidth of 128GB/S per stack, supports the integration of 4 DRAM stacks, and has a capacity of 4GB per stack. In 2017, SK Hynix launched HBM2, which doubled its bandwidth and capacity compared to HBM1. In 2018, JEDEC launched the HBM2e specification. HBM2e can achieve a bandwidth of 461GB/s per stack.
SK Hynix will begin mass production of HBM3 in the first half of 2022, with a bandwidth of 819.2GB/s, supporting the integration of 12 DRAM stacks, and a capacity of 24GB per stack. In 2023, the mainstream market demand will shift from HBM2e to HBM3, and the demand proportion of HBM3 will increase to 39%. As acceleration chips using HBM3 continue to increase in volume, it is expected that the market demand proportion of HBM3 will reach 60% in 2024.
At the end of 2023, NVIDIA released the DGX GH200, which further increased the demand for memory performance in AI servers. The DGX GH200 is linked to a total of 256 Grace Hopper super chips and has 144TB of shared memory. The GH200 single card is equipped with 480GB LPDDR5 memory and 96GB. HBM video memory, and in the previous generation DGX H100 server, an average single H100 chip corresponds to 256GB of memory and 80GB of HBM. Comparing the two, the memory capacity of the GH200 solution has been significantly improved.
TrendForce believes that Nvidia is planning more HBM suppliers. Micron, SK Hynix, and Samsung have all provided 8hi (24GB) samples in 2023, and Samsung's HBM2 (24GB) has been verified by the end of 2023.
Due to the cumbersome verification process of HBM, which is expected to take two quarters, the three original memory manufacturers are expected to complete verification in the first quarter of 2024. The HBM3e verification results of each original manufacturer will also determine the procurement weight distribution of NVIDIA’s HBM suppliers in 2024.
According to TrendForce statistics, in the third quarter of 2023, SK Hynix overtook Samsung Electronics and became the world's largest server DRAM manufacturer. The report shows that in the third quarter of 2023, SK Hynix’s server DRAM sales reached US$1.85 billion, capturing 49.6% of the market share, firmly ranking as the global leader. Samsung Electronics, which ranked second, had server DRAM sales in that quarter. It was US$1.313 billion, with a market share of 35.2%. During the same period, Micron’s server DRAM sales were US$560 million, with a market share of 15.0%, ranking third.
It should be pointed out that the above statistics only refer to DDR5 memory installed in traditional servers, and do not include HBM used in AI servers. If HBM sales are included, SK Hynix’s lead over Samsung Electronics will be even greater.
On January 24, 2024, SK Hynix released its FY2023 financial report, with revenue of 32.77 trillion won, a year-on-year decrease of 27%. Looking at FY2023Q4 performance alone, revenue was 11.31 trillion won, a year-on-year increase of 47% and a month-on-month increase of 25%. DRAM business Q4 revenue was 7.35 trillion won, a year-on-year increase of 49.15% and a month-on-month increase of 20.98%. The company turned a loss into a profit in Q4 of FY2023, and its performance began to rebound across the board. SK Hynix said that starting from Q4 of FY2023, the demand for the company's products for AI servers and mobile terminals has increased, the average selling price has increased, and the memory market environment has improved. Throughout FY2023, the company's DRAM actively responded to customer needs with its technical strength. The revenue of its main products DDR5 and HBM3 increased by 4 times and 5 times respectively year-on-year.
As far as the overall DRAM module market is concerned, Samsung Electronics still maintains its dominant position in DRAM, but SK Hynix is catching up. According to Omdia statistics, in the third quarter of 2023, Samsung Electronics had a market share of 39.4% in the DRAM field, SK Hynix ranked second with a market share of 35%, and Micron ranked third with a market share of 21.5%.
Micron, ranked third in the memory industry, is also stepping up its presence in the AI server market. The company plans to mass-produce HBM3e in the first quarter of 2024 to seize business opportunities with Nvidia's supercomputer DGX GH200. Naga Chandrasekaran, senior vice president of Micron's Technology Development Division, said that mass-produced 1-gamma process products using EUV technology are in the research and development process and are expected to be mass-produced in 2025.
Industry insiders pointed out that although Micron lags behind Samsung and SK Hynix for nearly a year in terms of HBM product development, the company has accelerated its pace in the development and mass production of the new generation HBM3e product and is expected to gain ground in the HBM competition. city
The last cycle of DRAM reached its peak around December 2017 and bottomed out in December 2019. The downward cycle lasted for two years, followed by an upward cycle of about one and a half years. The last cycle lasted for 3 years. -4 years. Originally, the DRAM cycle peaked around April 2021, bottomed out in September 2023, and prices rebounded in 10. This downward cycle has lasted for two and a half years, and a new round of DRAM upcycle has begun.