Global semiconductor revenue is projected to grow 14% in 2025 to total $717 billion, according to the latest forecast from Gartner, Inc. In 2024, the market is forecast to grow 19% and reach $630 billion.
Following a decline in 2023, semiconductor revenue is rebounding and expected to record double-digit growth in 2024 and 2025 (see Table 1). “The growth is driven by a continued surge in AI-related semiconductor demand and recovery in electronic production, while demand from the automotive and industrial sectors continues to be weak,” said Rajeev Rajput, Senior Principal Analyst at Gartner.
Table 1: Semiconductors Revenue Forecast, Worldwide, 2023-2025 (Billions of U.S. Dollars)
|
2023 |
2024 |
2025 |
Revenue |
530.0 |
629.8 |
716.7 |
Growth (%) |
-11.7 |
18.8 |
13.8 |
Source: Gartner (October 2024)
In the near term, the memory market and graphics processing units (GPUs) will bolster semiconductor revenue globally.
The worldwide memory revenue market is forecast to record 20.5% growth in 2025, to total $196.3 billion. Sustained undersupply in 2024 will fuel NAND prices up 60% in 2024, but they are poised to decline by 3% in 2025. With lower supply and a softer pricing landscape in 2025, NAND flash revenue is forecast to total $75.5 billion in 2025, up 12% from 2024.
DRAM supply and demand will rebound due to improved undersupply, unprecedented high-bandwidth memory (HBM) production and rising demand, and the increase in double data rate 5 (DDR5) prices. Overall, DRAM revenue is expected to total $115.6 billion in 2025, up from $90.1 billion in 2024.
AI Impact on Semiconductors
Since 2023, GPUs have dominated the training and development of AI models. Their revenue is projected to total $51 billion, an increase of 27% in 2025. “However, the market is now shifting to a return on investment (ROI) phase where inference revenues need to grow to multiples of training investments,” said George Brocklehurst, VP Analyst at Gartner.
Among them is a steep increase in HBM demand, a high-performance AI server memory solution. “Vendors are investing significantly in HBM production and packaging to match next-generation GPU/AI accelerator memory requirements,” said Brocklehurst.
HBM revenue is expected to increase by more than 284% in 2024 and 70% in 2025, reaching $12.3 billion and $21 billion, respectively. Gartner analysts predict that by 2026, more than 40% of HBM chips will facilitate AI inference workloads, compared to less than 30% today. This is mainly due to increased inference deployments and limited repurpose for training GPUs.