On February 6, the AI industry chain ushered in a polarized market. On the one hand, demand for AI infrastructure is strong and Hon Hai’s revenue has increased significantly; on the other hand, the shortage of memory chips has led Nvidia to postpone the launch of new game chips, reflecting the structural differences in the development of the AI industry.
Benefiting from the strong growth in global demand for AI infrastructure, the January revenue data released by Hon Hai Precision was impressive. Data shows that Hon Hai’s sales in January reached NT$730.04 billion, a year-on-year increase of 35.5%, showing strong performance resilience. As Nvidia’s core partner, Hon Hai’s substantial revenue growth directly reflects the strong purchasing power of global data center servers. At the same time, Hon Hai expects first-quarter sales to increase by 28%, further demonstrating the prosperity of the upstream of the AI industry chain.
In sharp contrast, Nvidia announced that it would postpone the launch of new game chips, the first time in nearly 30 years that it has not launched new game chips throughout the year. It is reported that the core reason for this postponement is the shortage of memory chips triggered by the global AI boom. Nvidia has prioritized the scarce memory chip production capacity to the more profitable AI business, leading to a significant reduction in game GPU output. In addition, Amazon plans to invest $200 billion this year in AI, chips, robots and other fields, with a investment scale exceeding that of Google, further exacerbating the supply-demand contradiction of memory chips.
In addition, the AI large model field continues to usher in new developments. Anthropic released Claude Opus4.6, and OpenAI launched GPT-5.3-Codex and Frontier, targeting enterprise-level agent automation and promoting the development of AI technology to a higher level. However, affected by the overall correction of U.S. stocks, related tech stocks generally fell.
Market analysis pointed out that the current AI industry chain presents a pattern of “strong upstream and differentiated downstream”. AI infrastructure and large model research and development are still the core tracks, while the downstream application field still needs to wait for technological breakthroughs and scenario landing. In the follow-up, we can focus on investment opportunities in the upstream AI chip and server fields.