CITIC Securities: Domestic Large Models Exceed Expectations, Emphasizing Investment Opportunities in Domestic Computing Power

robot
Abstract generation in progress

Citic Securities Research Report states that during the 2026 Spring Festival, domestic large models experienced a surge in token usage. As of February 22, in the previous week’s global large model token usage, the top three were all domestic large models. We believe that the explosive growth in tokens fundamentally reflects an exponential increase in AI inference demand. With cost advantages and a continuously improving ecosystem, domestic computing power is expected to gradually dominate at the infrastructure level. We recommend focusing on valuation re-estimation opportunities driven by increased super-node interconnection density, including optical communication, high-speed line modules, switching chips and switches, and data centers.

Full Text

Communication | Domestic Large Models Surpass Expectations, Focus on Domestic Computing Power Investment Opportunities

During the 2026 Spring Festival, domestic large model token usage surged. As of February 22, in the previous week’s global large model token usage, MiniMax M2.5 led with 3.02 trillion tokens, Kimi K2.5 ranked second with 1.03 trillion tokens, and GLM 5 was third with 816 billion tokens—all domestic large models. We believe that the core drivers of this growth are: 1) AI shifting from optional features to default interfaces for both consumer and enterprise users, leading to full penetration; 2) application scenarios evolving from simple dialogue AI to multimodal (text/image/audio/video) and AI agents, with per-task token consumption increasing exponentially. Additionally, data from OpenRouter shows that over 70% of token consumption comes from internet giants, large and medium-sized enterprises, and professional developers in production environments, where single-token calls far exceed those of individual users or small test projects. The explosive growth in tokens essentially reflects an exponential expansion in AI inference demand.

Hardware Layer: Domestic Computing Power Expected to Gradually Dominate Infrastructure Due to Cost Advantages and Ecosystem Improvements

During the Spring Festival, Alibaba and Tencent announced major achievements in NPO technology. Alibaba Cloud’s all-optical Scale-up network architecture UPN512 connects xPU and switches via optical interconnects, using a single-layer CLOS topology to achieve full interconnection of 512 xPUs. This design eliminates high-speed copper cables inside cabinets, significantly reducing wiring complexity, heat dissipation, power supply needs, and maintenance costs, with power consumption halved and costs reduced by 30%. We believe that domestic computing power infrastructure, leveraging cost and ecosystem advantages, is poised to evolve from serving domestic giants to supporting global users through domestic AI models, expanding the demand ceiling for domestic computing power.

The US-China AI Arms Race Is Heating Up; Focus on Domestic Computing Infrastructure

Major cloud giants in both the US and China are significantly increasing AI-related capital expenditures. According to CNBC on February 20, OpenAI aims to invest approximately $600 billion in computing power by 2030. The Financial Times reported on December 23, 2025, that ByteDance’s initial capital expenditure plan for 2026 is 160 billion RMB, higher than about 150 billion RMB in 2025. Alibaba also announced at the 2025 Cloud Computing Conference that it will increase investment beyond the 380 billion RMB planned over the next three years. In domestic computing power, super-node architectures are essential for catching up and surpassing others. Cloud providers and equipment manufacturers are accelerating the adaptation of open protocols. Focus on valuation re-estimation opportunities driven by increased interconnection density, including optical communication, high-speed line modules, switching chips and switches, and data centers.

  1. Optical Communication: As AI industry prosperity continues, optical interconnects are gradually replacing copper connections, becoming key to high-performance, high-bandwidth, low-latency AI networks. For scale-out, pluggable optical modules remain the undisputed first choice; for scale-up, as a new market, optical module technology is rapidly evolving, with NPO and CPO solutions progressing simultaneously.

  2. High-Speed Line Modules: Responsible for short-distance, high-bandwidth, low-loss connections between boards, such as within cabinets. As super-node scale increases and inter-card communication speeds rise, their value is expected to grow significantly.

  3. Switching Chips and Switches: According to NADDOD and Semianalysis estimates, for example, as NVIDIA’s super-node scale increases from NVL72 to NVL576 or higher, the ratio of GPU to switching chips approaches 1:1 or higher. AMD and Huawei’s super-node solutions also confirm this trend, with the value of switching components increasing substantially as their ratio to GPUs improves.

  4. Data Centers (IDC): IDC is the foundation of computing power, essentially a heavy-asset leasing service with high initial CapEx. Construction typically takes 1-3 years, with deployment taking 6-18 months. An occupancy rate of around 50% can break even; top-tier providers with over 90% occupancy can achieve gross margins over 50% and net margins over 30%. With rapidly rising industry demand, IDC is expected to benefit significantly, especially leading companies with abundant resources.

Risk Factors:

  • AI technology development and application falling short of expectations
  • Domestic infrastructure development lagging
  • AI-related policies not being implemented as planned
  • Cloud providers and operators’ capital expenditure falling short
  • Geopolitical risks

Investment Strategy:

During the 2026 Spring Festival, domestic large model token usage surged. As of February 22, in the previous week’s global large model token usage, the top three were all domestic large models. We believe that the explosive growth in tokens fundamentally reflects an exponential increase in AI inference demand. With cost advantages and a continuously improving ecosystem, domestic computing power is expected to gradually dominate at the infrastructure level. Focus on valuation re-estimation opportunities driven by increased super-node interconnection density, including optical communication, high-speed line modules, switching chips and switches, and data centers.

(Source: People’s Financial News)

GLM6%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)