The world of artificial intelligence — across the U.S., China, and the regions outside these major model-developer
centers, infrastructure hubs, deployment ecosystems, and differing governance approaches — has undergone seismic change
since DeepSeek surfaced forecefully on January 20, 2025.
We may be in for another major change when DeepSeek releases its new reasoning model. Why has there been no release yet,
besides the launch of new versions of an experimental model? DeepSeek is a different animal, not competing with OpenAI,
Anthropic, AlibabaiAlibabaAlibaba, founded in 1999 by Chinese entrepreneur Jack Ma, is one of the most prominent global
e-commerce companies that operates platforms like AliExpress, Taobao, and Tmall.READ MORE, TencentiTencentBest known for
its super-app WeChat, Tencent is a Chinese technology conglomerate and a major player in the video gaming industry.READ
MORE, or Moonshot, but spurring innovation, driving down costs for training and inference, and pushing full-throttle
toward artificial general intelligence. The whale has likely been slowed in its release of R2 by limited access to
compute, but the firm is expected to release its new model soon.
DeepSeek’s initial success in demonstrating the power of open-source and open-weight models has rippled through China’s
AI sector, leading most (but not all) major AI labs to release OS/OW models. Some leading labs, like ByteDance, will
probably not join the trend, and competition will intensify in terms of OS/OW models and benchmarks. The release of
DeepSeek’s new R2 model will likely mean a whole new round of competition at higher levels of performance.
Uptake of AI applications, including those based on OS/OW models from leading players such as DeepSeek, Alibaba,
Tencent, Moonshot AI, and Zhipu AI, has already rapidly increased across China’s private sector, government, and
state-owned enterprises. This will continue into 2026, bolstered by the release of implementing guidance for China’s AI+
So far, most of the U.S. government’s “concern” about Chinese AI models has focused on DeepSeek, as described in the
CAISI report. In 2026, that concern will expand to the many other Chinese OS/OW models gaining traction around the
world. Expect a lot more fear, uncertainty and doubt, and agenda-driven “analysis” of Chinese models and the alleged
As the uptake of Chinese OS/OW models outside China among researchers and companies continues apace, increasingly for
production-based applications, there will be greater scrutiny from the U.S. government, as in the leaked White House
memo targeting Alibaba. There are likely to be more reports like those from Exiger and CrowdStrike, alleging military
ties for Chinese AI firms or flagging other so-called risks around using these models.
AI labs in the U.S. will consider releasing more robust OS/OW models to counter the dominance of Chinese players in this
space. The Olmo models, created by the Allen Institute for AI, are likely to at least get a look from research
organizations in early 2026. We will likely also see the release of more advanced models from DeepSeek, Alibaba,
Moonshot, and others, which will probably eclipse the ability of the Allen Institute’s team to keep up.
Despite promised government support for OS/OW models in the U.S. AI Action Plan as a result of DeepSeek, there is little
Washington can do in 2026 to either promote open-source efforts or slow Chinese company efforts to develop more capable
Throughout 2025, there was major discussion about which GPUs American firms would be allowed to sell to Chinese
customers. This debate will continue in 2026, as U.S. President Donald Trump considers approving the export of Nvidia’s
H200 Hopper-era GPU. White House AI czar David Sacks and Nvidia CEO Jensen Huang will continue to press for a sliding
scale on GPU exports, possibly tied to GPU generation: Hoppers, yes for now; Blackwell, Rubin, and Feynman, not yet.
With the U.S.-China relationship set to remain on fairly stable ground through the planned April visit by President
Trump to Beijing, look for AI to be an arena that will see both efforts at greater cooperation — potentially in areas
such as AI safety and security and access to GPUs — and contention. China critics in Washington will continue to push
the increasingly tenuous argument that controls on GPUs and semiconductor manufacturing tools would eventually slow the
ability of Chinese companies to develop advanced AI models and applications.
A number of measures and legislative initiatives are under consideration as the U.S. targets the Chinese AI technology
stack. At present, however, both sides appear willing to forgo new measures in order to see the agreement reached in
South Korea in late October implemented, particularly around issues such as rare earths.
In the meantime, expect churn around these issues on both sides. The Department of Justice, for example, last month
charged several individuals for allegedly smuggling “advanced” GPUs to Chinese customers in violation of export control
regulations. Most of the GPUs in question were A100s — hardly advanced by any measure in the industry. The numbers
involved, several hundred, appear insignificant in terms of training frontier AI models, and the action was likely
intended more as a warning to those willing to circumvent current restrictions.
At the same time, it is likely that Chinese model developers will continue to have access to advanced GPUs in data
centers outside China; this could actually expand significantly in 2026. The U.S.-China trade truce means that any
movement here is unlikely in early 2026. The recent approval of some limited licenses for advanced GPUs, the Nvidia
GB200/300s, and likely the AMD GPUs for AI data center build-outs in the UAE and Saudi Arabia signal that the Department
of Commerce is developing a new framework to replace the AI Diffusion Rule, rescinded in May.
The department is attempting to balance Sacks’ push for expedited licensing with the concerns of some national security
officials around potential diversion of GPUs and/or access to these data centers by Chinese companies and researchers.
AI data centers already operating GB200s in Europe and Asia currently have no such restrictions, and can be used by
Chinese AI model developers. This will continue to be a major issue in 2026.
It remains unclear if DeepSeek is using this potential workaround. This is in part because DeepSeek and other leading
Chinese model developers still have access to capable Nvidia GPUs, and in 2026 will ramp up efforts to deploy domestic
alternatives from Huawei and Chinese GPU startups including Moore Threads, Biren, Enflame, and others.