There has been news that Huawei said the DeepSeek-R1 AI model will be available on its ModelArts Studio platform. Huawei did not specify which particular hardware is used, but media reports indicate the model runs using Ascend 910C chipsets. This move has made people wonder whether Huawei exactly had the same infrastructure in which the AI model was trained.
A tipster on X (formerly Twitter) shared promotional material from Huawei about its “Ascend-adapted” platform. The statement confirms that Huawei’s AI infrastructure is used for the inference part of DeepSeek-R1, but Huawei is not revealing if it was the same hardware for training.
Huawei’s Ascend 910C: Is It A H800 Killer from Nvidia?
Now that Ascend 910C by Huawei has recently been positioned to challenge Nvidia H800, often considered the “gold standard” of GPUs specifically used for deep learning workloads, industry professionals believe there exist certain performance penalties in this model.
The training of AI models requires a huge amount of computation. Optimizing models for a specific hardware setup takes time. If Huawei has successfully run DeepSeek-R1 on its Ascend-based platform, it leaves the possibility open that the same hardware was used to train it. However, at this moment, there is no concrete evidence to support the assumption.

The DeepSeek-R1 model continues to be at the heart of controversy. OpenAI accuse DeepSeek of having used proprietary models without permission to train its version. It has alleged to have proof of unauthorized use but has refused to make it public.
The release strategy of DeepSeek has also been a subject of controversy. Although the company made the model open-source, it only released the model weights and did not disclose information about the datasets and training process. This lack of transparency has raised questions about how the model was developed.
Industry Concerns Over DeepSeek-R1’s Low Development Cost
The claim by DeepSeek AI that it has developed R1 for just $6 million (approximately ₹51.9 crores) has raised many eyebrows. Training advanced language models usually requires a lot of resources, often in hundreds of millions of dollars. The unusually low cost has led some experts to question whether DeepSeek leveraged pre-existing AI architectures.
Despite all these, DeepSeek-R1 has gained significant attention for its efficiency and its open-source nature. The AI model’s capabilities make it one of the stronger contenders in this highly competitive AI landscape.
US AI Chipset Restrictions Impact China’s AI Growth
Last year, the US government imposed strict restrictions on AI chipset sales to China, preventing American companies from selling high-performance GPUs. This policy was aimed at limiting China’s AI advancements and maintaining the US’s competitive edge in artificial intelligence.
As a result, Chinese firms like Huawei have focused on developing their own AI chipsets. The Ascend 910C serves as a response to these restrictions, providing an alternative to Nvidia’s GPUs. The ongoing chip war has intensified AI development efforts in China, pushing companies to innovate despite regulatory challenges.
What’s Next for DeepSeek-R1 and Huawei?
DeepSeek-R1’s adoption on Huawei’s ModelArts Studio marks a significant step in China’s AI expansion. If Huawei continues refining its AI infrastructure, it could strengthen China’s position in the global AI race.
The controversy surrounding DeepSeek-R1’s training process remains unresolved. Industry experts will closely monitor further developments, particularly any official response from OpenAI. Meanwhile, Huawei’s progress in AI chip development will play a crucial role in shaping China’s AI capabilities.
As AI competition intensifies, the DeepSeek-R1 model highlights both the technological advancements and challenges facing the industry. Whether Huawei’s hardware can rival Western alternatives remains to be seen, but China’s AI ambitions continue to grow despite global restrictions.
Read more OpenAI unveils ‘o3’ reasoning AI models