DeepSeek, a rising star in the AI industry, has recently claimed a 545% cost-profit margin for its AI models—but only if everyone using them paid. This ambitious projection was made in a GitHub post where the Chinese startup provided details on its financials, revealing the daily running cost and revenue potential of its V3 and R1 models. However, this claim comes with several caveats and assumptions that make it far from guaranteed.
The Numbers Behind DeepSeek’s Claim
According to DeepSeek, the company’s daily running cost for these models, based on the use of Nvidia H800 chips, totals $87,072. If all usage were billed at the R1 pricing, it would generate $562,027 in daily revenue, leading to a 545% profit margin.
“In theory, this would translate into over $200 million in annual revenue,” says DeepSeek.
However, the company clarifies that only a subset of its services are monetized at the moment. DeepSeek’s financial projection assumes that every single user of its AI models switches to a paid plan—something no competitor has yet achieved. This makes the claim highly speculative.
Discounts and Lower Costs in Practice
Adding another layer of complexity, DeepSeek applies off-peak pricing for users during nighttime hours when demand is lower. The company re-allocates Nvidia chips to research and training during these times, lowering its operational costs. As a result, users benefit from automatic nighttime discounts, further reducing DeepSeek’s revenue.
Additionally, DeepSeek also uses lower-priced V3 models, which have not been factored into the theoretical 545% profit margin. These considerations suggest that the 545% margin would not be applicable to all its services.
Rival Reactions and Skepticism
The ambitious claim has drawn significant skepticism from tech giants and experts in the field. For instance, Google DeepMind’s Demis Hassabis has called DeepSeek’s claims “exaggerated,” suggesting that its low-cost model fails to account for the entire scope of investment required to build such AI systems.
“The $5.6 million figure likely only covers the final training phase of DeepSeek’s models, overlooking the broader infrastructure and development costs,” says Hassabis.
Despite the doubts, DeepSeek’s approach to training AI models with far less financial investment has sparked debates about the future of AI infrastructure costs, challenging the conventional wisdom that billions of dollars in resources are necessary to build competitive AI systems.
Could DeepSeek Disrupt AI’s Cost Structure?
DeepSeek’s ability to leverage Nvidia H800 chips—less powerful but significantly cheaper than those used by many AI giants like OpenAI—has made its low-cost approach intriguing. If this model is sustainable, it could challenge the billion-dollar infrastructure norms currently dominating the AI industry.
However, with the uncertainty around its monetization strategies and operational assumptions, DeepSeek’s financial projections remain speculative. Industry analysts will be watching closely to see whether DeepSeek can prove its model’s viability.