(CLO) Chinese AI startup DeepSeek has just revealed data on costs and revenue related to its popular V3 and R1 models.
According to the statement, DeepSeek’s theoretical return on investment can reach 545% per day. However, the company notes that actual revenue will be significantly lower.
Deepseek is growing strongly, especially in its home country of China. Photo: X
This is the first time DeepSeek has published information about the return on inference tasks, the post-training stage where trained AI models perform tasks like chatbots answering user questions.
The revelation could send shockwaves through the market for AI stocks outside China, which plunged in January after chatbots based on DeepSeek's R1 and V3 models gained global popularity.
The sell-off was partly due to DeepSeek’s announcement that it spent less than $6 million on Nvidia H800 chips to train its models, far less than U.S. rivals like OpenAI. The H800 chips DeepSeek uses are also less powerful than what OpenAI and other U.S. AI companies have access to, raising investor doubts about U.S. AI companies’ commitment to spending billions on advanced chips.
Assuming the cost of renting an H800 chip is $2 per hour, the total daily inference cost for the V3 and R1 models is $87,072, DeepSeek said in a March 1 GitHub post. Meanwhile, the theoretical daily revenue from the two models is $562,027, resulting in a 545% return on investment. Annualized, the revenue would be more than $200 million.
However, DeepSeek emphasizes that the actual revenue is much lower because the cost of using the V3 model is lower than R1. Furthermore, developers pay less during off-peak hours.
In addition to the profit margin information, DeepSeek has just revealed technical details about the development of the R1 inference model, which has the performance of OpenAI's o1 at a fraction of the usual cost, a move that is expected to accelerate global advances in the field.
Chinese companies, from chipmakers to cloud providers, are rushing to support DeepSeek’s AI models. Moore Threads and Hygon Information Technology, AI chipmakers with ambitions to compete with Nvidia, say their computer clusters and accelerators can support DeepSeek’s R1 and V3 models.
Huawei Technologies has also partnered with AI infrastructure startup SiliconFlow to deliver DeepSeek models to customers on the cloud, with performance comparable to models running on global high-end chips.
Other major companies like Alibaba, Baidu, and Tencent are also working to make DeepSeek’s models accessible through their services. DeepSeek’s success has turned the startup and its founder, Liang Wenfeng, into celebrities.
Cao Phong (according to CNBC, CNN, SCMP)
Source: https://www.congluan.vn/deepseek-tuyen-bo-ty-le-chi-phi-loi-nhuan-ly-thuyet-la-545-moi-ngay-post336849.html
Comment (0)