Building a chatbot like ChatGPT costs billions of dollars. That’s the motivation behind OpenAI’s plan to change its management.
OpenAI raised $10 billion in early 2023. Just 18 months later, the company had burned through most of that money. So it raised another $6.6 billion and arranged to borrow another $4 billion.
But in 18 months or so, OpenAI will need another infusion of cash, as the startup spends more than $5.4 billion a year. And by 2029, that number is expected to rise to $37.5 billion.
OpenAI’s rapidly rising costs are a big reason its original nonprofit structure may soon change. OpenAI needs to raise billions of dollars in the coming years, and its CEO believes it will be more attractive to investors if it becomes a for-profit.
AI has upended the way computer technology was created. For decades, engineers in Silicon Valley designed new technologies one step at a time.
When they build social networking apps like Facebook or shopping sites like Amazon, they write computer code line by line. With each new line, they carefully define what the app will do.
But when building AI systems, they feed huge amounts of data to these systems. The more data, the more powerful they become.
Just as students learn more by reading more books, AI systems can improve their skills by collecting larger sets of data. Chatbots like ChatGPT learn skills by entering all the English text on the Internet.
That requires ever more computing power from data centers. Inside those data centers are computers packed with thousands of specialized computer chips (graphics processing units, or GPUs), which cost more than $30,000 each.
Costs are being pushed higher as the chips, data centers and electricity needed are in short supply.
Sean Holzknecht, CEO of data center operator Colovore, said this new type of data center costs 10 to 20 times more than a traditional data center.
Specialized chips take months to run the calculations that allow ChatGPT to pinpoint patterns in all that data. Each “training run” can cost hundreds of millions of dollars.
“Imagine reading and rereading what’s on the internet,” said David Katz, a managing partner at Radical Ventures, a venture capital firm that invests in AI startups. “It’s the most computationally intensive task the world has ever seen.”
Google, Microsoft, OpenAI and others are working to expand the global pool of data centers needed to build the technology.
They plan to spend hundreds of billions of dollars to increase the number of computer chips they produce each year, install them in facilities around the world and secure the electricity they need to operate.
Those costs are especially steep when companies like OpenAI, Google, and Anthropic offer chatbots for free to users. Even charging $20 a month doesn’t cover the costs.
Since developing the first version of ChatGPT, OpenAI has steadily improved the chatbot, feeding it ever-larger amounts of data, including images and audio as well as text.
The company recently unveiled a version of ChatGPT that “reasons” through math, science, and computer programming problems. It builds the technology using reinforcement learning techniques.
Through this process, the system learns additional behavior over months of trial and error. For example, when solving different math problems, it can learn which methods lead to the right answer and which do not.
When people use this system, it “thinks” before responding. When someone asks it a question, it explores many possibilities before giving an answer.
OpenAI sees this technology, OpenAI o1, as the future of business. It requires even more computing power.
That's why the company predicts computing costs will increase sevenfold by 2029 as it pursues the dream of general artificial intelligence — a machine that rivals or surpasses the human brain.
“If you try to pursue science fiction, the costs will continue to increase,” said Nick Frosst, a former Google researcher and co-founder of AI startup Cohere.
(According to NYT)
Source: https://vietnamnet.vn/tai-sao-openai-can-nhieu-tien-nhu-vay-2353669.html
Comment (0)