Ilya Sutskever, co-founder and former chief research officer of OpenAI, predicts that AI model training as we know it will no longer exist.
The AI expert left OpenAI, the company he co-founded, earlier this year to start his own AI lab called Safe Superintelligence Inc.
“Pre-training as we know it will cease to exist,” Sutskever said at a conference on neural information processing.
The term "pre-training" refers to the early stage in AI model development, when a large language model learns patterns from large amounts of unlabeled data, often text from the internet, books, and other sources.
Data resource exhaustion
Sutskever said that while he believes current data can still accelerate AI development, the industry is running out of new resources to train models.
This, he says, will eventually lead to a change in the way AI models are trained today.
This is also the situation that has happened and is happening with fossil resources, when oil fields are a limited resource, or like the Internet only contains a finite amount of human-created content.
“We have reached the peak of data and it will not be there in the future,” Sutskever said. “We have to work with the data that is available, with a current internet source.”
Sutskever predicts that next-generation models will be “truly agent-like.” “Agent” is a popular term in the field of AI, generally understood as an autonomous AI system that performs tasks, makes decisions, and interacts with software independently.
In addition to being “agent-like,” he said future AI systems will be able to reason. Unlike current AI, which primarily recognizes patterns based on what the model has learned before, future AI systems will be able to solve problems step by step in a way that is closer to thinking. “The more reasoning, the less predictable the system becomes,” Sutskever said.
AI can create its own way of training itself
The expert also compared the development of AI systems to evolutionary biology, citing research showing the relationship between brain and body size in animals.
For example, while most mammals follow a certain proportional pattern, humans have markedly different brain and body proportions.
And at some point, as evolution finds new rates for the brain growth of our ancestors, AI may also find new ways to scale beyond the current way of training models.
(According to TheVerge, Yahoo Tech)
Source: https://vietnamnet.vn/nha-sang-lap-openai-ai-se-tim-ra-cach-tu-dao-tao-chinh-no-2352692.html
Comment (0)