Site icon VMVirtualMachine.com

Scaling laws challenge the endless progress of AI

Scaling laws challenge the endless progress of AI

By Mirage News
Publication Date: 2025-11-26 20:09:00

Milad Fakurian / Unsplash

OpenAI boss Sam Altman – perhaps the most prominent face of the artificial intelligence (AI) boom that accelerated with the launch of ChatGPT in 2022 – loves scaling laws.

These widely admired rules of thumb, which link an AI model’s size to its capabilities, are largely responsible for the AI ​​industry’s rapid rush to buy up powerful computer chips, build unimaginably large data centers and restart decommissioned nuclear power plants.

As Altman argued in a blog post earlier this year, the “intelligence” of an AI model is assumed to be “approximately equal to the log of resources used to train and run the model” – meaning that you can achieve consistently better performance by exponentially increasing the amount of data and computing power involved.

The scaling laws for large language models (LLMs) were first observed in 2020 and further refined in 2022. They are based on drawing lines on graphs of experimental data. For engineers, they give a simple formula that…

Exit mobile version