AI may not need huge training data after all

AI may not need huge training data after all

By ScienceDaily
Publication Date: 2026-01-04 12:00:00

New research from Johns Hopkins University shows that artificial intelligence systems whose designs are inspired by biology can resemble human brain activity even before they are trained on data. The study suggests that the structure of AI may be as important as the amount of data it processes.

The results, published in Natural machine intelligencechallenge the prevailing strategy in AI development. Rather than relying on months of training, massive data sets, and massive computing power, the research highlights the value of starting with a brain-like architectural foundation.

Rethinking the data-intensive AI approach

“The way the AI ​​field is currently evolving is by throwing a lot of data at the models and building computing resources the size of small cities. This requires spending hundreds of billions of dollars. In the meantime, people are learning to see with very little data,” said lead author Mick Bonner, an assistant professor of cognitive science at Johns Hopkins University.