Study Finds AI System Spontaneously Adapts to Resemble Human Brain In Self-Organization Process
Technology

Study Finds AI System Spontaneously Adapts to Resemble Human Brain In Self-Organization Process

Summary: Scientists at the University of Cambridge implemented a straightforward physical constraint on an artificial intelligence system, leading to the intriguing outcome of the AI adopting certain traits akin to the human brain.


Cambridge researchers applied physical constraints to an AI system, resembling the development of human and animal brains. Published in Nature Machine Intelligence, the study shows the system adopting features akin to complex organism brains. This breakthrough could enhance AI efficiency and provide insights into understanding the human brain.

Using computational nodes instead of real neurons, the researchers replicated similar functions—input, transformation, and output. The physical constraint, akin to neuronal spacing difficulties, involved assigning specific locations in a virtual space, making communication more challenging as nodes moved apart.

Given a maze navigation task, the AI system, under imposed constraints, initially erred but improved through iterative feedback. The virtual space constraint mirrored challenges akin to forming connections over distances in the brain.

The AI, starting with errors, improved through feedback and repetition under constraints mimicking brain challenges. It adapted strategies akin to human brains, forming highly connected hubs. Notably, individual nodes shifted from specific tasks to a "flexible coding scheme.”

By imposing a simple constraint, the AI system developed complex characteristics, mirroring features found in animal brains. This insight offers the potential for more efficient AI models, particularly beneficial for resource-intensive technologies like OpenAI's GPT. Jascha Achterberg highlighted the prospect of simplifying internal structures to enhance efficiency on computer chips and distribute large AI models across multiple chips in extensive compute clusters.

Researchers are taking two paths: making the model more brainlike using "Spiking Neural Networks" and applying insights to large-scale AI systems for energy-efficient processing, potentially reducing energy consumption in traditionally power-intensive setups.