Scientists at the USC Viterbi School of Engineering and the School of Advanced Computing have created artificial neurons that reproduce the intricate electrochemical behavior of real brain cells, marking a major milestone in neuromorphic computing. This groundbreaking discovery, published in the prestigious journal Nature Electronics, promises to revolutionize artificial intelligence by enabling the development of chips that are orders of magnitude smaller and dramatically more energy-efficient than current technologies, potentially paving the way for artificial general intelligence (AGI). Unlike digital processors or earlier neuromorphic chips that rely on abstract mathematical models to simulate brain activity, these novel neurons physically replicate the fundamental operational principles of biological neurons. This means they don’t just represent brain function symbolically; they tangibly recreate it.
The research, spearheaded by Professor Joshua Yang of USC’s Department of Computer and Electrical Engineering, builds upon his decade-old pioneering work in artificial synapses. The team’s innovative approach centers on a newly developed device called a "diffusive memristor." Their findings illustrate how these components can give rise to a new generation of computing hardware that not only complements but also significantly enhances traditional silicon-based electronics. While conventional silicon systems process information using the flow of electrons, Yang’s diffusive memristors harness the movement of atoms to perform computations. This atomic-level motion closely mirrors the way biological neurons transmit information, leading to the creation of smaller, more energy-efficient chips that process data with brain-like efficiency. This advancement holds immense promise for pushing the boundaries of artificial intelligence towards achieving AGI.
In the complex biological system of the brain, communication between nerve cells is a sophisticated interplay of both electrical and chemical signals. When an electrical impulse reaches the terminal of a neuron at a specialized junction known as a synapse, it undergoes a transformation into a chemical signal. This chemical messenger then travels across the synaptic gap to the next neuron, where it is received and converted back into an electrical impulse, perpetuating the signal transmission. Yang and his colleagues have achieved a remarkable feat by replicating this intricate biological process with astonishing accuracy within their artificial devices. A significant advantage of their design is its extreme compactness; each artificial neuron occupies a footprint equivalent to that of a single transistor, a stark contrast to earlier designs that required tens or even hundreds of transistors for comparable functionality.
The fundamental mechanism behind electrical impulses in biological neurons relies on charged particles called ions. The human nervous system, for instance, utilizes ions such as potassium, sodium, and calcium to generate the electrical signals that enable neural activity. In their latest study, Professor Yang, who also directs the USC Center of Excellence on Neuromorphic Computing, employed silver ions embedded within oxide materials to precisely generate electrical pulses that mimic the dynamic functions of natural brain cells. These emulated functions encompass fundamental cognitive processes such as learning, motor control, and strategic planning.
"Even though it’s not exactly the same ions in our artificial synapses and neurons, the physics governing the ion motion and the dynamics are very similar," Professor Yang explained. He further elaborated on the choice of silver, stating, "Silver is easy to diffuse and gives us the dynamics we need to emulate the biosystem so that we can achieve the function of the neurons, with a very simple structure." The innovative device responsible for enabling this brain-like chip functionality is indeed the "diffusive memristor," a name derived from the crucial role of ion motion and the dynamic diffusion facilitated by the use of silver. Yang’s rationale for harnessing ion dynamics in the construction of artificial intelligent systems is rooted in the unparalleled efficiency of the human brain. He noted, "because that is what happens in the human brain, for a good reason and since the human brain, is the ‘winner in evolution-the most efficient intelligent engine.’" He emphatically concluded, "It’s more efficient."
Professor Yang stressed that the primary limitation of modern computing is not a lack of raw processing power but rather its inherent inefficiency. "It’s not that our chips or computers are not powerful enough for whatever they are doing. It’s that they aren’t efficient enough. They use too much energy," he articulated. This energy consumption is a particularly critical concern given the colossal amounts of power required by today’s large-scale artificial intelligence systems to process vast datasets. Yang further elaborated that unlike the biological brain, "Our existing computing systems were never intended to process massive amounts of data or to learn from just a few examples on their own. One way to boost both energy and learning efficiency is to build artificial systems that operate according to principles observed in the brain."
While electrons offer superior speed for rapid computational operations, Yang explained that "Ions are a better medium than electrons for embodying principles of the brain. Because electrons are lightweight and volatile, computing with them enables software-based learning rather than hardware-based learning, which is fundamentally different from how the brain operates." In stark contrast, he pointed out, "The brain learns by moving ions across membranes, achieving energy-efficient and adaptive learning directly in hardware, or more precisely, in what people may call ‘wetware.’" He provided a compelling example: a young child can learn to recognize handwritten digits after encountering only a few instances of each character. In contrast, a conventional computer typically requires thousands of examples to achieve the same level of proficiency. Remarkably, the human brain accomplishes this feat of learning while consuming a mere 20 watts of power, a minuscule fraction of the megawatts demanded by contemporary supercomputers.
Yang and his research team view this technological advancement as a significant stride toward replicating natural intelligence. However, he candidly acknowledged a current challenge: the silver employed in these experiments is not yet fully compatible with the established processes of standard semiconductor manufacturing. Consequently, future research efforts will focus on exploring alternative ionic materials that can achieve similar functional outcomes. The diffusive memristors are exceptionally efficient in terms of both energy consumption and physical size. A typical smartphone, for instance, houses approximately ten chips, each containing billions of transistors that toggle on and off to execute computations.
"Instead [with this innovation], we just use a footprint of one transistor for each neuron. We are designing the building blocks that eventually led us to reduce the chip size by orders of magnitude, reduce the energy consumption by orders of magnitude, so it can be sustainable to perform AI in the future, with similar level of intelligence without burning energy that we cannot sustain," Professor Yang stated. With the successful demonstration of these capable and compact building blocks – artificial synapses and neurons – the critical next phase involves integrating a vast number of these units. The team will then rigorously test their performance to ascertain how closely they can replicate the brain’s remarkable efficiency and sophisticated capabilities. "Even more exciting," Professor Yang concluded, "is the prospect that such brain-faithful systems could help us uncover new insights into how the brain itself works."

