The core innovation lies in the fact that these new artificial neurons do not merely simulate brain activity through abstract mathematical models, as do conventional digital processors or earlier neuromorphic chips. Instead, they physically reproduce the intricate processes that govern how real neurons operate. In the biological realm, brain activity is initiated and propagated through a series of electrochemical signals. Similarly, these artificial neurons leverage actual chemical interactions to trigger and manage computational processes. This fundamental difference means they are not simply symbolic representations of neural function but rather tangible, physical recreations of biological mechanisms.

This research, spearheaded by Professor Joshua Yang of USC’s Department of Computer and Electrical Engineering, builds upon his decade-old pioneering work in the development of artificial synapses. The team’s latest breakthrough centers on a novel device they term a "diffusive memristor." Their findings elucidate how these innovative components can pave the way for a new generation of computing chips that not only complement but also significantly enhance existing silicon-based electronics. While traditional silicon systems rely on the flow of electrons to perform computations, Yang’s diffusive memristors utilize the movement of atoms—specifically, ions—to achieve the same goal. This atomic-level motion more closely mirrors the way biological neurons transmit information, enabling a more brain-like processing of data. The ultimate outcome of this approach is the potential for significantly smaller and more energy-efficient chips that process information in a manner akin to the human brain, potentially unlocking the door to true artificial general intelligence.

The intricate communication network of the brain relies on a sophisticated interplay of both electrical and chemical signals between nerve cells. When an electrical impulse reaches the terminal end of a neuron, at a specialized junction known as a synapse, it undergoes a transformation into a chemical signal. This chemical signal then traverses the synaptic gap to relay information to the next neuron in the chain. Upon reception by the subsequent neuron, this chemical signal is reconverted back into an electrical impulse, which then propagates through that neuron, continuing the computational cascade. Yang and his team have achieved a remarkable feat by replicating this complex biological process with striking fidelity within their artificial devices. A key advantage of their design is its remarkable miniaturization: each artificial neuron can be fabricated within the physical footprint of a single transistor. In stark contrast, previous neuromorphic designs often required tens or even hundreds of transistors to achieve a comparable level of functionality.

In biological neurons, charged particles called ions play a crucial role in generating the electrical impulses that underpin all nervous system activity. The human brain, in its remarkable efficiency, relies on the precise movement and concentration of ions such as potassium, sodium, and calcium to facilitate these essential electrochemical processes. These ions act as the fundamental currency of electrical signaling within our neural networks.

The research team’s innovative approach to recreating these intricate brain dynamics involves the strategic use of silver ions. In their new study, Professor Yang, who also holds the directorship of the USC Center of Excellence on Neuromorphic Computing, embedded silver ions within specific oxide materials. This carefully engineered structure allows for the generation of electrical pulses that closely mimic the fundamental functions of natural brain activity, including critical cognitive processes such as learning, motor control, and strategic planning.

"Even though it’s not exactly the same ions in our artificial synapses and neurons, the physics governing the ion motion and the dynamics are very similar," Professor Yang explains. This similarity in underlying physical principles is what allows their artificial constructs to effectively emulate biological neural behavior.

Yang further elaborates on the rationale behind their choice of materials: "Silver is easy to diffuse and gives us the dynamics we need to emulate the biosystem so that we can achieve the function of the neurons, with a very simple structure." The device that enables this brain-like chip functionality is indeed the "diffusive memristor," a name derived from the inherent ion motion and the dynamic diffusion processes that occur when employing silver.

He emphasizes that the team’s decision to harness ion dynamics for building artificial intelligent systems stems from a fundamental recognition of biological superiority: "because that is what happens in the human brain, for a good reason and since the human brain, is the ‘winner in evolution—the most efficient intelligent engine.’" This evolutionary success of the brain suggests that its fundamental operating principles are highly optimized for intelligence and efficiency.

"It’s more efficient," Professor Yang reiterates, highlighting a critical advantage that has long eluded conventional computing.

Yang is particularly keen to underscore the distinction between computational power and computational efficiency. He argues that the primary limitation of modern computing is not a lack of raw processing power, but rather its inherent inefficiency. "It’s not that our chips or computers are not powerful enough for whatever they are doing. It’s that they aren’t efficient enough. They use too much energy," he states. This issue is exacerbated by the massive energy demands of today’s large-scale artificial intelligence systems, which are tasked with processing colossal datasets.

He further explains that current computing systems were not originally designed for the massive data processing and on-demand learning capabilities that characterize advanced AI. "Our existing computing systems were never intended to process massive amounts of data or to learn from just a few examples on their own. One way to boost both energy and learning efficiency is to build artificial systems that operate according to principles observed in the brain." This suggests a fundamental shift in hardware design philosophy, moving away from brute-force computation towards biologically inspired efficiency.

While electrons are undeniably superior for raw speed in conventional computing operations, Yang posits that "Ions are a better medium than electrons for embodying principles of the brain." He elaborates on this crucial distinction: "Because electrons are lightweight and volatile, computing with them enables software-based learning rather than hardware-based learning, which is fundamentally different from how the brain operates." This implies that current electronic computing relies on complex software algorithms to simulate learning, whereas biological systems achieve learning directly through hardware modifications.

In contrast, Yang explains, "The brain learns by moving ions across membranes, achieving energy-efficient and adaptive learning directly in hardware, or more precisely, in what people may call ‘wetware.’" This direct, hardware-level learning is a key factor in the brain’s remarkable adaptability and efficiency. The analogy is stark: a young child can master the recognition of handwritten digits after encountering only a handful of examples, a feat that typically requires thousands of data points for conventional computers. Yet, the human brain achieves this remarkable learning capability while consuming a mere 20 watts of power, a stark contrast to the megawatts demanded by supercomputers for similar tasks.

Yang and his team view this technological advancement as a pivotal step toward accurately replicating natural intelligence. However, they acknowledge a practical hurdle: the silver used in their current experiments is not yet compatible with the established manufacturing processes for standard semiconductors. Consequently, future research will focus on exploring alternative ionic materials that can achieve similar functional outcomes while adhering to industrial standards.

The diffusive memristors exhibit exceptional efficiency in both energy consumption and physical size. A typical smartphone, for instance, houses numerous chips, each containing billions of transistors that toggle on and off to perform calculations. In a striking departure from this norm, Yang’s innovation enables the use of "just a footprint of one transistor for each neuron." This design philosophy aims to "reduce the chip size by orders of magnitude, reduce the energy consumption by orders of magnitude, so it can be sustainable to perform AI in the future, with similar level of intelligence without burning energy that we cannot sustain."

With the successful demonstration of these compact and capable artificial neurons and synapses, the next critical phase involves integrating vast numbers of these building blocks. The team will then rigorously test their ability to replicate the brain’s remarkable efficiency and capabilities. Professor Yang expresses further enthusiasm for the broader implications: "Even more exciting is the prospect that such brain-faithful systems could help us uncover new insights into how the brain itself works." This bidirectional scientific exploration promises to yield a deeper understanding of both artificial and biological intelligence.