Researchers at Skoltech have pioneered a groundbreaking mathematical model that delves into the intricate workings of memory, yielding astonishing insights that could revolutionize robotic systems, artificial intelligence, and our fundamental understanding of how the human mind encodes information. The study, meticulously detailed in the esteemed journal Scientific Reports, posits a compelling hypothesis: there may exist an optimal number of sensory inputs for cognitive processing, and intriguingly, our familiar five senses might be insufficient to achieve this peak efficiency.

"Our conclusion is, of course, highly speculative when applied to human senses, although one can never say never: It could be that humans of the future might evolve a sense of radiation or magnetic fields," stated Professor Nikolay Brilliantov, a leading co-author of the study and a prominent figure at Skoltech AI. "However, in any case, our findings hold significant practical importance for the fields of robotics and the theory of artificial intelligence. It appears that when each concept retained in memory is characterized by precisely seven features – as opposed to, for instance, five or eight – the number of distinct objects that can be held in memory is maximized."

This pioneering research builds upon a rich academic tradition that originated in the early 20th century, focusing on the fundamental building blocks of memory, known as "engrams." An engram can be conceptually understood as a distributed network of neurons across various brain regions that collectively activate, or "fire together," to represent a specific concept. Each engram is essentially a unique signature, defined by a distinct set of features. For human cognition, these features are deeply intertwined with our sensory experiences. For example, the abstract concept of a "banana" is not merely a single data point but a rich tapestry woven from its visual appearance, its characteristic aroma, its unique taste, and a multitude of other sensory qualities. Within this theoretical framework, the "banana" is thus represented as a five-dimensional object residing within a vast mental landscape that encompasses all other memories stored within the brain.

The dynamic nature of engrams is a crucial aspect of memory formation and retention. These neural patterns are not static but evolve over time. They can become sharper and more defined when consistently reinforced by sensory input from the external world, or conversely, they can become more diffuse and less accessible through a process akin to forgetting, especially if they are infrequently triggered. This ebb and flow of engram plasticity is the very mechanism by which we learn and adapt as we navigate and interact with our environment.

Professor Brilliantov further elaborated on the mathematical underpinnings of this phenomenon: "We have mathematically demonstrated that engrams within the conceptual space exhibit a tendency to evolve towards a steady state. This signifies that after an initial period of transition, a ‘mature’ distribution of engrams emerges, which then maintains its stability over time. When we then examine the ultimate storage capacity of a conceptual space defined by a specific number of dimensions, we find, somewhat surprisingly, that the number of distinct engrams that can be stored in memory in this steady state is greatest for a concept space of seven dimensions. This observation is the genesis of our ‘seven senses’ claim."

In essence, the researchers’ model posits that the objects and phenomena existing in the real world can be described by a finite set of features, which correspond to the dimensions of an abstract conceptual space. The primary objective of their model was to determine how to maximize the capacity of this conceptual space, measured by the number of distinct concepts that can be associated with these real-world objects. A greater conceptual space capacity, they argue, directly translates to a more profound and nuanced understanding of the world. Their findings reveal that this maximum capacity is achieved when the dimension of the conceptual space is precisely seven. From this mathematical optimization, the researchers logically infer that seven represents the optimal number of sensory inputs for an organism to process and store information efficiently.

A significant strength of this theoretical framework is its robustness. According to the researchers, this optimal number of seven dimensions is not contingent upon the specific details of the model, such as the inherent properties of the conceptual space or the nature of the stimuli that provide sensory impressions. The number seven appears to be an intrinsic and persistent characteristic of memory engrams themselves, independent of the particular sensory modalities involved. However, the researchers do acknowledge a crucial caveat: multiple engrams of varying sizes that are clustered around a common center are considered to represent conceptually similar information. Consequently, for the purpose of calculating memory capacity, these closely related engrams are treated as a single, unified concept.

The phenomenon of memory, in both humans and other living beings, remains one of the most enigmatic aspects of consciousness and cognition. The continued advancement of theoretical models of memory, such as the one developed by the Skoltech team, is therefore instrumental in unlocking new insights into the complexities of the human mind. Furthermore, these theoretical advancements are crucial for the ambitious endeavor of recreating humanlike memory capabilities in artificial intelligence agents, paving the way for more sophisticated and intelligent machines. The potential applications are vast, ranging from more intuitive and efficient AI assistants to advanced prosthetic sensory systems that could augment human perception beyond its current limitations. The implication that our current sensory apparatus may be suboptimal for peak cognitive function opens up exciting avenues for future research, both in understanding our own biology and in engineering the next generation of intelligent systems.