The Science of Adaptive Intelligence: A Multidisciplinary Exploration
At Self-Stacked-Systems Co-Lab, we explore artificial intelligence based on the fundamental processes of natural intelligence, at a function level and at a structural level. We search for the interplay of both these levels, and translate them into mathematical rules that apply to the construction of more ethical algorithms. We are interested in decreasing the cost and increasing the efficiency of existing approaches.
A recent research published in Frontiers in Artificial Intelligence showed that
“The human brain, a marvel of biological engineering, operates at an astonishingly low power consumption of about 12 watts, roughly equivalent to a dim light bulb. This efficiency stems from its highly optimized neural architecture, developed over millions of years of evolution.
In contrast, current AI systems, such as large-scale neural networks, often require massive computational resources, consuming up to 2.7 billion watts in large data centers. This inefficiency arises from the digital nature of AI, which relies on silicon-based processors and vast arrays of GPUs or TPUs. These systems perform billions of matrix operations, requiring significant electrical power for computation, cooling, and data transfer.”
https://www.frontiersin.org/journals/artificial-intelligence/articles/10.3389/frai.2023.1240653/full
Cognitive Neuroscience: The Architecture of Thought
Understanding the human mind requires unraveling the neural mechanisms that underlie perception, learning, and decision-making, stage-by-stage. A stage-like approach is foundational to us. It enables the process of quantifying and designing self-organizing complex structures. We investigate how hierarchical processing and distributed networks enable intelligence, from low-level sensory processing to high-level cognition.
Complex Networks: Mapping Intelligence
Natural systems and artificial intelligence systems all rely on interconnected networks. Our research examines how network structures participate in the formation of cognition, how self-organization parallels the emergence of adaptation, and how transitions condensate the phenomenon of cost-effective evolution, applying network science to both biological and computational models of intelligence.
Complex Systems: The Dynamics of Adaptation
Intelligence is more than the sum of its parts—it emerges from the interaction of multiple components. By studying self-organization, non-linear dynamics, and emergent properties, we seek to understand the intricacies of complex, adaptive systems. We are curious about not only how to mimic learning, but how to discriminate the processes of learning and development so as to mimic development and evolution itself, with reverence to biological plausibility.
Robotics and Artificial Cognition: Bridging Biology and Technology
How can artificial systems replicate the flexibility and adaptability of human intelligence? Our work in robotics and computational cognition explores biologically inspired models that integrate perception, action, learning and development, pushing the boundaries of AI and autonomous systems to a new level of self-evolution.
Hierarchical Processing: From Neurons to Thought
Cognitive systems are structured hierarchically, from neural circuits to abstract reasoning. We study how information flows through different levels of organization, enabling problem-solving in a cost-effective maximally efficient stage-like manner.
Human Development and Evolution: Intelligence Across Time
To fully understand cognition, we must trace its origins. We explore how evolutionary pressures and developmental processes shape intelligence, offering insights into both natural and artificial cognitive architectures.
By integrating these perspectives, Self-Stacked-Systems Co-Lab seeks to unravel the deep mechanisms of adaptive intelligence, shaping the future of cognitive science, AI, and beyond.