Intelligence is the ability to accomplish complex goals. These goals could be anything—from growing toward sunlight to coding websites. These goals may emerge through evolution, be programmed, or arise spontaneously. What matters is that the entity behaves in a way that moves toward achieving them.
The three fundamental building blocks of intelligence are computation, memory, and learning.
- Computation – Computation is the physical evolution of information. You provide an input, and that input gets transformed into an output. That’s computation. It’s not the same as thinking. According to the laws of physics, all matter is capable of computation. For example, your digestive system senses the current acid level and secretes more acid accordingly. Or a thermostat checks the room temperature and adjusts it.
- Memory – Memory is the ability to retain information about the past so it can be used for future computation. In simple terms, memory is the storage of information for later use.
- Learning – Learning is the process of updating memory over time. This updating happens when an entity receives feedback on its actions and adjusts accordingly.
An entity that can perform all three—compute, store memory, and learn—is considered intelligent.
For example, bacteria compute chemical concentrations in their environment to decide which direction to move. Their memory is genetically encoded in DNA, and learning happens through evolution across generations.
In AGI, there will be large-scale multimodal processing capable of running millions of simulations in parallel. Memory might store world models, emotional states of people, and everything else. Learning will be continuous—driven by meta-learning—that would improve everything at exponential speed.
Across scales, the architecture of intelligence remains the same. What differs is speed, scale, and the nature of the goals.
Leave a comment