AI’s memory-forming mechanism found to be strikingly similar to the brain’s
Dec. 19, 2023.
1 min. read Interactions
Suggests low-cost, high-performance AI systems
Physical neural network learns and remembers ‘on the fly’ like neurons for low-energy machine intelligence
Researchers at the Institute for Basic Science (IBS) in South Korea have discovered a striking similarity between AI memory processing of transformer models and the hippocampus of the human brain.
The finding provides a novel perspective on memory consolidation, a process that transforms short-term memories into long-term ones in AI systems.
Hippocampus role in memory
The team focused on memory consolidation through the NMDA receptor in the hippocampus, which facilitates learning and memory formation. When a brain chemical called glutamate is present, the nerve cell undergoes excitation, but a magnesium ion acts as a small gatekeeper blocking the door. When this ionic gatekeeper steps aside, substances are allowed to flow into the cell. This is the process that allows the brain to create and keep memories.
Low-cost, high-performance AI systems
The team discovered that large learning models, such as ChatGPT, seem to use a gatekeeping process similar to the brain’s NMDA receptor.
In the animal brain, a low magnesium level is known to weaken memory function. The researchers found that long-term memory in the transformer can be improved by mimicking the NMDA receptor. Similar to how changing magnesium levels affect memory strength, tweaking the transformer’s parameters to reflect the gating action of the NMDA receptor led to enhanced memory in the AI model.
CHA Meeyoung, a data scientist in the team, notes that “the human brain is remarkable in how it operates with minimal energy, unlike the large AI models that need immense resources. Our work opens up new possibilities for low-cost, high-performance AI systems that learn and remember information like humans.”