The neural network model inspired by biology greatly improves its memory ability.
Inspired by recent biological discoveries, researchers have developed a new model that shows enhanced memory performance. This is achieved by modifying a classic neural network. Computer models play a key role in studying the process of making and retaining memories and other complex information in the brain. However, building this model is a delicate task.
The intricate interaction between electrical and biochemical signals and the connection network between neurons and other cell types create the basic structure for the formation of memory. Nevertheless, due to the limited understanding of the basic biology of the brain, it has proved to be a difficult task to code the complex biology of the brain into a computer model for further study.
Researchers at Okinawa Institute of Science and Technology (OIST) improved the widely used memory computer model (called Hopfield Network) by incorporating biological insights. This change has inspired a neural network, which not only better reflects the connection between neurons and other cells in the brain, but also has the ability to store more memories.
Thomas Burns, a doctoral student in the group of Professor Zhishu Shenjing, who is the head of OIST’s neurocoding and brain computing department, said that the increased complexity in the network makes it more realistic.
"Why is there so much complexity in biology? Memory may be one reason, "Mr. Burns said.
In the classical Hopfield network (left), each neuron (I, J, K, L) is connected with other neurons in pairs. In the improved network made by Burns and Professor Shenjing, three or more groups of neurons can be connected at the same time. Source: Thomas Burns (OIST)
Hopfield network stores memory as a weighted connection pattern between different neurons in the system. The network is "trained" to encode these patterns, and then researchers can test its memory of these patterns by presenting a series of vague or incomplete patterns to see if the network can recognize them as patterns it already knows. However, in the classical Hopfield network, the neurons in the model are connected with other neurons in the network to form a series of so-called "paired" connections.
Paired connections represent the connection between two neurons at the synapse, which is the connection point between two neurons in the brain. But in reality, neurons have complex branching structures called dendrites, which provide multiple connection points, so the brain relies on more complex synaptic arrangements to complete its cognitive work. In addition, the connections between neurons are regulated by other cell types called astrocytes.
Burns explained: "There are only paired connections between neurons in the brain, which is simply unrealistic. He created an improved Hopfield network, in which not only pairs of neurons, but also three, four or more groups of neurons can be connected, such as astrocytes and dendritic trees in the brain. "
Although the new network allows these so-called "collective" connections, on the whole, it contains the same number of connections as before. The researchers found that a hybrid network with paired connections and collective connections performed best and retained the most memory. They estimate that its effect is more than twice that of the traditional Hopfield network.
It turns out that you actually need to balance the combination of various features to some extent, Burns said. A single synapse is necessary, but you should also need some dendritic trees and some astrocytes.
Hopfield networks are very important for simulating brain processes, but they also have powerful other uses. For example, a very similar network type called Transformers is a language tool based on artificial intelligence, such as ChatGPT, so the improvements identified by Burns and Professor Shenjing may also make such tools more powerful.
Burns and his colleagues plan to continue to study their modified Hopfield networks to make them more powerful. For example, in the brain, the connection strength between neurons is usually different in two directions, so researchers want to know whether this asymmetric feature can also improve the performance of the network. In addition, he also wants to explore ways to make the memories of the network interact, just as they do in the human brain. Our memory is multifaceted and huge. We still have a lot to discover. "