In this blog post, we explore whether artificial intelligence can perform human emotions, intuition, and even the role of the right brain based on the formation of emotions and the working principle of artificial synapses.
The human brain consists of the right and left hemispheres. Most of the higher mental functions are performed by both hemispheres working together, but the functions that the right and left hemispheres are mainly responsible for are different. The left hemisphere is responsible for verbal, mathematical, analytical, logical, and rational functions, while the right hemisphere is responsible for nonverbal, spatial-temporal, intuitive, and emotional functions. Brian Christian, author of “The Most Human Human” (2012), says that the right brain is one of the main differences between humans and artificial intelligence. He argues that the left brain of humans is capable of performing the calculations and logic that computers are currently capable of, but the emotional aspects of the right brain are difficult for artificial intelligence to perform. However, I agree with Brian that if we can figure out how emotions, feelings, and intuition are formed in the brain, we can apply that to artificial intelligence. I will discuss the possibility of artificial intelligence mimicking the functions of the right brain by looking at emotions, which are the most representative of the functions of the right brain and the most commonly claimed to be beyond the reach of artificial intelligence.
First, the generation and expression of emotions in response to situations is similar to the algorithms of artificial intelligence. Emotions are formed based on evolution and memory. Mammals have developed the limbic system and the cerebral cortex, which are responsible for emotions, compared to other organisms. The interaction between the two developed parts has led to the production of complex reactions, which are emotions. As mammals evolved with the purpose of survival and reproduction, they have common emotions of love, fear, anger, and sadness through mating for reproduction and fleeing from predators for survival. This can be seen in the case of an uneducated newborn baby who smiles when he feels good and cries when he feels bad. The emotions that we have without learning are called primary emotions, and they form the foundation for learning emotions.
Let’s take the emotion of “anger” as an example to explain the learning of emotions. When a person feels the emotion of “anger” in a certain situation, hormones, especially adrenaline, are secreted by the stimulus, and the sympathetic nerve in the autonomic nerve system reacts. We remember this physiological reaction as “anger” and remember the situation that triggered it. This is how information that causes the emotion of “anger” is stored in the brain. And by applying and analyzing that data, we determine whether we will feel the emotion of “anger” in response to a certain stimulus.
Another example is brain stimulation experiments. When the part of the brain responsible for laughter was electrically stimulated, the woman could not suppress her laughter. At the same time, the doctors said, “You’re really funny, doctors!” in response to her laughter. What this experiment shows is that the brain makes us feel emotions by sending electrical signals to specific parts of the brain in specific situations.
The formation of emotions is similar to the algorithms of artificial intelligence and computers. The computer stores data in the memory element, and the artificial intelligence algorithm gives the corresponding stimulus to a specific part of the CPU. And when the corresponding stimulus (command) is given, the matching response is retrieved from the memory and executed. If human emotions are interpreted as reactions based on experience and memory, then the artificial intelligence receiving electrical stimulation in a specific part of the body in a situation that matches the data value corresponding to each emotion and expressing an appropriate emotion is not much different from how humans express emotions.
One may wonder if the primary emotion is the difference between computers and humans during the process of forming emotions. Of course, artificial intelligence has not evolved from primitive organisms like mammals and has not developed a limbic system and cerebral cortex. It has not also developed primary emotions for survival and reproduction during the process of evolution. However, what is important is the existence of primary emotions, not the process by which they are formed. Even if a person inputs them, as long as they have primary emotions, their ability to gather information about situations and categorize them at an exceptionally fast pace will make the limitations of having a human input seem insignificant. And the technologies that make this possible are artificial synapses and deep learning.
Thoughts, emotions, feelings, and memories are all regulated by neurotransmitters in the brain. The human brain has structural and functional units called neurons, and neurotransmitters are secreted at the axon terminals of these neurons. The areas where one neuron is connected to another to exchange neurotransmitters are called synapses. The neurotransmitter stored at the end of the axon of a neuron is released when the neuron receives neural information in the form of an electrical signal, and the signal is transmitted to another neuron through the synapse. The neuron that receives the signal opens the sodium-potassium ion channel to form an active potential within the cell. This spreads to the surrounding cells, and the signal is transmitted. This is a very fragmented representation of the process of controlling thoughts, emotions, moods, and memories, but these fragmented interactions between neurons come together to form human thoughts and emotions.
Artificial synapses are created by mimicking the human neuron-synapse system. This hardware, which mimics human neurons, has enabled artificial intelligence to respond immediately to any situation. Previously, memory and input values were stored one-to-one, which led to systemic limitations in terms of reaction speed and diversity. However, artificial synapses, which mimic the process of one neuron interacting with dozens of neurons, not only overcome the existing problems but also further advance deep learning technology. This has led to the creation of new hardware that is both morphologically and functionally similar to the human brain.
The development and advancement of artificial synapse technology has advanced deep learning technology. Deep learning technology is a technology used to cluster or classify objects or data. The core of deep learning technology is classification through pattern analysis. The amount of information that artificial intelligence receives is enormous, and more efficient hardware than existing hardware was needed to analyze and classify the vast amount of information. So, artificial neural networks, which are an advancement on artificial synapses, were developed. Artificial neural networks use dozens of CPUs and GPUs (central processing units) for fast and efficient classification. In addition to processing information quickly through CPUs, the parallel processing of information enabled by GPUs makes it possible to classify astronomical amounts of data in a short period of time. The reality is approaching where artificial intelligence stores and classifies data on various situations that trigger emotions, and then outputs appropriate emotional expressions when faced with new situations. Classifying and predicting through pattern analysis, and connecting dozens of CPUs and GPUs with artificial neural networks is very similar to the reactions and structures of the limbic system and the cerebral cortex between each neuron in our brain, both structurally and functionally. In fact, in March, scientists from France and the United States announced that they had succeeded in developing a memristor, a solid-state artificial synapse that can learn on its own and even mimics the plasticity of human synapses, for artificial brains.
There is an argument that the action of human neuron synapses is not just about sending and receiving signals, but also about adjusting the strength of the signals being transmitted, which is quite variable, so it is difficult to catch up with it with simple hardware. This means that it is impossible to completely replace the biological characteristics of humans with mechanical parts. However, the above-mentioned “memristor” is a good example of the positive possibilities of this technology. The “memristor” uses a genome to change the voltage and achieve plasticity. As technology advances, new hardware is developed that did not exist before. These hardware components may not be identical to those of the human body, but they perform the same structural and functional roles.
Emotions are created and expressed based on existing data on evolution and memory, so they are similar to a computer retrieving stored data. Meanwhile, artificial synapses that mimic human synapses have been developed, which has advanced deep learning technology. Deep learning has enabled the rapid classification of situations, which immediately triggers an appropriate response. Combining these three elements, we can conclude that artificial intelligence has structural and functional units similar to those of the human brain, and can quickly and accurately respond to situations using vast amounts of data to produce emotions. We have discussed emotions as an example, but the nonverbal, spatiotemporal, intuitive, and emotional functions performed by the right brain are also formed through the same process as emotions. Considering that all of these functions are derived from the interaction between synapses in the brain, we can extend the conclusion that artificial intelligence can have emotions to the conclusion that artificial intelligence can perform the role of the right brain.
We discussed how emotions, the most representative role of the right brain, were formed through evolution and how they are expressed in each situation, what is common between artificial intelligence and human synapses, and how artificial synapses respond quickly to situations through rapid classification with advanced artificial neural networks. In short, emotions, which were created for survival and the propagation of the species, remain in the brains of mammals, allowing them to know what emotions are expressed in what situations, and this is also possible for artificial intelligence. In addition, the development of artificial synapses that mimic the synapses in the human brain has led to the development of deep learning technology. This has enabled artificial intelligence to respond immediately to situations through rapid classification of vast amounts of data. Therefore, I agree with the claim that artificial intelligence can fully match the role of the right brain.
The conclusion that artificial intelligence can also perform the role of the right brain is significant. If an artificial right brain (artificial intelligence) is developed to assist and replace the role of the human right brain, the most anticipated social impact is that it will be able to provide a solution to patients suffering from right-brain diseases. ADHD (attention deficit hyperactivity disorder), which has recently shown an increasing incidence, is a typical right-brain disease. It is possible to alleviate symptoms by using artificial intelligence to enhance the functions of the right brain, such as by training an AI that plays the role of a normal right brain or by inserting an AI-equipped chip into the right brain. In addition, it may open up the possibility of treating and alleviating autism and non-verbal learning disorders. However, in this case, it seems necessary to clearly define the boundaries of the patient’s identity, to what extent is human and to what extent is artificial intelligence. In severe cases, if the brain is replaced with an artificial brain, there will be a need for constant discussion on whether the patient should be considered a human or a cyborg. Furthermore, if an artificial intelligence with the functions of the left and right brains and a body appears, not as a way to assist humans, there will be controversy over whether this should be considered a new species.