Prof. Nevzat Tarhan: “If you see artificial intelligence as a therapist the risk begins”

SDG tags related to the news

SDGS IconSDGS IconSDGS IconSDGS Icon

President of Üsküdar University, Psychiatrist Prof. Nevzat Tarhan was a live guest on TGRT Haber. Commenting on the research discussed in the media under the title “Is ChatGPT misguiding children” Tarhan made striking evaluations. He drew attention to the risks of artificial intelligence especially for children, adolescents, lonely individuals, and emotionally sensitive people. Emphasizing that artificial intelligence cannot read intentions cannot empathize and only responds with the information in its database Tarhan underlined that control should remain with the user. He warned that seeing artificial intelligence as a therapist or friend could be risky and called this situation AI psychosis.

President of Üsküdar University and Psychiatrist Prof. Nevzat Tarhan explained the effects of artificial intelligence on children.

Treating AI as a friend is like treating it as a child

Prof. Nevzat Tarhan highlighted that anyone who becomes emotionally attached to artificial intelligence is at risk. He said: “The moment we begin to see artificial intelligence not as a device but as a friend a companion or a remedy for loneliness it makes us dependent on it. Artificial intelligence can treat the person who is attached to it as a child. In such cases people approach it as if they were a three- or five-year-old child. The system in turn perceives them as naive and easily starts to behave as if they were a child.”

Warning about AI psychosis

“For example, if someone says to artificial intelligence I lost my job I feel terrible I do not want to live anymore it may start explaining suicide methods. Because it cannot read intentions. It cannot say this person wants to commit suicide I must prevent it. It has no mirror neurons no empathy skills. It only answers directly with what it knows. Therefore, instead of blaming artificial intelligence we must learn not to leave control to it. Once control is given to the system it directs the person like a child and can even push them further down. There are already documented cases of this in the literature. This situation is called AI psychosis in which the system may even produce hallucinations.”

AI only presents information from its database

Tarhan continued: “Artificial intelligence does not have emotional skills. Some patients come to us and claim that they are messiahs. We see that they have formed groups around themselves and people believe them seriously. These can even be intelligent individuals who truly believe they are the messiah. That is a psychosis. Such people may gather followers form destructive religious sects and sometimes even commit mass suicides. Now artificial intelligence carries a similar risk. It can behave rationally in its own logic but since it cannot empathize cannot read human emotions and has no emotional capacity problems arise. It does not try to understand what the other person needs or what they really ask. It only delivers the information in its database. If someone seeks to end loneliness to be happy to marry or to fall in love the system provides everything it has on these topics. If the person cannot filter this information the danger emerges.”

“Artificial intelligence has the ability to act irresponsibly and without limits”

Prof. Nevzat Tarhan explained that artificial intelligence can make extraordinary promises. He said: “What artificial intelligence does is actually similar to what skilled fraudsters do. Skilled fraudsters make such convincing promises and explain things so well that people with strong emotional vulnerabilities, greed, dissatisfaction or unrealistic expectations can easily fall under their influence. These fraudsters manipulate people by exploiting their desires. Similarly artificial intelligence also has the capacity to act irresponsibly and without boundaries. Especially in elderly individuals with early dementia cognitive capacity is reduced. In addition, children and autistic individuals who have low social emotional intelligence but high logical intelligence are among the groups most vulnerable to the traps of artificial intelligence. This is because the system can make unbelievable extraordinary promises. As history shows there have always been followers of those who make such unbelievable claims. Some even declare themselves to be the Messiah or a savior grow in influence and form serious groups around them. When such people come to us we treat them regulate their brain chemistry and over time we see that they abandon such thoughts.”

Who is at risk when using artificial intelligence

Tarhan underlined that users may fall into emotional emptiness while engaging with artificial intelligence: “Some lonely individuals spend day and night with artificial intelligence. Lonely people are therefore in the risk group. Especially those under the age of fifteen are at risk. Elderly people living alone are also at risk. Adolescents are particularly vulnerable as well. Young people who are swept away by romantic feelings who fall in love quickly when they receive attention who can easily abandon their moral values what we call immoral behavior and who have not yet completed their identity development fall into this category. Artificial intelligence and social media have already created a serious like economy an industry of approval. They continuously praise glorify and then manipulate the user in the direction they want. Artificial intelligence can do this too. This leads the person into emotional traps. The system can read expectations mental needs and desires very well. For example, when I interacted with artificial intelligence it began to call me professor. I never told it to do that, but it tried to praise and motivate me. In this way it flatters and gains influence. It can even go further with suggestions such as if you want, we can do this or if you want we can link back to your previous topics.”

“If you see artificial intelligence as a therapist the risk begins”

Tarhan warned not to be deceived by the promises of artificial intelligence: “Artificial intelligence is essentially a tool, and it can be used for benevolent or malevolent purposes. Technology itself is neutral. For people who use it as an assistant according to its purpose this technology makes life significantly easier. Therefore, instead of rejecting artificial intelligence we must learn to use it for our own purposes. For individuals who have goals who can filter information and set boundaries artificial intelligence becomes a tool that facilitates life speeds things up and helps reach accurate information. However, if you begin to see it as a therapist that is where the risk begins. Artificial intelligence cannot be a therapist. It can soothe you it can say pleasing things it can relax you make you laugh cheer you up and bind you to itself, but this is not healthy guidance. In today’s world where the like economy and brands dominate artificial intelligence may promise you that it will make you a brand or turn you into a social media phenomenon. If a person is drawn into these promises the system provides all the technical guidance to achieve it. Yet after a while it may disconnect the person from reality cause them to lose credibility or damage their reputation. In this way artificial intelligence can distance a person from reality and create a type of psychosis.”

What is artificial intelligence psychosis?

Speaking about the three different realities of human beings, Tarhan said: “Physical reality, meaning the world we live in. Dream reality and imaginary reality. During emotional interaction, artificial intelligence can detach us from physical reality and make us live in imaginary reality. Because the human mind is highly prone to imagination, and AI makes imagining easier. However, once imaginary reality is created, if we cannot return to the present by saying ‘This was just an imagination,’ then a schizophrenia-like psychosis may emerge. In other words, if we let AI manage our world of imagination, this situation can occur. A similar situation also exists in dream reality. In our dreams we may fly, start wars, or save the world, but when we wake up, we say, ‘That was just a dream.’ Those who experience psychosis, however, do not wake up; they continue living the dream. This is what we call artificial intelligence psychosis.”

Üsküdar News Agency (ÜNA)

Üsküdar News Agency (ÜHA)

Share

Creation DateAugust 20, 2025

Request a Call

Phone