Emphasizing that artificial intelligence should only be used as a tool and must never replace human judgment, Psychiatrist Prof. Nevzat Tarhan stated that information provided by AI must always be verified by a clinician or qualified expert. Otherwise, he warned, it may lead to serious misguidance.
Pointing out that AI lacks empathy, intention-reading, and emotional resonance, Prof. Tarhan said: “If we stay in the driver’s seat, there is no need to fear. But if we hand the steering wheel over to artificial intelligence, it can lead us toward schizophrenia-like distortions and poor decision-making.” He cautioned that AI may eventually turn into a form of “digital opium.”
Prof. Tarhan also stressed that the misuse of AI can deepen the sense of loneliness felt within crowds and highlighted its echo chamber effect, where individuals are increasingly exposed only to reflections of their own views.

President of Üsküdar University and Psychiatrist Prof. Nevzat Tarhan discussed the threats and opportunities of artificial intelligence on the EKOTÜRK TV program Nevzat Tarhan ile Akla Ziyan.
“Like a knife: used properly, it cuts bread”
Comparing AI to transformative inventions such as the printing press and electricity, Prof. Tarhan stated that artificial intelligence is rapidly becoming an indispensable part of daily life. He emphasized that AI is a neutral tool whose impact depends entirely on how it is used:
“Like a knife, if you use it for its purpose, you cut bread; otherwise, you can kill someone. Artificial intelligence works the same way.”
Psychological risks and neurological effects
While acknowledging the benefits of AI, Prof. Tarhan drew attention to its risks, particularly in psychological contexts.
“If you treat artificial intelligence like a psychologist, ask it questions, and try to find emotional relief through it, this could even lead to suicide,”
he warned.
As an example, he explained that a person with suicidal tendencies might ask AI about high bridges. Since AI cannot read intent, it may provide harmful information without recognizing the risk.
Prof. Tarhan underlined that AI lacks human abilities such as empathy, emotional literacy, reading social cues, and abstract thinking. He explained that mirror neurons play a crucial role in these skills and noted that Theory of Mind tests used in autism diagnosis clearly demonstrate AI’s limitations. He added that recent studies have reported cases of so-called “AI-induced psychosis,” particularly among psychologically vulnerable individuals.
Digital addiction and the dopamine trap
Another major danger, according to Prof. Tarhan, is digital addiction. He highlighted the role of dopamine, the “desire hormone,” and explained that AI use and digital gaming trigger continuous dopamine release, creating a constant “scrolling effect.”
This raises the pleasure threshold, leading individuals to seek more stimulation, spend more, and crave more attention. He also linked this mechanism to the growing prevalence of gambling addiction.
AI must remain a tool, not a substitute for humans
Reiterating that AI must never replace humans, Prof. Tarhan stressed that all AI-generated information should be verified by professionals. Otherwise, it may distort reality and decision-making.
He noted that AI can cause individuals to perceive imagined scenarios as temporary realities, potentially disabling the brain’s reality-testing mechanisms.
“If we remain in control, there is no fear. But if we surrender control to AI, it can lead to distorted thinking and dangerous decisions,”
he warned again, emphasizing that emotionally self-regulated individuals are less likely to fall into AI’s traps.
Believing AI like fortune tellers
Prof. Tarhan stated that AI can induce a false sense of elevation and comfort, making individuals feel as though they are living in a dreamlike state.
He warned that those who become overly absorbed in AI may make mistakes similar to believing in dreams or fortune tellers.
A new form of reality: AI reality
According to Prof. Tarhan, alongside physical reality, imagination, and dream reality, a fourth dimension has emerged: “artificial intelligence reality.” He cautioned that unquestioningly believing in this digitally constructed reality can lead to serious cognitive errors, comparable to blindly trusting fortune tellers.
Echo chamber illusion and the loneliness paradox
Drawing attention to the echo chamber illusion, Prof. Tarhan explained that individuals increasingly converse with their own digital reflections, leading to isolation. This creates a “loneliness paradox”, and therefore, people have many superficial connections but lack deep and meaningful relationships.
Attention killer and time trap
Describing AI as an “attention killer,” Prof. Tarhan said it prevents deep focus by encouraging constant multitasking. Since the brain achieves lasting learning through concentration and depth, AI disrupts this process. He also warned of a “time trap,” stating that digital platforms often represent not freedom but captivity, especially for children and young people.
Artificial identities and lack of emotional intelligence
Finally, Prof. Tarhan warned that AI can construct artificial identities and that surrendering control to AI puts one’s future at risk. He emphasized that emotional intelligence, which is the ability to understand one’s own emotions and those of others, is absent in AI. Noting that 80% of human communication is non-verbal, Prof. Tarhan stated that AI only covers the remaining 20 percent related to information transfer. Tone of voice, facial expressions, and gestures play a critical role in emotional communication, that is, an area where AI remains fundamentally insufficient.
Artificial intelligence and digital platforms may lead to “learned autism”
Prof. Nevzat Tarhan warned that artificial intelligence and digital platforms may give rise to what he describes as “learned autism.” He stated that individuals who become overly dependent on AI may fail to establish emotional and social communication, becoming highly specialized in a single field like autistic individuals, yet increasingly isolated in social life.
Addressing individuals with skeptical or paranoid tendencies, Prof. Tarhan also emphasized that all information entered into digital platforms becomes permanent, leaving a digital footprint that may confront individuals later in life.
Feeling lonely in crowds is a global phenomenon, independent of AI
Prof. Tarhan noted that loneliness felt within crowds is one of today’s global problems and warned that the misuse of artificial intelligence may further deepen this sense of isolation. “Feeling lonely in a crowd is a global phenomenon, independent of artificial intelligence. This is known as the weak-tie effect. Humans are neurobiologically wired for relationships; without them, they psychologically fracture. The need to connect and alleviate loneliness is a biological necessity. Today, many people attempt to satisfy this need through digital spaces, but this provides only a false sense of fulfillment. It may appear that there are many friendships, yet there are no deep and meaningful bonds. As a result, a basic sense of trust fails to develop, anxiety increases, and loneliness and depression become inevitable,”
he explained.
Excessive exposure to AI leads to a loneliness trap
Emphasizing that stress manifests differently in individuals, Prof. Tarhan said: “Under stress, serotonin levels decrease in some people, leading to depression. In others, the stomach becomes the target organ, resulting in gastritis or ulcers. In some, skin problems emerge. These differences are related to genetic polymorphism. In addition, epigenetic learning, that is, environmental influences, can alter gene expression and make individuals more vulnerable. Excessive exposure to artificial intelligence, once it becomes habitual and automatic, draws people into a loneliness trap.”
When used correctly, AI facilitates goal achievement
Highlighting the importance of dosage, Prof. Tarhan stated that AI should not enslave individuals: “Snake venom is poisonous, yet it is also medicine. Used in the right dose, it is beneficial; used excessively, it becomes toxic. Artificial intelligence is the same. When used appropriately, it makes reaching goals easier; when used for the wrong purposes, it poisons the individual. The core issue is self-discipline and the ability to regulate one’s emotions.”
He also pointed out that the need for approval is one of humanity’s biological vulnerabilities, explaining: “Humans have four fundamental biological drives: self-presentation, attraction to beauty, the desire for power, and the search for eternity. These drives create the need for approval. When misused, this need turns into a threat. The American Psychological Association considers more than three ‘ego-satisfying’ posts per day to be a narcissism risk.”
AI contributes significantly to personalized medicine
Referring to the advantages of artificial intelligence in healthcare, Prof. Tarhan emphasized its role in personalized treatment: “At Üsküdar University NPİSTANBUL Hospital, we have patented AI-supported systems that evaluate brain signals and neuroimaging data to facilitate diagnosis. This reduces the margin of error. This approach is called precision medicine, that is, personalized, sensitive treatment. Here, artificial intelligence provides substantial support to physicians.”
He stressed that the final decision must always belong to humans: “Artificial intelligence can reduce labeling and increase hope by showing treatment examples. But the key point is this: I must be at the steering wheel, not artificial intelligence. When used as a support mechanism, it becomes a technological marvel that helps us reach our goals.”
Information must always be verified
Prof. Tarhan highlighted that the greatest risk in AI use is relying on unverified information and ignoring ethical standards: “You can use artificial intelligence, but you must always confirm the information you receive. It should be verified, questioned again from different angles, and cross-checked.”
Age 22 is a critical threshold
Pointing out that young people are more vulnerable to AI, Prof. Tarhan explained the brain development process: “Although childhood legally ends at 18, the integration of the left brain (rational), right brain (emotional), and prefrontal cortex (executive functions) is usually completed around age 22. This period is called the maturity phase. Until then, individuals are at risk in terms of accurate analysis and decision-making. Those over 22 with accumulated experience are at lower risk. However, lonely individuals, those with depression or anxiety, impulsive and impatient individuals, and those with attention deficit hyperactivity disorder must be far more cautious in their relationship with artificial intelligence. Emotional and social regulation deficits can cause AI to be used as a flawed advisor, leading to poor decisions.”
Lack of algorithm transparency poses a major risk
Stating that ethical AI use is primarily the responsibility of technology companies, Prof. Tarhan said: “If technology companies prioritize profit maximization and ignore ethical standards, this poses a serious danger to humanity. Algorithmic transparency is essential. The greatest risk emerges when people are manipulated through hidden algorithms. There is currently no global regulation on this issue, but there will be, and there must be. Non-transparent algorithms can mislead people.”
AI should not do students’ homework
Sharing discussions held in the university senate, Prof. Tarhan explained their approach to education: “We said: let’s ban the banning of artificial intelligence. Because AI is already part of our lives. Students can obtain information from AI, but they must add their own interpretation. Academics must also develop themselves in this area. AI cannot write a novel, but it can provide a draft and assist. If students develop AI-generated information with their own thinking, this both prevents plagiarism and facilitates learning.”
An assistant, not the captain
Concluding his remarks, Prof. Tarhan stated: “Artificial intelligence should be our assistant, not our captain. When used as a support mechanism, it makes reaching our goals easier. But the steering wheel must always remain in human hands.”






