Artificial intelligence is not at a point where it can replace psychologists yet!

SDG tags related to the news

SDGS IconSDGS IconSDGS IconSDGS Icon

Artificial intelligence, which has rapidly integrated into every aspect of our lives in recent years, also brings the question: 'Can it replace psychologists?' in the world of psychology and therapy. 

Stating that this question is frequently asked today, Üsküdar Üniversitesi Psychology (English) Department Head Dr. Lecturer Elif Kurtuluş Anarat said, “Artificial intelligence is not currently at a point where it can replace psychologists. The main determinants in therapy are empathy, human-specific intuitions, and the ability to establish a genuine emotional connection. Artificial intelligence can imitate these aspects to a certain extent, but it is not possible for it to establish ‘genuine empathy.’”

Anarat stated, “AI literacy is essential. Future therapists must be individuals who know how to use technology ethically and in compliance with professional codes.”

Üsküdar Üniversitesi Psychology (English) Department Head Dr. Lecturer Elif Kurtuluş Anarat evaluated the role of artificial intelligence, one of today's most debated topics, in the field of psychology and therapy.

Can artificial intelligence replace psychologists?

Stating that the question 'Can artificial intelligence replace psychologists?' is frequently asked today, Dr. Lecturer Elif Kurtuluş Anarat said, “Artificial intelligence is not currently at a point where it can replace psychologists. Because the essence of being a therapist is not merely to convey information or follow a protocol. The main determinants in therapy are empathy, human-specific intuitions, and the ability to establish a genuine emotional connection. Artificial intelligence can imitate these aspects to a certain extent, but it is not possible for it to establish ‘genuine empathy.’ Therefore, it would be more accurate to view artificial intelligence not as a figure replacing the psychologist, but rather as an auxiliary tool supporting the process.”

Human therapists build a bond with clients based on warmth, trust, and empathy

Stating that the biggest difference between therapy conducted with a human psychologist and AI-supported therapy is “emotional depth”, Dr. Lecturer Anarat continued:

“A human therapist establishes a bond with their client based on warmth, trust, and empathy. This bond itself creates a healing effect. Artificial intelligence, on the other hand, can be stronger in terms of accessibility and practicality. Chatbots are available 24/7, fast, and low-cost. However, they cannot provide the insight, flexibility, and depth of relationship that a human therapist offers. Indeed, some studies have shown that individuals working with artificial intelligence experienced a reduction in anxiety levels; but the recovery rate in human therapies is much higher.”

Artificial intelligence transforms the role of psychologists

Dr. Lecturer Elif Kurtuluş Anarat also touched upon the transformation created by AI-supported psychological applications in the profession, stating the following:

“It must be stated clearly that while artificial intelligence offers some conveniences in psychology, it cannot replace human psychologists. Yes, chatbots can be available 24/7, fast, and advantageous in terms of cost. However, these applications mostly offer superficial support. The real healing comes from the client finding a therapist who genuinely listens, feels their emotions, and can adapt to their needs. Artificial intelligence can imitate this human dimension but cannot genuinely establish it. This transformation actually makes the role of psychologists even more important. Because now, our responsibility is not only to provide therapy but also to guide technology correctly, observe ethical boundaries, and place the data provided by artificial intelligence within a human framework. In other words, instead of pushing the profession of psychology into the background, artificial intelligence actually reminds us that we are ‘indispensable subjects who establish a human connection alongside the client.’”

AI making sole decisions in crisis moments is a risk

Dr. Lecturer Elif Kurtuluş Anarat also pointed out the tasks that artificial intelligence can undertake in psychological support and therapy, stating, “Currently, artificial intelligence is mostly used in administrative processes. For example, it is quite practical for tasks like appointment scheduling or note-taking. Additionally, there are chatbots that apply cognitive behavioral therapy techniques for anxiety or depression. They can be successful in providing psycho-education, tracking daily mood, and sending personalized reminders. However, in crisis moments, for instance, when encountering a client with suicidal risk, artificial intelligence making decisions on its own poses a great risk. Therefore, they must be used under human supervision.”

AI can imitate empathy through certain patterns, but it is not genuine…

Dr. Lecturer Elif Kurtuluş Anarat stated that artificial intelligence can imitate empathy through certain patterns, adding, “Some studies even show that users perceive AI responses as more empathetic than human therapists. But there's a critical difference here: this is ‘reflected empathy,’ meaning responses are generated based on the user's commands and guidance. In therapy, trust and healing are possible when you know the person opposite you genuinely feels your emotions. Artificial intelligence can approach this, but not with the same authenticity.”

People can establish a bond with artificial intelligence, but this must be under therapist supervision

Addressing whether therapeutic trust and connection can be established in human-machine interaction, Dr. Lecturer Elif Kurtuluş Anarat said, “In the short term, yes, people can establish a bond with artificial intelligence. Sometimes this bond can make a person feel supported. But in the long term, there's a risk of a false intimacy forming. This can lead to negative consequences like dependency, loneliness, or avoidance of human relationships. Therefore, even if a trust relationship is established with artificial intelligence, its boundaries must be clearly defined and it must be under the supervision of a human therapist.”

Artificial intelligence is not a decision-maker, just a tool

Answering the question of who would be responsible if artificial intelligence provides an incorrect therapy recommendation, Dr. Lecturer Elif Kurtuluş Anarat stated, “Responsibility must always lie with humans. Artificial intelligence is not a decision-maker; it is merely a tool. Therefore, therapists must be aware of the limitations of the systems they use, obtain informed consent from clients, and ensure data security. If an error occurs, the responsibility belongs to the therapist and the institutions that developed the system.”

Future therapists must be 'AI literate'

Dr. Lecturer Elif Kurtuluş Anarat also noted that students should learn to see this technology not as a threat, but as a supportive tool when used correctly, adding, “For this, AI literacy is essential. This means they should learn not only how artificial intelligence works but also its ethical boundaries, risks, and how to preserve a human-centered approach. Future therapists must be individuals who know how to use technology ethically and in compliance with professional codes.” 
 

Üsküdar News Agency (ÜHA)

Share

Update DateFebruary 26, 2026
Creation DateOctober 17, 2025

Request a Call

Phone