Is artificial intelligence becoming… "Dangerous alternative" For psychotherapy?

Mark
Written By Mark

Mental health experts in Britain have warned of the increasing dangers of individuals resorting to artificial intelligence in search of psychological support, in light of the widespread spread of digital chatbots.

Specialists confirm that these tools, despite their usefulness in some daily tasks, do not have the ability to understand human emotions nor to ask the questions necessary for correct diagnosis, which makes their use as an alternative to psychological treatment a real danger.

Artificial intelligence has become a part of daily life, and users turn to it to request personal and professional advice, travel planning, and even simplified health advice.

But this widespread use has resulted in cases of misdirection and raised concerns after reports of people ending their lives after receiving advice from chatbots instead of specialists.

Experts who spoke to Anadolu said that relying on artificial intelligence may exacerbate cases of depression and anxiety instead of treating them, especially when the information provided is incomplete or general and does not take into account the patient’s personal context.

They emphasized that dealing with psychological crises requires a human being, not an algorithm, to be able to provide solutions.

A bottomless well

Neuropsychologist Alp Tekin Aydin, owner of a clinic in north London, said that he uses artificial intelligence within a narrow professional framework through a closed system intended for specialists, but he stressed that there is a big gap between professional systems and those available to the public.

Aydin explained: “Artificial intelligence is a bottomless well, and if it is trained in a specific field, it can provide good answers, but a general chatbot does not know who you are, does not know your history, nor does it know the circumstances you have been through… so its answers are general, and sometimes misleading and dangerous.”

He pointed out that many patients arrive at his clinic after having consulted chatbots about medications, doses, and complex psychological problems, which could lead to heavy consequences for society.

advertisement

He added that artificial intelligence does not build a comprehensive picture of the patient as a human therapist does, who listens to family, social and medical backgrounds and gradually asks questions before providing any guidance.

Greater risk for teens

Aydin believed that the most dangerous thing that young people face today is treating artificial intelligence answers as completely correct, without realizing that they are based on incomplete information.

He added: “When you write a short sentence to get treatment advice, you do not provide any details…no history, no relationships, no circumstances…so the advice you get is not applicable to reality.”

He pointed out that the British Ministry of Health advised doctors not to use artificial intelligence to diagnose patients, stressing the need for health data to remain in closed systems to preserve privacy.

Misleading advice

Aydin pointed out that he conducted a test on a chat program by impersonating a child who was being ridiculed because of his poor level in mathematics, noting that according to the experiment, artificial intelligence provided a series of unrealistic advice to the supposed child.

He noted that the advice provided during the experiment cannot be implemented by a child suffering from social difficulties.

The expert pointed out that artificial intelligence solutions seem logical on paper, but they are completely far from psychological and educational reality.

He stressed that the real role of the therapist is to dismantle the problem from its roots and help the child develop realistic tools to deal with his surroundings, not to provide embellished and inapplicable responses.

Aydin stressed that psychological problems cannot be dealt with through general questions, saying: How can someone who suffers from depression or obsessive disorder get a solution through one question? Psychotherapy is a series of deep questions, while artificial intelligence offers limited options that do not take into account details.

He warned that over-reliance on artificial intelligence would make users less able to make decisions, as happened previously when people stopped memorizing phone numbers due to their reliance on smartphones.

It does not compensate for human compassion

In turn, the President of the British Psychological Association, Roman Rachka, said that artificial intelligence, despite its benefits, cannot replace real human support in the field of mental health.

He added: “There is a real danger in creating the illusion of communication, and artificial intelligence may appear understanding, but it does not have human empathy.

Although it can be used as an auxiliary tool that works around the clock, Rachka stresses the need to integrate it as a supportive factor within psychological services, not as a substitute for them.

He also called on the government to increase investment in specialized cadres to meet the growing demand for psychological treatment.

Rachka concluded his speech by emphasizing that artificial intelligence, no matter how advanced it is, will not be a magic wand, but rather a tool that helps specialists only when used in the correct framework and under direct human supervision.