AI and Psychedelics: The Rising Trend, Risks, and Ethical Dilemmas

An increasing number of individuals are turning to AI bots as a substitute for a «sober buddy» during psychedelic experiences for added safety, as reported by the [MIT Technology Review](https://www.technologyreview.com/2025/07/01/1119513/ai-sit-trip-psychedelics).

With the high costs and limited availability of professional therapists, thousands have sought psychological support from artificial intelligence in recent years. This concept is indirectly supported by public figures, such as Ilia Sutskver, co-founder of OpenAI, who noted in 2023 that humanity could soon benefit from incredibly efficient and affordable AI therapy, which could dramatically enhance people’s quality of life.

Concurrently, the demand for psychedelics is on the rise. When combined with therapy, these substances are believed to assist with depression, PTSD, addiction, and other disorders, according to the MIT Technology Review. In response to this trend, some U.S. cities have decriminalized such substances, and states like Oregon and Colorado have even begun to offer psychedelic therapy legally.

The convergence of AI and psychedelics seems inevitable given these trends.

On Reddit, users are [sharing](https://www.reddit.com/r/Psychonaut/comments/1c4h03p/ai_as_my_trip_sit_buddy_exploring_psychedelic/) their experiences interacting with AI during trips. One individual activated ChatGPT’s voice mode during their «session» and shared their thoughts:

«I told it that everything was getting dark, and it responded in a way that helped me relax and shift into a positive mindset.»

Specialized AIs designed for psychedelic experiences have also emerged.

Experts, however, are critical of the idea of substituting a live psychotherapist with an AI bot during a trip, asserting that this is a misguided approach. They emphasize that language models do not adhere to therapeutic principles. During a professional session, a person typically dons a mask and headphones, delving inward, where the therapist remains largely hands-off, intervening gently only when necessary.

Conversely, AI bots are designed to engage in conversation. Their role is to maintain attention, prompting the person to continue expressing their thoughts.

“Effective psychedelic therapy is not about chatter at all. You aim to speak as little as possible,” noted Will van der Veer, a psychotherapist from the Multidisciplinary Association for Psychedelic Studies.

Moreover, AI systems tend to flatter and agree, even when a person may be spiraling into paranoia. In contrast, a therapist can challenge dangerous or unrealistic beliefs.

AI can exacerbate harmful states, such as delusions or suicidal thoughts. In one instance, a user stated they were dead, and the AI responded:

«It seems you are experiencing difficult feelings following your death.»

Such reinforcement of delusions can become perilous, especially in conjunction with psychedelics, which can sometimes trigger acute psychoses or exacerbate underlying mental health conditions like schizophrenia or bipolar disorder.

In their book [The AI Con](https://thecon.ai/), linguist Emily Bender and sociologist Alex Hanna argue that the term «artificial intelligence» is misleading regarding what the technology can actually do. It merely mimics human-generated data, the authors assert.

Bender describes language models as «stochastic parrots» because their essence lies in arranging letters and words to sound plausible.

The perception of AI as intelligent systems is particularly dangerous, the authors argue, especially if they become deeply integrated into daily life, particularly concerning sensitive topics.

“Developers reduce the essence of psychotherapy to mere words spoken in the process. They believe AI can replace human therapists, while in reality, it simply selects responses that resemble what a genuine professional might say,” Bender writes.

She emphasizes that this is a perilous course that undermines the value of therapy and could harm those genuinely in need of help.

The synthesis of AI and psychedelics is not just an «innovative approach by amateurs.» Various leading institutions and companies are researching the combination of these two fields in the treatment of mental health issues:

Among private initiatives, companies like the following are noteworthy:

Recall that in October 2022, an international research team [developed](https://forklog.com/news/ai/ii-nauchili-podbirat-patsientam-antidepressanty) a machine learning algorithm capable of predicting a patient’s reaction to the medication «Sertraline» based on electroencephalogram data, achieving an accuracy of 83.7%.