In the heady days of 2017, before ChatGPT was released and before AI felt so ubiquitous, the Woebot was created at Stanford University.
With the talons of AI burying deeper into the job market and corporate anxiety bristling through offices worldwide, many notable vocations have remained safe amid the technological revolution.
Roles such as nurses, doctors, teachers, lecturers, and therapists require a unique human intelligence that cannot be replicated with AI.
Yet while these particular roles are currently unthreatened by AI, technology itself can nonetheless help where human empathy is unavailable.
Enhancing therapy: The Woebot
The Woebot was designed to plug the gaps of a human therapist, insomuch as their services are limited to clinical hours.
While the best therapists in the world may have a wealth of knowledge at their feet and a breadth of emotional tools available to help a vast array of problems, they cannot help a patient experiencing a panic attack in the middle of the night. No matter how sophisticated the treatment, it will always be scheduled. The symptoms of anxiety, depression, OCD, and other mental health disorders, however, are not. They are varied and unpredictable and cannot always be confined to a weekly 50-minute slot.

US Tariffs are shifting - will you react or anticipate?
Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.
By GlobalDataThe Woebot was built for those who find themselves in their darkest moments at unsocial hours when it can be hard to reach out to another person. It provides brief encounters for those in need, with the average length of time being six and a half minutes, and 80% of conversations happening outside traditional clinical hours. The longest conversations took place between two and five in the morning.
The AI potential
Woebot was originally built to be rules-based, meaning that everything it says has been scripted by a writing team and under the supervision of clinical psychologists. The Woebot’s scripted responses ensured safety and consistency in interactions.
However, since the Woebot’s development, generative models have become commonplace. Speaking to the advantage of AI, the developers discovered that generative AI was good for such scenarios where a conversation was needed, to the extent that AI is competent in roleplay. Furthermore, it was found that people were more likely to self-disclose when talking to an AI bot, given the lack of judgment and stigma, as well as a sense of anonymity.
One 2018 study published in the Journal of Communication notes that self-disclosure to a chatbot can be as beneficial as self-disclosing to a human. Notable too is the fact that the act of self-disclosure itself benefits people by decreasing their levels of stress.
Ethical considerations and the Woebot
There are nonetheless some ethical concerns surrounding the Woebot, as well as other mental health chatbots that are available to the public, particularly around privacy and control. The question of who has access to transcripts hovers in the shadows of every interaction. Similarly, the question of who controls what is fed to the AI to make it respond in a certain way is unsettling.
There is also the worrying possibility of creating a bot so entirely perfect and hyper-responsive that it becomes impossible to leave. Such bots should serve human well-being and avoid fostering an addiction to perfection, yet the question of how this should be done remains unanswered.
Innovation in psychotherapy
Overall, there is a dearth of progress in traditional psychotherapy in a way that entirely opposes the onslaught of progress in tools like AI. However, to address global mental health challenges, incorporating such technologies is certainly worth experimenting with. A WHO report published in 2024 noted that 20% of high schoolers in the US have considered suicide.
Involving innovative technologies like AI to combat this increasingly severe problem is not necessarily an advocation for the replacement of human therapists, but rather a recognition of the help and support that AI can give to both professionals and patients, as demonstrated with such tools as the Woebot.