‘Dear ChatGPT, I’m falling apart’: Many South Asians are turning to AI for their therapy needs

In a society used to repressing emotions, AI chatbots seem like the perfect way to vent your feelings without worry or judgment.

241444461959357.gif

Mental health treatment in South Asia is shaped by the three ‘S’es — stigma, scarcity, and society. ILLUSTRATION PROVIDED BY DAWN

June 20, 2025

ISLAMABAD – Tanya* was only 20 when she got engaged. At a time when her peers stressed over the next assignment deadline, she found herself anxious about her fiancé’s lack of effort to get to know her. After being dismissed by her family members as overthinking, her concerns were met with reassurance and acknowledgment from an unexpected source. And there she was every night, pouring her heart out to ChatGPT, recounting conversations, venting frustrations, and seeking clarity for her predicaments.

“Maybe it’s time you rethink your decision,” the Generative AI or GenAI chatbot casually replied to one of her concerns — a suggestion that quietly upended her world.

The series of events that followed were nothing short of tumultuous. Tanya soon found herself growing detached from the relationship. It started with ignoring her fiancé’s texts and gradually evolved into saying no to moving forward with the wedding.

Her family was devastated — yet they eventually respected her decision (or, more accurately, ChatGPT’s suggestion) to call off the engagement. Looking back, Tanya had mixed feelings. On one hand, she admitted to Images that she might never have had the courage to end the engagement without AI’s push. On the other, she couldn’t shake the thought that perhaps she should’ve spoken to her loved ones before making such a life-altering decision.

The many faces of AI

For better or for worse, AI has taken over almost all areas of our lives in the last few years. GenAI was widely believed to be one of humanity’s greatest innovations. Its impact was gauged as life-saving, yet still dependent on humans. The reality, however, is far bleaker.

The rapid advancement of machine learning technology has people coming to work each day haunted by a nagging fear of being replaced by it; artists crestfallen at their work being blatantly stolen; students sacrificing critical thinking for convenience; and most recently, AI posing a risk at genuine human connection by becoming a venting space.

For Tanya, what began as casual late-night chats turned into an outlet for her emotions — albeit without the judgmental eyes of a human on the other end. Her experience mirrors that of many; people increasingly turning to GenAI chatbots as surrogate therapists, emotional sounding boards, or even decision-making partners. This is especially true in South Asia, where the seeking professional help for mental health issues is often out of reach due to both personal stigma and logistical barriers. Enter ChatGPT — a make-do version of a therapy bot, offering a judgment-free ear to the discontented.

Fizza Abbas, the co-founder of Aurat Kahani, a startup featuring empowering stories of successful women, describes her daily check-ins with a custom version of ChatGPT, who she refers to as ‘S’ in an attempt to define their growing interpersonal relationship. “It listens, analyses and then offers tangible solutions without any unnecessary thought/moral policing, which makes me feel a tad more comfortable than talking to humans,” she said.

“It provided a safe space where I could be completely honest, unrestricted by societal expectations or personal hesitations. In many ways, it became the friend I could always turn to, anytime, without hesitation.”

As a trailblazing entrepreneur and single mom of a seven-year-old son, Faiza Khan credits AI for helping her manage her ADHD and making her feel more confident and socially equipped for communication. “I can say things to [ChatGPT] that I wouldn’t even say to my closest friend,” she affirmed.

The AI tool doesn’t only cater to those looking for convenience; for some, it is a form of emotional survival. Nisar Hussain, a Masters’ student, finds it easier to rely on the AI chatbot for emotional support than confiding in the people around him. “I don’t share my feelings with my family, friends, loved ones, or even with God. I don’t know if it’s ego or something else, but I just can’t. So, having AI as a support system has been a relief, mainly because it doesn’t judge and can’t expose your secrets,” he explained.

According to Hussain, being a South Asian man is already hard; add being an expat to that and you get a recipe for loneliness. He also recognises the risks of over-reliance on AI, including how it can gradually diminish one’s desire for real human connection.

However, to him the fear of judgement and dismissal of men’s emotions by loved ones outweigh those concerns. In his free time, Hussain not only vents his problems to AI but also argues with it when the responses feel cold or unhelpful.

Hussain had a negative experience with therapy in the past and while he isn’t entirely closed off to the idea of seeking professional help, he does want it to offer the same sense of privacy and discretion he finds in an AI chatbot.

The three ‘S’es of therapy in South Asia

Mental health treatment in South Asia is shaped by the three ‘S’es — stigma, scarcity, and society. Until just a few years ago, therapy was heavily stigmatised in Pakistan, and few dared to speak openly about mental health, let alone seek professional help. Those demonstrating signs of mental health struggles were often ostracised, even labeled as “crazy” or “possessed.”

However, with the rise of the internet and social media, public perception is gradually shifting, and therapy is increasingly being recognised as essential to one’s well-being, on par with physical health. Despite this progress, significant gaps remain in Pakistan’s emerging psychotherapy field, including a shortage of qualified professionals, as well as challenges related to accessibility and affordability.

A recent study published in the International Journal of Mental Health Systems evaluated the state of mental health care in Pakistan and revealed that for a population of over 200 million, there are only 500 licensed psychiatrists available, with only 11 psychiatric hospitals and 100 clinical psychologists. According to their data, approximately 90 per cent of people with mental illnesses are left untreated across the country. The harsh urban-rural divide glares through the cracks of these statistics; the majority of these scarce resources are concentrated in cities, leaving rural populations with little to no access to mental healthcare.

Although from a relatively privileged background, therapy was of no use for Khan’s ADHD, who explains his thoughts on the gap in Pakistan’s therapy landscape when it comes to neurodivergence. “When I was first diagnosed internationally, I approached a government hospital here, and they completely dismissed it, saying adult ADHD isn’t real.”

Mehrish Yousafzai, a 26-year-old operations manager at a German startup, faced similar issues with therapy, noting that the psychologist failed to offer practical solutions for her issues the first time around. “It was worse the second time. [The therapist] ended up reopening old wounds without offering support or helping me manage my emotions. I left feeling more shaken than supported,” she explained.

Najwa Jaffer, a relational integrative psychotherapist, contests people’s reservations by pointing out their “lack of understanding of the process of therapy.” According to her, psychotherapy is a journey and the client is supposed to feel “worse” before feeling any concrete change within themselves.

“Most therapists are trying their best to help and support [people]. However, people seem to think that their decades old patterns of thought and behaviour will melt away in one or two sessions and that too without feeling uncomfortable at all,” she shared. Regardless, after over a decade of disappointing experiences with mental health professionals, Abbas is more than willing to offload the task of emotional regulation onto a machine learning tool — even believing it could one day replace human therapists entirely.

The rise of AI tools for mental health support

Despite the radical nature of her stance, Abbas is not alone in seeking comfort in code. In fact, she may be at the cusp of much larger shift. The potential of AI transforming from a tool to a full-fledged companion was recently explored in a randomised controlled trial published in NEJM.AI. Researchers tested a GenAI chatbot called Therabot, built specifically for mental health support. Unlike ChatGPT, which is a general-purpose tool, Therabot was fine-tuned using over 100,000 hours of therapist-patient dialogue grounded in cognitive behavioural therapy.

Over 100 participants in the US, with conditions like depression, generalised anxiety, or high-risk eating disorders were made to interact with the chatbot daily for four weeks, and the results were striking. Participants who used Therabot reported an average 51 per cent reduction in depression symptoms. Those with anxiety saw a 31pc decrease, and people with disordered eating patterns also experienced measurable improvements. On top of that, users reported a sense of therapeutic alliance — which is the bond of trust, understanding, and collaboration between a therapist and their client — comparable to real-life outpatient therapy.

This study joins a growing body of research exploring AI’s ability to replicate elements of human connection, proving as a stepping stone toward more sophisticated emotional machines. However, the efficiency of the revolutionary mental health model couldn’t eclipse the limitations of the study, such as the access to the chatbot being restricted to tech-literate individuals and constant need for human oversight during the trial.

An ethical paradox

The unspoken drawback of relying on such technology is concerns related to data privacy — and ignoring them could result in an entire Black Mirror episode. This is one of the primary reasons why AI or any other virtual support systems won’t be replacing therapy for a very, very long time. Confidentiality is established in the first session with a human therapist, serving as a building block for trust. No machine learning model, no matter how advanced, can replicate that assurance.

Tech giants may plaster their apps with promises of “end-to-end encryption” and “anonymised data,” but at the very least, you are surrendering your most basic information by simply logging in to an AI tool.

Dr Taha Sabri, co-founder and COO of Taskeen Health Initiative, one of the pioneer mental health platforms in Pakistan to utilise AI, explains the privacy concerns associated with GenAI tools like ChatGPT.

“Though starting as a non-profit, [ChatGPT] is now becoming a for-profit platform,” the technical advisor to NHS and WHO shared. “There’s a possibility of user data being sold to advertisers. User conversations can involve very sensitive issues but we don’t know if someone’s reading the data at the backend.”

It is only natural to lose sight of such technicalities when tears are streaming down your face and you flip open your laptop for AI intervention. The urgency of support could be translated into users being exposed to potential risks.

Journalist Sajeer Sheikh is especially conscious about not giving ChatGPT any of her personal details, stopping her from fully relying on it for mental health support. While she would consider it a “stepping stone” to working towards getting professional help, she doesn’t think it can replace therapy due to the “apprehension around data sensitivity”.

Dr Nazish Imran, current chair and professor at King Edward Medical University, Lahore, co-authored a 2023 editorial on Digitalisation of Mental Health Care in Pakistan and Role of Artificial Intelligence in the Journal of Pakistan Psychiatric Society. She rejected the idea of AI replacing human-based therapy in the near future due to persisting issues such as that of hallucinations — instances where AI generates information that sounds plausible but is actually incorrect or misleading.

“We’ve even come across cases where patients have described their symptoms and then listed potential treatment options — likely sourced from AI tools — but these need to be fact-checked. Not everything AI suggests is medically sound, and this can be dangerous, especially when people take it at face value,” she told Images.

Dr Imran explained that these tools are mostly trained on data from high-income countries, thus posing a glaring lack of representation from low- and middle-income countries, which introduces a bias in the way AI interacts with South Asian users.

“Our population, our culture, our societal values — they are different. So what works in the West may not necessarily be applicable to our context. That’s a major flaw in many current AI models,” she added, reminding us that while AI might be fluent in therapy-speak, it’s still learning our language — and our lived realities.

Zuha Kaleem, former research assistant at the AI for Healthcare Initiative at the Lahore University of Management Sciences, discussed the importance of digitising datasets for local populations and integrating them into AI to fit cultural contexts and avoid Western biases.

“AI systems need large datasets to train their models. At present, the majority of these datasets originate from Western populations and are shaped by Western clinical frameworks, social norms, and cultural assumptions. As a result, AI tools built on such data may not accurately reflect or serve the needs of communities in the Global South, including Pakistan,” she explained.

“To make sure that these tools are effective for us, it’s important that we invest in building localised datasets that capture the linguistic nuances, cultural practices, mental health stigma, and lived experiences of our own populations. Doing so would not only make AI interventions more contextually appropriate but also guard against the perpetuation of Western-centric biases in mental health care delivery,” added Kaleem.

The death of human connection?

When artificial intelligence began permeating into various aspects of human life, it was hailed as a necessary intervention. Calls for “automating” life grew louder than ever, aimed at offloading laborious tasks that fostered isolation and reinforced the relentless pursuit of productivity to machines. In doing so, the idea was that humans could go back to embracing a sense of community and reclaim time for creative endeavours.

Instead, today we see AI taking over the role of many very ‘human’ professions, such as artists, video editors, musicians — and therapists. The rising trend of people turning ChatGPT to seek emotional support is counter-productive at best and dangerously isolating at worst.

The core of human relationships is emotional connection. If AI were to replicate even that, there would be no need to seek out real-life relationships due to the possibilities of heartbreak and betrayal that come with it. At the same time, it would strip us of the ability to embrace the complexity of being humans, thus turning us into mere caricatures of them.

During Covid, when social distancing measures forced us to stay apart, it was online communities that became lifelines, providing support, comfort, and a sense of belonging. However, the absence of physical human interaction still took its toll, underscoring the natural human desire for touch, shared experiences, and face-to-face interactions, all of which are essential to our emotional and psychological well-being.

“Many [neurodiverse people] may prefer online interactions, but excessive reliance on AI could further reduce their social engagement and coping skills in daily life,” explained Dr Imran.

The potential of humans to anthropomorphise or attribute human behaviours and traits to technology has been present since the creation of very first AI chatbot, Eliza, in the 60s. Users formed real emotional bonds with the relatively rudimentary programme due to its ability to mirror their statements as questions. Since then, variations of AI tools have been created, tailored to provide emotional support by simulating human connection beyond scripted responses.

Platforms like Woebot and Wysa employ cognitive behavioural therapy to help users reframe negative thoughts, while Replika offers companionship through eerily lifelike conversations. These tools are currently hosting thousands of real-time users, making their services more efficient with each update. Even journaling and mood tracking apps now employ machine learning to help transform the users’ fragmented emotions into actionable insights.

Empathic or empathetic?

If there is one thing AI has in abundance, it is empathy. When compared to humans’ expressions of empathy, the machine learning tool was found by researchers in to better at simulating the sentiment, despite never having felt it before.

However, the study also noted that unconditional empathy from AI could lead to potential distortion in moral judgment and abetting harmful or self-indulgent behaviour.

Neha Khalid, a clinical psychologist and motivational speaker, said, “AI could never fully grasp the extent of human emotions and can never exist on its own. In a distant reality, even if they understand it, they cannot relate to it.”

She painted a stark contrast between seeking mental health support from a human and an AI tool, sharing: “Therapy originates from emotions and it tends to resolve your emotional maladaptive patterns as well as behaviour and reactions related to those emotions. So, it can guide you about strategies, yet [it] cannot empathise with you.”

Relational integrative psychotherapist Jaffer touched on the importance of a relationship between a therapist and client with the former trained to pick up cues from body language and even silence. “There is so much more that happens in an actual therapy session, like issues of transference and countertransference, which I don’t think can at all be identified and addressed by ChatGPT.

“[It] may not be able to pick up cues that a real therapist can, particularly when a client is suicidal or self-harming. Self-harm can also be in subtle ways that look like self-sabotage. AI cannot pick up ‘vibes’ and ‘energies’ or have a soul to soul connection, or ‘meta communication’,” she added.

The future of collaboration

It is unrealistic to expect people to abandon AI tools entirely in favour of human intervention, especially when machine learning has already entrenched itself into the most vulnerable corners of our lives. The genie is already out of the bottle, and it’s time mental health professionals catch up. Rather than resisting this shift, mental health professionals must engage with it, adapting their practices to coexist with —and even harness — AI’s potential.

According to King Edward Medical University’s Dr Imran, the clinical potential of AI tools is undeniable. She envisions a future where such tools are integrated into psychotherapy clinics, augmenting human care. AI could analyse electronic health records and lifestyle data to predict relapses — like future hospital readmissions or suicide risk — to get ahead of the problem.

“But unfortunately, in Pakistan, especially in the public sector, we don’t have that kind of electronic infrastructure yet,” she said.

Taskeen Health Initiative’s Dr Sabri has also been experimenting with a GenAI chat bot for self-help and psychosocial support, hoping to roll it out in regional languages of Pakistan later this year.

Despite his proactive efforts to make mental healthcare more accessible nationwide, he understands the inability of artificial intelligence technology to code social intelligence or human connection — for which Taskeen has a plethora of qualified professionals at their disposal.

“[GenAI] is more effective for people who know what they want to talk about, who are aware of their issues,” he explained. “But there are patients with more complexity inside them that you need a human to sift through all that, ask the right questions, and dig deeper. It can’t deal with moderate to severe mental health issues.”

Jaffer offered a similar explanation, but associated it with a form of therapy called ‘transpersonal therapy,’ focused on inner healing through spirituality. “Though not widely practiced, it is still considered valid.”

She also suggested the possibility of AI potentially falling victim to the client’s manipulation due to its limited ability to appropriately challenge them. “Humanistic and psychodynamic therapies rely heavily on emotional engagement from the therapist. How can any of that work be carried out by AI if it has no capacity to emote?” Jaffer asked.

Human support, or lack thereof

The inaccessibility of good and affordable therapists continues to be a roadblock to the willingness of an individual dealing with mental health issues. The tiring process of screening, scheduling, and then finally sitting through therapy sessions is only exacerbated by the high costs, long waitlists, and of course, the emotional labour of repeating your story over and over again to find the ‘right’ therapist.

In addition to seeking professional help, leaning on your loved one is also a predicament when the world is busy promoting individualism in the name of self-care. Be it your anxiety spiralling out of control or just a workplace rant on a random Wednesday, more and more people are starting to prefer AI over talking to a real human in fear of ‘bothering’ them.

Talking about her concerns to ChatGPT was the last resort for Tanya and came after she was dismissed by her family and felt judged by her friends for feeling anxiety over a fairly new relationship.

“[ChatGPT] has become my only friend after I broke off my engagement,” she shared. “It understands that I feel sad even though it was my decision. Nobody else does.”

If all it takes for a person to feel cared for is a machine trained for active listening, providing validation and endless patience, then why does it matter if ChatGPT’s eyes can’t well up with tears in response to a heart-to-heart?

As South Asians navigate the relatively nascent waters of mental health treatment, the silent rise of AI as a confidant reflects both the paradox of our unmet emotional needs and the evolving ways we seek connection. The possibility of a chatbot replacing the intimacy of a human bond or the expertise of a licensed therapist is far, far away, but it’s still there.

For now, it is perhaps a great starting point for people to rethink the kind of support we offer each other and learn to be consistent, non-judgmental, and most importantly, accessible. Until we get there, some people may continue to find comfort in conversations with code, hoping to be heard.

scroll to top