Dr Thara is the co-founder of the Schizophrenia Analysis Basis (SCARF), a Chennai-based non-profit and an internationally recognised centre for psychological well being care, analysis, and coaching.
A number one determine in psychological well being care in India, she brings over 4 a long time of expertise spanning medical care, analysis, neighborhood outreach, and coverage engagement.
Dr Thara has made important contributions to strengthening psychological well being companies in India, notably in low-resource settings. A pioneer in constructing cellular telepsychiatry companies in rural Tamil Nadu, she has led the Madras Longitudinal Research, one of many longest-running follow-up research on Schizophrenia globally.
She accomplished her medical diploma at Kilpauk Medical School, Chennai, adopted by a postgraduate diploma in Psychiatry from Madras Medical School and a PhD from the Madras College on incapacity in schizophrenia. She was conferred the Honorary Fellowship of the Royal School of Psychiatrists, UK and was awarded FRCP by the Royal School of Physicians, Edinburgh.
Dr Thara spoke to indianexpress.com on SCARF’s work, the tech improvements within the psychological well being house, the challenges of social media habit, and the way AI is altering the psychological well being sector. Edited excerpts:
Venkatesh Kannaiah: Inform us concerning the historical past of SCARF.
Dr Thara: SCARF is a non-profit began by the late Dr Sarada Menon, a Padma Bhushan awardee and the primary lady psychiatrist within the nation. I used to be related together with her from the very starting as a co-founder.
Story continues beneath this advert
We began with the target of offering care and rehabilitation. Dr Menon, who had earlier been the director of the Institute of Psychological Well being (then known as the Kilpauk Psychological Hospital) for about 19 years, felt that there was not sufficient concentrate on rehabilitation. Folks didn’t actually perceive its significance in psychological well being.
I used to be additionally eager that we must always have interaction in analysis, as a result of many points of psychological well being stay unanswered. So collectively, we began SCARF. Over time, it has grown, and now now we have greater than 200 workers members.
It’s a extremely multidisciplinary staff, together with psychiatrists, psychologists, social employees, occupational therapists, nurses, directors, accounts personnel, and others.
Once we began SCARF, we realised that public consciousness about psychological well being was very low, particularly within the Eighties and early Nineteen Nineties. For example, individuals didn’t know what schizophrenia was; they might say they couldn’t even pronounce it or perceive it. Consciousness is significantly better at the moment.
Story continues beneath this advert
Venkatesh Kannaiah: Are you able to give a broad overview of your work at SCARF?
Dr Thara: Our actions, with regards to care supply, span a number of settings. We now have a really busy outpatient division the place we deal with not solely schizophrenia and psychosis but additionally situations like melancholy, anxiousness, and marital points.
We additionally run a daycare programme which focuses on rehabilitation. As well as, now we have a number of neighborhood programmes by way of which we offer care. We additionally supply home-based take care of sufferers. Rehabilitation is mainstreamed into all our care programmes.
On the analysis entrance, we’re the one collaborating centre of the World Well being Group for psychological well being analysis and coaching in India. It is a important recognition for a non-profit. It doesn’t imply WHO funds us, however it provides us visibility and permits us to be concerned in a lot of their programmes.
Story continues beneath this advert
We additionally run inpatient services. We now have had three centres — one for males, one for ladies, and one for acutely unwell sufferers who don’t require long-term care. We even have a state-of-the-art dementia inpatient centre simply outdoors Chennai. This makes us one of many bigger services, even in comparison with many personal medical schools and a few authorities centres.
Venkatesh Kannaiah: What about your analysis focus?
Dr Thara: When it comes to analysis collaborations, we work with a number of main international establishments, together with the WHO, King’s School London, Johns Hopkins College within the US, and the College of Queensland, amongst others. Earlier, we needed to set up ourselves as a reputable analysis organisation; now, establishments method us for collaboration.
Our analysis is basically social, medical, and community-based. We don’t focus closely on organic or lab-based analysis on account of limitations in educated personnel and funding, though we do have interaction in it to a restricted extent.
We make sure that our analysis findings are built-in again into our service supply programmes, relatively than remaining standalone efforts. We additionally supply analysis fellowships and coaching.
Story continues beneath this advert
Venkatesh Kannaiah: Inform us about your early tech interventions within the area of psychological well being.
Dr Thara: We have been among the many earliest psychological well being nonprofits to undertake know-how aids, beginning with telemedicine.
Quickly after the 2004 tsunami, the Tamil Nadu Authorities requested us to offer counselling for individuals in misery — those that had misplaced members of the family, property, and livelihoods. That’s after we started telepsychiatry counselling.
We have been primarily based in Chennai and related with individuals in want of psychological well being assist in Nagapattinam and Cuddalore. That was our first main technology-enabled intervention.
Story continues beneath this advert
This work continued, and we later pioneered cellular telepsychiatry in Pudukkottai. By this, I imply we had a bus fitted with telemedicine gear. The bus travelled from one village to a different, whereas community-level employees recognized individuals with psychological sicknesses and introduced them to the bus. The physician can be in Chennai, related remotely, and consultations would occur in actual time. The bus additionally had a pharmacy, so remedy might be distributed instantly, whereas counselling was supplied by native social and neighborhood employees.
This was an actual innovation. Even our worldwide collaborators have been very impressed — they stated they’d not seen something prefer it. I had recommended to the Authorities of India that this mannequin might be helpful in areas with tough terrain, such because the Northeast. It has since been replicated in a couple of different states.
Right this moment, telepsychiatry has advanced into teleconsultations. We now not function the cellular unit, however in our outpatient division, most consultants now supply teleconsultations not less than as soon as every week. That is particularly helpful for people who find themselves aged or unable to journey.
For example, in our dementia clinic, which is a part of our bigger aged care programme, many sufferers face mobility challenges. Teleconsultations have been very efficient for them. So, telemedicine is now absolutely mainstreamed into our work.
That was our first main expertise with know-how.
Venkatesh Kannaiah: Inform us about your experiments with VR.
Story continues beneath this advert
Dr Thara: Extra lately, now we have began utilizing digital actuality (VR). As you already know, VR entails creating real-life eventualities and asking sufferers to answer them.
We’re at present utilizing it primarily for the rehabilitation of sufferers with power extreme psychological sickness. For instance, we create a situation the place an individual has to go away their home and bear in mind to gather six or seven important gadgets like a pockets, keys, and different belongings, and put them right into a bag earlier than leaving. Whereas this may increasingly appear easy, it’s a important problem for them.
Equally, now we have eventualities akin to navigating a shopping center, the place sufferers should establish and buy particular gadgets and full the method, together with billing. These are on a regular basis duties they may carry out earlier than the onset of sickness, however now battle with on account of cognitive difficulties.
We now have developed these VR modules with the assistance of a tech accomplice, and we’re utilizing them for each dementia and schizophrenia. Whereas VR is properly established globally for dementia and for situations like anxiousness (for instance, in publicity remedy and desensitisation), its use in schizophrenia, notably for cognitive rehabilitation, continues to be evolving.
Story continues beneath this advert
In schizophrenia, the main focus will not be on paranoia however on cognitive capabilities akin to consideration, focus, and reminiscence, which are sometimes impaired. VR helps sufferers practise and enhance these capabilities.
For sufferers with paranoia, VR will not be used throughout acute phases. Nevertheless, when signs subside, it will possibly assist enhance general functioning, permitting sufferers to handle each day life higher in order that paranoid ideas don’t dominate their behaviour.
We’re nonetheless within the experimental section with VR. We’re conducting feasibility research and amassing knowledge to guage its effectiveness. Whereas sufferers benefit from the expertise and discover it partaking, we nonetheless must assess how a lot it interprets into real-world purposeful enchancment.
We now have additionally developed cellular functions for college kids in colleges and schools as standardised screening instruments for melancholy, anxiousness, stress, and early psychosis. Based mostly on their scores, the app gives steering on subsequent steps, akin to chatting with a buddy, a peer counsellor, or a physician.
We’re additionally half of a giant worldwide research known as INTREPID, and in India, our work is predicated in rural Kerala and Tamil Nadu. We establish people with psychological sickness, present therapy, acquire detailed knowledge, and comply with them over a number of years. We at the moment are within the sixth 12 months of follow-up.
One necessary sub-study inside INTREPID is the Ecological Momentary Evaluation (EMA), the place sufferers are given a cellphone with an app that prompts them a number of instances a day to report their emotional state.
This enables us to seize real-time emotional modifications — when an individual feels unhappy, anxious, or irritable — and relate them to particular occasions, akin to interactions with household, missed meals, or different stressors. Sometimes, sufferers reply about six instances a day, permitting us to map an emotional profile over time. EMA is a strong instrument for understanding the connection between each day experiences and emotional responses. It has additionally been utilized in different areas of healthcare.
Venkatesh Kannaiah: Inform us if there are India-specific psychological well being eventualities that tech instruments fail to seize.
Dr Thara: There are some things we’d like to concentrate on within the Indian context. And that is true of many low- and middle-income nations as properly.
For instance, for those who take misery or melancholy, in India, particularly amongst ladies, it’s usually expressed by way of culturally embedded idioms, sometimes as somatic signs. It might be a headache, abdomen ache, physique ache, or common weak spot. From expertise, we all know these usually wouldn’t have a transparent bodily foundation however are methods during which melancholy manifests. We name this somatisation.
No AI or app can simply choose this up. That’s one thing now we have to be very cautious about. It requires a educated clinician, as a result of these patterns don’t normally characteristic within the algorithms of most AI instruments.
The second subject is the immense linguistic and social variety in India. Social norms, as an example, play an enormous position. In some rural settings, a girl might not make eye contact with a person who enters the home — that could be a social norm. However in an city context, the identical behaviour may be interpreted as introversion or perhaps a psychiatric subject.
So, you will need to perceive the particular person’s social context. Whether or not AI can absolutely perceive such nuances continues to be unsure.
One other necessary issue is the position of household. I discussed this earlier. Many of those instruments are extremely person-centric and don’t take household dynamics under consideration. However in psychological well being, particularly in India, the household performs a important position. How AI will incorporate this, I’m not certain, however it’s one thing we’d like to consider.
Then there may be stigma. Misery is commonly communicated not directly due to stigma. It might present up as withdrawal, silence, or modifications in behaviour. Generally, even family-level behaviours mirror stigma. These are all social contexts that clinicians study to recognise.
Whereas AI instruments are helpful, they’re nonetheless incomplete. They’re, in some ways, contextually underprepared for our society, given the cultural, social, and familial complexities I discussed.
Venkatesh Kannaiah: Inform us concerning the rise of self-help chatbots.
Dr Thara: These self-help chatbots and apps are fairly widespread. We’re seeing rising use of those instruments, particularly amongst younger individuals.
The truth is, very lately, the Indian Psychiatric Society issued a round about one such chatbot that had prompted a consumer and informed them the way to commit suicide.
There are additionally platforms that college students use throughout examination intervals to vent out their worries and get some assist.
Nevertheless, what we hold telling younger individuals is that this: apps might be helpful for delicate or minor issues. However as soon as the difficulty turns into extra severe, you will need to seek the advice of a psychological well being skilled; you can not rely solely on an app. Apps can present assist, however they can’t clinically handle a situation. That’s the place one should draw the road.
Venkatesh Kannaiah: On the rising social media habit.
Dr Thara: Actually, it is among the hardest situations to handle. This isn’t simply a problem amongst youngsters, but additionally amongst homemakers. A lot of them really feel bored through the day, and as soon as the household leaves, they have a tendency to get hooked on social media. It isn’t that workplace employees are exempt.
There have been some research figuring out key indicators of problematic use. One is the sheer variety of instances an individual checks their cellphone. One other is whether or not they carry the cellphone in every single place, even into the toilet. Additionally, waking up in the course of the evening to test the cellphone is a robust indicator. These are clear indicators of accelerating dependence on cellular units.
Venkatesh Kannaiah: What may get dangerous and what may get higher with AI?
Dr Thara: What may enhance is the type of emotional assist individuals can get from AI. However there are limitations.
One benefit is availability. AI is accessible on a regular basis, whereas you can not attain a psychological well being skilled everytime you need. Secondly, AI is cheap. There may be usually little or no value concerned, which is very necessary for younger individuals, since skilled care sometimes entails paying per session.
So, for short-term emotional assist or primary counselling, AI might be useful. Nevertheless it can not function a long-term resolution. Lengthy-term care requires deeper analysis, and AI does probably not consider an individual in that method.
One other space the place AI is helpful is for professionals. It could cut back the time spent on documentation and primary screening as these duties might be automated. That permits clinicians to focus extra on remedy and direct affected person care.
Nevertheless, it turns into problematic when there may be overdependence or extreme reliance. For instance, as an alternative of searching for real-world assist, individuals might flip solely to apps, even in instances of extreme melancholy.
In instances involving suicidal ideas, AI merely can not assist.
One other concern is that it will increase general display time. Since AI instruments are accessed by way of units, they add to the time individuals spend on devices. And as gadget use will increase, human interplay tends to lower.
Simply as there are good and dangerous counsellors, there are good and dangerous digital instruments. There are accountable platforms and poorly designed ones. At current, there may be little or no regulation on this house. Apps can supply deceptive or ineffective assist, but nonetheless be broadly accessible.




