AI for the Mentally Unstable

Approximately 16% of the world’s population is currently experiencing some form of mental health crisis.

This doesn’t seem like much of a statistic to jiggle your wig or ruffle your shirt over, but the numbers are indeed rising. And this is because mental disorders are becoming increasingly hard to diagnose.

The process of sourcing out an anomaly in a person’s brain, that is responsible for their abnormal behaviours, is not as simple as ABC. It’s not as easy as detecting malaria and prescribing drugs to eliminate it out of that person’s system.

This all narrows down to the fact that most of what we know, and much of what we understand about these disorders, we deduce from mere observations of the common behavioural patterns or symptoms generally associated with abnormal disorders, which professional medical personnel gather into their diagnostic manuals.

And although this method of diagnostics for mental disorders has proven effective in so many cases, it still is inconclusive, and there is much room to question its accuracy, and further room for improvement, as well.

But what may seem to be the saving grace for psychiatry is that the world of medicine is a promising terrain for technology to play out, and in this case, Artificial Intelligence, which might just be a possibly huge help to treating mentally unstable patients.

Artificial intelligence, in case you don’t know, is one branch of computer science that is very much interested in the creation of intelligent machines that will eventually be able to do things or carry out tasks with human precision.

What this means for psychiatry and its wards is that:

1. There will be faster diagnosis of mental conditions, which will in turn lead to proper treatment recommendations for mental health patients.

This could be achieved from the creation of more AI induced apps that can monitor patients, especially remotely, and alert medical personnel whenever there’s even the slightest change in their health patterns.

This would help them not to miss out on any clues they might have ignored before in the physical examination of the said patients.

Such clues can be combined with their own personal deductions and used to draft the right treatment plans.

For instance, mood tracking apps like Wysa and MoodSpace that can aid in cognitive therapy, as well as other apps that can instantly detect mental or emotional distress, either by requiring you to say random things into your phone mic, in order to scan your voice or speech pattern, or with the creation of wearable devices which will assess your everyday bodily activities.

2. Patients can now have all-day access to electronic mental health care.

Getting an appointment with a therapist or a psychologist is like playing the lottery, especially when you don’t live close to any of such mental health professionals.

You might get lucky enough to score one, but chances are that you might end up regretting your session.

Artificial Intelligence will help turn this situation into a win-win by ensuring that you have all-day access to electronic assistance and self-help services, until you’re able to make an appointment. There are so many apps laden with AI presence that can help you document and assess your moods, and help you figure out what to do in a panic situation.

3. Eradication of the discomfort associated with talking to a therapist.

This is especially important in the first few sessions with one, which can be one of the most uncomfortable moments in a person’s life. Most people do not feel exactly relieved to unburden themselves or strip down their feelings to a total stranger, for fear of judgment, or being seen as “crazy”.

But when it’s an AI bot, it’ll be easier to engage in some emotional talk.

Even with these few pointed ways through which Artificial Intelligence can successfully thrive covered, there are still some questionable areas that draw concern. One of which is the fact that:

1. AI isn’t human.

Psychiatric conditions are different from the usual illnesses. They differ because they are connected to a person’s emotions and feelings, which can in turn have long term effects on the physical aspect of that person’s health.

For example, a depressed person can be pushed to want to hurt or harm himself physically, due to mental instability caused by chemical imbalances in their brains. And so, such situations demand in-depth knowledge of emotional intelligence.

So, how then can AI show empathy, in the case of an emotional distressed patient? How can it calm a panic attack? Mental health entails a world of disorders that require empathy and support, which a “robot” could not possibly give.

And so, it’s hard to see the sense in a machine trying to tell apart the normal human behaviour from an abnormal one.

2. Privacy might become extinct.

Much of the algorithm under which Artificial Intelligence for mental health would be performing would require a close up and personal monitoring of the everyday lives of people.

This means little to no regulation, or uncontrolled monitoring, which might even induce paranoia itself for mentally unstable people, thereby resulting to a “what should solve the problem becomes the problem” situation.

3. AI for mental health can become money hungry.

There is a high tendency that the more mental health firms see the profit potential in AI induced apps or machines, there will be a shift in focus from trying to provide the best mental health treatment for patients to wanting to monetize the in-app features as much as possible. E.g. Forcing people to subscribe to ridiculously priced monthly health packages that don’t amount to much value.

Perhaps, the only moderation for the operation of Artificial Intelligence in the medical field is the need or the presence of AI-human collaboration, where AI will only exist to assist mental health personnel.

This would happen without it necessarily trying to take over their jobs, and perform under protocols or guidelines that will ensure that they are safe, and will work side by side with human beings towards the goal of trying to make life situations for mental health patients better, without compromising their privacy or their health.

Leave a Reply

Your email address will not be published.