Tuesday, Apr 23, 2024 | Last Update : 03:31 PM IST

  Life   More Features  02 Oct 2018  A black box without empathy

A black box without empathy

THE ASIAN AGE. | NAVEENA GHANATE
Published : Oct 2, 2018, 12:15 am IST
Updated : Oct 2, 2018, 6:02 am IST

It has become imperative to discuss how algorithms are dealing with sensitive issues like suicide.

Doctors feel that when a person is asking questions related to suicide, it helps if the responses are more empathetic.
 Doctors feel that when a person is asking questions related to suicide, it helps if the responses are more empathetic.

While AI should ideally predict and prevent vulnerable people from committing suicide, the recent suggestion given to a PhD scholar by Quora proves we need to rethink algorithms.

Social media platforms are said to be making positive changes in the lives of people around the world. But when it comes to suicide prevention, these platforms aren’t smart enough. Case in point is a tweet by Shehla Rashid, a PhD student at Delhi’s Jawaharlal Nehru University, wherein she spoke of Premenstrual Dysphoric Disorder (PMDD), which is an extreme form of Premenstrual Syndrome (PMS). In this extreme scenario, one feels suicidal. A dejected Shehla searched for ways to commit suicide. She looked on Quora, which is known to have intelligent answers, but the next day she received an email. She wrote, “Quora sends me an email asking if I’m still contemplating suicide, and that they’re here to help! In a world where algorithms will help you end your life if you want to end your life, it’s really important to share information about PMDD. (sic).”

It has become imperative to discuss how algorithms are dealing with sensitive issues like suicide. As suicide or anxiety are emotional issues, how can these platforms be made more sensitive?

An expert and security researcher Anivar Aravind said, “Responding to such searches should not be an engineering decision. It needs to have a social or psychological consultation, which is absent in most tech companies. These algorithms are black boxes as in except for the company, nobody knows how that product is programmed. The output of the algorithm reflects the sensibility of the product manager who wrote the program. Additionally, it is supplemented by the human bias of the developer or company.”

However, sending emails based on searches has been a norm for many platforms. People who have such thoughts often tend to go incognito and search. However, since Shehla gave up on everything, she logged into her account directly. Because of this, she got an email asking, “Still curious about which would be least painful death? Jumping off a building or jumping off a bridge?” Doctors feel that when a person is asking questions related to suicide, it helps if the responses are more empathetic.

Dr Diana Moneteria, at Hyderabad Academy of Psychology, said, “When we do suicide prevention training, we teach people that asking others about suicide doesn’t increase the risk of suicide. But there is a caveat, how you ask makes a difference. If a search engine is sending machine-generated emails with no person involved, the question is ill-advised. A normal person would have gotten help instead of giving ideas to end life.”

People are suicidal only because they have a problem that they cannot solve.

“On social media, looking for posts like ‘I want to kill myself’ or doing a Facebook Live are signs of correction or of wanting help. It would have helped if the machine said ‘go get some help’ instead of giving options on committing suicide,” she says.

But certain measures have been adopted by some tech companies to prevent suicides. Facebook, Twitter and Instagram employ artificial intelligence to detect signs of suicide and depression.

Using algorithms, users searching for a banned hashtag or specific words related to harm are redirected to the support system. Yet there are no guidelines on how to deal with such issues and every tech company handles it in its own way.

Shehla RashidShehla Rashid

“If a search for suicide or killing is detected, the systems should identify them and it should be backed by human decision,” said Aravind. While Shehla looked up Quora, bigger platforms like Google provide helpline numbers.

Quora must refine their algorithms
Shehla Rashid

“My suicidal urge had subsided as soon as my period set in. But when I saw the email from Quora, I was surprised and I felt uncomfortable. Such an email could reinforce suicidal tendencies among those who are depressed or in a bad premenstrual phase. At this point, I decided that I must highlight the issue.”

“Quora has a huge reach, especially among youngsters. As a social platform, they have a huge responsibility to filter promotional emails that could result in harm. Quora should take note of this and refine their algorithms further. It is a powerful example to demonstrate how technology that is designed rather innocently may end up doing harm. If someone is looking for ways to rape or murder someone, or ways to take their own life, Quora must filter such content and also fine-tune their algorithms to avoid sending out emails that reinforce such tendencies.”

Tags: quora, social media, shehla rashid