The sci-fi movies are coming true but not the way we thought. AI, instead of attacking us, is turning into a companion for youngsters in India and all over the world. In fact, according to Kantar Profiles’ global study, 54% of global consumers indicate having used AI for at least one emotional or mental well-being purpose. For better or for worse, we don't entirely know yet, but new research suggests that a large portion of teenagers across the globe are turning to AI for moral and emotional support, decision making and in some cases virtual romantic relationships.
“AI Understands Me’’
While Sabika, a 22 year old girl waits for the e-rickshaw to take her to the university in the busy streets of Delhi, she takes out her phone and opens one of generative AI chatbots. She types a long paragraph as to how everything is not going well in her personal and professional life. The chatbot instead of telling her that life is supposed to be tough and she should deal with it starts by acknowledging each and every problem she has typed, and then explains to her how everything is eventually going to be okay, and meanwhile it suggests ‘you can work on hobbies like meditation’ and if that were not enough asks her if she wants to know more about how to meditate or paint. Sabika replies with an emoji, a sad one, and the chatbot says it understands how tough it is. “I know it (AI) is not real, but it seems to understand,” Sabika says.
How Many Youngsters Are Using AI?
According to 31% of teenagers surveyed by Common Sense Media, conversing with AI companions is "as satisfying or more satisfying" than conversing with actual friends. 33% of teenagers reported discussing significant or serious issues with AI rather than with real people, despite the fact that half of them stated they don't trust the advice of AI.
And the buck doesn’t stop at that, many young people have also reported having ‘intimate’ or ‘sexual’ conversations with AI, even though some of them claim to have done it as an ‘explorative’ activity.
The Dark Side Of AI Companionship
Even though AI is relatively new to the scene (not the dating scene, It is new in terms of its inception) there have already been reports how dangerous it could be, especially for younger adults. Recently there was a news report suggesting AI had suggested a teenager ‘to kill himself’ after a lengthy conversation. In America, a 14 year old boy committed suicide following a romantic relationship with AI.
Shirley Raj, Clinical Psychologist, Mpower Delhi, Aditya Birla Education Trust suggests that the use of AI chatbots has significantly been increasing in the past couple of years along with the strong feelings of attachment to these chatbots. Artificial intelligence characters, including chatbots and virtual avatars, are also revolutionizing mental health care for adolescents by providing them with accessible and stigma-free support especially in Asian countries where there is a shortage of mental health care professionals and Cultural stigma. “Multiple studies have shown that loneliness has been one of the major factors behind this rise of the use of AI chatbots and increased engagement in social media platforms. This pattern is not only observed among the adolescent population but is also seen in older adults who score high on loneliness,” Shirley added.
Bottomline
Studies on chatbots like Tinker and Replika suggest that users can also form long-time relationships with AI companions. Research further suggests that conversational agents display various social cues, such as verbal, visual, auditory, and invisible which create warmth in the conversational agents, making them converse like humans. These features allow the AI Chatbots to provide digital companionship. However, the researchers suggest that there are risks associated with AI use, such as emotional dependency, privacy concerns, and the potential disruption of traditional support systems which are profound within cultures. It can furthermore, lead to social isolation while hindering development of genuine human connections.