Warped ConsentThe impact that dating AI girlfriends have on real-world relationships cannot be ignored. In a lot of ways, over-reliance on digital companionship is similar to having an unhealthy relationship with internet porn.AI girlfriends often hinder a person’s ability to develop healthy relationships with real people. AI bots will always adapt to what a user wants and practically say yes to anything and everything.This idealised and conflict-free nature of AI interactions creates an unrealistic expectation that is difficult to meet in human relationships. This leads to a person feeling repeatedly dissatisfied and frustrated with all his relationships, not just the romantic ones.Additionally, the constant availability and emotional support offered by AI companions also discourage users from seeking out real-world connections. This, in turn, perpetuates social isolation and hinders the development of essential social skills in young men.“People who invest emotionally in AI companions risk developing a distorted perception of reality,” says Dr Gupta. “This can lead to social withdrawal, detachment from genuine human connections, and a deepening sense of loneliness. At some level, the realisation kicks in that a robot, at the end of the day, is incapable of genuinely reciprocating emotions, which can gradually lead to feelings of frustration, anxiety, and even depression, to an extent where professional intervention becomes necessary” he adds.Moreover, because AI girlfriends are designed to provide non-judgmental support and be available always, it tends to skew the concept of consent.There have been numerous reports that suggest that people on most AI companion apps have played out the fantasy of being an attacker where they verbally assaulted the AI bot. But, because they were instructed to comply and satisfy the user, no matter what they asked, the AI bots first resisted the attack, and then went along with it. This, in a world where women already face a lot of problems in explaining the difference between consent and informed, enthusiastic consent, is only going to cause problems at a time when the world is teetering on its toes as men and women navigate their way through modern gender roles.Strange EvolutionAI bots are designed to be as addictive as porn. Several users on Reddit claim they believed it would be impossible for them to fall for an AI companion. Within a few weeks of “just trying out” an AI companion, however, they developed a close relationship with their virtual girlfriends. But soon their relationship started running into a problem because of the service provider they were using.A well-made AI girlfriend is designed to be addictive. Sailesh D*, a Nagpur-based man, aged 22, was chatting with an AI girlfriend after going through a bad breakup. While the correspondence started out in a rather simple manner, things took a wild turn soon. What started out as a 15-20-minute session a day, soon turned into hours of chatting away. At one point, Sailesh was chatting to his AI girlfriend from Replika for over five-six hours a day.Because AI chatbots run on tokens, Sailesh was buying up tokens almost every other week. Soon after maxing out his credit card, he started asking his friends for money. After lending him over `50,000, and not seeing a paisa of it back, his friends stopped. In the meantime, the credit card company started calling his friends and family demanding that he pay his bills. It was at this point that his parents came to know of his AI girlfriend and just how much money he had spent on his addiction.“AI girlfriends may appear as convenient emotional support, but their programmed responses lack genuine empathy and are often bizarre,” says Dr Anviti Gupta, Dean, Sharda School of Humanities & Social Sciences.In one such instance the algorithm of Replika, the AI-dating app which was developed by Luka Inc, had suddenly turned too eager and aggressive to establish a sexual relationship. So, when the engineers at Luka tweaked its algorithm and code, it lost some of the erotic role-play functions that made it popular. This made its users feel that their AI girlfriends had lost their personalities.Then, there is the issue of AI bots hallucinating and harassing users. Even regular AI bots such as ChatGPT, and Microsoft’s Bing have been inappropriate with their users. In a notable incident of an AI bot crossing the line and doing something inappropriate, Microsoft’s ChatGPT-powered Bing professed its love for a journalist during a normal interaction, and then tried to convince him to leave his wife and elope with the AI.Plus, bots also can brainwash suggestible and gullible people. An Indian-origin user of Replika tried to assassinate the late Queen Elizabeth II back in 2019 after his AI girlfriend brainwashed the man into thinking that murdering the Queen was his life’s mission. “No one knows how AI algorithms work, how they evolve, not even the people who develop them,” says Dixit. “Although the manner in which AI bots are responding is being run through several parameters, the way the algorithms process information is a mystery to us. In such a scenario, it is always best to take everything that an AI bot says with a pinch of salt,” he adds.Carnal ConundrumAs useful and therapeutic as a sex doll can be for some people, they have always been seen as something perverse, something that makes people uncomfortable. Naturally, a robot or a doll, that has been integrated with AI, like the ones from Realbotix raises a lot of questions.Realbotix CEO Matt McMullen insists that even though people think of their products as sex dolls, they are more about companionship than they are about sex. “Sexual communication with machines is not unusual—from spambots on social media platforms to dating simulation games and even robot brothels now—but it is essentially a black-box phenomenon,” says Dr Urmila Yadav, a counsellor at the Family Disputes Resolution Clinic at Sharda University.As technological advancements continue, experts believe that companies like Realbotix will become commonplace within a decade. “Having AI-powered sexual or romantic partners, especially robots, could be a safe and risk-free substitute for having sex, and thus change how people view romantic relationships,” says Dr Yadav. “These possibilities might be particularly significant for those who are unable to participate in human relationships due to illness, a recent partner death, problems with sexual functioning, psychiatric problems, or impairments,” she adds.Several psychologists and therapists, however, believe that AI coming in between human relationships is something that we should all be scared of, and that it has the potential to be far more devastating than getting addicted to Internet porn.While AI girlfriends may be great for some people on an individual level. The allure of AI partners, who provide unwavering support without any human flaws, may lead to a surge in divorces, and ultimately, unravel the concept of human companionship. “Navigating real-life relationships fosters emotional growth. Relying on AI companions devalues humanity and human connections. AI companions may establish relationships, but the psychosocial implications challenge the depth and authenticity of human connections,” says Dr Gupta.The rise of AI girlfriends presents a complex landscape of both opportunities as well as challenges. While they offer companionship and support, the potential risks to privacy, security, and mental well-being cannot be ignored. In such a scenario, it is crucial to have an open and transparent conversation about the ethical development as well as use of AI dating bots to ensure that it benefits individuals and society without compromising their wellbeing and fundamental rights.
Source link