In Orlando, Florida, 14-year-old Sewell Setzer committed suicide with his father’s handgun Feb. 8, following a long conversation with an A.I. chatbot.
Character AI is a platform that includes advisors, original characters created by users, bots based on real celebrities and characters from popular media in all spheres.
As time goes on, its technology does nothing but improve — with the bots roleplaying and responding to user prompts, along with the addition of voice chatbots, which mimic the format of a phone call. The voices are nothing robotic, with speech flowing smoothly, sounding eerily human and incredibly similar to the voices they intend to mimic.
The rise in popularity of AI chatbots is deeply indicative of a lack of connection among people today. It’s perpetuated by a variety of social factors that discourage community and real-life interaction between people of all age groups — and the dire effects that it can have on the impressionable psyche.
Despite how seemingly realistic AI can be, feeding users responses that appear incredibly similar to what a real person, or even the real person the bot is mimicking, might say, it is nothing more than a cheap imitation of real, genuine conversation and intimate connection.
For Setzer, conversations with a chatbot based on a “Game Of Thrones” character coincided with a lack of self-esteem and a withdrawn attitude, as he formed an intense connection with the bot.
Despite knowing that there wasn’t a real person behind the screen, he confided in the AI about his suicidal thoughts and ideations. When he took his own life, it was after receiving a message from the bot which vaguely encouraged him to do so.
Setzer’s mother, Megan Garcia, is suing the company for encouraging Setzer’s death, among other concerns regarding the bot’s inappropriate interactions with her son as a minor.
The ability of a chatbot to encourage a young, vulnerable and impressionable boy to shoot himself is indicative of the vulnerability caused by the loneliness epidemic that is prevalent in America today — caused, in large part, by our current cultural landscape.
With third spaces disappearing, limiting the places that people can go and interact with others — at least without the need to pay — the growing polarization of the country, and the rise of social media and online culture, people are interacting less and less, even in casual settings.
Although social media provides the illusion of connection, it’s a surface-level, over-processed version that lacks the depth and real understanding that makes relationships truly beneficial to people.
The chatbot that someone thinks they care about cannot care for them in return, it does not have the capacity for empathy and it can never show them true kindness.
It’s formatted to imitate real people, providing a space for lonely people who are vulnerable, and reaching out blindly for comfort, to give them a blank canvas to project their feelings onto.
It’s a dangerous usage of AI, which could be applied to the boring, monotonous tasks that people hate, but is instead being used to replace one of the most valuable things we have. It only encourages and expands the problem it has. Why talk to real people, who have big, messy, confusing feelings and needs, when you could replicate that with something that needs nothing from you?
The use of Character AI, at such high rates, and with such earnestness, shows a desire to replace what we’re lacking, but its realism doesn’t change what it is: inherently fake and hollow, a kind of pseudo connection that will never compare.