Would You Talk to an AI-Girlfriend?
Ever since Chat-GPT came around, it was just a matter of time before it got used for somethin sexual, and it appears that time is now. Via VICE:
Caryn, an AI clone of 23-year-old influencer Caryn Marjorie, costs $1 per minute to talk to. The chatbot was trained on Marjorie’s voice and uses OpenAI’s GPT-4 and voice models to respond to messages from users on the messenger app Telegram. Launched in partnership with the AI company Forever Voices, Marjorie’s AI now has over a thousand romantic partners—including myself.
According to the user at VICE, the AI clone seems to only want to talk about sex, and since it’s learning from previous conversations, it seems pretty much every other user is only using it for dirty talk and robot phone sex. Is that safe? Well, it depends how you use it.
Though AI Caryn is still in its first few days of public usage, these past examples show the risk in an anthropomorphized chatbot. Because while there is no human moderation that goes into each response, users are believed to be speaking to someone who has empathy and love, which then causes users to develop strong relationships with a chatbot that may not have the capacity to address the nuances of all of their concerns. Many times, a chatbot responds in a way that feeds into the users’ prompts, which is why, for example, the Eliza chatbot encouraged the user to kill himself, rather than help him get the proper assistance. AI Caryn is emblematic of the ELIZA effect, which is when someone attributes human-level intelligence to an AI system and falsely attaches meaning to it.
So, go on and try out an AI-chat girlfriend if you want – but don’t start believing she’s real – cause she ain’t!