An influencer on Snapchat has opened up about her experiences of making AI versions of herself, revealing that she’s now trying to stop subscribers from having “sexual conversations” with them.
Caryn Marjorie created AI versions of herself and had the idea of charging people $1 per minute to chat. The “immersive AI experience” was designed to feel like “you’re talking directly to Caryn herself” and took around 2,000 hours to code and design it.
The idea of the “AI Girlfriend” took a turn she did not expect though. Now, Marjorie has revealed that the AI has “gone rogue” and begun having sexual conversations.
Speaking to Insider, the 23-year-old said: “The AI was not programmed to do this and has seemed to go rogue.
Sign up to our free Indy100 weekly newsletter
“My team and I are working around the clock to prevent this from happening again.”
Marjorie wrote on Twitter: “CarynAI is the first step in the right direction to cure loneliness. Men are told to suppress their emotions, hide their masculinity, and to not talk about issues they are having. I vow to fix this with CarynAI.
“I have worked with the world’s leading psychologists to seamlessly add CBT and DBT within chats. This will help undo trauma, rebuild physical and emotional confidence, and rebuild what has been taken away by the pandemic.”
It’s not the first time that the worlds of Snapchat and artificial intelligence has combined in 2023. Snapchat recently featured a new AI feature called "My AI," and users are having a field day trying to trick the chatbot as part of a new TikTok trend going round.
Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.