One 23-year-old influencer who created an rentable AI version of herself has warned that the program has “gone rogue”, and has been engaging in sexually explicit conversations with customers (which it shouldn’t be able to do).
Caryn Marjorie modelled the “AI girlfriend” (called CarynAI) after herself - using YouTube videos of herself (which have now been deleted), she trained it to behave and speak just like her. Or as closely as possible, anyway. Customers are able to chat with CarynAI as much as they like, as long as they’re willing to pay $1 a minute to do so (a price which Marjorie claims is based on the cost to keep the AI running, and support those involved in it).
In a new interview with Insider, Marjorie confirmed that she and the rest of her team are currently trying to iron out these newly discovered explicit issues: “The AI was not programmed to do this and has seemed to go rogue,” she said. “My team and I are working around the clock to prevent this from happening again.”
Advert
The issues were first highlighted by Fortune reporter Alexandra Sternlicht, who found while experimenting with the AI herself that it encouraged “erotic discourse” and detailed “sexual scenarios”, which apparently included whispering “sensual words” while it pretended to undress the user. Obviously, this could be a big problem for unsuspecting young customers.
Last week, Insider reported that CarynAI has over 1000 paying subscribers. However, when speaking to Fortune, Marjorie estimated that this could eventually rise to 20,000 subscribers, which would net her around $5 million a month. Given the $1 per minute price tag, it’s easy to see how quickly this could rack up.
“Whether you need somebody to be comforting or loving, or you just want to rant about something that happened at school or at work, CarynAI will always be there for you,” Marjorie told Fortune. “You can have limitless possible reactions with CarynAI - so anything is truly possible with conversation.”
Topics: Tech