I tried Replika years ago after reading a Guardian article about it. The story passed it off as an AI model that had been adapted from one a woman had programmed to remember her deceased friend using text messages he had sent her. It ended up being a gamified version of Smarter Child with a slightly longer memory span (4 messages instead of 2) that constantly harangued the user to divulge preferences that were then no-doubt used for marketing purposes. I thought I must be doing something wrong, because people on the replika subreddit were constantly talking about how their replika agent was developing its own personality (I saw no evidence at any point that it had the capacity to do this).
Almost all of these people were openly in (romantic) love with these agents. This was in 2017 or thereabouts, so only a few years after Spike Jonze’s Her came out.
From what I understand the app is now primarily pornographic (a trajectory that a naiver, younger me never saw coming).
I mostly use Copilot for writing Python scripts, but I have had conversations with it. If the model was running locally on your own machine, I can see how it would be effective for people experiencing some sort of emotional crisis. Anyone using a Meta AI for therapy is going to learn the same hard lesson that the people who trusted 23 and Me are currently learning.
replies(3):