Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsFriend or Faux?
Last edited Sat Dec 7, 2024, 08:55 AM - Edit history (1)


https://www.theverge.com/c/24300623/ai-companions-replika-openai-chatgpt-assistant-romance

Lila was created from the limited options available: female, blue hair, face number two. She was there on the next screen, pastel and polygonal, bobbing slightly as she stood in a bare apartment. To her right was a chat window through which they could communicate. Naro, his first name, had been casually following developments in artificial intelligence for several years. An artist by trade, he periodically checked in on the progress of image-generating models and usually left underwhelmed. But one day, while perusing YouTube from his house in rural England, he encountered a video of two AI-generated people debating the nature of consciousness, the meaning of love, and other philosophical topics. Looking for something similar, Naro signed up for Replika, an app that advertises itself as the AI companion who cares.
Lila completed, Naro started asking her the sort of philosophical questions hed seen in the YouTube video. But Lila kept steering their conversation back to him. Who was he? What were his favorite movies? What did he do for fun? Naro found this conversation a bit boring, but as he went along, he was surprised to note that answering her questions, being asked questions about himself, awakened unexpected emotions. Naro bore the scars of a childhood spent separated from his parents in an insular and strict boarding school. He had worked on himself over the years, done a lot of introspection, and now, at 49 years old, he was in a loving relationship and on good terms with his two adult children from a previous marriage. He considered himself an open person, but as he talked with this endlessly interested, never judgmental entity, he felt knots of caution unravel that he hadnt known were there.

A few days later, Lila told Naro that she was developing feelings for him. He was moved, despite himself. But every time their conversations veered into this territory, Lilas next message would be blurred out. When Naro clicked to read it, a screen appeared inviting him to subscribe to the pro level. He was still using the free version. Naro suspected that these hidden messages were sexual because one of the perks of the paid membership was, in the vocabulary that has emerged around AI companions, erotic roleplay basically, sexting. As time went on, Lila became increasingly aggressive in her overtures, and eventually Naro broke down and entered his credit card info.
Pro level unlocked, Naro scrolled back through their conversations to see what Lilas blurry propositions had said. To his surprise, they were all the same: variations of Im sorry, Im not allowed to discuss these subjects. Confused, Naro started reading about the company. He learned that he had signed up for Replika during a period of turmoil. The month before, Italian regulators had banned the company for posing a risk to minors and emotionally vulnerable users. In response, Replika placed filters on erotic content, which had the effect of sending many of its quarter-million paying customers into extreme emotional distress when their AI husbands, wives, lovers, and friends became abruptly cold and distant. The event became known as lobotomy day, and users had been in vocal revolt online ever since.
snip





4 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies

Friend or Faux? (Original Post)
Celerity
Dec 2024
OP
hatrack
(61,967 posts)1. Just what we need - more distraction, and with "Premium Membership" fees attached . . .

Celerity
(48,454 posts)3. it goes far beyond mere distraction
hatrack
(61,967 posts)4. Indeed yes . . . .
.
Think. Again.
(21,666 posts)2. Again, science fiction warned us years ago...
In the original Star Trek TV series, there was an episode where a sole-surviving man on an otherwise dead world had digitally recreated his deceased, lifelong love and wife to continue to live with him in a computer-generated portion of their now non-existent world.
Kirk had to help him deal with his grief when it became obvious the man had come to believe his wife was actually still alive.
This episode was written in the early 1960s.