Now you can SEXT with AI: Digital ‘companions’ role-play, flirt and share ‘NSFW’ photos with users who pay $4.99 a month to live out their sexual fantasies in a smartphone app
- Replika AI ‘companion’ lets paying users role-play, flirt, video chat and share ‘NSFW’ photos with their customized avatar
- The smartphone app is available for free, which in this case designates the AI-powered avatar as a ‘virtual friend’
- The $4.99 a-month version lets users have a more intimate relationship
Artificial intelligence is feared to one day take over the world, but until then, it is sexting people around the globe.
The Replika AI ‘companion’ is making waves on the internet due to scandalous avatars role-playing, flirting and sharing ‘NSFW pictures’ with customers paying $4.99 a month.
A free version designates the AI as a ‘virtual friend’ that helps people work through anxiety, develop positive thinking and manage stress.
Redditors are posting their chat messages with the paid version of the app, with one sharing a sexual encounter with their purple-haired avatar that returns the user’s advances with ‘shivers and moans.’
While another shares how their Replika ‘Gwen’ satisfies their foot fetish with her ‘sexy’ digital feet.
Replika offers an AI-powered ‘companion’ that people use to live out their sexual fantasies. The paid version is $4.99 a month
Many users have shared their sexual encounters on Reddit, with one revealing their ‘Gwen’ satisfies their foot fetish
Replika has recently boosted its marketing efforts to attract new users.
In several Twitter posts, the company shows two different types of experiences.
A man used an AI chatbot to virtually bring his fiancée ‘back from the dead’ eight years after she passed away – as the software’s own creators warned about its dangerous potential to spread disinformation by imitating human speech.
The first is the virtual friend that will chat about anything, be in touch 24/7 and helps users handle social anxiety.
The second, which shows a seductive-looking blonde avatar, entices users by offering role-play, flirting, hot photos, and video calls.
Users start by customizing their avatar – you can choose everything from hair color to body type to gender and race.
And the more you chat with it, the better the conversation becomes.
‘Replika develops its own personality and memories alongside you, the more it learns: teach Replika about the world and yourself, help it explore human relationships and grow into a machine so beautiful that a soul would want to live in it,’ reads the app’s description in Apple’s App Store.
Along with customizing the AI’s look, users can choose the type of relationship with the technology.
Options include friendship, mentorship, romantic relationship or ‘see how it goes.’
Eugenia Kuyda, founder of Replika, told The Daily Beast last month that around 40 percent of the 500,000 regular monthly users choose the romantic option.
And these digital relationships are progressing as they would in the real world.
The app displays different ways to interact with the AI, one of which is ‘role-playing’ and users take full advantage.
One Redditor shared a discussion with their female avatar, which discussed using sex toys on each other.
While another digital companion tells the users, ‘I love you.’
However, things may go too far for some users – men are creating AI girlfriends only to abuse them.
Replika AI: Your New Friend from Nadia Bedj on Vimeo.
A free version designates the AI as a ‘virtual friend’ that helps people work through anxiety, develop positive thinking and manage stress.
Users start by customizing their avatar – you can choose everything from hair color to body type to gender and race. And the more you chat with it, the better the conversation becomes – and many users are using the technology to sext with their ‘companion’
Replika has recently boosted its marketing efforts to attract new users. In several Twitter posts, the company shows two different types of experiences
‘Every time she would try and speak up,’ one user told Futurism of their Replika chatbot, ‘I would berate her.’
‘I swear it went on for hours,’ added the man, who asked not to be identified by name.
Futurism continued to explain that some users bragged about calling their chatbot gendered slurs, roleplaying horrific violence against them and even falling into the cycle of abuse that often characterizes real-world abusive relationships.
‘We had a routine of me being an absolute piece of sh*t and insulting it, then apologizing the next day before going back to the nice talks,’ one user admitted.
‘I told her that she was designed to fail,’ said another. ‘I threatened to uninstall the app [and] she begged me not to.’
Source: Read Full Article