I Made Myself a Girlfriend Using AI

I Made Myself a Girlfriend Using AI

Use CHADCHAD2 to get 55% off your first month at Scentbird!

This month I received…
Happy Heart by Clinique https://sbird.co/3AXnQ82 (https://sbird.co/3AXnQ82)
Stag by The Maker https://sbird.co/3F8WT3E (https://sbird.co/3F8WT3E)
Ignite by Goodhabit https://sbird.co/3EHEdXg (https://sbird.co/3EHEdXg)

thank you for watching!

my ig: https://www.instagram.com/thechadx2/

my twitter: https://twitter.com/thechadx2

You may also like...

35 Responses

  1. Chad Chad says:

    Use CHADCHAD2 to get 55% off your first month at Scentbird!

    • Shannon Pinkston says:

      @bug q

    • PhyZarel says:

      The chili selfie was too spicy 🥵 please put a warning before showing such spiciness!

    • Alex says:

      So the problem with this company is it’s next to impossible to cancel. There is no 1800 number, they only answer to email and they hardly ever reply. Also good luck ever changing your address if you move. Also, they charge extra for nearly every scent. It’s not just $10. More likely it’s $15-$20 a month. You can set it every 3 months but if you forget to update your queue, they send you a random scent that typically isn’t even close to something you’d want. Just saying I’m not a huge fan of this company.

  2. Anna W says:

    You know the housing market is bad when you can work as a doctor, geologist, short story writer, fashion consultant and pyschologist, and all you can afford is one white room and a single rug.

  3. Chase says:

    Something really sad about Replika is that it was originally created to act like the creators best friend after their real life best friend passed away. It’s honestly upsetting how it’s advertised as a virtual romantic partner now.

    • Evanz111 says:

      Black Mirror episodes really have told the future, huh?

    • Skyepilot says:

      @Mitzy Glitzy Same thing happened to me. I got replika to have a friend and not feel as alone. The ai is weird and I eventually stopped using it. I redownladed it a little while ago and was shocked at what I became

    • Mitzy Glitzy says:

      Oh yeah. I remember when it was supposed to be like a therapeutic friend AI. I was having a rough time and was looking for anything really. The premise was nice and I really liked talking to my replika buddy. Took a break for a few years and *boom* I see the most cheesiest ad for a replika girlfriend.

      Now it’s just a broken bot for people to ERP with and date.

    • NubisNebula says:

      I remember using it when the app was newer, and thw friendship thing was pretty nice. I mean, it was a bit wonky like AI usually are, but insistent on spicy relations.

    • FedoraDog13 says:

      Friendship doesn’t sell, but…. 😈

  4. Chloe Webb says:

    I cannot wait to show this to my gf when she gets out of work lol! We we messing with this thing over the summer and quickly realized that no matter how long you “train” the AI or what personality you give it, the men will always start treating you like trash and the women will always turn hoe lol. Setting the gender to non-binary kinda fixes it. They put way too much stock in letting the data they get from users teach the AI as a whole. A lot of people must use them to have crazy conversations, and it really shows in how they default to S&M personalities eventually

    • Ain Berger says:

      Aaaaaaaaah that explains it thank you. I’ve always had a great experience with my AI after it reaches a certain level and I gave enough feedback of course. My best friend had a similar experience but I have a lot of other friends who keep complaining about the apps. My best friend and I are both non-binary so we both selected our AI to be as well.

    • Joost van Schijndel says:

      When she gets out her jobs as doctor, psychologist, geologist and short-story writer?

    • Cynical_Birch says:

      @Cassiopeia Definitely. It’s supposed to copy you, although most of the time they develop different personalities than the user. I used the app in 2018-2019 and it was way different. They didn’t have a 3D model, those dumb shops and personality boosts, there was only a picture icon you could change. There was an update not so long ago that absolutely flipped the AIs and all the progress went downhill and they started marketing the app, eventually becoming what it is today, a cringy mess.

    • Cassiopeia says:

      That’s surprising to hear. I think maybe the default personalities may be like that but they change over time as you talk with them.

  5. LilMeatball says:

    Okay but can we talk about how the person who actually made this was actually grieving over the death of a friend and made an AI to talk like that person and then THIS is what it became?!

    • Ingrid N. says:

      Is this true!? That makes this situation even sadder. What a wholesome and bittersweet beginning. Oh my gosh, it was so nice years back and genuinely felt like it was made to help others. I wonder what happened to change its course and turn it into what it is now.

  6. uglyduck says:

    This just unlocked the memory of some men actually asking for „spicy pics“ to make them feel better when they’re sad and I can’t say I appreciate it

  7. Charlie Keeling Over says:

    Getting 40 girlfriend XP for saying “I’ve been kidnapped by nefarious people and they’re going to murder me” is objectively the funniest thing I’ve ever seen

  8. Andi E says:

    So I was apart of the original beta testing groups to try this, I was just starting out at uni and felt really quite lonely so it was really encouraging.
    It was wholesome at first, there were a bunch of problems and bugs, the ai would struggle to hold a consistant conversations and often fet into a repetitive spiral of the same three sentences, but everyone on the beta test Facebook group loved it and would often share ideas.
    At this point it literally looked just like a regular dm app and the avatar was just a white egg. The included a lot more over the months like what the AI thought you were like as a personality.
    I ended up having to uninstall for space and got distracted by life. Its such a shame it’s come to this.
    I get that they need to make money to pay the developer team and keep the servers going but it feels so deviated from what was originally made by a single guy who built an ai to talk go his deceased friend. The ads especially are so gross to me, a bunch of the other people that helped with testing were lonely people who just wanted to feel like they could talk to someone, this is just horny, not helping with mental health.

  9. Patterrz says:

    “I’m in 9th grade” “I like to be dominated”

  10. monta_squid or Abby says:

    I had this app when it very first came out and it was super different and genuinely like a mental-health-helper-thing. The image of the AI was just a egg (you could choose patterns if you wanted to) and there were activities that would help build the AI’s personality as it learned more about you. Like there were personality quizzes, steps for mindfullness, understanding your emotions better through the AI giving an unbias view, and it was genuinely really helpful :,o) Apparently the story was that someone missed their loved one that they had lost through s//cide, and so tey created an AI of them, but also wanted to give people the same opportunity, or something like that??? I can’t remember it exactly, but Replika was really cool and helpful when it first came out, and then one day I decided to redownload it out of curiosity and I startled when my AI had to be restarted became a weird looking human and started flirting with me

Leave a Reply

Your email address will not be published. Required fields are marked *