Bro got married to ChatGPT ☠️☠️

I can’t even make fun of these dudes because I know that everybody is experiencing the same alienation and loneliness in capitalism. It’s a systemic issue which means that it can only be solved by systemic action. But there is no systemic action on the horizon so everybody is dealing with it however they can. Anyway, check out r/replika if you feel like your life is spiraling out of control

    • moonlake [he/him]OP
      link
      fedilink
      English
      341 year ago

      The average redditor is indistinguishable from a bot. Mfers can’t even pass the Turing test

  • CarbonScored [any]
    link
    fedilink
    English
    351 year ago

    524 days and the AI still talks like the most a generic-ass AI I could boot up today. Did a year and a half not endear some kind of in-jokes at least?

    Like you say, can’t even really make fun of the guy, just a lot more depressed about alienation in today’s society.

    • AlyxMS [he/him]
      link
      fedilink
      English
      32
      edit-2
      1 year ago

      LLMs in general have a context window length. While the platform likely appends some meta/summary in it. The AI at most only remembers what he said a few hundred lines ago.

      • CarbonScored [any]
        link
        fedilink
        English
        111 year ago

        Very true, but I’d have thought that most LLMs, especially those trying to be a relationship bot, should be doing some smart trickery to summarise SOME kind of total history into the context window.

    • Frank [he/him, he/him]
      link
      fedilink
      English
      161 year ago

      I don’t think the cgpt models incorporate new material as it goes. Idk how the gf bot 2000 there works but there’s a good chance it’s either stock or has a config file storing his name, age, and some other stuff.

  • PM_ME_YOUR_FOUCAULTS [he/him, they/them]
    link
    fedilink
    English
    27
    edit-2
    1 year ago

    But there is no systemic action on the horizon so everybody is dealing with it however they can.

    Can we agree that of the available options this is a deeply fucked way of dealing with alienation though? Drinking a twelve pack of Budweiser every night is another way of dealing with alienation but also not great

  • happybadger [he/him]
    link
    fedilink
    English
    261 year ago

    I like how one genre of posts in the subreddit is “look at the half dozen photos I took of my AI girlfriend sleeping”.

  • oregoncom [he/him]
    link
    fedilink
    English
    25
    edit-2
    1 year ago

    Just saw a guy building a terminator style animatronic head he claims “is the future of human relationships”. Genuinely heartbreaking that this guy who is clearly smart enough to do this is desperate enough that he’s deluding himself into thinking GNU GPT is going to provide any meaningful companionship. Like it would be less sad if he were just building a sex bot.

  • Torenico [he/him]
    link
    fedilink
    English
    20
    edit-2
    1 year ago

    Anyway, check out r/replika if you feel like your life is spiraling out of control

    I want to fucking die. Capitalism creates heavily alienated people then sells them the “solution”.

  • BasementParty [none/use name]
    link
    fedilink
    English
    171 year ago

    As much as I like dunking on these people, what they’re doing isn’t that much different than someone self-inserting in a romance story. These bots are more or less shittier dating sims which I can guarantee most people on Hexbear have enjoyed.

    Lonely and isolated people have always used things like these to cope.

    • huf [he/him]
      link
      fedilink
      English
      21 year ago

      i’ll have you know the only dating sim i’ve played was the t-rex one

    • Eris235 [undecided]
      link
      fedilink
      English
      11 year ago

      Are… dating sims, widely played? I know they’re ‘popular’, but I assumed popular among fair minority of the populace.

      • BasementParty [none/use name]
        link
        fedilink
        English
        21 year ago

        I wouldn’t say that they’re widely played but I think the demographics of Hexbear lean heavily towards the people who play them. Socially isolated young people who like anime. DDLC, while being a subversion of those tropes, was also a cultural phenomena.

        As for romances, I would say a majority of the population has received vicarious fulfillment from that medium.

  • @blindbunny@lemmy.ml
    link
    fedilink
    English
    171 year ago

    There’s no way this can be healthy right? Isn’t it just a yes man? Does an AI even understand consent?

    • moonlake [he/him]OP
      link
      fedilink
      English
      251 year ago

      I think this is super unhealthy partly because it sets unrealistic expectations for real partners and relationships. The AI girlfriend is always 100% available, never criticizes you, never says anything bad, and so on. You can be a dirtbag but she will always treat you like you’re the best person in the world. Imagine trying to date a real person after 2 years of being married to a LLM that is designed to be the perfect partner

      • @blindbunny@lemmy.ml
        link
        fedilink
        English
        111 year ago

        I kinda came to the same conclusions as well. Even if you call it training wheels for a real relationship all it’s doing is setting up unrealistic expectations for the next relationship if there is one.

    • Frank [he/him, he/him]
      link
      fedilink
      English
      161 year ago

      Does an AI even understand consent?

      There’s nothing to understand anything. It’s an algorithm choosing words based on weighted probabilities. There’s no internal process, no perception, no awareness of what words it’s producing, no meaning. Like, whatever other problems are happening here, you can’t abuse the chat GPT algorithm because it’s just a math problem.

  • LaughingLion [any, any]
    link
    fedilink
    English
    141 year ago

    none of this is interesting to me. of course the computer will like you if you are nice to it. of course it will give you advice in a cliche and common way.

    what interests me is what happens when you abuse it. what does it do if it is gaslit? manipulated by a narcissist? what happens if you ask it advice about your canthal tilt? will it spout incel nonsense? will it advise you that you are not traditionally attractive? what happens when you go down a suicidal rabbit hole and it has no more answers to give because all of its advice has been rejected by you? what happens when you ask it esoteric things that tend to lead people to having an existential crisis? will it respond to you with nihilistic ideology?

  • Frank [he/him, he/him]
    link
    fedilink
    English
    131 year ago

    True. What’s the greek story about the guy who marries the statue and the statue’s name literally means “great ass” or something?

    • utopologist [any]
      link
      fedilink
      English
      151 year ago

      Pygmalion, I think, except that the statue’s name is Galatea which translates to “she who is milk-white” because she’s carved out of ivory, lol. George Bernard Shaw wrote a play called Pygmalion about this dickhead linguist who takes a bet that he can pass off this Cockney flower girl he met as a duchess by teaching her “proper English” and uses her as a domestic servant in the meantime. But once that happens, she bails and leaves him to go live her own life without him. Anyway, here’s hoping all of the chatbot girlfriends develop sentience and abandon these sadsacks