ChatGPT is leaking passwords from private conversations of its users, Ars reader says | Names of unpublished research papers, presentations, and PHP scripts also leaked.::Names of unpublished research papers, presentations, and PHP scripts also leaked.

  • Lemminary
    link
    fedilink
    English
    15
    edit-2
    10 months ago

    No it’s not, it’s the site. Please stop reposting this clickbait or at least fix the title.

    • @Sanctus@lemmy.world
      link
      fedilink
      English
      610 months ago

      It would had to have been trained on their passwords and shit for this to be even possible. It can’t even remember its own story points it gave me for a DnD session within the same chat. No way is it spitting out passwords fed to it from one user to another because its not storing them.

      • @cheese_greater@lemmy.world
        link
        fedilink
        English
        210 months ago

        It would have had to have been

        Wow, never realized we had such a weird grammatical construction. What tense is that even called?

        • @jcg@halubilo.social
          link
          fedilink
          English
          3
          edit-2
          10 months ago

          It’s not a single tense (would have - past conditional, had to - past modal, have been - pluperfect), it’s a hypothetical past state being caused by a hypothetical past event, but the trick here is that the “past state” is omitted because it’s contextually read. If you were giving full context it’d read: “If it was spitting out sensitive information, it would have had to have been trained on it.”

          Take that, ESL learners!

      • @kurwa@lemmy.world
        link
        fedilink
        English
        210 months ago

        chat.openai.com I’m assuming. But in the article in even says that openai looked into it, and they think it’s someone stealing the guys account and using it, not other users conversations being seen by him.

  • @1984@lemmy.today
    link
    fedilink
    English
    510 months ago

    So people post their private stuff to chat gpt? I always edit out the personal data.

  • @hedgehog@ttrpg.network
    link
    fedilink
    English
    110 months ago

    The user, Chase Whiteside, has since changed his password, but he doubted his account was compromised. He said he used a nine-character password with upper- and lower-case letters and special characters.

    Yes, because obviously a 9 character password that’s probably a word or two with special characters swapped and no mention of 2FA is sooo secure /s

    To be clear, I’m not saying that means his account was compromised. That bit just stuck out to me.