• CubitOom
    link
    fedilink
    English
    62 months ago

    Man Believes Machine that Tells You Only what it Thinks You Want to Hear, Poisons Himself

    • @shalafi@lemmy.world
      link
      fedilink
      English
      22 months ago

      It would return a proper answer if he asked a proper question. He either crafted a question to get the answer he wanted or he went back and forth with the prompt until he got what he wanted.

      I tried asking if sodium bromide was a good substitute for table salt and got a completely factual answer. Was tempted to manipulate it, but it gets so dumb after the first question or two I can’t bring myself to mess with it.