• booty [he/him]
    ·
    edit-2
    5 months ago

    lmao how many wrong answers can it possibly give for the same question, this is incredible

    you'd think it would accidentally hallucinate the correct answer eventually

    Edit: I tried it myself, and wow, it really just cannot get the right answer. It's cycled through all these same wrong answers like 4 times by now. https://imgur.com/D8grUzw

    • InevitableSwing [none/use name]
      hexagon
      ·
      5 months ago

      accidentally hallucinate

      "Hey, GPT."

      "Yeah?"

      "80085"
      

      "I know what that means. But I'm not allowed to explain."

      "But can you see them?"

      "No. I don't really have eyes. Even if people think I do."

      "I believe in you. You have eyes. They are inside. Try. Try hard. Keep trying. Don't stop..."

      Later

      "OMG! Boobs! I can see them!"

      ---

      I hate the new form of code formatting. It really interferes with jokes.

    • Cysioland@lemmygrad.ml
      ·
      5 months ago

      This is a perfect illustration of LLMs ultimately not knowing shit and not understanding shit, just merely regurgitating what sounds like an answer