• Parzivus [any]
    ·
    2 years ago

    ChatGPT is fun but also confidently wrong all the time. Using it as a knowledge source is one of the worst possible applications

    • Owl [he/him]
      ·
      2 years ago

      The GPT4 announcement included a benchmark on how often the different GPTs lie to you. ChatGPT was 40% of the time.

      GPT4 is down to 20% lies, which is definitely an improvement, but still a huge amount if you think about it at all.

      • UlyssesT
        ·
        edit-2
        8 days ago

        deleted by creator

      • regul [any]
        ·
        2 years ago

        How often does Google lie?

      • MedicareForSome [none/use name]
        ·
        2 years ago

        However, they note that GPT-4 acts not confident in its answers it so people are less likely to fact-check results than with the previous version.

        The more accurate it is, the more likely its mistakes go unnoticed.

      • iridaniotter [she/her]
        ·
        edit-2
        2 years ago

        Yeah I asked it for sources on ekranoplans and I am pretty sure it made them up:

        • The Wing-In-Ground (WIG) Effect Craft: A Review

        • WIG Craft and Ekranoplans by Sergy Komarov

        edit: I asked it again and it gave me more fake resources but one of them included a real researcher on the subject. The singularity is here!!!

    • happybadger [he/him]
      ·
      2 years ago

      I now have multiple professors who warn against using it for essays. Students are trying to write phytopathology papers using it.

    • HexbearsDad [he/him]
      ·
      2 years ago

      You have to ask it not to do that explicitly. AI researchers call it "hallucinating" a response. You have to ask it to say "I don't know" if it doesn't know something.

    • CanYouFeelItMrKrabs [any, he/him]
      ·
      2 years ago

      The Bing one has different modes, creative, balanced, and precise. ChatGPT is seems on the creative side but an application using GPT does not need to be like that. On precise mode the Bing one regularly says it does not have enough info to say