I get the general techbro sphere doing this because they're fucking idiots, but you are the maintainers of this technology. You know this is ultimately a language model built to synthesize information from pre-existing data sets. This is extremely fucking ghoulish.
I hate the promotion of the idea that we should be using generative adversarial networks as search engines or encyclopedias so much it's unreal, literal misinfo generation machines :agony-shivering:
ChatGPT is fun but also confidently wrong all the time. Using it as a knowledge source is one of the worst possible applications
The GPT4 announcement included a benchmark on how often the different GPTs lie to you. ChatGPT was 40% of the time.
GPT4 is down to 20% lies, which is definitely an improvement, but still a huge amount if you think about it at all.
Still more truthful than Communist China
:so-true:
deleted by creator
How often does Google lie?
100% of the time or 0% of the time depending on your definition of lying
I think 90% which tanked their stocks upon announcing their AI lol
However, they note that GPT-4 acts not confident in its answers it so people are less likely to fact-check results than with the previous version.
The more accurate it is, the more likely its mistakes go unnoticed.
Yeah I asked it for sources on ekranoplans and I am pretty sure it made them up:
The Wing-In-Ground (WIG) Effect Craft: A Review
WIG Craft and Ekranoplans by Sergy Komarov
edit: I asked it again and it gave me more fake resources but one of them included a real researcher on the subject. The singularity is here!!!
I now have multiple professors who warn against using it for essays. Students are trying to write phytopathology papers using it.
deleted by creator
adhering to the good|fast|cheap-pick-two rule
You have to ask it not to do that explicitly. AI researchers call it "hallucinating" a response. You have to ask it to say "I don't know" if it doesn't know something.
The Bing one has different modes, creative, balanced, and precise. ChatGPT is seems on the creative side but an application using GPT does not need to be like that. On precise mode the Bing one regularly says it does not have enough info to say