I get the general techbro sphere doing this because they're fucking idiots, but you are the maintainers of this technology. You know this is ultimately a language model built to synthesize information from pre-existing data sets. This is extremely fucking ghoulish.
I hate the promotion of the idea that we should be using generative adversarial networks as search engines or encyclopedias so much it's unreal, literal misinfo generation machines :agony-shivering:
However, they note that GPT-4 acts not confident in its answers it so people are less likely to fact-check results than with the previous version.
The more accurate it is, the more likely its mistakes go unnoticed.