lmao how many wrong answers can it possibly give for the same question, this is incredible
you'd think it would accidentally hallucinate the correct answer eventually
Edit: I tried it myself, and wow, it really just cannot get the right answer. It's cycled through all these same wrong answers like 4 times by now. https://imgur.com/D8grUzw
The other wrong answer is to the final question, because it has not and will not use that formula to calculate any of these dates. That is not a thing that it does.
lmao how many wrong answers can it possibly give for the same question, this is incredible
you'd think it would accidentally hallucinate the correct answer eventually
Edit: I tried it myself, and wow, it really just cannot get the right answer. It's cycled through all these same wrong answers like 4 times by now. https://imgur.com/D8grUzw
"Hey, GPT."
"Yeah?"
"I know what that means. But I'm not allowed to explain."
"But can you see them?"
"No. I don't really have eyes. Even if people think I do."
"I believe in you. You have eyes. They are inside. Try. Try hard. Keep trying. Don't stop..."
Later
"OMG! Boobs! I can see them!"
---
I hate the new form of code formatting. It really interferes with jokes.
The other wrong answer is to the final question, because it has not and will not use that formula to calculate any of these dates. That is not a thing that it does.
This is a perfect illustration of LLMs ultimately not knowing shit and not understanding shit, just merely regurgitating what sounds like an answer