A chatbot used by Air Canada hallucinated an inaccurate bereavement discount policy, and a customer who flew based on this information sued the company over being misled. The small claims court sided with the deceived customer, arguing that the chatbot was acting as an official agent of Air Canada, and that there was no reason a customer should have to double check resources from one part of the Air Canada website against different parts of the same website.
I remember a state recently passed a law barring lawyers from using AI programs to generate legal documents, and this right here is why. Remove the possibility of lawyers appealing to “well it’s not our fault the document is wrong, the AI did it!”
I heard about someone doing that from Leonard French. Some old boomer thought the AI could actually search for court cases, ended up getting tricked into citing a bunch of non-existent caselaw and got into a lot of trouble.
I'm wishing so hard it was a Sovereign Citizen...
The OG chatbot