A chatbot used by Air Canada hallucinated an inaccurate bereavement discount policy, and a customer who flew based on this information sued the company over being misled. The small claims court sided with the deceived customer, arguing that the chatbot was acting as an official agent of Air Canada, and that there was no reason a customer should have to double check resources from one part of the Air Canada website against different parts of the same website.

  • Infamousblt [any]
    ·
    4 months ago

    This was inevitable and will instantly kill AI chatbots. I tried to explain this to my marketing team when they were all excited about an AI chatbot on our company website. Wonder if this will change their tune

    • 420blazeit69 [he/him]
      ·
      4 months ago

      This case may have cost the airline a few grand. Sure you'll get a few losses like this (but many more situations where the customer just eats it), but if the cost savings of the chatbot are enough...