A chatbot used by Air Canada hallucinated an inaccurate bereavement discount policy, and a customer who flew based on this information sued the company over being misled. The small claims court sided with the deceived customer, arguing that the chatbot was acting as an official agent of Air Canada, and that there was no reason a customer should have to double check resources from one part of the Air Canada website against different parts of the same website.

  • Frank [he/him, he/him]
    ·
    4 months ago

    It's not a "hallucination" you dorks it's a random number generator that you used to replace labor.

    • sexywheat [none/use name]
      ·
      4 months ago

      Yeah why is it a "hallucination" when the AI just makes shit up, but when a person does it they're either lying or just plain wrong.

      • Frank [he/him, he/him]
        ·
        4 months ago

        I didn't lie on my tax returns it was a hallucination due to poorly curated traing matieral.

      • SerLava [he/him]
        ·
        4 months ago

        Because the person knows and the AI is dumb as dirt

    • SerLava [he/him]
      ·
      4 months ago

      I like to call it a hallucination, because while yes the thing isn't smart enough to experience thoughts, it really gets at how absolutely unreliable these things are. People are talking to the thing and taking it seriously, and it's just watching the pink dragons circle around

    • Beaver [he/him]
      ·
      4 months ago

      Idea: how about we replace all our typists with a bunch of monkeys? They'll eventually type the right thing!