A chatbot used by Air Canada hallucinated an inaccurate bereavement discount policy, and a customer who flew based on this information sued the company over being misled. The small claims court sided with the deceived customer, arguing that the chatbot was acting as an official agent of Air Canada, and that there was no reason a customer should have to double check resources from one part of the Air Canada website against different parts of the same website.

  • DamarcusArt@lemmygrad.ml
    ·
    4 months ago

    Oh please, I really hope we get more stuff like this. Nothing will kill this fad faster than companies realising they've been swindled by techbros threatening them with FOMO and this algorithm bullshit won't actually do anything useful for them.

      • M500@lemmy.ml
        ·
        4 months ago

        Then I’m not going to talk to them. If the information they give me may be incorrect and not binding, then what’s the point?

        • ProletarianDictator [none/use name]
          ·
          4 months ago

          then what’s the point?

          wooing investors who bust a nut at the the idea of inserting LLMs into shit that doesn't need them.