A chatbot used by Air Canada hallucinated an inaccurate bereavement discount policy, and a customer who flew based on this information sued the company over being misled. The small claims court sided with the deceived customer, arguing that the chatbot was acting as an official agent of Air Canada, and that there was no reason a customer should have to double check resources from one part of the Air Canada website against different parts of the same website.

  • Infamousblt [any]
    hexbear
    48
    4 months ago

    This was inevitable and will instantly kill AI chatbots. I tried to explain this to my marketing team when they were all excited about an AI chatbot on our company website. Wonder if this will change their tune

    • 420blazeit69 [he/him]
      hexbear
      8
      4 months ago

      This case may have cost the airline a few grand. Sure you'll get a few losses like this (but many more situations where the customer just eats it), but if the cost savings of the chatbot are enough...

  • carpoftruth [any, any]M
    hexbear
    42
    4 months ago

    According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot's misleading information because Air Canada essentially argued that "the chatbot is a separate legal entity that is responsible for its own actions," a court order said.

    Prepare for more of that, applied to weaponized drones

  • Frank [he/him, he/him]
    hexbear
    37
    4 months ago

    It's not a "hallucination" you dorks it's a random number generator that you used to replace labor.

    • sexywheat [none/use name]
      hexbear
      10
      4 months ago

      Yeah why is it a "hallucination" when the AI just makes shit up, but when a person does it they're either lying or just plain wrong.

      • Frank [he/him, he/him]
        hexbear
        10
        4 months ago

        I didn't lie on my tax returns it was a hallucination due to poorly curated traing matieral.

      • SerLava [he/him]
        hexbear
        5
        4 months ago

        Because the person knows and the AI is dumb as dirt

    • SerLava [he/him]
      hexbear
      7
      4 months ago

      I like to call it a hallucination, because while yes the thing isn't smart enough to experience thoughts, it really gets at how absolutely unreliable these things are. People are talking to the thing and taking it seriously, and it's just watching the pink dragons circle around

    • Beaver [he/him]
      hexbear
      3
      4 months ago

      Idea: how about we replace all our typists with a bunch of monkeys? They'll eventually type the right thing!

  • PeeOnYou [he/him]@lemmygrad.ml
    hexbear
    35
    4 months ago

    air canada is a bunch of fucking thieves... my partner's parents had a flight cancelled and then rebooked, and then cancelled, and then rebooked again 2 days later... Air Canada tried to send them $300 as compensation when the laws clearly state if its over 9 hours delayed then they owe $1000. When her parents refuted the $300 compensation they said it was too late, they already sent the $300. Now they have to go petition some air board something or other. On top of that when they rebooked the flight finally, they didn't add the extra bag from the original ticket and wouldn't let her dad board the plane, so my partner called the airline and gave them her credit card to add the goddamned bag again. Then when all was said and done they had charged both of them for the same bag. When my partner called them to dispute this, they said they only see 1 charge for the bag and tough luck.

    Fucking scam artists.

    • sexywheat [none/use name]
      hexbear
      10
      4 months ago

      IIRC they were rated the worst airline in all of North America a few years back, and totally well deserved.

      What privatisation of state industries/services does to a mf.

  • PKMKII [none/use name]
    hexbear
    34
    4 months ago

    I remember a state recently passed a law barring lawyers from using AI programs to generate legal documents, and this right here is why. Remove the possibility of lawyers appealing to “well it’s not our fault the document is wrong, the AI did it!”

    • OutrageousHairdo [he/him]
      hexagon
      hexbear
      27
      4 months ago

      I heard about someone doing that from Leonard French. Some old boomer thought the AI could actually search for court cases, ended up getting tricked into citing a bunch of non-existent caselaw and got into a lot of trouble.

  • @DamarcusArt@lemmygrad.ml
    hexbear
    21
    4 months ago

    Oh please, I really hope we get more stuff like this. Nothing will kill this fad faster than companies realising they've been swindled by techbros threatening them with FOMO and this algorithm bullshit won't actually do anything useful for them.

      • @M500@lemmy.ml
        hexbear
        4
        4 months ago

        Then I’m not going to talk to them. If the information they give me may be incorrect and not binding, then what’s the point?

        • ProletarianDictator [none/use name]
          hexbear
          1
          4 months ago

          then what’s the point?

          wooing investors who bust a nut at the the idea of inserting LLMs into shit that doesn't need them.

  • NephewAlphaBravo [he/him]
    hexbear
    17
    4 months ago

    I extremely approve of describing everything AI does as hallucinations, dreaming, etc