Air Canada is being held responsible for a discount its chatbot mistakenly promised a customer, the Washington Post reported.
The airline must refund a passenger, Jake Moffat, who two years ago purchased tickets to attend his grandmother's funeral, under the belief that if he paid full price, he could later file a claim under the airline's bereavement policy to receive a discount, according to a ruling by Canada's Civil Resolution Tribunal (CRT).
He didn't invent the idea, rather a support chatbot with which he communicated on Air Canada's website provided him the false information, ultimately costing the airline several hundred dollars. The tribunal's judgment could set a precedent for holding businesses accountable when relying on interactive technology tools, including generative artificial intelligence, to take on customer service roles.
In November 2022, Moffat spent over $700 (CAD), including taxes and additional charges, on a next-day ticket from Vancouver to Toronto. He made the purchase after being told by a support chatbot on Air Canada's website that the airline would partially refund him for the ticket price under its bereavement policy, as long as he applied for the money back within 90 days, the tribunal document shows. Moffat also spent more than $700 (CAD) on a return flight a few days later, money he claimed he wouldn't have spent had he not been promised a discount at a later date.
But the information he received from the Air Canada chatbot was erroneous. Under the airline's bereavement travel policy, customers must request discounted bereavement fares before they travel, the airline told the tribunal. "Bereavement policy does not allow refunds for travel that has already happened. Our policy is designed to offer maximum flexibility on your upcoming travel during this difficult time," the airline states on its site.
Moffatt subsequently applied for a partial refund for the total cost of his trip within the 90 days of purchase specified by the chatbot, providing the required documentation, including his grandmother's death certificate, according to his claim.
After ongoing correspondence between Moffatt and Air Canada, by phone and email, the airline informed him that the chatbot had been mistaken, and did not grant him a refund, the tribunal document shows. Moffatt then filed a claim with the CRT for $880 (CAD) which he understood to be the difference in regular and alleged bereavement fares to be.
In court, the airline tried to eschew responsibility, calling the chatbot "a separate legal entity that is responsible for its own actions."
The airline also argued that an accurate version of its policy was always represented on its website.
Tribunal member Christopher Rivers determined that it's incumbent upon the company "to take reasonable care to ensure their representations are accurate and not misleading" and that Air Canada failed to do so, the decision shows.
"While a chatbot has an interactive component, it is still just a part of Air Canada's website. It should be obvious to Air Canada that it is responsible for all the information on its website," he said in his decision. "It makes no difference whether the information comes from a static page or a chatbot."
While the airline claimed the customer could have referred to the bereavement travel policy page containing correct information, Rivers said it isn't the customer's responsibility to distinguish between accurate and inaccurate information included on a business's website.
The airline owes Moffatt $812 (CAD) in damages and tribunal court fees, the CRT ruled.
Megan CerulloMegan Cerullo is a New York-based reporter for CBS MoneyWatch covering small business, workplace, health care, consumer spending and personal finance topics. She regularly appears on CBS News Streaming to discuss her reporting.