Smart Error Air Canada compensates passengers for chatbot incorrect answers

Mondo Technology Updated on 2024-02-21

The focus of the incident was the experience of passenger Jake Moffatt, who flew from Vancouver to Toronto in 2022 due to the death of his grandmother. Moffatt tried to book a ticket through Air Canada** and asked about the company's bereavement fares. According to court documents, the chatbot responded to Moffatt's query about bereavement fares, claiming that Air Canada offered bereavement fare discounts and included a hyperlink to the airline's policy. However, Moffatt did not click on the link, which stated contrary to what the chatbot stated, stating that the customer could not apply for a bereavement fare after completing the trip.

Moffatt also called Air Canada on the day he spoke to the chatbot for more information on the possible amount of the flight discount. He claimed that a human customer service representative told him that he would receive a discount of about 440 Canadian dollars (about $326), without telling him that the discount could not be applied retroactively. Based on information from the chatbot and a human customer service representative, Moffatt booked his flight.

A few days later, Moffatt submitted his request for a partial refund totaling 1,630 Canadian dollars (about $1,210). After weeks of arguing with the airline, Moffatt sent screenshots of the chatbot's response to Air Canada in February 2023. In response, a human customer service representative told him that the chatbot's suggestion was "misleading" and said they would document the issue so that Air Canada could update the chatbot.

Moffatt's dealings with Air Canada eventually went to the Civil Settlement Tribunal (CRT), a quasi-judicial tribunal for civil law disputes such as small claims. Moffatt himself appeared in court on his own behalf, while Air Canada was represented by an employee.

In its defense, Air Canada denied all of Moffatt's allegations and claimed that it could not be held responsible for the information provided by its servants, **men, representatives or chatbots — an argument that court member Christopher CRivers was confused. In a decision released this week, Rivers said that Air Canada's claim that its chatbot is "an independent legal entity responsible for its actions" is meaningless.

Rivers added that Air Canada did not take reasonable steps to ensure the accuracy of its chatbot. Eventually, Rivers ordered Air Canada to pay Moffatt for almost a year and a half to get a refund.

The incident showed the big companies a message that the excuse that "it was my chatbot, not me" won't work in court.

Related Pages