Air Canada Ordered to Compensate Customer Misled by Chatbot

Air Canada has been instructed to pay compensation to a customer who claimed they were deceived into purchasing expensive flight tickets due to incorrect guidance from a chatbot. In a surprising argument, the airline attempted to distance itself from the chatbot’s erroneous advice by asserting that the online tool was a “separate legal entity” responsible for its own actions. However, this argument was dismissed by the Civil Resolution Tribunal (CRT), which stated that Air Canada should take responsibility for all the information on its website, regardless of its source.

In its ruling, the CRT ordered Air Canada to reimburse the customer, Jake Moffatt, $812. This covers the difference between the full-price tickets they purchased after their grandmother’s death and the bereavement rates offered by the airline. According to Moffatt, they made the decision to book the full-fare tickets based on the chatbot’s assurance that they could later claim the reduced bereavement rate. However, when they requested a refund, Air Canada informed them that bereavement rates did not apply to completed travel.

Despite Air Canada’s claim that it could not be held liable for the chatbot’s advice, the CRT disagreed. In its decision, the CRT stated that Air Canada had not taken reasonable care to ensure the accuracy of its chatbot. The airline argued that Moffatt could have found the correct information about bereavement rates on another part of its website. However, the CRT pointed out that there was no reason for Moffatt to believe that one section of the website was more reliable than the chatbot.

This case involving misleading information from a chatbot is relatively uncommon, as per a survey conducted by the Canadian Legal Information Institute. Nonetheless, it serves as a reminder to companies to ensure the accuracy of all the information provided by their online tools. Airlines, in particular, should take the necessary steps to prevent misleading guidance that could result in financial losses for their customers.

Frequently Asked Questions (FAQs):

1. What happened in the case involving Air Canada and the customer, Jake Moffatt?
Air Canada was instructed to pay compensation to Jake Moffatt after they claimed they were deceived into purchasing expensive flight tickets due to incorrect guidance from Air Canada’s chatbot. The Civil Resolution Tribunal (CRT) dismissed Air Canada’s argument that the chatbot was a separate legal entity, stating that the airline is responsible for all the information on its website.

2. How much compensation did the CRT order Air Canada to pay Moffatt?
Air Canada was ordered to reimburse Jake Moffatt $812, which covers the difference between the full-price tickets they purchased and the bereavement rates offered by the airline.

3. Why did Moffatt book the full-fare tickets?
Moffatt made the decision to book the full-fare tickets based on the chatbot’s assurance that they could later claim the reduced bereavement rate.

4. Why did Air Canada refuse to refund Moffatt?
When Moffatt requested a refund and claimed the bereavement rate, Air Canada informed them that bereavement rates did not apply to completed travel.

5. Did Air Canada argue that it couldn’t be held liable for the chatbot’s advice?
Yes, Air Canada claimed that it could not be held liable for the chatbot’s advice. However, the CRT disagreed and stated that the airline had not taken reasonable care to ensure the accuracy of the chatbot’s information.

Definitions:

– Chatbot: An online tool that uses artificial intelligence to simulate conversation with human users, providing information or assistance.

Suggested Related Links:

https://www.aircanada.com
https://www.canlii.org

The source of the article is from the blog agogs.sk

Privacy policy
Contact