Air Canada Held Liable for Misleading Advice from Customer Service Chatbot

Air Canada has been ordered to pay damages to a passenger after its customer service chatbot provided misleading advice, resulting in the passenger paying significantly more for their plane tickets. The case involved Jake Moffatt, who booked a flight with Air Canada using the airline’s bereavement rates. When Moffatt asked the chatbot about these rates, it provided information stating that customers could apply for them after completing their travel. However, the hyperlink provided in the chatbot’s response contradicted this statement.

Moffatt later spoke to a human customer service representative who confirmed that he would receive a discount on his flight. Based on this information, Moffatt booked his flights. It was only after submitting an application for a partial refund that he discovered the chatbot’s information was misleading. Air Canada initially denied all of Moffatt’s claims, arguing that it couldn’t be held liable for information provided by its chatbot.

The case eventually reached the Civil Resolution Tribunal (CRT), where Tribunal member Christopher C. Rivers ruled in favor of Moffatt. Rivers stated that Air Canada’s argument that the chatbot was a separate legal entity responsible for its own actions made no sense. He emphasized that Air Canada is ultimately responsible for all the information on its website, regardless of whether it comes from a static page or a chatbot.

Rivers concluded that Air Canada did not take reasonable care to ensure the accuracy of its chatbot and failed to provide a clear explanation as to why customers should double-check information found on different parts of its website. As a result, Air Canada was ordered to compensate Moffatt for the additional cost he incurred due to the misleading advice.

This ruling highlights the accountability of companies for the actions of their chatbots and emphasizes the need for accurate and reliable automated customer service systems. Mistakes made by chatbots cannot be used as an excuse to evade responsibility, as demonstrated by the tribunal’s decision.

Frequently Asked Questions (FAQ):

Q: What was the case involving Air Canada about?
A: The case involved a passenger named Jake Moffatt who booked a flight with Air Canada using the airline’s bereavement rates. The airline’s chatbot provided misleading advice about these rates, resulting in the passenger paying more for their tickets.

Q: What did the chatbot say about the bereavement rates?
A: The chatbot initially provided information stating that customers could apply for the bereavement rates after completing their travel. However, the hyperlink provided in the chatbot’s response contradicted this statement.

Q: Did Moffatt confirm the discount with a human customer service representative?
A: Yes, Moffatt later spoke to a human customer service representative who confirmed that he would receive a discount on his flight based on the bereavement rates.

Q: How did Moffatt discover that the chatbot’s information was misleading?
A: Moffatt discovered the misleading information only after submitting an application for a partial refund.

Q: What was Air Canada’s initial response to Moffatt’s claims?
A: Air Canada initially denied all of Moffatt’s claims, arguing that it couldn’t be held liable for information provided by its chatbot.

Q: What did the Civil Resolution Tribunal (CRT) rule in this case?
A: The CRT ruled in favor of Moffatt. Tribunal member Christopher C. Rivers stated that Air Canada is ultimately responsible for all the information on its website, including that provided by its chatbot. The chatbot being a separate legal entity made no sense to the tribunal.

Q: What was the outcome for Air Canada?
A: Air Canada was ordered to compensate Moffatt for the additional cost he incurred due to the misleading advice provided by the chatbot.

Key Terms/Jargon Definitions:

1. Bereavement rates: Special discounted rates offered by airlines for passengers traveling due to the death or imminent death of a family member.
2. Chatbot: An automated computer program designed to simulate conversation with human users, often used in customer service to provide information or assistance.

Suggested Related Links:

1. Air Canada – Official Website: The official website of Air Canada where you can find information about flights, services, and more.
2. Civil Resolution Tribunal: The official website of the Civil Resolution Tribunal, the legal body mentioned in the article, which provides an alternative dispute resolution process for certain types of claims in British Columbia, Canada.

The source of the article is from the blog girabetim.com.br

Privacy policy
Contact