Air Canada Held Liable for Inaccurate Chatbot Advice, Ordered to Compensate Customer

Air Canada has recently been ordered by the Civil Resolution Tribunal to compensate a man from British Columbia due to the inaccurate information provided by the airline’s chatbot. The tribunal’s decision, posted online on Wednesday, found in favor of the man who relied on the bot’s information when booking a flight to attend his grandmother’s funeral. The tribunal member, Christopher C. Rivers, stated that Air Canada did not take reasonable care to ensure the accuracy of its chatbot and awarded $650.88 in damages for negligent misrepresentation.

The man, Jake Moffatt, had inquired about the airline’s bereavement rates, which are reduced fares provided to individuals who need to travel due to the death of an immediate family member. The chatbot had purportedly informed Moffatt that he could claim these fares retroactively by completing a refund application within 90 days of ticket issuance. However, when Moffatt submitted his request with his grandmother’s death certificate, the application was denied, and his subsequent attempts to receive a partial refund were met with further resistance.

Air Canada argued that it could not be held liable for the chatbot’s information, claiming that the bot was a separate legal entity responsible for its own actions. However, Rivers dismissed this argument, emphasizing that the airline is ultimately responsible for all the information on its website.

The compensation awarded by the tribunal covered the difference between the amount Moffatt paid for his flight and the discounted bereavement fare. In addition, Air Canada was ordered to pay pre-judgment interest and fees.

This ruling highlights the importance of maintaining accurate and reliable chatbot services. It reminds companies that they cannot absolve themselves of responsibility by attributing errors to automated systems. Customers should be able to trust the information provided by chatbots and websites alike, without the need for cross-verification. With this decision, Air Canada is held accountable for the misleading information provided by its chatbot, underscoring the significance of ensuring accuracy and transparency in customer interactions through AI-powered tools.

FAQ:
1. What happened to Air Canada regarding its chatbot?
Air Canada was ordered by the Civil Resolution Tribunal to compensate a man from British Columbia due to inaccurate information provided by its chatbot.

2. What was the reason for the compensation?
The man relied on the chatbot’s information when booking a flight to attend his grandmother’s funeral, but the information turned out to be incorrect. The compensation was awarded for negligent misrepresentation.

3. What are bereavement rates?
Bereavement rates are reduced fares provided to individuals who need to travel due to the death of an immediate family member.

4. How did the chatbot misinform the man?
The chatbot informed the man that he could claim bereavement fares retroactively by completing a refund application within 90 days of ticket issuance, but his application was denied when he submitted it with his grandmother’s death certificate.

5. Did Air Canada accept responsibility for the chatbot’s actions?
Air Canada argued that it could not be held liable for the chatbot’s information and claimed that the bot was a separate legal entity. However, this argument was dismissed, and the airline was held responsible for all the information on its website.

6. What was the compensation awarded?
The compensation covered the difference between the amount paid for the flight and the discounted bereavement fare. Air Canada was also ordered to pay pre-judgment interest and fees.

Key Terms:
– Chatbot: A computer program designed to simulate conversation with human users.
– Bereavement rates: Reduced fares provided to individuals who need to travel due to the death of an immediate family member.
– Negligent misrepresentation: Providing inaccurate information due to a lack of reasonable care.

Related links:
Air Canada
Air Canada Chat Support

The source of the article is from the blog shakirabrasil.info

Privacy policy
Contact