Business

Air Canada Held Accountable for Misinformation Created by Its Chatbot

Published February 16, 2024

A recent tribunal has set a significant example in company accountability by ruling that Air Canada is responsible for the misleading information provided by its artificial intelligence-powered chatbot. This decision is a crucial reminder for businesses that use advanced technology to interact with customers. The case emerged when the chatbot indicated to a customer, Jake Moffatt, that he could obtain a bereavement fare discount after purchasing a full-priced ticket. However, Air Canada disputed this claim later on, stating that the fare needed to be applied for in advance of the trip.

Corporate Responsibility for AI Missteps

The tribunal's member, Christopher Rivers, articulated that it's self-evident for Air Canada to be held liable for the content on their website, including the chatbot's information. The airline did not show due diligence in ensuring the accuracy of the bot's information, leading to the dispute. Air Canada has accepted the ruling and stated that it views the matter as resolved, though it did not provide further comments.

Implications of the Tribunal's Decision

The dispute, though involving a modest sum of about $650, sheds light on broader implications regarding the potential pitfalls as companies increasingly rely on AI for customer service. Legal professionals emphasize the importance of companies accurately managing AI tools. Lawyer Ira Parghi suggested that areas too complex for AI should be avoided to minimize the chances of liability. Existing laws are capable of addressing many AI gaps, but upcoming federal legislation may fill in additional voids. Meanwhile, companies must clearly disclose the AI-driven nature of their services and thoroughly test their systems prior to public deployment to minimize legal and regulatory risks. In the U.S., warnings have been issued against chatbot-related problems such as customer trust erosion and misinformation.

Future of AI and Company Accountability

The Air Canada case represents only a fraction of the potential scenarios that can arise from AI errors. Legal experts are anticipating further judicial decisions to understand the full extent of company liability in the wake of AI miscommunications. The B.C. Civil Resolution Tribunal's ruling also underscores the effectiveness of the platform for consumers to seek justice. Notably, the judgement also criticized Air Canada for its generic dismissal responses that lacked substantial counter-evidence.

AirCanada, chatbot, liability