BC Tribunal Confirms Companies Remain Liable for Information Provided by AI Chatbot

4 Min Read By: Lisa R. Lifshitz, Roland Hung

On February 14, 2024, the British Columbia Civil Resolution Tribunal (the “Tribunal”) found Air Canada liable for misinformation given to a consumer through the use of artificial intelligence chatbots (“AI chatbot”).[1]

The decision, Moffatt v. Air Canada, generated international headlines, with reports spanning from the Washington Post in the United States to the BBC in the United Kingdom.[2] While AI comes with economical and functional benefits, companies clearly remain liable if inaccurate information is provided to consumers through use of an AI tool.

Background

AI chatbots are automated programs that use AI and other potential tools like natural language processing to simulate a conversation and provide information in response to a person’s prompts and input. Common virtual assistants such as Alexa and Siri are all examples of AI chatbots.[3]

Increasingly, AI chatbots are used in commerce. According to a 2024 report from AI Multiple Research,[4] AI chatbots have saved organizations around $0.70 USD per interaction. By 2025, the predicted revenue of the chatbot industry is estimated to reach around $1.3 billion USD. Today, around half of all large companies are considering investing in these tools. Air Canada’s AI chatbot is one example of their use in a commercial setting. However, as the Tribunal’s decision shows, they do not come without risks.

The Tribunal’s Decision in Moffatt

The Tribunal’s decision came after a complaint was made by Jake Moffatt. Moffatt wanted to purchase an Air Canada plane ticket to fly to Ontario, where his grandmother had recently passed away. On the airline’s website, Moffatt engaged with an AI chatbot, which responded that there is a discount if the buyer is traveling because of a death in the family and using reduced bereavement fares. Anyone seeking a reduced fare could allegedly submit their ticket within ninety days of issuance through an online form and receive the lower bereavement rate.[5]

Unfortunately, the AI chatbot’s answer was incorrect. The reference to “bereavement fares” was hyperlinked to a separate Air Canada webpage titled “Bereavement travel” that contained additional information regarding Air Canada’s bereavement policy. The webpage indicated that the bereavement policy does not apply to requests for bereavement consideration after travel was completed. Accordingly, when Moffatt submitted his application to receive a partial refund of his fare, Air Canada refused. After a series of interactions, Air Canada admitted that the chatbot had provided “misleading words.” The representative pointed out the chatbot’s link to the bereavement travel webpage and said Air Canada had noted the issue so it could update the chatbot.

Moffatt then sued Air Canada for having relied on its chatbot, which the Tribunal determined was an allegation of negligent misrepresentation. Air Canada alleged that the correct information could have been found elsewhere on its website and argued that it could not be liable for the AI chatbot’s responses.[6] Strangely, Air Canada endeavored to argue that the chatbot was a separate legal entity that is responsible for its own actions.

The Tribunal ultimately found in favor of Moffatt. While a chatbot has an interactive component, the Tribunal found that the program was just a part of Air Canada’s website and Air Canada still bore responsibility for all the information on its website, whether it came from a static page or a chatbot. As a service provider, Air Canada owed Moffatt a duty of care that was breached by the misrepresentation. Air Canada could not separate itself from the AI chatbot, which was integrated in its own website. Negligence existed as Air Canada did not take reasonable care to ensure that its chatbot provided accurate information. It did not matter if the correct information existed elsewhere. A consumer cannot be expected to double-check information it finds on one part of the website with another.[7]

The Tribunal ultimately awarded Moffat approximately $650 CAD in damages, plus pre-judgement interest and filing fees with the Tribunal.

Takeaways

While admittedly this is not a court decision, the Tribunal’s decision in Moffatt serves as a helpful reminder that companies remain liable for the actions of their AI tools. Additionally, any company that intends to use AI tools should also ensure that they also put into place adequate internal policies that protect consumer privacy, warn consumers of any limitations, and train the AI system to deliver accurate results.


  1. Moffatt v. Air Canada, 2024 BCCRT 149.

  2. Kyle Melnick, “Air Canada chatbot promised a discount. Now the airline has to pay it.,” Washington Post, February 18, 2024; Maria Yagoda, “Airline held liable for its chatbot giving passenger bad advice – what this means for travellers,” BBC, February 23, 2024.

  3. What is a chatbot?,” IBM, accessed February 27, 2024.

  4. Cem Dilmegani, “90+ Chatbot/Conversational AI Statistics in 2024,” AIMultiple, last modified February 5, 2024.

  5. Moffatt, supra note 1, at paras. 13-16.

  6. Id. at paras. 18–25.

  7. Id. at paras. 26–32.

By: Lisa R. Lifshitz, Roland Hung

MORE FROM THESE AUTHORS

Connect with a global network of over 30,000 business law professionals

18264

Login or Registration Required

You need to be logged in to complete that action.

Register/Login