CALT is in pilot phase. Data is preliminary and subject to change.
Decided Tribunal Ruling AI Liability British Columbia
Jake Moffatt consulted Air Canada's website chatbot, which incorrectly stated he could apply retroactively for bereavement fares within 90 days. Air Canada's policy actually barred retroactive applications. The tribunal rejected Air Canada's argument that the chatbot was a "separate legal entity" and found negligent misrepresentation.
Court British Columbia Civil Resolution Tribunal
Citation 2024 BCCRT 149
Jurisdiction British Columbia
Decided
Parties Plaintiff / Applicant: Jake Moffatt
Defendant / Respondent: Air Canada
Judge Tribunal Member Christopher C. Rivers

AI context

AI system: Air Canada website chatbot

Air Canada deployed a chatbot on its website to handle customer inquiries. The chatbot generated an inaccurate statement about bereavement fare policy that the customer reasonably relied upon.

Significance

First Canadian ruling holding a corporation liable for its AI chatbot's misrepresentations. Established that companies cannot disclaim responsibility by treating a chatbot as a separate entity.

Outcome

Other

Tribunal awarded CA$812.02 to Moffatt for negligent misrepresentation. Air Canada held liable for its chatbot's inaccurate advice about bereavement fares.

Sources

View on CanLII →