CALT is in pilot phase. Data is preliminary and subject to change.

Canadian AI Legal Tracker

Canadian court cases and legal proceedings involving AI systems.

10 cases tracked

10 cases tracked

Pending AI & Copyright Ontario
2025 ONSC 6217

Seven major Canadian news organizations sued OpenAI alleging copyright infringement, circumvention of technological protection measures, breach of website terms of service, and unjust enrichment, arising from OpenAI's use of their content to train ChatGPT. Damages sought include statutory damages of CA$20,000 per work and punitive damages.

Active Litigation
Decided Hallucinated Citations Quebec
2025 QCCS 3521

Specter Aviation sought homologation of a Paris arbitral award. Self-represented defendant Laprade filed pleadings containing at least eight fictitious case citations and doctrinal references generated by ChatGPT. First judicial sanction for AI misuse in Quebec litigation.

Judgment
Decided Hallucinated Citations Alberta
2025 ABCA 322

An appeal factum, prepared by a third-party contractor using a large language model, contained seven fabricated case citations. The Court of Appeal held that the lawyer of record bears ultimate responsibility for filed materials regardless of who drafted them. A follow-on decision (2026 ABCA 20) imposed CA$17,550 in personal costs — the first appellate personal costs award for AI hallucination in Canada.

Judgment
Decided AI & Privacy Alberta
2025 ABKB 287

Clearview AI challenged the Alberta Privacy Commissioner's order prohibiting collection of facial biometric data. The court partially sided with Clearview on freedom of expression grounds, finding some PIPA provisions unconstitutional. However, it upheld the finding that mass scraping of images for AI training does not qualify as "publicly available" data under the statute.

Judicial Review
Decided Hallucinated Citations Ontario
2025 ONSC 2766

In an estates matter, counsel's factum cited multiple non-existent cases — one hyperlink returned a 404 error, another redirected to an unrelated decision. The court ordered counsel to show cause for contempt. Subsequent hearings found that counsel's admission and corrective steps purged the contempt.

Judgment
Decided Hallucinated Citations Federal
2025 FC 1060

Egyptian refugee claimants' counsel filed a motion containing two non-existent cases generated by Visto.ai, an AI tool for Canadian immigration law. Counsel concealed the AI use through four court directions before admitting it. The court held that undeclared and unverified GenAI output compounded by concealment warranted special costs.

Judgment
Appealed AI & Privacy British Columbia
2024 BCSC 2311

Clearview AI sought judicial review of the BC Privacy Commissioner's order requiring it to stop collecting and using facial biometric data of British Columbians scraped from social media. The court held that BC's PIPA applies extraterritorially and that the data was not "publicly available" within the statutory exception. The BCCA unanimously affirmed in February 2026.

Judicial Review
Pending AI & Copyright British Columbia

CanLII alleged that Caseway AI systematically scraped over 120 GB and 3.5 million records from its legal database in breach of terms of use and copyright, using the data to train an AI legal assistant chatbot. Caseway disputed the terms-of-use claim, arguing the underlying court documents are public records.

Active Litigation
Decided AI Liability British Columbia
2024 BCCRT 149

Jake Moffatt consulted Air Canada's website chatbot, which incorrectly stated he could apply retroactively for bereavement fares within 90 days. Air Canada's policy actually barred retroactive applications. The tribunal rejected Air Canada's argument that the chatbot was a "separate legal entity" and found negligent misrepresentation.

Tribunal Ruling
Decided Hallucinated Citations British Columbia
2024 BCSC 285

In a family law dispute over children's travel to China, defendant's counsel cited two non-existent cases generated by ChatGPT. The fabrications were discovered by opposing counsel. Canada's first reported decision addressing AI-hallucinated citations in court filings.

Judgment