Canadian AI Legal Tracker
Canadian court cases and legal proceedings involving AI systems.
10 cases tracked
Seven major Canadian news organizations sued OpenAI alleging copyright infringement, circumvention of technological protection measures, breach of website terms of service, and unjust enrichment, arising from OpenAI's use of their content to train ChatGPT. Damages sought include statutory damages of CA$20,000 per work and punitive damages.
Specter Aviation sought homologation of a Paris arbitral award. Self-represented defendant Laprade filed pleadings containing at least eight fictitious case citations and doctrinal references generated by ChatGPT. First judicial sanction for AI misuse in Quebec litigation.
An appeal factum, prepared by a third-party contractor using a large language model, contained seven fabricated case citations. The Court of Appeal held that the lawyer of record bears ultimate responsibility for filed materials regardless of who drafted them. A follow-on decision (2026 ABCA 20) imposed CA$17,550 in personal costs — the first appellate personal costs award for AI hallucination in Canada.
Clearview AI challenged the Alberta Privacy Commissioner's order prohibiting collection of facial biometric data. The court partially sided with Clearview on freedom of expression grounds, finding some PIPA provisions unconstitutional. However, it upheld the finding that mass scraping of images for AI training does not qualify as "publicly available" data under the statute.
In an estates matter, counsel's factum cited multiple non-existent cases — one hyperlink returned a 404 error, another redirected to an unrelated decision. The court ordered counsel to show cause for contempt. Subsequent hearings found that counsel's admission and corrective steps purged the contempt.
Egyptian refugee claimants' counsel filed a motion containing two non-existent cases generated by Visto.ai, an AI tool for Canadian immigration law. Counsel concealed the AI use through four court directions before admitting it. The court held that undeclared and unverified GenAI output compounded by concealment warranted special costs.
Clearview AI sought judicial review of the BC Privacy Commissioner's order requiring it to stop collecting and using facial biometric data of British Columbians scraped from social media. The court held that BC's PIPA applies extraterritorially and that the data was not "publicly available" within the statutory exception. The BCCA unanimously affirmed in February 2026.
CanLII alleged that Caseway AI systematically scraped over 120 GB and 3.5 million records from its legal database in breach of terms of use and copyright, using the data to train an AI legal assistant chatbot. Caseway disputed the terms-of-use claim, arguing the underlying court documents are public records.
Jake Moffatt consulted Air Canada's website chatbot, which incorrectly stated he could apply retroactively for bereavement fares within 90 days. Air Canada's policy actually barred retroactive applications. The tribunal rejected Air Canada's argument that the chatbot was a "separate legal entity" and found negligent misrepresentation.
In a family law dispute over children's travel to China, defendant's counsel cited two non-existent cases generated by ChatGPT. The fabrications were discovered by opposing counsel. Canada's first reported decision addressing AI-hallucinated citations in court filings.