AI in Alberta Litigation: Courts Demand Human Oversight and Accountability feature image

AI in Alberta Litigation: Courts Demand Human Oversight and Accountability

By HMC Lawyers LLP

Artificial intelligence (AI) is rapidly transforming legal practice and the broader business environment in Alberta. As AI tools become more prevalent, Alberta’s Courts are grappling with new challenges, including the use of generative AI in legal submissions that reference non-existent case law.


In the past year, Alberta’s Courts have issued clear guidance about the use of artificial intelligence (AI), especially generative AI, in litigation. Two appellate decisions stand out for their direct treatment of AI-generated submissions and the responsibilities of both lawyers and self-represented litigants.

The Lawyer on Record is Ultimately Responsible for AI Misuse

In Reddy v Saroya, 2025 ABCA 322 the Alberta Court of Appeal addressed the professional risks of using generative AI to prepare written submissions. The appellant filed their initial factum on December 16, 2024, while the respondent’s factum was filed on January 16, 2025 and identified a potential AI Issue. The appellant’s factum contained several citations to cases that could not be found. When contacted, appellant’s counsel stated that the cases existed, but there were errors in the citations.

At the oral hearing, the appellant’s counsel explained that illness, a busy schedule, and the holiday season contributed to their failure to properly review the original factum and recognize the issue with the cited cases. The factum was drafted by a third-party contractor who claimed not to have used generative AI.

The Court referenced Rule 3.1-2 of the Law Society of Alberta’s Code of Conduct, requiring lawyers to perform all legal services the standard of a competent lawyer, and lawyers should develop an understanding of relevant technology. The Law Society of Alberta published “the Generative AI Playbook”, a resource described as a starting point for Alberta’ lawyer seeking to harness the benefits of disruptive technologies, like generative AI, while safe guarding their client’s interests and maintaining professional competence.  The Court emphasized that when AI is used without safeguards, it can introduce confusion and delay into proceedings and may constitute an abuse of process that could bring the administration of justice into disrepute.

The Alberta Courts issued a Notice to the Public and Legal Profession dated October 6, 2023 titled “Ensuring the Integrity of Court Submissions When Using Large Language Models” (the “October 2023 Notice”).  The Court of Appeal re-emphasized the principles of this notice: parties must rely on authoritative sources (such as official court websites, commonly referred commercial publishers, or well-established public services like CanLII) when referring to caselaw, and the involvement of a “human in the loop”, human verification of any work produced by generative AI. Parties are expected to budget the time to cross-reference the output of AI. If engaging with a third-party contractor to assist with drafting, the lawyer on record bears the ultimate responsibility for the material’s form and contents and compliance with the October 2023 Notice.

The consequences of not complying with the October 2023 Notice are at the discretion of the panel or judge. Counsel and self-represented litigants should not expect leniency for non-compliance; possible remedies include striking submissions, or the imposition of cost awards against the party who failed to adhere.

Leniency is Not Afforded to Self-represented Litigants Who Misuse AI

In DJ v SN, 2025 ABCA 383, the Alberta Court of Appeal sanctioned a self-represented litigant who used AI-generated authorities in their factum. The Court awarded costs of $500.00 against the litigant and warned that more substantial penalties could be imposed in future cases for failing to comply with the October 2023 Notice on the use of generative AI.

Key Takeaways

  • Lawyers and self-represented litigants must verify the output of any generative AI used when preparing their written submissions.
  • The lawyer on record is ultimately responsible for the misuse of AI, even when using a third-party contractor for drafting.
  • Time constraints, illnesses or busy schedules do not excuse failure to verify citations.
  • The Court is considering enhanced costs against counsel personally for failing to confirm the accuracy of cited authorities and may impose further sanction for future breaches.

Contact HMC Lawyers for Legal Guidance

At HMC Lawyers, we are proficient on the technical requirements of the Courts to handle your legal needs. Our breadth of experience allows us to promptly handle litigation issues and anticipate potential roadblocks that may delay its resolution. To make an appointment with a member of our team, contact us online or call 403-269-7220.
mobile skyline

Contact HMC Lawyers for Exceptional Legal Guidance

At HMC Lawyers, we offer strategic legal advice. Our breadth of practice experience allows us to promptly handle almost every litigation-related legal issue that may arise, and anticipate potential roadblocks that may delay its resolution. To make an appointment with a member of our team, contact us online or call 403-269-7220

For articling and/or summer student inquiries please contact either Kristen Hagg or Praveen Thind by calling 403.269.7220 or emailing them directly at joinus@hmclawyers.com.

HMC Lawyers LLP
#1000 903 8th Ave SW
Calgary, AB
T2P 0P7

    Please provide us with as much detail as you can about your case. A lawyer from HMC Lawyers will be in touch as soon as possible.