In August 2025, the Office of the Information and Privacy Commissioner of Alberta (the “OIPC”) published a report (the “Report”) recommending a legal framework for the use of artificial intelligence (“AI”) in Alberta. To date, there are no AI-specific laws regulating the use of AI in any province or at a federal level in Canada. The only exceptions to this current gap in regulation are:

  1. the specific requirements regarding individual rights where personal information is subject to ‘automated decision-making’ under Quebec’s private sector act; and
  2. the transparency obligations where AI is used in publicly advertised job postings under Ontario’s Working for Workers Four Act, 2024 (Bill 149), which will come into force in 2026.

The Report suggests that a standalone provincial statute in Alberta, complementing modernized privacy laws, could position the province as a leader in responsible AI governance.

This article answers common business questions about what the OIPC recommends, why it matters, and how organizations should prepare.

Why does Alberta need an AI law?

The OIPC argues that a dedicated law is needed to manage risks such as biased automated decision-making, while also enabling benefits such as improving the quality of public services and enhancing the delivery of health care. To strike the right balance between protecting privacy and realizing these benefits, there must be a legal framework that facilitates the development and use of AI while protecting the public.

For businesses, a clear legal framework would reduce uncertainty, establish compliance expectations, and create opportunities to compete in global markets where AI rules are already emerging (e.g., the European Union’s Artificial Intelligence Act (the “EU AI Act”).

What would a standalone AI law include?

The OIPC recommends that Alberta’s existing privacy laws – the Protection of Privacy Act (“POPA”), the Personal Information Protection Act (“PIPA”), and the Health Information Act (“HIA”) – be complemented by a standalone AI law aimed at preventing and reducing harm to Albertans under provincial jurisdiction. To be effective, the standalone AI law should be broad in scope, align with other privacy legislation, and support Alberta’s broader digital strategy while safeguarding the rights of Albertans.

In terms of content, the OIPC recommends that Alberta’s AI law aligns with global frameworks such as the EU AI Act. Key elements would include:

  • safety and security requirements for AI systems;
  • transparency and explainability for users and regulators;
  • traceability and accountability rules clarifying developer versus deployer obligation;
  • non-discrimination and anti-bias safeguards;
  • human oversight of high-risk AI decisions; and
  • privacy by design, prioritizing anonymized or synthetic data in training.

The Report clarifies that while conformance with these elements is important, a standalone AI law could still be customized to reflect the unique context of Albertans’ values and industries, and provide a measure of provincial control in areas beyond federal jurisdiction, including health services, the public sector, and provincial commerce.

However, the OIPC also states that AI legislation alone is not sufficient to regulate all the impacts of AI; it must work in conjunction with other laws, including privacy legislation.

How should Alberta’s privacy laws be modernized?

The OIPC stresses that updating POPA, PIPA, and HIA is essential, regardless of whether a standalone AI law is enacted.

The OIPC previously issued comments and recommendations during the review of the Freedom of Information and Protection of Privacy Act (“FOIP”) – which was repealed on June 11, 2025, and replaced with two acts, the Access to Information Act and POPA – and during the review of PIPA. These recommendations, which focused on codifying privacy rights to protect Albertans from harmful uses of their information, were reiterated in the Report.

First, on March 4, 2024, the OIPC submitted to the Department of Technology and Innovation a document outlining changes to FOIP required to support the adoption of AI in the public sector. These included legislating authorized purposes for collection, use, and disclosure of personal information in AI systems, and codifying rights to ensure fair and privacy-respecting operation of those systems.

Second, as part of its review of PIPA, the OIPC issued recommendations to the Standing Committee on Resource Stewardship that specifically addressed concerns about the use of automated decision-making in the private sector.

Proposed updates include:

  • granting individuals the right to be notified when AI systems are used in decisions affecting them, prior to the decision being made;
  • allowing individuals to contest automated decisions;
  • requiring organizations to publish plain-language explanations of how AI is used;
  • authorizing the Commissioner to audit and suspend harmful AI systems; and
  • defining categories of data (anonymized, synthetic, pseudonymized, personal) and regulating their use in AI.

What should organizations do now?

During this period of ambiguity and uncertainty regarding the use or development of AI, and in anticipation of potential further provincial or federal legislation, organizations should:

  • follow existing privacy guidance from federal and provincial privacy commissioners;
  • monitor developments in the EU and other jurisdictions as laws such as the EU AI Act are implemented over the next 12 to 18 months;
  • adopt privacy-by-design principles in AI systems;
  • develop internal policies on accountability, transparency, oversight, and compliance with existing laws; and
  • monitor legislative updates to prepare for rapid compliance once new laws are introduced.

Conclusion

The Report marks a potentially pivotal moment in Alberta’s approach to AI regulation. As the OIPC notes, the absence of adequate privacy and AI laws creates risks such as unfair or biased automated decision-making. Addressing these gaps through the provincial regulation of AI can build trust, give Alberta-based entities confidence to adopt AI, and strengthen the province’s position in the digital economy.

In the interim, the OIPC recommends measures that can be undertaken by organizations in Alberta’s private, health, and public sectors to protect the public from potential harms of AI, including following guidance already published by federal, provincial, and territorial privacy commissioners.

If your organization is developing or deploying AI systems in Alberta, our Technology, Intellectual Property, and Privacy Group can help you assess risks, build compliance strategies, and anticipate legislative change. Contact our AI team today to discuss your organization’s needs.


Subscribe to our newsletters to receive timely updates on AI, technology, and privacy law developments.