Government’s latest attempt at privacy legislation reform includes AI regulation

The Liberal government has introduced its second attempt at overhauling the country’s privacy laws covering the business sector.

Innovation Minister François-Philippe Champagne introduced Bill C-27 in the House of Commons this morning. Details of the legislation weren’t immediately available so it wasn’t clear how it will differ from the proposed legislation the government tabled in 2020, but it comes in three parts:

  • the Consumer Privacy Protection Act 2022 (CPPA), the same name as the previous legislation. In a briefing with reporters a government official suggested it is largely the same as the first version with three changes. One is added protection for children (see below);
  • the Personal Information and Data Protection Tribunal, the same tribunal that would have final say over fines proposed by the federal Privacy Commissioner for violations of the CPPA;
  • and the new Artificial Intelligence and Data Act obliging businesses deploying “high impact” AI technologies to use them responsibly. An AI data commissioner will enforce regulations to be set.

UPDATE: The full legislation is here.

UPDATE: In a technical briefing an official said there are five main changes in the new version of the CPPA compared to the earlier one:

  • inclusion of clearer rules around the exceptions to meaningful consent businesses need to get for collecting personal data;
  • inclusion of definitions of de-identified and anonymized personal data and clarification of how businesses have to treat that data;
  • inclusion of increased responsibilities for firms collecting data of children. Businesses would be limited in their ability to collect or use information on minors. The law would also hold organizations to a higher standard when handling minors’ information than handling the data of adults.
    Minors — or their parents or guardians — would have the right to understand how their personal data is being used, and the right to demand a business delete that data.
  • and expanding the discretion of the Privacy Commissioner to launch investigations of possible CPPA violations.

Organizations will still have to use plain language in explaining to users what personal data is collected and how it will be used.

In response to one complaint, the changes also specify some members of the Personal Information and Data Protection Tribunal must have privacy law experience.

In a news release, Champagne said the new proposed CPPA will ensure that the privacy of Canadians will be protected and that innovative businesses can benefit from clear rules as technology continues to evolve. This includes:

  • increasing control and transparency when Canadians’ personal information is handled by organizations;
  • giving Canadians the freedom to move their information from one organization to another in a secure manner;
  • ensuring that Canadians can request that their information be disposed of when it is no longer needed;
  • providing the Privacy Commissioner of Canada with broad order-making powers, including the ability to order a company to stop collecting data or using personal information; and
  • establishing significant fines for non-compliant organizations—with fines of up to five per cent of global revenue or $25 million, whichever is greater, for the most serious offences.

The government says the proposed Artificial Intelligence and Data Act AIDA) will introduce new rules to strengthen Canadians’ trust in the development and deployment of AI systems, including:

  • protecting Canadians by ensuring high-impact AI systems are developed and deployed in a way that identifies, assesses and mitigates the risks of harm and bias;
  • establishing an AI and Data Commissioner to support the Minister of Innovation, Science and Industry in fulfilling ministerial responsibilities under the Act, including by monitoring company compliance, ordering third-party audits, and sharing information with other regulators and enforcers as appropriate; and
  • outlining clear criminal prohibitions and penalties regarding the use of data obtained unlawfully for AI development or where the reckless deployment of AI poses serious harm and where there is fraudulent intent to cause substantial economic loss through its deployment.

In a briefing for reporters, a government official said AIDA “is a principles-based law that lays out key requirements that companies must follow to demonstrate they are developing and deploying artificial intelligence responsibly.

“The government will define in regulations classes of AI systems that have a high impact on people. The core obligations will require organizations to assess their AI systems against the criteria for high-impact systems. Organizations will have to document their rationale, and proactively identify and mitigate risks in the development and deployment of high-impact systems. This includes risks to human health, safety and biases in systems.”

They will also have to report general information to the public about their AI systems.

The legislation is an attempt to deal with worries that AI systems will be used to identify and discriminate against women, minorities, persons with disabilities and others.

The proposed laws would only apply to federally-regulated industries, such as the financial, telecom, interprovincial transportation, and interprovincial energy sectors, and provinces and territories that don’t have their own private sector privacy laws. These include Ontario, Manitoba, New Brunswick, Nova Scotia, Newfoundland and Labrador, Prince Edward Island, Saskatchewan, Yukon, Nunavut and the Northwest Territories.

The goal of private-sector privacy law reform is to update the Personal Information Protection and Electronic Documents Act (PIPEDA). It needs to be overhauled largely to meet the demands of the European Union that countries have legislation or agreements with the EU that are largely similar to its General Data Protection Regulation (GDPR) if foreign businesses collect personal information of EU residents.

Former privacy commissioner Daniel Therrien criticized the original CPPA (at the time called Bill C-11) for allowing businesses to collect or use an individual’s personal information without their knowledge or consent under certain circumstances. He also saw no need for the creation of a tribunal as an extra step to review any proposed penalties. And he complained that the bill didn’t clearly state that Canadian residents have a right to privacy.

C-11 died when the government called the 2021 election.

In the runup to its second attempt at overhauling PIPEDA, the private sector has been pressuring the government to keep the CPPA and not to make it too close to the GDPR.

Initial reaction to the proposed laws has been favourable. Former Ontario privacy commissioner Ann Cavoukian said she is “especially pleased” to see the AI legislation given the growth and reliance of artificial intelligence. “I’ve always said that we need to ‘look under the hood’ when it comes to AI and the means by which it is brought to bear. So this looks like a good start — very thorough and comprehensive, on the face of it. Cavoukian is now executive director of the Toronto-based Global Privacy & Security by Design Centre.

The proposed bill is long overdue, said Imran Ahmad, co-head of the information governance, privacy and cybersecurity at the Norton Rose Fulbright Canada law firm. “The changes (compared to its predecessor, Bill C-11) take into consideration the feedback and critiques directed by stakeholders during the last consultation process. The bill seems to be going in the right direction. The order-making powers of the Commissioner are one of the key features of the Bill. Overall, it brings Canada in line with what’s happening around the world in the area of privacy and data protection laws.”

At the end of the day it is not dramatically different from C-11 which was tabled in 2020, said David Fraser of the Halifax-based McInnes Cooper law firm. “The main changes from PIPEDA are the ability for the OPC to issue orders, and the establishment of a tribunal to deal with penalties. I think it does appropriately take into account the need to separate the role of the police and prosecution, on one hand, from the role of the judge, on the other hand.”

It is interesting to note that information about minors is deemed to be sensitive, he said, but that individual autonomy of decision-making for kids is respected where they are mature enough to make their own decisions.

There’s something in here to make most people happy and to disappoint most people, he said. “On first blush, there are no big surprises: it seems to be a reasonable compromise”.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Howard Solomon
Howard Solomon
Currently a freelance writer, I'm the former editor of ITWorldCanada.com and Computing Canada. An IT journalist since 1997, I've written for several of ITWC's sister publications including ITBusiness.ca and Computer Dealer News. Before that I was a staff reporter at the Calgary Herald and the Brampton (Ont.) Daily Times. I can be reached at hsolomon [@] soloreporter.com

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now