Site icon IT World Canada

Proposed Canadian AI law is like a race car without an engine, expert tells Parliamentary committee

Image of a skeleton car

Image by Julia Garan via GettyImages.ca

The Liberal government’s proposed law controlling the use of artificial intelligence applications is like a badly built racing car, a computing expert has told a Parliamentary committee.

But instead of putting the proposed Artificial Intelligence and Data Act (AIDA) in a garage, Andrew Clement, professor emeritus at the University of Toronto’s faculty of information, told the Commons industry committee Wednesday that the government should start all over again, with proper public consultation including a wide range of Canadians.

Andrew Clement testified by video conference. Screen shot via ParlVu

Innovation Minister François-Philippe Champagne, the bill’s sponsor, “wants to make Canada a world leader in AI governance. That’s a fine goal. But it’s as if we’re in an International Grand Prix,” Clement said. “Apparently to allay the fears of Canadians, he proudly entered a made-in-Canada contender. Beyond the proud Maple Leaf and him smiling at the wheel, his AIDA vehicle barely has a chassis and an engine.”

“He insisted he was merely being ‘agile,’ promising that if you just help propel him over the finish line, all will be fixed through regulations.”

But, Clement said, as a previous witness testified, there is no prize for first place in AI oversight. “Good governance isn’t a race, but an ongoing learning project.”

Among the problems: The Innovation Minister would have “sweeping powers,” Clement said, which puts him in conflict with his job to advance the AI industry. And the proposed AI data commissioner for overseeing the law should be an independent officer of Parliament, he said, not someone who, as the bill proposed, would report to the Innovation Minister.

“With so much uncertainty about the perils and promise of AI, public consultation with informed expertise is a vital precondition for establishing a sound legal foundation,” Clement added. But while Champagne said his department held more than 300 meetings, by Clement’s count, 220 were with businesses, including 36 with U.S. tech giants. Only nine were with civil society representing Canadians.

“Canada also needs to carefully study developments in the E.U., U.S., and elsewhere, before settling on its own approach,” Clement said.

Asked by one MP if AIDA in its current form would protect Canadians, Clement replied “I would say not.”

AIDA demands businesses deploying “high impact” AI technologies to use them responsibly. In particular they would have to develop and deploy applications in a way that mitigates the risks of harm and bias. There would be criminal prohibitions and penalties for using data obtained unlawfully for AI development, where the reckless deployment of AI poses serious harm, and where there is fraudulent intent to cause substantial economic loss through its deployment.

Clement is the latest witness to call AIDA flawed and say it has to be sent back to the drawing board. The Information Technology Industry Council filed a brief telling MPs that more consultation is needed but also recommended wording changes. Lorraine Krugel, vice president for privacy and data at the Canadian Bankers Association, asked for “targeted amendments” to the CPPA.

Meanwhile the European Parliament is on track to pass its own AI law this year.

Exit mobile version