Global regulatory policies are ‘10 innovation cycles’ behind, says expert

The Big Data and AI Conference in Toronto this week featured dozens of businesses vying for attention and speakers talking about the potential implications of a connected economy. It also revealed organizations have a long way to go when it comes to changing the way they think about privacy in their products and solutions.

Richard Wunderlich, director of smart grid initiatives for Siemens Canada, tried to put a number on it during a panel discussion about connected economies.

“Our company is seeing this exponential growth for innovation, and accompanying it, an exponential growth of security issues,” he said. “The regulatory environment that’s supposed to enable the innovation and security is probably 10 innovation cycles late.”

It’s why many projects leaning into the “connected economy” concept, which is broadly defined as the value created through technology-enabled links between people, machines, and organizations, are failing to take off.

“We’re not ready yet to be as connected as we can be to optimize the living environment,” indicated Wunderlich.

Helene Beauchemin, legal counsel for Quebec’s research lab Stradigi AI, says organizations need to get much better at defining what type of AI is in their products and its relationship with users and their data. Photo by Alex Coop.

And unfortunately, Canadian startups steeped in AI and other technological talent don’t have many good examples to look up to as they try to navigate these complicated waters, according to Helene Beauchemin, legal counsel for Quebec-based AI solutions company and research lab Stradigi AI.

“It’s a delicate balance – you want to be compliant but you also want to be user-friendly,” Beauchemin told IT World Canada. “We need to change our privacy practices, but we’re still not sure how to do it in a way that makes sense for the products of tomorrow.”

Corporations like Facebook and Google are finding out the hard way that people actually do care about how their data is being used. And while they continue to stumble on their way to integrating better privacy practices, the topic of privacy is no longer something they can brush aside without the public noticing.

Growing businesses and startups leaning on AI and other emerging technologies don’t have to repeat those mistakes, but in order to sidestep them, they need an actionable guide to privacy culture, a topic that was the focus Beauchemin’s session this week at the event.

She outlined five steps organizations need to follow, but after the meeting, pointed to one more. It should be the very first thing an organization does before the actual privacy discussion happens – define the AI capabilities in your product or solution.

“AI is a broad term. So are we working with traditional machine learning? Deep learning? Image and video recognition? Biometrics? Or are we a mining company that’s trying to disseminate unstructured data from sensors that has nothing to do with personal information?” she said.

Organizations need to be able to identify the impacts that their products and solutions may or may not have on people. Beauchemin pointed to a tool called COMPAS — the Correctional Offender Management Profiling for Alternative Sanctions — which is used to predict a defendant’s risk of committing another crime. It works through a proprietary algorithm developed by a private company called Equivant that considers answers to a lengthy questionnaire. Studies have questioned its effectiveness and the algorithm behind it, and the debate over the tool continues today. “That [AI] has an enormous impact on human rights,” she said.

Once the AI and its relationship with people and their data is defined, Beauchemin’s five steps toward establishing a culture steeped in sound privacy practices are as follows:

  • Drive awareness across the organization with the help of internal or external legal counsel and contextualize what privacy really means to every department and role.
  • Perform ongoing education, and don’t just focus on data scientists – consider having a privacy ambassador in each department of the company.
  • Ensure legal counsel is in the room from the very beginning when new products and solutions are conceived. Beauchemin said too often they’re brought in at the very end when the product is ready to launch. Then a critical oversight involving privacy is found, forcing everyone to go back to the drawing board.
  • Draft a risk assessment plan and make sure there is a full buy-in from leadership.
  • Have a compliance program in place and be prepared to update it regularly to keep up with technological and regulatory advancements.

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Alex Coop
Alex Coop
Former Editorial Director for IT World Canada and its sister publications.

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now