With files from Pragya Sehgal
Bowing to months of pressure from Canadian privacy regulators, American facial recognition provider Clearview AI has stopped trying to sell its services to law enforcement agencies in Canada while it is under investigation for the way it collects images from the internet.
The Office of the Privacy Commissioner of Canada (OPC) announced July 6 that Clearview AI will cease offering its facial recognition services in Canada and that it’s cancelled its contract with the Royal Canadian Mounted Police, which was the firm’s last client in the country.
A number of police agencies in Canada that had formal or test agreements with the company dropped their relationships after the investigation began February 21, leaving only the deal with the RCMP in place. Despite the dropping of those relationships and today’s announcement the privacy commissioners’ investigation is continuing.
“The privacy authorities appreciate Clearview AI’s co-operation to date on the ongoing investigation, and look to the company’s continued co-operation as it is brought to conclusion,” the OPC release added.
Clearview issued a statement of its own saying that “In response to OPC’s request, Clearview AI has ceased its operations in Canada. We are proud of our record in assisting Canadian law enforcement to solve some of the most heinous crimes, including crimes against children. We will continue to co-operate with OPC on other related issues.”
While Clearview AI continues to market its service in other countries, the statement also said that “Canadians will be able to opt-out of Clearview’s search results.”
The privacy commissioners are looking into media reports that the company collects and uses photos of people on social media platforms to train its facial recognition algorithm. They want to know if the company’s practices comply with federal and provincial privacy legislation. The investigation of Clearview by privacy protection authorities for Canada, Alberta, British Columbia and Quebec remains open, according to an OPC press release.
“The authorities still plan to issue findings in this matter given the importance of the issue for the privacy rights of Canadians,” it read.
Clearview AI claims its solution has an advantage over competing facial recognition products because it has copied more than 3 billion images from the internet, including from social media platforms like Facebook, Instagram, Twitter and YouTube. Police forces use Clearview AI to compare images of unknown people — usually suspects — to this database for identification.
In addition to questions about the legality of photo scraping for a commercial entity, Clearview AI apparently keeps images in its database even if someone deletes their image from a website.
Moreover, in early February, the Toronto Police Services was also criticized for using the company’s facial recognition services internally, without the knowledge of the police chief.
Clearview AI provides services in several countries to various government institutions and a broad range of organizations, including financial institutions and retailers.
The Canadian investigation into whether residents in the country have given or have to give consent to images scraped from the Internet is separate from allegations that no facial recognition technology being sold today is accurate. Studies have shown that facial recognition systems are less effective — some say biased — at identifying people of colour.
Those claims intensified after Detroit police admitted a Black man was arrested “on an erroneous facial recognition identification” for allegedly stealing several watches from a store. The American Civil Liberties Union has filed an administrative complaint against the police department for wrongful arrest involving facial-recognition technology. According to a news report the facial recognition software used by police suggested the suspect was the robber in a store video. Police then showed a lineup of headshots, including the suspect’s, to a store security guard who hadn’t witnessed the theft but had watched the video. The guard then identified the suspect.
Early last month IBM said it has stopped offering what it calls “general purpose” facial recognition or analysis software. A few days later Amazon said it would temporarily stop police use of its Rekognition software (but continue for the International Center for Missing and Exploited Children), urging governments to regulate the use of the technology. Then Microsoft said it would not sell facial recognition software to police departments until Washington issues clear regulation around use of the technology.
Meanwhile, other firms, including Clearview AI, continue to do business. On its website, Clearview AI says it’s image search technology “has been independently tested for accuracy and evaluated for legal compliance by nationally recognized authorities.”