Facial recognition service Clearview AI fights Canadian ban


Biometric facial recognition service Clearview AI is reluctant to delete facial photos it collected in Canada without consent. In December, privacy authorities in the provinces of Alberta, British Columbia and Québec ordered Clearview to be wiped out and permanently banned from operating its facial recognition service. Clearview AI is now taking legal action against this.

The company does not want Canadian authorities to ban Clearview’s facial recognition, nor does it want to delete photos it has already collected. The company considers it “impossible” to delete the images. The Supreme Court of British Columbia is responsible for reviewing the decisions of the data protection authority in the first instance. The request for review submitted there automatically has a suspensive effect on the decision of the data protection authority.

The New York company Clearview AI has collected more than three billion facial photos on the Internet. With this, it has trained a facial recognition algorithm, which it rents out. However, the company did not even attempt to obtain the consent of those affected. Clearview AI claims to be nothing more than a search engine for images, like Google Images. The copied images are publicly available, so everything is legal.

In February 2021, the Canadian Data Protection Agency officially ruled that Clearview’s facial recognition is illegal in Canada. The authorities of Australia, France and the UK have made similar findings. The company has paused its service to Canadian customers, but wants to resume operations.

As of June 2020, Clearview AI has distributed its facial recognition service to the Royal Canadian Mounted Police Force (RCMP) for a fee, and to thousands of users in Canada via free trial accounts. These included, for example, pharmacy chains, dozens of local police authorities, insurance authorities and so on.

The local police authorities initially denied using the service, but after a data leak on the Clearview server they had to admit that they had used the test accesses – allegedly without the knowledge of the police chiefs. The use by the RCMP is particularly explosive because the Federal Police had assured the Federal Data Protection Authority that facial recognition would only be introduced after a technology assessment with regard to data protection had been completed – but then simply bought it from Clearview AI.

In its submission to the Supreme Court of British Columbia published on Monday, Clearview puts forward a long series of arguments: Because the company has no facilities or employees in British Columbia, the authority is not responsible at all, and its decision is insufficiently reasoned, and formulated as such that it is unclear how Clearview AI can even fulfill it. Contrary to what the authority decided, Clearview AI wants to pursue a legitimate purpose with its facial recognition service: tracking down criminals and victims.

Contrary to further statements by the authority, the photos that Clearview dusted off from social networks are public data, which means that the Canadian data protection law PIPA does not apply, the company says. (What Clearview doesn’t mention: Google, LinkedIn, Meta, Twitter, and YouTube have all slapped cease and desist letters on the photo collector because grabbing user images violates the terms of service of their respective services. Note.)

In general, the decision of the data protection authority was “unreasonable” (roughly: unfair, unreasonable, inappropriate). However, if the authority correctly interpreted the Data Protection Act, the relevant provisions of the Act should be repealed as unconstitutional. Specifically, Clearview AI refers to paragraph 2 litera b) of Canadian Charter of Rights and Freedoms. This section guarantees freedom of speech and freedom of the press.

The term “(un)reasonable” is important in Canadian administrative law but not precisely defined. Crucially, in the present context, Canadian courts give administrative agencies wide latitude in interpreting those laws that are the core competence of that administrative agency. Unless the respective law contains provisions to the contrary, the court does not examine whether the administrative authority has interpreted its technical norms correctly, but only whether its interpretation was “reasonable”.