Privacy laws don’t protect us from facial recognition tech

Advertisement

Advertise with us

Amid the recent, dizzying advances in generative AI, it’s been easy to miss the slow but steady progress in facial recognition over the last decade. In the past few months, it has broken containment.

Read this article for free:

or

Already have an account? Log in here »

We need your support!
Local journalism needs your support!

As we navigate through unprecedented times, our journalists are working harder than ever to bring you the latest local updates to keep you safe and informed.

Now, more than ever, we need your support.

Starting at $15.99 plus taxes every four weeks you can access your Brandon Sun online and full access to all content as it appears on our website.

Subscribe Now

or call circulation directly at (204) 727-0527.

Your pledge helps to ensure we provide the news that matters most to your community!

To continue reading, please subscribe:

Add Brandon Sun access to your Free Press subscription for only an additional

$1 for the first 4 weeks*

  • Enjoy unlimited reading on brandonsun.com
  • Read the Brandon Sun E-Edition, our digital replica newspaper
Start now

No thanks

*Your next subscription payment will increase by $1.00 and you will be charged $20.00 plus GST for four weeks. After four weeks, your payment will increase to $24.00 plus GST every four weeks.

Opinion

Amid the recent, dizzying advances in generative AI, it’s been easy to miss the slow but steady progress in facial recognition over the last decade. In the past few months, it has broken containment.

In the United States, Immigration and Customs Enforcement (ICE) has deployed a technology known as Mobile Fortify, which uses facial recognition on officers’ cellphones to “quickly verify subjects of interest during operations.”

In the United Kingdom, the Metropolitan Police scanned 4.2 million people’s faces during 2025 using live facial recognition cameras in public areas across London. And the British government recently promised to further “ramp up facial recognition and biometrics.”

Demonstrators against Amazon's facial recognition software hold images of Amazon founder and former CEO Jeff Bezos near their faces at Amazon headquarters. (The Associated Press files)

Demonstrators against Amazon's facial recognition software hold images of Amazon founder and former CEO Jeff Bezos near their faces at Amazon headquarters. (The Associated Press files)

Face scans may soon be everywhere and Canada’s patchwork of privacy rules is not ready to protect us. The most striking gaps concern personal and household surveillance.

Let’s look at three examples.

First, there are Ring doorbell cameras. Ring, which is owned by Amazon, has sold its cameras to millions of people around the world, including many in Canada.

Last September, Ring announced it was adding facial recognition to its cameras in the form of its “Familiar Faces” feature, which scans the face of everyone who comes to your door and identifies anyone you have added to a database. It also announced “Search Party,” an AI feature that activates cameras throughout a neighbourhood to scan outdoor footage to help find a lost dog.

This provoked concerns that these two features will be combined, allowing Ring to use its network of cameras to track people as well.

These fears were seemingly confirmed by a leaked email from the company founder in which he said, though the feature was “first for finding dogs,” the company’s ultimate goal was to use it to “zero out crime in neighbourhoods.”

Equally concerning, the feature was initially supposed to operate through a partnership with Flock Safety, a surveillance technology company that works with law enforcement. After an outcry, Ring cancelled the partnership.

The second example involves Meta, the parent company of Facebook. The New York Times revealed that Meta wants to add facial recognition to its smart glasses, according to a leaked internal memo. The company sold more than seven million pairs of smart glasses last year.

If the “Name Tag” feature works the way the company apparently hopes, it will allow the glasses to identify anyone the wearer looks at and give them information about that person using Meta’s vast database of user profiles.

Appallingly, the memo stated that the company plans to take advantage of current events in the U.S., launching the feature while civil society groups “that we would expect to attack us” have “their resources focused on other concerns.”

A third example is Canadian law enforcement. In 2024, York and Peel regional police in Ontario started using facial recognition software to, in the words of York Police Const. Kevin Nebrija, “help speed up investigations and to identify suspects sooner.” Nebrija told the CBC that, in terms of privacy, “nothing has changed because security cameras are all around.”

In December 2025, Axon, the main supplier of body cameras in Canada, partnered with the Edmonton police department on a pilot project that allowed officers’ body cameras to identify people on a “high-risk” watch list of around 7,000 people.

Canadian privacy laws offer no explicit protections for our biometric data, with the exception of Quebec, where Law 25 contains provisions governing its collection and use.

Everywhere else, citizens face an uneven patchwork of general privacy rules. Their application to biometrics depends on who is using the technology and in what context.

When it comes to law enforcement, Section 8 of the Canadian Charter of Rights and Freedoms does protect citizens from unreasonable searches and seizures by the police. However, whether facial recognition qualifies as a search depends in part on whether the target had a reasonable expectation of privacy.

It’s an unresolved question whether biometric scans should be subjected to a higher standard than, say, cellphone cameras, when people are scanned while walking down the street or attending a protest.

Federal law enforcement is also governed by Canada’s Privacy Act, while provinces have their own privacy acts to govern provincial and municipal police. Federal and provincial privacy commissioners, who are tasked with interpreting these acts, have emphasized the sensitivity of biometric data and the importance of due process constraints in the use of facial recognition.

They have not decreed that police must get a warrant to use facial recognition.

A federal statute governs the collection of data by private companies and its use in commercial activity.

However, the act was written before the emergence of widespread biometric surveillance. It remains unclear how it would apply when companies capture and analyze the faces of people who have no direct relationship with the organization collecting the data.

The most striking gap in our current privacy laws appears when it comes to the actions of private individuals. The laws exempt personal and household surveillance. This means that if someone uses smart glasses or a doorbell camera to identify you or trace your movements, this would normally be legal.

Civil remedies might apply in specific cases where the victim could show there was targeted or harmful misuse. But not all provinces allow such claims. And even in those that do, the legal process is reactive and fact-specific, and it depends on the victim being willing to go to court.

Canada’s privacy framework is not designed for our current moment — one in which mass surveillance is becoming a reality.

Lawmakers need to act. We should demand new, stronger privacy laws, ones that deal explicitly with facial recognition. Otherwise we may find ourselves living in a world that we no longer recognize — but one that recognizes us.

» Neil McArthur is the director of the Centre for Professional and Applied Ethics at the University of Manitoba. His column was originally published at The Conversation Canada: theconversation.com/ca.

Report Error Submit a Tip

Opinion

LOAD MORE