Thank you so much, Chair, and good afternoon, members of the committee as well. Thank you for inviting me to appear before you today on behalf of the Privacy and Access Council of Canada.
My remarks today reflect round tables held by the council with members from across the public and private sectors, and with members of law enforcement, who agree that facial recognition is one of many digital tools that have great potential.
Like any technology, facial recognition is neither good nor bad, but it's easy to justify, especially when considered on its own. What people do with technology makes all the difference in reasonableness, proportionality and impact on lives.
Thirty-four years ago, our Supreme Court said that “privacy is at the heart of liberty in a modern state”, that “privacy is essential for the well-being of the individual” and that privacy “is worthy of constitutional protection”, and I dare say it still is, except that now we struggle to have any privacy, at home or away.
It's difficult now, if not impossible, to prevent our facial images being captured and analyzed and our movements and our associations being calculated and evaluated in real time. We are in view every time we go outside, and often inside as well, and our images are posted to the Internet, often without our knowledge. We haven't consented to those images being used, or to our gait, our keystrokes or other biometrics being analyzed and correlated with databases that have been amassed with information about each of us.
We haven't asked that the voice-activated devices or the messaging platforms that our children use at school and we use at work analyze our conversations or our emotions, or for our TVs to watch us watching them, yet that is now commonplace, thanks to governments and companies creating an unregulated global biometrics industry that's predicted to reach $59 billion U.S. by 2025, while the tech companies embedded in the public sector urge us to use our faces to pay for groceries and to get government services.
In the 40 years that computers have been part of our daily lives, though, there hasn't been any substantive education in Canada about privacy or access laws, or rights or responsibilities, so it's no surprise that Canadians trust that the laws themselves are enough to protect privacy or that just 14% rate their own knowledge of their privacy rights as “very good”. In the meantime, there's been an onslaught of automated, privacy-invasive technologies and multi-million dollar investments in surveillance technologies to create safe communities across Canada purchased by the other 86% of people as well.
Certainly, facial recognition-enabled cameras in cars, body cams, doorbells and cellphones might help police identify a suspect or solve a crime, but even police admit that cameras and facial rec do not prevent crime, and there's little correlation between the number of public CCTV cameras and crime or safety, yet their unregulated sale and use are a self-fulfilling prophesy, because familiarity breeds consent.
Facebook, Cambridge Analytica, Cadillac Fairview and Tim Hortons are just the tip of the iceberg. Companies and governments can and do create or use technologies that violate our privacy laws because they can, because the current consent model is a fantasy, and because Mark Zuckerberg and others know that the risk of penalty is far less than the reward of collecting, manipulating and monetizing information about us.
We are at a moment, though, where three important changes are needed to help safeguard our democratic freedoms without impeding innovation and so that Canadians can regain trust in government, police and the public sector.
First, enshrine privacy as a fundamental human right for all Canadians, in person, online and at our borders.
Second, enact laws that require everyone who creates, purchases or uses technology to demonstrate that they actually have a clear and correct grasp of our privacy laws, rights and responsibilities.
Third, in the same way that vehicles and food must meet stringent government regulations before being allowed for sale or use in Canada, craft laws that put the onus on creators, requiring that technologies undergo comprehensive independent examination of their privacy access and algorithmic integrity, bias and impact before the product or platform may be acquired or used, directly or indirectly, and make sure the standards are set and the laws are written without the direct or indirect influence or input of industry.
Those are just a few highlights of a very complex issue that I am looking forward to discussing with you.