Thank you very much, Mr. Chair, for the invitation to participate in your meeting. I'm going to share some thoughts about technologies that are just around the corner and that I believe will have a profound impact on how we think about privacy. My goal is to help us understand them so that, as much as possible, our laws can be ready for what's coming next.
I am a professor in the Faculty of Environmental Design at the University of Calgary, as well as an adjunct professor of computer science. I'm a research fellow of our Centre for Military, Security and Strategic Studies and of the Canadian Global Affairs Institute here in Ottawa. I've spoken to all the major hacker conferences like DEF CON, Black Hat, and one with the intriguing name of Hackers on Planet Earth, so I try to keep track of what both the good and the bad hackers are up to.
I'm also pretty sure that I taught Canada's first course in information security in 1974. Back then it was simple: lock your computer room doors, choose good passwords, and don't put confidential stuff in the trash. Today, it's much more complicated.
Consider a 2015 project called “The Face of Litter”, sponsored by Hong Kong Cleanup. Workers collected discarded chewing gum and cigarette butts on that city's streets and sent them to Parabon NanoLabs, a privately held Delaware corporation. Parabon used DNA phenotyping to create an approximate digital portrait from each sample. A week later, on passing the scene of the crime, the spitter saw an eerily familiar face on a video screen, a DNA-driven self-portrait.
Now, how could they do this? There was plenty of saliva left on those discarded items to do DNA analysis. In fact, it requires only one nanogram. Certain traits like eye colour, hair colour, and facial shape are easy to work out. Ancestry can be analyzed. Stir in machine intelligence and real-world knowledge—gum chewers are more likely to be 18 to 34, and cigarette smokers older—and you get a very creepy scenario whereby biodata is used not to identify someone specifically, but to infer things about the person. This challenges our long-held definitions of personally identifiable information and personal health information.
In my 2014 book, Technocreep: The Surrender of Privacy and the Capitalization of Intimacy, I suggest that a store might grab a few skin cells when you type in your PIN and send them off for analysis. The next time you visit that store, you might see a pop-up asking if you knew you were pre-diabetic and saying, “Here's a special coupon just for you.” While to my knowledge no store is doing this yet, we have seen retail outlets in the U.S. and the U.K. use facial recognition to identify shoplifters, VIP customers, and known litigious individuals. Banks such as HSBC are already using facial recognition for client identification, and several Canadian banks are doing biometric trials.
Your biometric data, be it your voice, face, or DNA, might well be covered by the Privacy Act and under PIPEDA's definition of personal health information, though those definitions will need to be updated as technologies emerge, but does this legal protection do the average person any good? In practice, many customers would not notice an obscure clause authorizing the use of their biometric data in the retail or banking environment. It could be buried in the terms and conditions document, which hardly anyone reads. Some people might even give consent to the use of their biometrics, hoping to save money, get better service, or obtain useful health information.
I believe that citizens may not fully understand all the implications of collecting, storing, and exchanging their biometric data, as well as secondary uses and cross-correlation of biometric and other databases. We need laws that mandate full disclosure and a process to ensure real compliance, which would mean more than just guidelines on the OPC website. Even today, public overt surveillance cameras are supposed to carry proper signage. In my experience, most carry no signage, and nobody does anything about it.
Then there's the time problem. Fifty years ago, a criminal may have left blood at a crime scene with impunity since, aside from determining blood type, it didn't hold much information. Today, law enforcement is solving long-dormant cold cases through DNA analysis of old samples.
We cannot predict what future data analysts will extract from our biological and biometric data, except to say it will be more than they do today. Experts also suggest that quantum computers will be able to retroactively decrypt decades of data that we currently believe is secure. There's a wonderful phrase that describes all this: beware of time-travelling robots from the future.
I do detect a growing unease in the Canadian public. When I talk to people about biometric identifiers, from ear shape to heart rhythms to your unique body odour, which can identify you, their ears perk up. Recently I was approached by Costco's magazine for an article on the downsides of biometric identification. I explained how fingerprints can be stolen and put on a fake finger with a 3-D printer. A hacker named Starbug even captured the fingerprints of the German defence minister from a high-resolution photo of her hand.
Even more troubling is the belief that biometrics are infallible, which they are not. They have error rates that vary depending on parameters set by the designers. The first-generation NEXUS terminals used at Canadian borders would sometimes fail to uniquely match a person from the eye biometrics they obtained.
Illinois and Texas have passed specific commercial biometric privacy laws, and article 9(1) of the European Union's forthcoming general data protection regulation puts specific restrictions on use of genetic data and biometric data where processed to uniquely identify a person. Canadians need a similar level of protection, and these laws provide a starting point for us.
Another area that needs serious thought is behavioural biometrics. In Technocreep I review Progressive Insurance's Snapshot device, which people install voluntarily to try for a discount on their car insurance. It records how much they drive, when they drive, and how hard they hit the brake. I suggested that it might be a sensible choice for some people, especially since it didn't track where they drove. Then Desjardins insurance brought out the Ajusto app, which uses your smartphone to create a driving-quality score. Unlike Snapshot, this system knows exactly where you are, and even how well you respect the speed limit.
Right now, systems of this nature are opt-in, and the companies take pains to tell consumers that even bad driving will not raise their rates. However, there is certainly the possibility of driving monitors and even wearable fitness monitors becoming de facto mandatory in order to obtain insurance at a reasonable rate. Insurance is, after all, about spreading risk and charging risk-based premiums.
In opposing the long-overdue genetic privacy law for Canada, Bill S-201, Jacques Y. Boudreau, chair of the committee on genetic testing for the Canadian Institute of Actuaries argued that an essential element for insurance to work properly is an equal access to information by both parties. There is clearly tension brewing between our right to keep information private and commercial interests.
We spend a lot of time worrying about how an authorized data collector uses our data. However, a flood of data breach examples, from the Sony hack, to the DNC emails, to the Ashley Madison fiasco proves that our personal data can fall into the wrong hands with devastating consequences. People whose email addresses appeared on the Ashley Madison client list have received blackmail threats, suffered workplace repercussions, and in three reported cases have committed suicide. A further complication here is that people could appear on that list without having actually signed up due to the lax design of the system.
While there are hacking-related Criminal Code provisions such as mischief in relation to data and unauthorized use of computer, these do not directly address the privacy implications of hacking. Of course, many perpetrators are never caught, but some are. There should also be consequences for the entity that manages the data if they did not take reasonable precautions to secure it.
Therefore, I support effective data breach notification in both the public and private sectors, as well as enhanced mechanisms, including order-making powers, to enable the Privacy Commissioner to preserve public confidence. I also support regular review of our privacy laws at least every five years.
I will close by revealing that you've been listening to a cyborg, a human being with a new technological body modification. I had an RFID chip implanted in my hand at this year's DEF CON conference. Right now it gives me only one superpower: I can open my door at the university without fumbling for my ID card. In the near future, devices will be available to give people telephoto vision, super-acute hearing, and enhanced mental powers.
Canada's first privacy laws date from the era when information was kept on paper, and we dragged them into a world where our data lives in cloud networks somewhere on the planet. Our next challenge, one that will keep us busy for a long time, is dealing with the implications of the data being us, an intimate part of our humanity.
Thanks so much for your attention. I look forward to your questions.