Good morning, ladies and gentlemen. My thanks to the chair and members of the committee for inviting me to speak to you today.
I'm not going to speak to you about privacy regulatory matters and existing statutes. The reason for that is you've heard from Commissioner Stoddart on that subject. You've heard from legal scholars like Michael Geist. You've just heard from Commissioner Denham. There's very important work that needs to be done in the regulatory and legislative space.
The reason I'm not talking to you about that today is not because I do not have strong regulation in my own jurisdiction; I have order-making power, and I cannot emphasize enough how important order-making power is to a regulator. I also have, under PHIPA, the Personal Health Information Protection Act, a wonderful ability in terms of mandatory breach notification. We have these tools at our disposal, and they're excellent, but I'm not going to be talking to you about that today.
I'm going to talk to you about the future of privacy. I'm going to take the next 10 minutes to talk to you about something called privacy by design. Before I start that, though, please allow me to introduce my colleagues. I'm joined by Michelle Chibba, my director of policy, and David Goodis, my director of legal services.
Privacy by design is all about ensuring that the user has control of their data. Increasingly what we are experiencing all around the world is that with the enormous growth of mobile technologies, wireless WiFi everywhere, online social media, mobile devices, with the growth of information sharing and availability, it is becoming extremely difficult to regulate this information strictly after the fact—meaning you allow the privacy harm to arise, someone complains, we investigate, and then we offer a system of redress. That's very valuable and must continue, but using those tools, we only catch, in my view, the tip of the iceberg in terms of the potential pool of privacy infractions and privacy-invasive activities. Privacy by design is all about being proactive and trying to prevent the privacy harm from arising to begin with.
You'll see that privacy by design was adopted as an international standard two years ago in Jerusalem by the international community of privacy commissioners and data protection authorities. It was unanimously passed as an international standard and has since then been actually reflected in work coming out of both the United States and the EU. The FTC in the United States, the Federal Trade Commission, has just in January of this year put out its piece on how it sees privacy moving forward in terms of regulatory structures and private sector self-regulation. They've recommended three practices. The first of those three practices is following privacy by design.
If you look at the regulation put out by the EU on data protection earlier this year, you'll see the language of privacy by design, and privacy as the default permeates the entire regulation. You may be interested to know that privacy by design has now been translated into 25 languages. I assure you this is no small feat. It is reflected in all of the major languages around the world. I just want to give you an idea of the import of privacy by design and how seriously it's being taken all around the world.
Now I'm going to walk through, very quickly, the seven foundational principles of privacy by design. Let me try to summarize this for you. The essence of privacy by design is to embed privacy into the design of not only information technologies but accountable business practices, policies, and procedures in a proactive way, in an effort to prevent the privacy harm from arising as opposed to reactively offering a system of redress after the fact.
The essence of privacy by design is being embedded as what we call the default setting. By that I mean that when privacy is the default condition, you, as the user, the data subject, can be assured of privacy. You don't have to look for the privacy. It's guaranteed. It's automatic. It's embedded in the system as the default setting. That is key, and that is an integral part of privacy by design.
The other essential feature is that it talks about operating in a positive-sum, not a zero-sum, environment. Zero-sum means that you can have one or the other of two interests. You can have privacy versus security, privacy versus social media, or privacy versus biometrics. Get rid of the versus.
Positive-sum means privacy and other functionalities. You have to have privacy functioning in an environment in which it can operate in unison with other interests, as it must. The future is all about creativity and innovation. Who knows what's around the corner in terms of the next technology and the next development? We welcome that. We insist upon privacy being part of the package.
You've all heard a great deal about big data. I'm not going to talk to you about that today, because there is no time. Just for your information, here's a little teaser. Tomorrow we're launching a paper we did jointly with IBM called “Privacy by Design in the Age of Big Data”. We're releasing this tomorrow morning at conferences in Washington, D.C., and Toronto. If you look at our website tomorrow, please take a look at our paper on how you can have privacy and big data.
I'm going to talk to you for the remaining four minutes I have about an example of how privacy by design actually works on the ground. I don't want you to think this is simply a theoretical formulation or some academic construct. It's real. It's operating right now on the ground.
Let me tie this to Facebook and other social media. As Commissioner Denham mentioned, Facebook has the capability of facial recognition technology. So photographs that are uploaded to Facebook can be tagged with an identity through facial recognition technology. You can imagine what a treasure trove this will be for law enforcement and other interests, with the pictures, the faces, of 900 million users, potentially, being tagged, using facial recognition technology, and potentially matched with pictures of faces taken from a crime scene, for example. The police would come knocking on the door of Facebook with a warrant. Of course, Facebook would have to give them the information.
I'm going to tell you about a technology we've introduced here in Ontario that would not allow that to happen, even though it would allow facial recognition technology to happen. It is facial recognition technology using privacy by design biometric encryption.
Let me just tell you very briefly what this is. In Ontario, the OLG, the Ontario Lottery and Gaming Corporation, is the corporation that runs our casinos in this province. We have 27 casinos in Ontario. They're run by the OLG.
They came to me a few years ago and said that they had a problem. They have an addicted gamblers program, a problem gamblers program, called the self-exclusion program. Quite simply, if you are an addicted gambler, and you're going through the equivalent of a 12-step program, such as Gamblers Anonymous, and you go through the entire program, the last thing they'll ask you to do is go to the casino of your choice and ask to be placed on the self-exclusion program. That means that you want to give up gambling, have gone through the whole program, but know that you might fall off the wagon and try to go back into that casino and gamble, and you don't want to do that.
The self-exclusion program is completely opt-in. It's voluntary. You go to the casino of your choice and you say, “Sign me up. I want you to keep me out. If you see me trying to enter your premises, I'd like you to ask me to leave, please.” You fill out the form. They take your picture. You sign it, and this is completely your choice.
The problem was that this program wasn't working very well. In the past, the form you filled out with your picture on it and all that would live in some back office somewhere in a file cabinet.
In the meantime, these addicted gamblers who fell off the wagon would try to sneak back into a casino. They would go to the front of the casino—there are 27 of them across the province—and they would sneak back in. They were very good at it and they would successfully get back in. Unfortunately, many of them would lose their life savings. They would lose their families. They would lose their jobs. It was terrible. Then they would sue the Ontario government—the casino—for not honouring this program and keeping them out. It was a lose-lose.
So when the OLG, the Ontario Lottery and Gaming Corporation, came to us and asked for a solution, they said, “Here's what we can do”. They said they had cameras at the front of all casinos. Casinos all around the world have cameras at the front for security purposes. They said that if they were to match the cameras at the front with the faces in their backroom files, then they could identify these self-excluded gamblers and keep them out.
Here's the problem with that—facial recognition technology can pick up the faces of a lot of people entering the casino, not just the problem gamblers. Plus, this could then be made available to others for secondary uses like law enforcement. I wanted to ensure that wouldn't happen.
So what we did was that we asked them to use a program called biometric encryption. What this means, very simply, is that this is a system of using a facial recognition data capture in a way such that it cannot be used for any other purposes. When you use biometric encryption, no biometric template—as it is called, which is a digital representation of the face or the finger—is retained in the database.
Quite simply, all that means is that if law enforcement comes knocking on the door and wants to access your database of biometric templates to see if there's a match to a crime scene, you can't give them the biometric template because it doesn't exist. The only use that can be made of this information is for this particular purpose, the primary purpose which is intended.
I can explain to you later, if we have time in questions, how this works. But this has been tested in other jurisdictions. In the Netherlands, priv-ID, another company, has done this, and I can give you other examples.
This is a wonderful, privacy-protected biometric solution that allows the particular privacy biometric problem to be addressed, but doesn't allow the information to be used for any other purpose.