Good afternoon.
Almost exactly one year ago I was sitting in a boardroom much like this one, only much, much fancier. The daylong meeting was at 1601 South California Avenue in Palo Alto, California. If the address isn't familiar to you, it's the Facebook campus. A guy called Mark Zuckerberg works there. It is spectacular—vibrant, pounding with energy, everybody jacked into headphones. I felt like a kid in a candy store.
Because I was required to sign a non-disclosure agreement upon my arrival, I cannot tell you many of the interesting things that I learned at Facebook that day. Apparently Zuck's Facebook tag line, which reads—and I quote—“I'm trying to make the world a more open place by helping people connect and share”, does not apply to Facebook's business operations.
However, there is one thing that I will disclose: I got sick to my stomach that day from eating way too many Sour Patch Kids. The roof of my mouth was practically torn to shreds. Imagine a very well-stocked candy store—Sugar Mountain or the Bulk Barn—with a seemingly endless supply at every single coffee station throughout the entire Facebook campus.
Now, in defence of my gluttony, let me say that I was not the only one. What I witnessed that day was 25 of the world's most important privacy scholars and advocates stuffing their faces, lining their pockets, and filling their knapsacks with candy—grown adults earning six-figure salaries. We weren't stealing. Excessive and free consumption was encouraged. We were simply reacting to the offer of ubiquitous, abundant, and highly addictive forms of fuel.
Why have I wasted three of my precious ten minutes talking to the ethics committee about eating Sour Patch Kids at Facebook's campus? Because information is the new sugar: big data, big sugar—get candy, get candy, get candy.
Just as health practitioners urge us to consume fewer refined sugars and to safeguard through policies the increasing unhealthy consumption habits of Canadians, I appear before you today as a privacy practitioner, urging you to safeguard Canadian citizens and global corporations from the complex and increasingly unmanageable desire to collect, use, and disclose more and more personal information.
Because big data is like big sugar: the more ubiquitous, abundant, pleasurable, efficient, and profitable it is, the more we want it. Sometimes, the more we want it, the more blinded we are by its consequences. We stand at the precipice of what one might call the late onset diabetes of the information age, and we should be doing much more to prevent it.
You've already heard excellent submissions from two fantastic commissioners, Ann Cavoukian and Elizabeth Denham, as well as my hugely talented University of Ottawa colleagues, Professors Scassa, Geist, and Steeves. They have overlapped on a number of crucial recommendations that must be followed by this committee. I'll recap four points quickly.
First, you need to finish what you started. You're way behind on a number of necessary legislative reforms to PIPEDA, the Personal Information Protection and Electronic Documents Act. Studying social media may grab headlines, but the ethics committee should first focus on the PIPEDA review. I learned as a kid to leave the drum solos to later. It's not as sexy, but the rudiments must come first.
Point two: perhaps the most important rudimentary aspect of this is that the Privacy Commissioner needs much greater powers, including the power to make orders, award damages, and issue penalties. These enforcement powers must have serious teeth.
Point three of the overlap—also rudimentary—is mandatory notification requirements for a certain kind of security breach.
The fourth and last of the basic points I'm reiterating from the previous discussions is the need to mandate far greater transparency, not only about the collection of personal information, but about how it is being used and to whom it's being disclosed. We need this both at the front and at the back end of social media transactions.
To be clear, this is not just a point about tweaking privacy policies or making more understandable notice provisions. It is about legislating what I would call mandatory minimums--mandatory minimum standards for privacy transparency, requiring that they be embedded into technologies and in social techniques. We don't sell cars without speedometers, odometers, or fuel or pressure gauges. Likewise, our social media should be required to have feedback mechanisms that allow us to look under the hood and to warn us when conditions are no longer safe.
I have two further submissions of my own. The first concerns privacy of default settings. In his appearance before this committee, Professor Geist generously referred to my work entitled “The Devil is in the Defaults”. In short, the architecture of every technology includes a number of design choices. Some of those design choices create default positions. For example, a car's default position is stop. When we enter a car and turn it on, the car is in park. For safety's sake, its design requires that we conscientiously put it into gear in order to go. Although it would be possible to design things the other way around, we recognize the danger of cars that default to “go” rather than “stop”, and we have regulated against them.
The same should be true for privacy, but it isn't. For example, following the lengthy investigation of Facebook in 2008 and 2009, the Privacy Commissioner found that Facebook needed more privacy safeguards. Responding with a complete overhaul of its so-called privacy architecture, Facebook offered new settings for its nearly 500 million users. Although this was deemed a privacy U-turn by the major media at the time, the net effect of these new settings was ironically a massive and unprecedented information grab by Facebook, which I would be happy to explain more in the question period.
In a rather subtle and ingenious move, Facebook very politely gave our Privacy Commissioner the new settings she wanted. But when Facebook gaveth, it also swiftly tooketh away. Choosing to create privacy default settings that collect more information than even before, Facebook knew perfectly well that 80% to 92% of its users would never change those defaults. Behavioural economics made it very clear that, like a bad sugar habit, Facebook could get away with nudging us further and further towards poor information consumption habits.
Currently, the Privacy Commissioner is powerless to do anything about this. Without changes to our law, Canadian legislators are allowing social media sites to build vehicles that default to “go” rather than “stop”. Zuckerberg knows how unsafe this is. This is why he has rejigged his own privacy settings. He knows that Facebook's defaults are dangerous. The question is why isn't what is good enough for the geese also good for the gangster?
The devil is in the defaults. We need to fix this through legislation that contemplates settings with privacy as the default. While I agree with Professor Geist that Twitter should be commended for “Do Not Track”, and that Google should be commended for its privacy dashboard, I would take this all one step further. We need legislation that would make some of these amazing features on our online experience non-optional. They should be factory-built and installed with privacy as their default.
I will make my second submission much more succinctly, since it's similar to the testimony I offered at the PIPEDA review a few years ago. The biggest threat to privacy is not social networks. It's not surveillance cameras. It's not wireless mobile, nor databases, nor GPS tracking devices, etc., etc. The biggest threat to privacy is the standard form contract. Under our current law, almost all privacy safeguards that are built into our privacy legislation can easily be circumvented by anyone who provides goods or services by way of a standard form agreement. By requiring users to click “I agree” to their terms on a “take it or leave it” basis, companies can use contract law to sidestep privacy obligations. In short, this is based on a mistaken approach to the issue of consent. In my written submission, which I will provide to this committee, I offer detailed legislative reforms that would help prevent companies from doing an end run around the protections set out in privacy legislation. It's crucially important.
Thank you for your consideration of these matters. I hope during the question period that committee members will give me the opportunity to expand on my three main recommendations: one, mandatory minimums for privacy transparency; two, mandatory privacy default settings; and three, mechanisms that prevent contracting out of privacy through standard form agreements.
Thank you.