Thank you very much for having me today.
I am here on behalf of Tech Reset Canada. We are an advocacy organization looking at the innovation economy, the public good and the impacts of the innovation economy on our society. I am really happy to get to talk to you today, because it means we're talking more about the issues related to technology.
The Facebook and Cambridge Analytica case has been one I have used often when speaking and doing public education and community events to highlight one core truth right now—there are a lot of unintended consequences coming out of the use of technology. Framing that as the reality we're dealing with, I'm just going to share some remarks regarding our work, what we have found in it and how it ties into this particular issue and, more broadly, data governance and technology and society.
Having said that, I spent some years running public consultations. I am currently living in Toronto, and one of the projects that is front and centre for me is Sidewalk Toronto. Is everyone in the room familiar with this project? It's a subsidiary company of Alphabet, a sister company to Google. It's investing up to $50 million to create a plan for a smart city on Toronto's waterfront. It's just a plan. There's no real estate transfer. It's about a year old now. What it has given us in Toronto, and I think others, is a very focused view of the level of education we have as people in this country to engage in this discourse around technology and society.
What I would like to say about all of that is that a lot of us have no idea what is going on, what data is, where our data goes, who has our data or how our data can be used. All of these issues, which are fundamental and central to making decisions about them, we do not have a good handle on.
I'm at almost the year mark of watching a company have consultations with the public while knowing that nobody understands what anybody is truly talking about. As someone who has done public consultation and who holds the profession and the practice dear to my heart—and I think it is central to democracy—I am extremely troubled at the state of that project and also about the idea that we should be making any kind of quick decision or policy. If we do that right now, I can tell you for sure that it will not be inclusive of the people who live in this country and what they want to do about some of the issues related to Cambridge Analytica and to any sort of tech company and its relationship to people. I just want to set up that this is one big thing, starting at a high level.
Another theme related to this that I think is really important to consider, whether it's Facebook, Google or any other large company, is that we're beginning to blur the line between the market and the state in this country. We're beginning to lose track of who's in charge of what, who's responsible for what, and the implications of data being used by non-government actors.
In this country, we work from a social contract. People give us data—us in terms of government—and people understand what government does with their data generally. We are introducing corporate actors into social situations, whether it's using Facebook to communicate and organize in a community and do many things, or maybe existing in a city. This sort of blurring of this line, I should hope, is becoming more visible to the people in this room. I think it is a thing of grave concern, and we need to delineate and understand who is in charge of this whole....
What's happening now is this enthusiasm for technology, and it's somehow making everybody forget what their roles are, that we have rules and laws, and that those are things that help us determine how our society looks. I don't think it was ever the intention to be enthusiastic about the innovation economy and have that then become governance of social impacts. I really don't think that was something that happened on purpose, and I think we need to be very aware of the fact that this is now happening regardless.
There is an article written in 1998 by a scholar named Lawrence Lessig that said “code is law”. Software code is, in some cases, determining.... They are not “law laws”, but they are determining social norms and the ways we interact with each other. I just think these are things we might not have understood as this began. I do not want to ever think—and I don't want anyone here to think—that people who are technologists even have a handle on the implications of all of this.
Having said those things, I have just a couple more points.
One of them is that democracy moves slowly. This is good. This stuff is hard. I would really caution everyone in this room to consider how much education we need to be doing before we can even be making decisions that are informed by people who live in this country.
I know there's a lot of enthusiasm, and everybody says tech moves incredibly quickly. We have agency over technology. Technology is not something that just pops up and doesn't exist because of humans and their agency, so we need to remember some of those facts.
Another thing to be very clear about is that we are blurring the lines between procurement and the influence of purchasing products, or using products, and how that trickles down to the people who live here.
In my opinion, what is happening in Toronto is problematic because you should not be making policy with the vendor. This is essentially what we're doing. We are allowing someone who is going to be a vendor to influence how the policy for said vendor's work will go. I do not understand how anyone could have thought this was a good idea to begin with. I don't think we should continue this for much longer. In these cases, we really need to be aware of the ways these two issues are linked to each other.
Another thing that relates to this is that we've been thinking about technology as an industry. I see that in this country, a lot of the narrative is about wanting to do well, wanting to be innovative, wanting to do the things that make us leaders in technology, and there being a lot of opportunity for prosperity and wealth development. This is true. However, there's also a much larger narrative about what it means to lead in the governance of technology and the governance of data, and Canada has an opportunity right now to lead.
You have probably heard a lot of good things about the General Data Protection Regulation in Europe. It's not perfect but it is definitely moving towards some of the things we should be thinking about. I am confident that if we really take this seriously, if we look at impacts and engage people better, we can lead.
This is an opportunity. There's a lot of fear and anxiety about what to do. If we don't go fast and we are very considerate in what we're doing, I see a great opportunity here for the country to show global leadership in what to do with data governance and governance around technology. I don't want us to miss that in this need to react to fear, anxiety or issues that are quite complicated. I really don't want to miss that point.
I also want to talk about opportunity as a technologist. I think it is something we need to think more about. How do we develop social and public structures that use all the wonderful things that technology can produce, more for public good and more within government? We need to look at our academic institutions and ask ourselves why we're not developing technology that we are using.
If you go out into our communities where people are talking about digital rights and digital justice, they are wondering why we aren't building tools that we could be using for community organizing, or for social good—lots of the ways people use Facebook or other things.... Why aren't we doing better at building systems, at building competency so that we can be building those products, figuring out different models, and thinking about how we can use these things within government.
I really want to stress this. The idea that government can't keep up with tech, or that there's a problem here because people in government don't.... This is not my belief. I'm telling you what I hear a lot. We really need to shut that down and start to show that if there is an interest in really using technology well across the board in our society, we can be intentional and make investments to make sure that happens. These are all opportunities for the country.
Again, when you respond to fear, you respond quickly, and I don't think that will be a good response. I think this case is a very good one to watch, as is the Sidewalk Toronto example. There are big issues coming out of here. There is nothing wrong. I will say this as a technologist: Everybody will think we are doing wonderful things for technology if we take it slow and figure out what to do.
This includes industry. It is not helpful to industry if you are not clear with them as to what the guardrails are, how their operations have to be law-abiding and how they can be encouraged to reflect some of the values that we as technologists think should be there in terms of sharing values, being open with things and considering things that aren't necessarily proprietary.
There are lots of ways to use technology. There are lots of ways to use math. We shouldn't think this is only a business thing. This is a social thing. There are a lot of really exciting things to do in there.
I'm trying to end on a hopeful note here because I truly believe there is great opportunity. I want to make sure we follow processes that ensure people are engaged in the development of what we're going to do next, and we do not rush that. There is no need. There is a lot of urgency in terms of not going fast. We need to really quickly decide that we are going to not go fast and be thoughtful about the process we follow from here.
Thank you.