I just want to go back to my remark about making the same mistake for the third time. It's the same mistake that we saw with privacy and data protection, which is to treat these topics as objects that are independent from the rest of the world as it exists. We've seen the failure that thinking like this has gotten us to. While we talk about privacy a lot, what we're dealing with is a deeply privatized space where the control and power of the infrastructures—particularly with AI, never mind with data and software—are privately held.
If we think about our failures in access to justice for things like privacy and data protection, and we think about the failures of this sort of model, with privacy or data protection it's never about whether we should do it; it's always about “how”. If we want to turn the corner into a different world so that we have control over technologies, we have to talk about them in context.
For me, I go back to this. Who is the minister in charge of X, Y or Z sector? Who is in charge of making sure forestry is operating in a certain way, environmental protections are operating in a certain way and cars are operating in a certain way? Go from there every time. If we keep scaffolding more and more complexity, more and more compliance, and more and more of these sorts of complexities out into the sky, it doesn't serve justice. We have a fundamental access to justice problem as it stands right now. How many people have the time and energy to file a complaint with the Privacy Commissioner? What is the profile of someone or the demographic of someone who can bring that kind of a complaint forward?
In the same way that we're talking today about how you would even know if you were harmed by artificial intelligence, I recently heard the concept that in some cases it's like asbestos: It's in things and you don't know it's there. Whom will you go to and ask to hold them accountable? If you get hit by a car, there is a clearly accessible track of where you go to deal with that problem. I do not understand why we think it's a good idea to build an entirely new construct when we have a perfectly good physical and material world and a perfectly good set of governance standards. That's a place where we have public power. To me, the only people who benefit from scaffolding all this additional complexity are those with private interests. In a democracy—at this point in time we're 30 years in—public power has to be increased.
Do I want to see a commissioner for AI? No. I don't want to see a new regime for AI.