I think what's important is to have a conversation. I think one example that comes to mind is the aerospace industry, where there's a really thorough process for surfacing errors that are being encountered in deployment. This is what we're missing in the AI industry, having a very clear protocol as to how we should surface bugs in the algorithms as they occur so that we can then figure out the solutions as engineers but also implement the right legislation to support these solutions.
That's something I would leave to you to see how we could draw inspiration from these regulated sectors like the aerospace industry.