I'd refer you to the statement that's been put together by the Electronic Privacy and Information Center. They have an interesting statement of rights when it comes to algorithmic transparency. In a lot of ways, it's already in our legislation. They have to tell us what they're doing. They have to tell us why they're doing it. They have to tell us what the outcomes are. It's just that so often it's been buried in the algorithm in ways that make it even less transparent, so certainly a number of us within the civil society sector are quite concerned about this and think that it's worth pursuing as a provision in its own right.
A lot of it, too, requires that corporations be much more responsible for the outcomes. Yes, I do think there should be penalties attached when there are discriminatory outcomes in particular, and I think that would create a situation in which people would be much more careful when they are running algorithms that really significantly change people's life outcomes.