I hope this is better.
To address the question on private sector use, the harms are real. I'll highlight a few examples.
In Michigan, for example, a skating rink was using a facial recognition tool on customers who were coming in and out. A 14-year-old Black girl was ejected from the skating rink after the face recognition system incorrectly matched her to a photo of someone who was suspected of previously disrupting the rink's business.
We've seen private businesses use this type of technology, whether it's in concert venues, stadiums or sports venues, to identify people on a blacklist—customers they don't want to allow back in for whatever reason. Again, the risk of error and dignitary harms involved with that, the denial of service, is very real. There's also the fact that this tracking information is now potentially in the hands of private companies that may have widely varying security practices.
We've also seen examples of security breaches, where large facial recognition databases held by government or private companies have been revealed in public. Because these face prints are immutable—it's not like a credit card number, which you can change—once your biometrics are out there and potentially used for identity purposes, that's a risk.
Similarly, we've seen some companies—for example, Walgreens in the United States—now deploying face recognition technology that can pick out a customer's age and gender and show them tailored ads or other products. This is also an invasive practice that could lead to concerns about shoppers and consumers being steered to discounts or products based on gender stereotypes, which can further segregate our society.
Even more consequentially, it's used by employers—