Ms. Wang, we know that facial recognition technology is terribly inaccurate with correctly identifying non-white people. We've heard of error rates of up to 34% for darker-skinned females. This FRT-induced digital racism is unacceptable and further reinforces why this technology should not be used for law enforcement.
You've written about mitigating bias in machine learning. How do we end this digital racism?