That goes directly to my point about expanding their trust and safety departments, not reducing. Essentially, that's the branch of the large major social media companies that actually oversees the content moderation, so that harmful content, problematic content, will not get the audience it's seeking. Unfortunately, we hear in the news that these departments have been shrinking. Trust and safety teams are being left out, and some of the initiatives that were started a while back are being discontinued.
It is a concern. It does signal that maybe that area is not as important, because it can be easily cut when there is no need for it anymore.