Thank you, Chair.
Ms. Jordan-Smith, thank you for your testimony today and for your courage in speaking out on this issue.
One thing that struck me about your testimony was that you talked about how your daughter was victimized through a platform that you weren't even aware she was using. It strikes me that in order to have a duty of care that would address the fact that technology changes all the time—there will always be some new platform that kids are on—we need to have a very clear but also broad definition of who, or what, a duty of care would apply to. It can't just be Meta or a couple of the known players, can it?
I've been giving some thought to what that could mean. I tend towards having a broader term. The term I would like to use is something like “online operator”, which would mean the owner or operator of a platform, such as a service online or application that connects to the Internet, or that is used or that reasonably could be expected to be used by a minor, including a social media service and an online video gaming service, so that it's very clear that as new platforms come up in the future, as technology changes, you as a parent aren't having to guess whether or not your child is being exposed to a platform that might not be covered by the law.
Would you support that type of recommendation?