Refine by MP, party, committee, province, or result type.

Results 1-15 of 32
Sorted by relevance | Sort by date: newest first / oldest first

Information & Ethics committee  From the context, if it violates our guidelines, we would remove it. If we don't, there are appeal mechanisms, and so on and so forth.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  I'm not familiar with the particular cases you're talking about, but we'd be happy to follow up.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  In general, if something were to violate someone's privacy or be defamatory or incite violence, and so on and so forth, against our guidelines, we would take it down. The case you're describing is something I'm not familiar with, but we'd be happy to receive more information and

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  I, too, do not work for Sidewalk Labs. You're right. We want your trust, but we have to earn your trust, through transparency, through development of best practices with you and accountability. I think then different sites will make different choices. That is in general the case,

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  Consistent with what we said at the outset, clear definitions by government of what's illegal, combined with clear notices, are critical to platforms acting expeditiously. We welcome that sort of collaboration in the context of illegal content.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  Whether or not it's that specific law, I think the basics of notice and take down of illegal content, speaking broadly, is something there is increasing consensus around.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  If I understand correctly, before you were asking about content that violates our community guidelines on YouTube. We have flagging systems where a user can click and say, “This violates your guidelines in this particular way.” That notice is then sent and put into a queue for re

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  We do our own threat assessments to prevent and anticipate new trends, and then we work collaboratively among the industry, and where appropriate with law enforcement and others, to make sure information and indicators are shared. We actively want to be a participant in that sort

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  There are existing legal frameworks with respect to our responsibility for illegal content, which we abide by.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  We have a responsibility for what we recommend, yes.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  Yes, we have requirements in place and broadly agree with what was said.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  Similarly, we're constantly assessing and looking to improve.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  We constantly do that sort of assessment, yes.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  Speaking for ourselves and the YouTube recommendation algorithm, we continue to try to improve the transparency that we have.

May 28th, 2019Committee meeting

Derek Slater

Information & Ethics committee  We will continue to communicate on how we're doing.

May 28th, 2019Committee meeting

Derek Slater