Refine by MP, party, committee, province, or result type.

Results 76-90 of 333
Sorted by relevance | Sort by date: newest first / oldest first

Justice committee  How to pimp, okay. The reality is that we're constantly fighting against issues and content like this. It's a constant effort to identify the context within which there are comments in a video, and then to create what we call a “classifier”, which is an automated system to identify them on a broad scale.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  I'm not sure in this case. I can follow up with you.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  I think we have addressed that broadly, because we've long focused on providing our services in many languages. We have actually developed the artificial intelligence translation systems to be able to translate upwards of 200 languages. Our systems aren't focused solely on English terms and English challenges.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  Yes, we are. I recognize the example you described. I've seen that as well. That is one of the challenges, especially immediately after a crisis. We're seeing content being uploaded and they are playing with it a little bit to try to confuse our systems. What we do, particularly in the case of hate content and violent content online, is to tighten the standards within which we identify videos so that we're taking them down even more quickly.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  To begin with the process itself, as I mentioned, especially in the context of hate content, we are dealing with such a quantity that we rely on our machine learning and image classifiers to recognize content. If the content has been recognized before and we have a digital hash of it, we automatically take it down.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  To further refine my point, it's important to realize that YouTube is a platform for content creators, so essentially the people who are on YouTube are people who are trying to build an audience and a community around shared interests. The vast majority of them have given you the information to be able to verify and have some level of certainty about their authority if not their actual identity, whether you're talking about repairing small engines, doing model trains or political commentary, and whether you're talking about traditional news organizations or purely online news organizations.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  I think from our point of view, I'd back away from bots to a wider perspective, which is that, across our system, we've long had experience with automated attacks on our infrastructure as well as our services. What we have focused on over time is providing the signals to our users that they are being subjected to an automated attack that is trying to either compromise their Google account, their Gmail account, or to present misinformation or disinformation to them.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  I'm sorry. What was the phrase?

June 4th, 2019Committee meeting

Colin McKay

June 4th, 2019Committee meeting

Colin McKay

Justice committee  We were faced with a difficult decision. The legislation was passed in December, and we had to have a system in place for the end of June. We went through the evaluation internally as to whether or not we could take political ads in Canada within that time frame, and it just wasn't workable.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  They can right now; on Google Maps, you can use the feedback mechanism to identify a particular element of the map. You can identify whether that road is closed indefinitely and it just isn't marked on the map, or whether there has been development since we last mapped that area and there is now a municipal building or some other facility that needs further recognition.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  I have a two-part response if you'll be patient with me. I think the first is that if we're speaking specifically about YouTube and a platform where you're able to upload information, there isn't a process of verification/authentication, but you do need to provide some reference points for yourself as an uploader.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  If you'll permit me to look at my notes, I have a very specific definition. For us, hate speech refers to content that promotes violence against, or has the primary purpose of inciting hatred against, individuals or groups based on the attributes I mentioned in my opening remarks.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  If those show up and they are flagged for review—a user flags them or they're spotted by our systems—we have a team of 10,000 who review videos that have been flagged to see if they violate our policies. If the context is that something is obviously a clip from a movie or a piece of fiction, or it's a presentation of an issue in a particular way, we have to carefully weigh whether or not this will be recognized by our users as a reflection of cultural or news content, as opposed to something that's explicitly designed to promote and incite hatred.

June 4th, 2019Committee meeting

Colin McKay

Justice committee  Speaking generally and not to that specific instance, if that video were uploaded to YouTube, it would violate our policies and would be taken down. If they tried to upload it again, we would have created a digital fingerprint to allow us to automatically pull it down. The context of how a video like that is shown in news is a very difficult one.

June 4th, 2019Committee meeting

Colin McKay