House of Commons Hansard #352 of the 44th Parliament, 1st Session. (The original version is on Parliament's site.) The word of the day was documents.

Topics

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:35 p.m.

Liberal

Kevin Lamoureux Liberal Winnipeg North, MB

Madam Speaker, where there is opportunity to bring in legislation and get it through the House of Commons, the government is definitely interested in it. AI and facial recognition is a very serious issue. We have treated it as a serious issue and will continue to do so. The whole responsible advancement of technology on the issue is so critical to our country—

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:35 p.m.

Liberal

The Assistant Deputy Speaker (Mrs. Alexandra Mendès) Liberal Alexandra Mendes

Order. It is my duty pursuant to Standing Order 38 to inform the House that the questions to be raised tonight at the time of adjournment are as follows: the hon. member for Calgary Rocky Ridge, National Defence; the hon. member for Renfrew—Nipissing—Pembroke, Carbon Pricing; the hon. member for Courtenay—Alberni, The Environment.

Resuming debate, the hon. member for Calgary Nose Hill.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:35 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, I will be sharing my time with the member for Montmagny—L'Islet—Kamouraska—Rivière-du-Loup.

What we are doing here today is something called a concurrence debate. It relates to a report that was actually submitted to the House in October 2022, two years ago, on the topic of facial recognition software. This might seem like a very niche topic, but it is really not. Facial recognition software has become pervasive in use, especially here in Canada, and the report provided a set of recommendations on safeguards that could be used to protect Canadians' privacy and their data, as well as to prevent negative social impacts such as the use of facial recognition software to do things like racially profile people from marginalized groups.

The report had some pretty clear recommendations. It was issued in October 2022, and the government abjectly has failed. It has let two years go by without implementing a single one of the recommendations to protect the health, safety and privacy of Canadians. I want to talk about what the government is going to say that it did in response to the report, and then debunk it.

The government tabled a bill, Bill C-27, which has two components. It has some content with regard to privacy and some content with regard to artificial intelligence. The problem with the bill is that virtually every type of civil society group, as well as academics and businesses, has panned both components of the bill for a variety of reasons. Many members of the House have asked for the bill to be split so that the two very disparate topics could be studied separately. The government has refused to do that. Most importantly, the bill contains absolutely nothing on facial recognition, absolutely nothing that materially addresses the recommendations in the report.

That is why when the Liberals stand up and talk about this, they have to dance around the issue. My colleague from the NDP rightly asked how many of the recommendations had been put in place. The answer is zero.

I am going to outline what the key failures of the bill are and then what the impacts of that are on Canadians. This is not necessarily a front-burner issue, but I think it was really important that the report was brought forward today, because it is something Canadians should be concerned about.

There are problems with unregulated use of facial recognition. I know this can sound really technical for some people, but I have to explain how pervasive it is. If someone were to walk into a shopping centre today, there is absolutely nothing stopping that shopping centre from using high-definition cameras to capture their every move, capture their biometric data, attach it to other profiles that the person might have with other companies and then use that information to make a profile on them about what they can afford and how they could be targeted for advertising. In really bad cases, they could be targeted for negative security experiences.

This is a very pervasive technology. Basically, anywhere there is a camera, facial recognition software can be and is likely being used. It is being used not just by the private sector; it is also being used by governments, and there are almost no limits on what the Liberal government can do with facial recognition software in Canada today. That is highly problematic for several reasons.

First of all, it is a massive invasion of Canadians' privacy; many times, they do not even know it is happening. That is because of the lack of regulation. The failure of the government to address the recommendations and put regulations into Bill C-27 means that Canadians' privacy is at risk. They do not have the ability to consent to when and how facial recognition software can apply to them. The second thing is that this opens them up to big-time data misuse.

As I said in the shopping centre example, there is really nothing preventing a shopping centre from selling biometric data and putting together a broader profile on somebody to be used for any purpose, without that person's ability to reject it on moral grounds. Under the fundamentals of privacy in Canada, we should have the right to reject it. I would almost argue that it is a human right.

The other problem is that it can lead to discrimination and bias. Many studies have shown that facial recognition software actually treats people of colour differently, for a wide variety of reasons. Of course that is going to lead to discrimination and bias in how it is being used. There should be restrictions on that to maintain Canada's pluralism, to ensure equality of opportunity and to ensure that people of colour are not discriminated against because of a lack of regulation. To reiterate, none of these things are in Bill C-27.

The unregulated use of facial recognition software, because the government failed to regulate it in Bill C-27, can also lead to suppression of speech. Let us say that a government wanted to use facial recognition software to monitor people on the street. There would then be, within different government departments, some sort of profiles on who people are, what they do or what their political beliefs are. If government officials see them and maybe a few of their friends coming from different areas and walking to a gathering spot, that could, in theory, be used to disrupt somebody's right to protest. There are absolutely no restrictions on that type of use by government in Bill C-27.

We can also see how facial recognition could be used by the government for extensive overreach. Many members of this place will talk about wrongful convictions with respect to facial recognition software. There have been cases where facial recognition software was used to lead toward an arrest or a warrant. Because there are not clearly defined limits or burdens of proof for the use of the technology, it can lead to wrongful arrests and convictions as well.

It leads to a loss of anonymity. I think we have the right to be anonymous, certainly in this country, but that right has been breached without even any sort of debate in this place, because the government has failed to put the regulations into Bill C-27.

Frankly, the lack of regulations, particularly on government use of facial recognition technology, also means that there is a lack of our ability as legislators to hold the government to account on whether or not it is overreaching. Because we do not have the requirement in law for governments to be transparent about how they are using facial recognition software, we cannot in this place say whether there has been an overreach or not. It is very difficult to get that information.

To be clear, Bill C-27 has been panned at committee by civil liberties groups and civil society groups because of three things: It fails to define “biometric function” as sensitive data, fails to provide clear restrictions on when and how businesses and government can use facial recognition technology, and fails to provide adequate safeguards for individuals, especially regarding consent and the potential for discriminatory outcomes. The bill is a failure. It should have long been split, as has been the request of multiple parties of this place.

Furthermore, the reality is that we have not had the debate in the House of Commons on what the guidelines should be for facial recognition technology. What the government has proposed to do in Bill C-27 is to take that out of this place, this vital debate, and put it in the hands of some Liberal-controlled regulator to be determined behind closed doors, with big tech companies, not us, setting the boundaries on that. That is wrong.

I want to talk about what the government has done. First of all, it has put unfettered use of facial recognition software out into the public. It has failed to define it in Bill C-27. Then it went one step further. Bill C-63, the government's massive draconian censorship bill, would go one step further in putting a chill on Canadian speech. It is another layer of Canada's loss of privacy, Canada's loss of speech and Canadians' loss of rights.

When the government stands up and talks about Bill C-63, the draconian censorship bill, as somehow being a response to facial recognition technology, this is not only laughable; it should strike fear into the heart of every Canadian. All of these factors combine to really put a chill on Canadians' privacy, their right to assembly, their right to freedom of speech and their right to live their life without government intrusion or the intrusion of merchants who might be using their biometric data to sell it to other companies.

It is just insane that Canada has not acted on this. We know that the Liberal government has not acted on it because it is in chaos right now. It has so many scandals, spending crises and ethical breakdowns. However, the one thing it has been focused on is censorship. That is because it does not want Canadians to hold it to account.

I am very glad that the report is being concurred in in the House. I find it an abject failure of the Liberal government that it has not acted on the recommendations, which, frankly, are non-partisan and should have been put into law a long time ago.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:45 p.m.

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Madam Speaker, we have just seen a demonstration of what Stephen Harper did, which was absolutely nothing for modernization whatsoever. In fact, it is a continuation of that because the Conservatives still do not want anything to do with it. That is the reason why they do not even advance the legislation.

The member was just critical of Bill C-63. In essence, Bill C-63 says that, if someone's partner or ex puts inappropriate pictures onto the Internet without their consent, that is wrong. They should not be able to do that.

The Conservative Party says, “Who cares?” It is not even going to let Bill C-63 be debated to get it to the end of second reading. It will never make it to committee if it is left up to the Conservative Party. They are stonewalling it. They are taking a Stephen Harper approach to the issue, and that is to do nothing but complain.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:45 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, it was former prime minister Harper's government that introduced legislation to stop revenge porn. That was the first law that passed in the House of Commons in response to many terrible incidents. That was a Conservative bill that was passed. Bill C-63 does not do that.

The bill that would do what the member opposite was talking about is a bill that I wrote, Bill C-412. My bill, Bill C-412, would protect people from the non-consensual distribution of intimate images created by artificial intelligence. It includes a digital restraining order for women who are being stalked by people online and a regulated duty of care for how online operators must treat children. We would do all of that without a $200-million bureaucracy, which C-63 proposes, and without a massive impingement on Canadian speech through the reiteration of section 13 of the Canadian Human Rights Act.

We in opposition did what the government should have done a long time ago. I am very proud of that. I am proud of my caucus colleagues. It is more of what Canadians can have, with the hope that they can look forward to when the Conservatives form government after the next election.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:50 p.m.

Conservative

Corey Tochor Conservative Saskatoon—University, SK

Madam Speaker, we need to get back to who is watching the watchers. What is going on with the instructional handbook of Nineteen Eighty-Four? It is bizarre what is happening with this regime, but we have seen this before. Failing regimes during their dying days always reach for the power of the state, the fist of the government to crush opposition. I think there are some similarities with what the government is doing right now with censorship in Bill C-63 and all the censorship bills the government is trying to use to control our society.

I would like to hear my colleague's comments on that. Is this a failure of the government to react to this report, which clearly spells out some recommendations?

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:50 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, the government has failed on the fronts that my colleague mentioned in two ways: action and omission. On action, the government has censored Canadians through Bill C-11, which has had a massive effect on YouTube creators, censoring who gets seen and who does not. Bill C-18 has resulted in a news ban for online media platforms, so Canadians cannot get the news. It has also put many newsrooms out of work, so now the government cannot be held to account. Now the government is proposing Bill C-63, which will lead to a kangaroo court, wherein any Canadian could be dragged through with vexatious complaints based on their political opinions.

As well, through omission, by not putting limits on facial recognition software, the government can overreach and use Canadians' biometric data without any limitation. All of that leads to a police state, a censorship state, and something that every Canadian, regardless of political stripe, should be absolutely opposing with every fibre of their being.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:50 p.m.

Liberal

Kevin Lamoureux Liberal Winnipeg North, MB

Madam Speaker, there is no paranoia there. One has to wonder about the collection of little dots put here and there to try to spook or scare Canadians.

Government can actually be a valuable resource in supporting Canadians. Would the member not agree that government does a lot of good?

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:50 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, I wish I had five hours.

I will just say this: Bill C-18 is one small example of what the government has done. Bill C-18 has resulted in the complete decimation of Canada's media ecosystem. There is virtually no local reporting. There is a ban on sharing news on social media platforms.

The government wants an ill-informed, censored population so that it cannot be held to account.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

4:50 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Madam Speaker, I am pleased to rise to speak today.

I thank my colleague for pronouncing my riding name so well. She did a very good job. Above all, she has a wealth of experience, having been a minister in a previous government, which did a great deal for technology, among other things.

We are talking about a report on facial recognition technology that was tabled two years ago. The reality is that the government has had two years to act on the report's recommendations. Unfortunately, it has done nothing.

Many of our colleagues here have talked about Bill C-27. I have the pleasure of serving on the Standing Committee on Industry and Technology, which is responsible for Bill C-27. It is important to understand that facial recognition is nowhere to be found in Bill C-27. It is a bill on artificial intelligence and privacy, but there is not a single line in that bill that talks about facial recognition.

I would like to review the chronology of events surrounding Bill C‑27. This is important, because it gives us one more opportunity to consider how the government operated. Earlier on, my colleague from Winnipeg North said it was transparent and proactive, that it was doing lots of things, that it had introduced bills, and that it was holding consultations. I have news for him: On June 16, 2022, two and a half years ago, Bill C‑27 was introduced for first reading here in the House. On November 4, 2022, six months later, we debated it at second reading. The bill reached the Standing Committee on Industry and Technology on April 24, 2023, another six months later. However, Bill C‑27 was delayed when other government legislation was given extended consideration, including Bill C‑34 and Bill C‑42. Therefore, to some degree, the government deliberately delayed consideration of the bill.

During the study of Bill C‑27, we heard from numerous witnesses. We learned that 300 groups had been consulted. The problem is that they were consulted after the bill was introduced, not before. Surely, if the minister had consulted the organizations beforehand, he might have been able to include something about facial recognition in his bill. It is good to hold consultations, and we have absolutely nothing against that. It is an important thing to do, but ideally, it should be done before the bill is introduced, to avoid situations like the one we are in now, namely that we are still debating Bill C‑27 at the Standing Committee on Industry and Technology. I think there are roughly 250 amendments, including 55 amendments that the government moved to its own bill. How can such a thing happen? How can the government introduce a bill and then move 55 amendments a year and a half or nearly two years later? Someone somewhere must have done a bad job drafting the bill if, after introducing it, the government ended up consulting 300 groups and moving 55 amendments. We call that working backwards.

On September 26, 2023, we began studying Bill C-27, and we heard from the industry minister, who, we know, is an excellent salesman. I will give him that. Since the member for Winnipeg North told us to try to say nice things about what the government is doing, I will do just that. The government has an excellent Minister of Industry. He is a good salesman. I have no doubt he could “sell fridges to the Eskimos”. It is incredible. That said, I think that as the bill progressed, the minister was put in a position where he should have backed down, in a sense.

Contrary to what my colleague from Beauport—Limoilou said earlier, Bill C-27 does not cover a whole slew of topics. It covers two: artificial intelligence and privacy. The part of the bill on privacy is what we are debating right now. The progress of Bill C‑27 has been hampered because the Liberals want to establish a tribunal, even though no other country in the world has done that. We do not want this bill to establish a tribunal. There are already other authorities that could do this work, such as the Privacy Commissioner. We do not want to create an additional authority because that would require additional funds.

We also want Bill C-27 to move forward. The minister keeps telling us that Mr. Bengio from the University of Montreal is the father of AI in Canada and basically in the world. When Mr. Bengio appeared before the committee, he said that we needed to act quickly. We want to, but the reality is that the bill is ill-conceived. The very first witnesses who appeared before the committee told us that this bill is poorly designed.

First, artificial intelligence should have been addressed in a separate bill rather than bundled together with privacy, even though we agree that these two topics have elements in common. That does not necessarily mean that the two topics needed to be addressed in the same bill.

We moved several amendments to this bill. I must say that the committee is working collaboratively. In some committees, there are attacks, it is very politicized, it is very political and it is very partisan. I must say that at the Standing Committee on Industry and Technology, we all work very collaboratively. We try to move bills through as quickly as possible, but in the case of Bill C‑27, that was unfortunately not possible.

Other events took place in 2023 and 2024. I think we have done an amazing job. At committee, many witnesses came to talk about artificial intelligence itself, and their testimony was very interesting. One witness in particular surprised us a bit. They practically said that we are facing a third world war, a technological war that will be fought not with weapons, but with AI. We were a bit shaken when the witness told us that. We thought they were being a bit alarmist, but the reality is that we heard very solid arguments from the experts from across Canada who also appeared at committee on this topic, at the invitation of the various political parties.

Europe has just passed legislation on artificial intelligence. Here in Canada, if the government had been willing, this bill could have been split up to separate the two subjects. We could still do that. Right now, we could limit ourselves to resolving the issue of AI, in line with what just passed in Europe and what is about to pass in the United States. Their bills have been studied extensively. Quebec already has a law in effect, Bill 25. It is not fully aligned with the legislation that will be created in Canada. A number of legal experts told us that all the provinces' laws absolutely must be consistent with the federal legislation. All of these things come into play.

Facial recognition is a fundamental point when it comes to Canadians' quality of life. We have to make sure that people will not be identified by technology that will allow racial profiling, for example. Obviously, we do not want that anywhere. Just two weeks ago, a former Montreal police chief said that there was racial profiling in Montreal. The City of Montreal will probably be charged for that. Things would be even worse if we had tools to facilitate racial profiling.

I see that my time is up. I am happy to answer questions.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5 p.m.

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Madam Speaker, I appreciate the member opposite talking about the standing committee. It is an interesting contrast; the current government has always encouraged the passage of bills through second reading so that they can go to committee, which allows for input. I would like to think we are not being criticized for listening to what people say around the table and making amendments accordingly when we feel they give strength to the bill.

It is interesting that we are talking about AI; in one committee, AI was used by the Conservative Party to generate 20,000-plus amendments to one piece of legislation. That highlights the fact that there are those who abuse AI for what I would suggest are mischievous reasons. This is what the Conservatives did in trying to add to a filibuster using AI.

Would the member not agree that most Canadians would see it as somewhat silly to use AI as a mechanism to assist the Conservative research team?

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Madam Speaker, that is complete disinformation. The 20,000 amendments he is talking about were not written with artificial intelligence, but with my colleagues here who work at the office.

Again, we are trying to work collaboratively in committee. Now we are being accused of things that are not even true. It is a real shame to see because we have a duty to all Canadians to pass legislation to regulate new technologies of the future. They are not just at our door, they have entered the house. It is important to do this as soon as possible, but they prefer to attack us with nonsense like this.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:05 p.m.

Bloc

Julie Vignola Bloc Beauport—Limoilou, QC

Madam Speaker, my colleague spoke about Bill C‑27. He pointed out that it is not a mammoth bill, but that it should be split in two. That way, we could actually take a comprehensive look at AI and make the necessary amendments, since our country currently has no legislation related to AI.

We are in the most democratic minority government, where everyone can sit around the table to negotiate and discuss. What does my colleague think of the Liberal government's refusal to negotiate and split Bill C‑27?

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

October 9th, 2024 / 5:05 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Madam Speaker, in fact, we are waiting for the minister. He asked us to tell him what needs to be done. We quickly sent him our response so that we could settle Bill C‑27. We are waiting for his reply. Unfortunately, we still have not received it. He travels all over the world. He is a good salesman. However, when it comes to fixing things, it just is not happening.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:05 p.m.

Conservative

Larry Maguire Conservative Brandon—Souris, MB

Madam Speaker, I want to thank my colleague for his excellent presentation on Bill C-27. He mentioned that the government brought forward 55 amendments to its own bill. We just saw a response from the government of some incorrect news regarding amendments Conservatives put forward and how they were put forward.

Could you comment on how ill-prepared the government was when it had to make 55 amendments to its own bill?

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:05 p.m.

Liberal

The Assistant Deputy Speaker (Mrs. Alexandra Mendès) Liberal Alexandra Mendes

I cannot comment, but the hon. member for Montmagny—L'Islet—Kamouraska—Rivière‑du‑Loup can.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:05 p.m.

Conservative

Bernard Généreux Conservative Montmagny—L'Islet—Kamouraska—Rivière-du-Loup, QC

Madam Speaker, as I explained earlier in my speech, the government introduced Bill C‑27 and then it consulted 300 groups. Ideally, it should have consulted those groups before introducing the bill. That would have been the right thing to do. This government is always introducing bills and then proposing a pile of amendments in committee. That is what we call doing things backward, or not doing them right. Unfortunately, that is what has been happening for the past nine years.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:05 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Madam Speaker, I will be splitting my time with the always hon. member for Windsor West.

I will begin by clearing the record right off the bat for the hon. member for Winnipeg North. I was actually proud to participate, as the NDP critic on the committee, in studying and drafting the access to information, privacy and ethics committee's facial recognition technology report called, “Facial Recognition Technology and The Growing Power of Artificial Intelligence”.

Today's concurrence motion on our standing committee report, although two years past, remains perhaps even more important today as the technology continues to surpass any legislative regulations and, in my opinion, ethical restrictions. This important and timely work addresses the critical issue of the use of facial recognition technology and its growing power, especially within law enforcement and other sectors of society.

As the ethics critic for the NDP, I believe that it is vitally important to scrutinize this technology through the lens of equity, accountability and human rights. The Standing Committee on Access to Information, Privacy and Ethics produced this extensive report. Throughout our study, we heard the concerns of 33 witnesses, many of whom were raising alarm bells about the disproportionate harms inflicted on racialized communities and by the unchecked deployment of these technologies.

Let us start with the facts. Facial recognition technology systems are often powered by AI and are hailed for their ability to supposedly streamline processes, verify identities and assist in law enforcement operations. However, the evidence shows that this technology is far from neutral. As we heard from multiple witnesses, including privacy advocates and experts, facial recognition technology is riddled with algorithmic bias, and its misuse can have severe life-altering consequences for people who are already marginalized by society. Witnesses like Cynthia Khoo from the Center on Privacy and Technology at Georgetown Law School, Angelina Wang and Christelle Tessono from Princeton University made it clear that facial recognition technology is 100 times more likely to misidentify Black and Asian individuals. For darker-skinned women, the misidentification rate can exceed one in three.

Now, the system works nearly perfectly for white men, but for racialized individuals, especially Black and indigenous people, it is a flawed and dangerous tool that amplifies the biases already present in our institutions. We have heard time and time again about cases in the United States where Black men were wrongfully arrested due to the errors of facial recognition. Robert Williams, Najeer Parks and Michael Oliver were all victims of a broken system that disproportionately criminalizes Black bodies. Although no such cases have yet to surface in Canada, we cannot ignore the very real possibility of this happening here. We know there is systemic racism within our own police forces, a fact acknowledged by the House of Commons Standing Committee on Public Safety and National Security. So, the use of facial recognition technology, FRT, only serves to exacerbate the problem.

The committee also heard from civil liberties groups, like the International Civil Liberties Monitoring Group, ICLMG, that the use of this technology by law enforcement is not just flawed but fundamentally dangerous. We are seeing the potential for mass surveillance without public consent or adequate oversight. Tim McSorley of the ICLMG warned us that this is already happening. The RCMP admitted to using FRT tools like Clearview AI to track individuals without public knowledge or legal safeguards. This is surveillance of our most vulnerable communities under the guise of security, and it is unacceptable.

However, the harm does not stop at law enforcement. We must consider the broader societal implications. Facial recognition technology is not just about identifying criminals, it is also about tracking people in public spaces, at protests or even as they shop. This is a direct threat to fundamental rights, freedom of expression, freedom of assembly and the right to privacy. Let me be clear, those most affected by this are the very communities that are already subject to overpolicing: Black, indigenous and other racialized people.

Beyond this, we must acknowledge the wider context of how big tech companies, like Google, operate in these grey zones between public-facing ethics and the pursuit of profit through military contracts. Google's involvement in military projects, like Project Nimbus and Project Maven, facilitated through its venture capital firm, Google Ventures, is a stark example of this hidden agenda that is unfolding right now in the genocide in Gaza.

Project Nimbus, a cloud computing contract among Google, Amazon and the Israeli government, facilitates military operations. Critics argue that these operations contribute to surveillance and human rights violations, particularly in occupied Palestinian territories. They test it there, and then they export it around the world. Similarly, Project Maven was a highly controversial initiative in which Google partnered with the U.S. Department of Defense to develop AI technologies that improved drone targeting capabilities, technologies that have a devastating impact on civilian populations.

Although Google publicly distanced itself from Project Maven after internal protests, we know that the company's venture capital firm, Google Ventures, continues to invest in defence and AI companies with military applications. This allows Google to maintain financial stakes in military advancements even as it outwardly claims to step away from these projects. They include activities that are currently under ICJ investigation as war crimes by the Israeli government on the people of Palestine in Gaza and the West Bank. Former staffers who once worked on such military contracts as Maven continue to find themselves in start-ups funded by Google Ventures, ensuring that the ties between big tech and the military remain intact.

The use of drone technology and AI in warfare is expanding. We have seen military droned dogs that are armed and have the ability to track down people, including civilians. Therefore, Google's involvement in these venture capital activities demonstrates that these corporations are still very much engaged in these projects, although through more covert financial channels. While Alphabet may distance its brand from military contracts, it continues to benefit from and shape the future of warfare.

There is a revolving door between tech companies and the military-industrial complex, which is facilitated by investments from companies such as Google. This underscores the ethical concerns that we must address as a Parliament. The government's role in regulating these technologies is crucial. There are 18 recommendations that came out two years ago, and I challenge the hon. members from the Liberals side to stand up and actually talk about what meaningful action has happened over those two years.

This is crucial to protecting privacy and civil liberties. Not only that, but it is about preventing big tech from operating unchecked in areas that have profound implications for human rights. The report does not just outline the harms; it also provides a path forward, with several key recommendations that are necessary to mitigate these risks. I asked the hon. member for Winnipeg North to please refer to the recommendations and come ready to talk about them in a meaningful way.

To be clear, the committee has called for immediate action, including a federal moratorium on the use of facial recognition technologies by police and Canadian industries unless they consult with the Office of the Privacy Commissioner and obtain judicial authorization. I would extend this even further to say that these moratoriums ought to include any type of technology, be it deemed primarily lethal or part of a lethal technology that could be used in conjunction with the ongoing genocide in Gaza and the West Bank. Such technology needs to be subject to an immediate arms embargo.

Furthermore, we need stringent measures for transparency and accountability. An AI registry must be established in which all algorithmic tools and any entity operating in Canada are listed. Civil society must be actively involved in these discussions, particularly those representing marginalized groups. Witnesses such as Dr. Brenda McPhail from the Canadian Civil Liberties Association warned that, even if the technology were flawless and 100% accurate for every person, it would still pose a danger. This is because it would be perfect for the discriminatory gaze of law enforcement, making it easier to target racialized communities that already face systemic discrimination.

I will save the rest of my content for any interested or curious questions that might come my way.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:15 p.m.

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Madam Speaker, as I indicated to all committee members who participated in the coming up of the report, I can appreciate that there are many different reports that are formulated in our standing committees. I recognize the efforts of those who came before to present and those committee members who have actually come up with the final reports themselves. I am sure the member would be aware that there was a ministerial response to the report. Has he had the chance to read through it? If so, could he indicate which recommendation he feels that the minister did not address? I think there are 19 in total.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:15 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Madam Speaker, the Canadian Human Rights Act must ensure it addresses discrimination caused by AI technology, and the Privacy Commissioner needs to have the power to meaningfully impose penalties on those who violate the law.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:15 p.m.

Conservative

John Brassard Conservative Barrie—Innisfil, ON

Madam Speaker, it is interesting to hear the member's perspective given that he was a member of the ethics committee when this report was drafted. He did draft the report and, obviously, read it.

One of the issues that has come up consistently over the last nine years with the government, and we have heard directly from the Privacy Commissioner on this on several occasions, is that oftentimes the Privacy Commissioner is considered an afterthought. They are not even consulted on any of the legislation. The office and the commissioner are not consulted in such a way that they could proactively provide advice to the government in order to avoid many of the pitfalls the hon. member spoke about.

I am wondering if the member could speak about the importance of the involvement of the Privacy Commission. As the member just said, the Privacy Commissioner also has asked for the ability to levy fines if the government is not following the privacy laws of this country.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:15 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Madam Speaker, the truth is, many of our arm's-length commissioners do not have the resources, the teeth or the legislative ability to really dig in to hold government accountable. Yes, they might have mandates, but what we found at committee, if the hon. member will recall, is that the RCMP often refused to answer our questions fully and truthfully. It did not have what is called “a duty of candour” in terms of being able to answer questions and be held accountable. We often found at committee that the RCMP would send members of the law enforcement chain of command who did not have adequate answers.

If they were doing that to us, as parliamentarians, I can only imagine how frustrating the process is for privacy commissioners, as an afterthought. I cannot recall exactly what the number was, but we only scratched the surface on how this is being used ubiquitously across government. What we came to find out was that the vast majority of departments never once considered a privacy impact assessment, even though that was required by their departmental mandates.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:20 p.m.

Bloc

Denis Trudel Bloc Longueuil—Saint-Hubert, QC

Madam Speaker, I found my colleague's speech very interesting.

When it comes to new technology, whether it be AI or facial recognition, it is very troubling to see what states like China, Iran and Russia are doing. They can use these technologies against people in their country or around the world who are protesting.

I recently read a book by Portuguese author José Rodriguez dos Santos, a former journalist. He wrote a novel about how China is using AI against Uyghur communities. It was rather terrifying.

I would like my colleague to talk a bit about how Canada is always lagging behind when it comes to new technologies. Right now, in Quebec, many people are looking into the impact that screens are having on young people. Screens are here. They are already part of our lives. It is a bit too late now. It is the same thing with AI. AI is already here. Some states are already using it. Here in Parliament, we are presenting reports and talking about the effects of these new technologies.

What does my colleague think that we could do to make sure that we are not always lagging behind when it comes to new technologies?

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:20 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Madam Speaker, the NDP stands for genuine accountability and the protection of civil liberties. We do not have to look abroad. Although there are many good and legitimate cases abroad of this technology being abused, it is being abused right here in Canada. It is being abused by corporations left unchecked and a government that simply refuses to act.

Access to Information, Privacy and EthicsCommittees of the HouseRoutine Proceedings

5:20 p.m.

NDP

Brian Masse NDP Windsor West, ON

Madam Speaker, I am pleased to follow my colleague and I thank him for his really good work with regard to this report, which was issued in October 2022.

It is sad that we have not seen the government use this report for what it should have been used for. It is a call for action to deal with many of the issues of artificial intelligence, and it puts due light and justice not only to areas of concern but also to some of the good that AI can do, as my colleague referenced, when it is applied to conditions that have oversight and due diligence related to knowledge and awareness. It also looks at the vulnerabilities of AI as it is being built out.

I have had the opportunity to attend several conferences across the United States and Canada on artificial intelligence, and I can say that we are missing the opportunity to act in a responsible fashion. My colleague mentioned some practical examples, and I will return to those in a few minutes. I want to start by identifying that at the industry committee, Bill C-27, to deal with artificial intelligence, has been languishing since the start of this Parliament. That bill was tabled by the government and not a single thing took place with respect to it for a full year. We had a series of hearings and discussions with testimony that lasted weeks upon weeks to get to the bill, and at that time, we identified several problems.

There are two key components the New Democrats have been pushing for with regard to this bill that are important right now. The issues over privacy, which there seems to be a path forward to resolve, were part of the bill. Then the government decided to put artificial intelligence in the bill as well, which complicated the bill's sense. The government tried to sneak one past everybody by combining these pieces of legislation, which was not necessary. In fact, it was the member for New Westminster—Burnaby who got the bill separated for votes in this chamber, which we can still have, but the bill should never have been put together like this. The protection of Canadians' privacy should have been, foremost, the part of the bill we did first, before even going to testimony on artificial intelligence, instead of trying to sneak one by the Canadian public.

My colleague from Hamilton has outlined some of the deficiencies of artificial intelligence related to facial recognition, which this report speaks to. However, artificial intelligence, given some of the models that have been developed to date that people use, also already shows biases with regard to race, religion and the inputs it has. I have heard from the Amazons and the Googles at different conferences, and they admit to their failures in creating algorithms. They have biases for race and different genders built and baked into their systems because the people generating AI are not diverse and do not have to deal with the consequences of people being identified and misidentified mostly based on not being white and male. That is a known fact in the entire universe of AI.

In fact, at the time the government tabled the bill, a number of AI scientists broke from the major conglomerates to warn humanity about that. However, we have seen what has taken place from how badly the bill was manufactured, as we have over 200 amendments on this bill alone. As referenced here in the chamber by one of my colleagues on the committee, over 50 amendments were from the government, which tells us how badly it was crafted.

Those are very important factors to identify, because we are passing on protecting Canadian privacy and on updating the Privacy Commissioner. That is identified through several excellent recommendations in this report, which call for action. Despite that, not only have the Liberals done nothing, but on top of that, they filibustered their own bill. Even in the past week, when the minister was in Montreal, the Liberals blamed the committee and the opposition for holding up the bill. His own members filibustered their own bill before we broke at the end of the last session. That is what took place in committee and they blamed us publicly.

I asked the minister at committee just last week whether he regretted his comments or at least wanted to clarify them, but he doubled down. We have been requesting amendments to deal with the Privacy Commissioner and to protect Canadians, which they know of, but the Liberals are hanging onto the idea that we want to be complicit in an AI strategy that is not fundamentally vetted and has the not-for-profit community, the public and the academic community all concerned.

The Googles, the Amazons and all the others that are going to benefit from this are not concerned, and that is why they are clinging on to keeping the bill together. What I want to talk about, in terms of how we can move forward, the NDP's proposition to deal with the one carrying point that has a problem. This has united the other members on the committee, the Conservatives, the Bloc and the NDP, who are concerned about a tribunal system set up regarding the Privacy Commissioner.

We have concerns about that because the Competition Bureau has a tribunal over top of it. As New Democrats called for stopping the takeover of Shaw by Rogers, the government allowed the Competition Bureau to be sued for $5 million for doing its job by Rogers itself. The New Democrats defended the Canadian public. They defended the position that should have been there, which was not to let this takeover take place. On top of that, the public was punished by not even having their representation be able to carry the case without repercussions that were allowed from Rogers and Telus.

To wrap up quickly, the real repercussions are as follows: We have seen the Lavender project used by the Israel Defense Forces, using artificial intelligence, as a practical situation that has cost human lives. Today, this has consequences for thousands of families in Gaza. It is a real situation that has come to take place since this report was published. It is a real situation in which artificial intelligence in the military needs oversight and control.

I agree with my colleague and the rest of the committee in their call for halting artificial intelligence face recognition right now until we get some controls. It is about time the Liberals actually came to the table with solutions instead of putting up problems and other problems in the future.