Evidence of meeting #2 for Science and Research in the 45th Parliament, 1st session. (The original version is on Parliament’s site, as are the minutes.) The winning word was criteria.

A recording is available from Parliament.

On the agenda

Members speaking

Before the committee

Ljubicic  Professor, McMaster University, As an Individual
Pinker  Johnstone Family Professor of Psychology, Harvard University, As an Individual
Shariff  Professor, The University of British Columbia, As an Individual
Cobey  Scientist, University of Ottawa Heart Institute, As an Individual
Karram  Assistant Professor of Higher Education and Coordinator, Higher Education Graduate Program, University of Toronto, As an Individual
Larivière  Professor, Université de Montréal, As an Individual

Kelly DeRidder Conservative Kitchener Centre, ON

Thank you.

Thank you to all the panellists for being here today.

My question is for you, Mr. Pinker. You mentioned that, with the system as it is today, if anyone goes against the current orthodoxy, it creates a loss of trust in science, and I think this is the most detrimental effect of what's happening today in the DEI base.

I'm going to echo very quickly an article that came out: “a fellow academic scientist...said, 'I have made my peace with EDI. I will lie about my most deeply held beliefs or convictions on paper in order to get funding.'”

How would you assess the current merit-based criteria for federal funding in Canada? How will that trust be eroded in time, and how quickly, especially in innovation hubs like Kitchener Centre, where I'm from?

12:05 p.m.

Johnstone Family Professor of Psychology, Harvard University, As an Individual

Steven Pinker

Yes, well, I want to echo Professor Shariff in saying that there's a ripe opportunity for Canada to poach American scientists. Even as a professor at Harvard, I would like to say, “Poach us.”

The situation in the United States is threatened. The only disadvantage that Canada has is that it has a reputation for being woker than the United States and for there being possibly onerous requirements on the range of opinions expressed—the racial and gender preferences for the Canada chairs, for example.

I guess I would urge Canada not to squander the opportunity by imposing distortions of science coming from the other direction.

The Chair Liberal Salma Zahid

Thank you.

Now we will end this panel with MP McKelvie for one minute.

Jennifer McKelvie Liberal Ajax, ON

Thank you, Chair.

Earlier, a member opposite mentioned world views and the importance of incorporating world views. I am a western scientist. That's how I've been trained. That is the world view I use, but I also know that if I look at the world only in that context or if we use only that context, it is incomplete. The example I want to use with regard to that is indigenous knowledge. In the environmental sciences, where I come from, we know that the first nations of this land already knew that there was an ice age in the past. They already knew that the Great Lakes were in a different location. There is a tremendous amount of knowledge that is there, especially around sustainability, so it's important that we work together.

My question is for our first speaker, Dr. Ljubicic.

Could you just speak to the importance that you see of indigenous knowledge and of collaboration and partnerships so that we can further the field of science?

12:10 p.m.

Professor, McMaster University, As an Individual

Gita Ljubicic

Thank you so much for your important recognition of indigenous knowledge.

Yes, this has been what I've been working on for my career, learning primarily from Inuit knowledge holders, but also from first nations and Métis partners, over time. It's so important to learn from indigenous perspectives. They're the experts in their lands and in their ways of life. There's so much inspiration and innovation that can be learned from a very holistic way of thinking and the connections between people and their environments in all ways. We've worked so hard—and many others have, as well—to ensure that indigenous knowledge can be learned very respectfully within partnerships. This relates a lot to the actual methodologies for working together.

I would be happy to provide a more in-depth written response, since I know we're short on time.

The Chair Liberal Salma Zahid

Now we will have to end this panel.

I want to thank all the witnesses for their important testimony. If there is anything you would like to bring to the attention of the committee, you can always send written submissions to the clerk, and those will be circulated to the members.

With that, we will end this panel, and we will suspend the meeting for two minutes so that we can have the next panel.

Thank you. Thanks a lot for coming today.

The Chair Liberal Salma Zahid

Welcome, everybody.

I would like to make a few comments for the benefit of the new witnesses. Please wait until I recognize you by name before speaking. Those participating by video conference can click on the microphone icon to activate their mic. Please mute yourself when you are not speaking. Those on Zoom can select the appropriate channel for interpretation at the bottom of the screen: floor, English or French. Those in the room can use the earpiece and select the desired channel. I will remind you that all comments should be addressed through the chair.

For this panel, I would like to welcome Dr. Kelly Cobey, scientist at the University of Ottawa Heart Institute. We are also joined, via video conference, by Dr. Grace Karram, assistant professor of higher education and coordinator of the higher education graduate program at the University of Toronto. Our third witness for this panel is Mr. Vincent Larivière, professor, Université de Montréal.

Welcome to all the witnesses.

Each of you will have five minutes for your opening remarks, and then we will proceed to the round of questioning. We will start with Dr. Cobey.

Dr. Cobey, please go ahead. You have five minutes. Thank you.

Kelly Cobey Scientist, University of Ottawa Heart Institute, As an Individual

Thank you, Madam Chair and members of the committee, for the invitation to discuss the impact of federal funding criteria on research excellence in Canada.

I am a scientist at the University of Ottawa Heart Institute and an associate professor at the University of Ottawa. I also co-chair an international initiative called DORA, the Declaration on Research Assessment. DORA operates globally and across all disciplines. Our recommendations at DORA apply to funding agencies, academic institutions, journals, metrics providers and individual researchers. DORA advocates broader assessment criteria to acknowledge the diversity of researcher activities.

Our meeting today comes at a time when the criteria to assess researchers in this country are shifting. Historically, decisions were based on quantitative metrics, such as the number of articles we published, the journal impact factor of where those publications sat and the amount of funding that we brought in. Quantitative metrics are easy to calculate, which makes them convenient for assessing a lot of people very quickly. Unfortunately, they're not evidence-based, they're not responsive to changes in the research ecosystem and they can't be used for any mission-driven goals of the federal government.

The misuse of the journal impact factor, as well as the overemphasis on quantitative metrics, has created a culture in the research ecosystem of “publish or perish”. As researchers, we often feel that the surest or only pathway to success in our domain is through publishing more and doing more, with less emphasis on quality and more on quantity.

However, presently in Canada, we're seeing a principled shift away from these quantitative metrics and toward consideration of qualitative metrics that consider a broader impact of research. Canada's tri-agencies signed DORA in 2019 and have been working to implement its recommendations since then. This process is an evolution, not a revolution. In my view, Canada is becoming active on the global science policy stage with respect to the criteria to assess researchers. The tri-agencies are actively involved in DORA's community of practice for funders, they have a leadership role in the Global Research Council's research assessment committee and, through SSHRC, they have joined RORI, the Research on Research Institute.

Concretely, as researchers, we see recent changes that have had a widespread and immediate impact on us. For example, CIHR has an entirely new research excellence framework that now considers research excellence across eight domains, one of which is open science. The tri-agencies as a collective are implementing a new narrative CV, which sounds exactly like what it is: It's a descriptive report on what a researcher is doing, how they did it and why it had an impact. This is replacing a traditional CV, which was much more considerate of a list of outputs as opposed to a qualitative, nuanced assessment.

This new format requires researchers and reviewers alike to be trained in how to create these narrative CVs as well as how to appropriately adjudicate them. Otherwise, there's the concern that old habits and these leadership-style quantitative metrics are going to persist in the written narrative form. Narrative CVs are part of the solution to assessing research appropriately; however, I would say that I'm concerned about how these reforms are being implemented in our country and that there's a gap between the strong science policy that we're creating around this and the actual realities of what's happening at committees. We need to ensure effective monitoring and implementation as we roll out these changes.

I have three final short points.

First, how the federal government chooses to assess research excellence directly impacts what research is done, how it is done and who does it.

Second, the tri-agencies' new definitions of research excellence do not always come to be considered in practice in how research is evaluated by committees. This again comes back to repeated implementation gaps between what we say we want to do and what actually happens.

Finally, even if we assume that the criteria used to assess excellence in this country, historically or presently, were appropriate, there are a series of issues with how funding is administered in this country that prevent us from achieving that excellence in an efficient way. One example is the across-the-board funding cut for funded research projects.

There's also, in my view, incredibly limited grant monitoring. Once we get funds based on the promises of what we wrote in our grant, there's very little monitoring to see that, as researchers and as a federal government, we're providing returns on that investment.

Thank you.

The Chair Liberal Salma Zahid

Thanks a lot, Dr. Cobey.

We will now proceed to Dr. Grace Karram for five minutes.

Grace Karram Assistant Professor of Higher Education and Coordinator, Higher Education Graduate Program, University of Toronto, As an Individual

Thank you very much, honourable members.

Thank you for the opportunity to talk about this important topic.

When we compare Canada's position in scientific research with the positions of other members of the OECD, several paradoxes come to light. I will present these paradoxes as a way of clarifying Canada's research and development sector and those who work within it.

Specifically, I'll examine the role of post-secondary institutions, the impact of international research collaborations, the role of the business sector and labour market inefficiencies that have led to an underutilization of our Ph.D.s. I'm going to conclude with several recommendations to help strengthen Canada's research production.

How does Canada compare globally? Well, Canada's general expenditure on research and development, as a percentage of the GDP, is notably below the OECD average, and it has declined steadily since 2001. The paradox, of course, is that higher education expenditures in research and development have increased 30% during the same 20 years, so Canadian post-secondary institutions and the researchers they house play a significant role in the country's research and development.

The second paradox is that while our percentage of publications per researcher places us at seventh in the world in—and that's great—in our production of patents, we're actually 18th from the bottom. This is likely because of fairly low levels of R and D in the business sector. Even though industry tends to fund some R and D in post-secondary institutions, the ties are relatively loose.

The third paradox relates to international collaboration and a significant gender divide. Studies have repeatedly confirmed that international collaboration is correlated with an increase in research production, often identified by publications, however limited. However, in Canada, a statistically significant gender divide exists between men and women researchers. Men have significantly more international collaborations, and thus more high-impact research outputs.

The final paradox relates to labour and personnel. Although Canada has increased the number of individuals graduating with doctoral degrees, the number of tenure-track positions has plateaued. This has led to highly skilled researchers being employed in part-time, precarious positions mainly focused on teaching, and some eventually leave academia. You just have to visit one of Canada's amazing colleges, universities, CEGEPs or polytechnics to see a huge labour force of underemployed Ph.D.s, many with international experience and many who are women. Because much of our R and D is housed in post-secondary institutions, our private sector does not absorb Ph.D.s in the same way as other countries.

What does this tell us about scientific research in Canada? Higher education is a significant actor. We have relatively loose business ties, limited participation in global collaboration and an inefficient labour market that's not making the most of its skilled labour.

What do I recommend? Well, first, post-secondary institutions are at the heart of our research success, so keep funding universities and colleges. Canada needs to increase research funding to build the infrastructure at smaller institutions, as others have said in these panels, and definitely at our colleges, with their ties to industry and applied research. This practice of funding both projects and institutions has been very successful in the European context. In contrast, Canada tends to focus more on the projects than the institutional infrastructure, and we need to bring institutions up as well.

Second, fund both theoretical and applied research, establish strong partnerships with industry and make a pipeline to patents. However, as gatekeepers of research funding, we need thoughtful regulatory frameworks that ensure that it's done ethically and equitably and that it considers the social impact of research.

Third, we have to expand who is considered a researcher. Our precarious faculty who teach on part-time, limited contracts are rarely eligible to apply for federal funding. Moreover, federal funding prevents salaries from going to principal investigators, meaning that part-time researchers, when they are eligible to receive a grant, cannot increase their income to a living wage with funds from the grant. Our selection criteria need to adapt to the reality that not all researchers have the same conditions of employment.

Fourth, we need to increase our global collaborations and provide funding for travel to work globally with other teams. When I have conducted research on international publications, other teams in other countries are shocked that international collaboration is not one of our requirements. We need to focus on the big issues that impact our planet.

Lastly, we need targeted programming to support populations of researchers who are left outside the high-impact world of scientific research: women, researchers of colour and indigenous communities. In short, we want to see research funding going to diverse institutions and diverse researchers who can make Canada a global leader in scientific research with a positive social impact.

The Chair Liberal Salma Zahid

Thank you, Dr. Karram.

We will now proceed to Mr. Vincent Larivière, a professor from the University of Montreal.

You will have five minutes for your opening remarks. Please go ahead.

Vincent Larivière Professor, Université de Montréal, As an Individual

Thank you very much for the invitation to testify on the important issue of research excellence.

My name is Vincent Larivière, and I'm a professor of information sciences at the Université de Montréal. I'm also the UNESCO Chair on Open Science and the Quebec research chair on the discoverability of scientific content in French. I'm not representing the Université de Montréal today. I'm appearing as an individual, as an expert who has spent about 20 years studying the scientific community, and specifically the issue of research excellence and evaluation.

The first thing that's important to mention is the lack of consensus on what research excellence is. This can be seen virtually everywhere in the scientific community. Funding evaluation committees don't always agree on which projects are the most important. Journal editors and reviewers don't always agree on the quality of a paper.

Excellence in research is, in a way, the holy grail of the scientific world, but it remains quite difficult to define. There's a lot of subjectivity in all of this. It can be explained in a number of ways, but one thing is clear: Scientific excellence is multi-faceted. It can vary depending on the context. It can be the ingenuity of a method, the originality of a research issue, the quality of an argument's construction or the potential applications of a research project.

Because of this lack of consensus, evaluation committees often rely on quantifiable indicators, things that can be measured: the number of papers written in prestigious journals, the number of times they are cited, whether the person graduated from a prestigious university or whether they have gotten funding before. One of the main criteria for getting funding is having already gotten it. Those quantifiable markers don't always reflect research excellence, but they make the evaluation much simpler. A dozen or so publications will always be more than five. A million dollars will always be more than $100,000. That way of evaluating scientists and their projects, often done implicitly, raises important questions for the Canadian scientific community.

Focusing on publication volume will promote certain works, but also certain themes that are more easily published. That contributes to an overproduction of papers, which shouldn't be confused with overproduction of knowledge. Overproduction of papers contributes to noise and information overload, especially of mediocre quality. Many Nobel Prize winners, including Peter Higgs, have said that they wouldn't have been able to make their discoveries in today's context of research evaluation.

I'd like to make three recommendations for improving research excellence in Canada.

The first one is quite complicated, but I think it's doable. The idea would be to enable funding agencies to experiment with peer review. Peer review is known to be imperfect, but many countries are experimenting with it, including Switzerland, Norway and the United Kingdom. We can't say that those countries are lagging behind in science. There are countries that have taken the bull by the horns, realized the biases currently associated with research evaluation and decided that they should try to find new ways to encourage excellence. As my colleague Julien Larrègue says, it's important for the results of those experiments to be available to the expert community.

The second recommendation is somewhat related to what my colleague Ms. Cobey said on the issue of CVs, which are evaluated by the various committees. Narrative CVs were recently put in place, which I think sounds like a good idea on the surface, but it isn't entirely clear how those CVs are going to be interpreted. They will, in fact, also be interpreted based on their volume. I recently received a seven-page narrative CV that was longer than the application itself. We have absolutely no idea how committees are going to evaluate that. That has to be considered. Some countries have implemented a requirement for short, two-page CVs that don't focus on the publication volume and that can then show the publications that are most relevant to the project.

The third recommendation goes back to indicators. In Canada, there usually isn't an explicit request to provide indicators for evaluations. However, during evaluations, committee members often pull indicators out from nowhere. Obviously, committees are often sovereign, so there isn't much that can be done. I think there needs to be a ban on using those indicators in the evaluation committees of granting agencies. It isn't just a matter of not encouraging them; it's also about telling the committees that all of that is outside the scope of the evaluation.

Thank you, and I look forward to taking your questions.

The Chair Liberal Salma Zahid

Thanks a lot.

We will proceed to our round of questioning. We will start with MP Baldinelli for six minutes.

Please go ahead, Mr. Baldinelli.

12:30 p.m.

Conservative

Tony Baldinelli Conservative Niagara Falls—Niagara-on-the-Lake, ON

Thank you, Madam Chair.

I'd like to quickly indicate that I'll ask one question, and then I'll cede some of my time to my colleague Ms. DeRidder. She has a few questions she'd like to ask.

I'd like to follow up with Ms. Cobey and Mr. Larivière.

Mr. Larivière, you mentioned a notion that struck me—the lack of consensus on what constitutes research excellence.

Ms. Cobey, you talked about DORA and the move away from quantitative to qualitative metrics, for example. The DORA principle is being accepted by the three federal granting agencies, but I read in the briefing materials provided that only nine universities have accepted that principle. Why do you think there's only a limited uptake on that with regard to accepting the DORA principle? What's precluding others from accepting that idea?

12:30 p.m.

Scientist, University of Ottawa Heart Institute, As an Individual

Kelly Cobey

Through you, Madam Chair, that's a great question.

I would say that the DORA movement in Canada is quite robust. Many institutions are thinking about responsible research assessment more through this broader narrative of impact perspective. Sometimes the administrative hurdles of pushing a signature onto a declaration aren't worth the battle. I see personally, as the co-chair of DORA, many institutions in this country actively implementing so-called responsible research assessment without having signed a declaration. Certainly the tri-agencies signing it has prompted more institutions to consider it more deeply.

12:35 p.m.

Conservative

Tony Baldinelli Conservative Niagara Falls—Niagara-on-the-Lake, ON

Thank you.

Before I cede my time, in one of your comments you mentioned that there's very little monitoring of how federal funds are being spent and on checking the status of projects. Could you write down some of your thoughts on why that is and on how we could correct or work on that?

I'll cede my time to Ms. DeRidder.

12:35 p.m.

Conservative

Kelly DeRidder Conservative Kitchener Centre, ON

Thank you.

Thank you, everybody, for coming today and for being part of our panel.

Dr. Cobey, being a part of DORA, you come with a wide range of insights from across the country. First, do current federal funding models support meaningful collaboration among academic researchers, especially related to the technology and innovation sector?

12:35 p.m.

Scientist, University of Ottawa Heart Institute, As an Individual

Kelly Cobey

I think it's a good question. In terms of meaningful collaboration across the sector, I would say that right now we have science policies that I think would support doing that, such as the open science initiative. For instance, if there's a federal goal towards AI innovation, we need robust research data management. We have a policy being rolled out in that respect to get data management plans done at the front end of research so that at the back end of research we can have data to share and to be leveraged and innovated upon. We have the policy and we have the vision, but we don't have the incentives and rewards for researchers to actually do that.

For example, at my institution and others across this country, researchers don't have the skills and the practical knowledge to get consent, to de-identify their data and to prepare it for that mission-driven goal of AI innovation. I think that because we don't have those skills, as researchers we need to upskill. To do that, we need to know that we can focus on getting those skills and getting that training, and that it will be valued. It's not just about producing more; it's pausing and taking time to upskill ourselves so that we can get our data into a position for lending toward collaborations beyond our single use of how we envision that data to be used.

12:35 p.m.

Conservative

Kelly DeRidder Conservative Kitchener Centre, ON

Thank you very much for your answer.

Second, what improvements do you think can be made to ensure that our federal research funding programs are flexible enough to captivate community-focused research happening in local innovation hubs, such as mine in particular in Kitchener? How do we not rely on publishing to fund and start to rely more on innovating to fund?

12:35 p.m.

Scientist, University of Ottawa Heart Institute, As an Individual

Kelly Cobey

Thank you.

I think one thing that needs to be done is that there needs to be more consultation on an ongoing basis between, say, the tri-agencies and the government and the researchers in the institutions. There's a bit of a siloing, I think, in terms of how messages and policies translate from the federal funders to the institutions. At the tri-agencies, they may be saying that they signed DORA and they value a broad range of impacts, including community-based research and these types of things, but if the institutions don't send that same message, there's a bit of a mismatch.

I feel that researchers are often caught between two systems as we roll out at the federal level. We're being told EDI, open science and broader excellence from our federal funders, but many of our institutions are still focused on those quantitative indicators. It creates a duplication of effort for us as researchers.

12:35 p.m.

Conservative

Kelly DeRidder Conservative Kitchener Centre, ON

You mentioned EDI among some of the qualitative things happening right now. My fear with that is whether trying to create inclusion is actually creating exclusion, especially in the funding world. In its form today—through defining sex, skin colour and things like that—how truly inclusive do you think the criteria are for awarding federal funding in the broader research spectrum?

12:35 p.m.

Scientist, University of Ottawa Heart Institute, As an Individual

Kelly Cobey

It doesn't speak specifically to EDI. It speaks about broader incentives and rewarding a range of different outputs that researchers contribute to and being transparent about how you assess researchers. I think that's really critical. We need to know what the criteria are.

The Chair Liberal Salma Zahid

Thank you.

Dr. Karram, I see that your hand is raised.

12:40 p.m.

Assistant Professor of Higher Education and Coordinator, Higher Education Graduate Program, University of Toronto, As an Individual

Grace Karram

Thank you. I want to comment on the EDI question—

The Chair Liberal Salma Zahid

The time is up. We have to proceed to the next member. Maybe you will get an opportunity in there.

We will proceed to MP Jaczek for six minutes.

MP Jaczek, please go ahead.