Evidence of meeting #12 for Public Accounts in the 40th Parliament, 3rd Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was program.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Sheila Fraser  Auditor General of Canada, Office of the Auditor General of Canada
Michelle d'Auray  Secretary of the Treasury Board of Canada, Treasury Board Secretariat
Neil Yeates  Deputy Minister of Citizenship and Immigration
Ian Shugart  Deputy Minister of the Environment
Tom Wileman  Principal, Office of the Auditor General of Canada
Alister Smith  Assistant Secretary, Expenditure Management Sector, Treasury Board Secretariat
Elizabeth Ruddick  Director General, Research and Evaluation, Department of Citizenship and Immigration

May 4th, 2010 / 9:30 a.m.

Bloc

Meili Faille Bloc Vaudreuil—Soulanges, QC

Thank you, Mr. Chairman.

I'd like to pick up where my colleague left off.

If one looks at exhibits 1.3 and 1.4 in the Auditor General's report, one can see that evaluations were planned and appear to have been completed within the time periods. Eighty-eight per cent of Canadian Heritage's evaluations were completed; 80% of Fisheries and Oceans' were completed, etc.

However, table 1.4 notes that a low proportion of total program expenses were evaluated. The estimated average annual percentage of program expenses evaluated is relatively low.

Could you please help us understand what you focus on and why, and who sets the priorities?

9:30 a.m.

Secretary of the Treasury Board of Canada, Treasury Board Secretariat

Michelle d'Auray

Priorities are in fact set by the departments because they are the ones who must establish their evaluation priorities. That being said, under the Financial Administration Act, 100% of grants and contributions programs must be reviewed over a five-year period. Therefore, as I explained earlier, to date more than 65% of grants and contributions programs have been evaluated, if you include 2009-2010.

As my colleague, Mr. Yeates, indicated, if one includes his evaluations or the evaluations he is about to complete, that coverage would be 88% of his expenditures. This is a process that takes place over five years and the same approach is used for all programs subject to evaluation under the evaluation policy.

We really have concentrated our efforts on the fundamentals, that is, data collection, recruitment, identification of skills for evaluators. We now have a guide to evaluation skills.

9:35 a.m.

Bloc

Meili Faille Bloc Vaudreuil—Soulanges, QC

Excuse me for interrupting. You started with data collection, but that is what appears to be lacking in the evaluations. Did you list that first because that is your main priority?

9:35 a.m.

Secretary of the Treasury Board of Canada, Treasury Board Secretariat

Michelle d'Auray

Mr. Chairman, it is one of the fundamental factors in our ability to undertake evaluations.

My colleagues have examined these issues, especially in the area of the environment, and they might like to expand on their approach.

9:35 a.m.

Deputy Minister of Citizenship and Immigration

Neil Yeates

Yes. I would add that it is true that this is a significant challenge for us.

The data is often very expensive for us to collect. At CIC we have an income tax file that we created with Statistics Canada a number of years ago to track immigrant outcomes over time. So that has proven to be very useful for us in looking at economic outcomes for immigrants. It's excellent data.

In some of our other programs, for example our language classes and so on, we have not had a good system for collecting language outcome data.

9:35 a.m.

Bloc

Meili Faille Bloc Vaudreuil—Soulanges, QC

Have you just done that this year?

9:35 a.m.

Deputy Minister of Citizenship and Immigration

Neil Yeates

No. We've had the income tax data file for 20 years--a long time.

9:35 a.m.

Bloc

Meili Faille Bloc Vaudreuil—Soulanges, QC

That is what I was wondering, because I remember meetings with Ms. Ruddick on data for the immigration system.

Here's my point. It seems to me that every time you come here we're talking about inadequate data, etc. I cannot understand why after 40 years, we're still evaluating programs. Information such as the criteria on the basis of which programs are evaluated have not been determined or assessed, and it is not a priority. Yet, today, in 2010, all of a sudden, this is a priority because the Auditor General has provided the background. One can see that this has not been resolved over the years.

Furthermore, in paragraph 1.96 to 1.100, the Auditor General points out that during her exchanges with the various departments, the departments expressed their concerns over their ability to evaluate programs. The report goes on to say that they were not able to undertake the required improvements on a regular basis.

I have a question. When you identify a weakness or a need to make an improvement in departments, why do you not act on those improvements? How does your decision-making process rank the importance of resolving problems as soon as they arise or as soon as they are identified?

9:35 a.m.

Secretary of the Treasury Board of Canada, Treasury Board Secretariat

Michelle d'Auray

Mr. Chairman, perhaps I'll begin by answering and then I'll ask my colleagues to expand on their own programs and activities.

First, with respect to data, we recommended that departments work with their managers. This was for the establishment or renewal of programs, in order to identify performance measurements unique to those programs. We also requested, through our policy, that the evaluation heads in each department draft an annual report on the quality and capture of data.

Those are the measures that we identified. The Auditor General had of course also identified them but we had identified them at about the same time as the audit did. This is what triggered our changes in the evaluation policy, precisely to address those deficiencies. We had also identified them. That is why we agree with the recommendations. Our reflections and the consultations that we undertook with the departments, which led to the new policy, reflected the same deficiencies and observations raised by Ms. Fraser in the audit.

That being said, we are increasingly using the evaluation measures for program reviews, especially for strategic reviews. The same applies to program renewal, under the transfer payments policy. We officially require performance evaluations, because we must focus on the effectiveness evaluation issue, performance evaluations. We are much more demanding with respect to evaluations than we were previously.

I'm not saying all will be resolved by tomorrow morning but I do think we are on the right path. The connections have been made between performance measures, program activity architecture, and expenditure reviews, and the full cycle is now integrated.

9:40 a.m.

Liberal

The Chair Liberal Shawn Murphy

Merci, Madame Faille.

Mr. Christopherson has seven minutes.

9:40 a.m.

NDP

David Christopherson NDP Hamilton Centre, ON

Thank you, Chair.

Thank you all very much for your attendance today. It's good to see you all.

Right up front, the last thing in the world I am is an academic, so once we get into the world of evaluations and data you have your best chance for smoking me over, because it's not my area.

Madame d'Auray, on page 4 you said:

Much of our monitoring and support work is carried out through the annual management accountability framework assessment process, which assesses evaluation quality, neutrality, coverage, and use.

So the first thing that struck me when I heard you say that was why didn't this management accountability framework assessment process pick up these deficiencies long before the AG arrived on the scene?

9:40 a.m.

Secretary of the Treasury Board of Canada, Treasury Board Secretariat

Michelle d'Auray

I would say that it did. It led us to undertake a fairly comprehensive review of the evaluation policy that we renewed and redid. It caused us to look at the performance measurement information framework. It led us to have a fairly extensive series of conversations with deputy ministers about how they used evaluation. And it led to the fairly extensive work plan and very pragmatic approach we are taking today.

As I said earlier, it's not perfect, but the gaps were very similar. The approaches we're taking are to address the biggest gaps and make the linkages between expenditures, program effectiveness, improvements, and decisions about whether or not the programs should be maintained, changed, or improved.

So I don't want to say this is all a perfect situation, but in parallel, as we were looking at and assessing departments in their capacities--and the quality, scope, and coverage--we came to the same conclusion the Auditor General did.

9:40 a.m.

NDP

David Christopherson NDP Hamilton Centre, ON

And this was prior to the Auditor General arriving?

9:40 a.m.

Secretary of the Treasury Board of Canada, Treasury Board Secretariat

Michelle d'Auray

We were doing this in the same timeframe, and we were encouraging departments to make improvements.

9:40 a.m.

NDP

David Christopherson NDP Hamilton Centre, ON

I'm sorry, it's a small matter, but I just want to be clear that you started this work, identified this, and began something prior to the Auditor General arriving.

9:40 a.m.

Secretary of the Treasury Board of Canada, Treasury Board Secretariat

9:40 a.m.

NDP

David Christopherson NDP Hamilton Centre, ON

Auditor General, is that correct? Did you see all this beginning when you arrived?

9:40 a.m.

Auditor General of Canada, Office of the Auditor General of Canada

Sheila Fraser

We do note in the report activities on monitoring and oversight in paragraphs 169 through to 174. We do note that over the years a number of issues were identified that were very similar to what we found.

9:40 a.m.

NDP

David Christopherson NDP Hamilton Centre, ON

Good.

Just to drill down one more step, did the analysts in the line ministries pick this up too? Did they report to their supervisors that they were unable, from a professional perspective, to deliver the kinds of quality evaluations they would like to?

I'm just checking to see if the system worked all the way through, or did it take the AG to come in and trigger everything--because we have both. We get into all kinds of situations, so I'm just trying to get a sense of how well our systems were working underneath. Right at the beginning, were the line ministry analysts able to determine they had a problem, it went up, and then it got bumped up further to the Treasury Board, where things started to happen? Then did the AG come in, see some of this, and then do her further work?

9:45 a.m.

Deputy Minister of Citizenship and Immigration

Neil Yeates

I can start off, Mr. Chair, speaking to the experience in CIC.

We've been working on ramping up our evaluation capacity and function for several years. I noted in my opening remarks that in the management accountability framework we received an unacceptable rating in 2006. So that was well before the AG's work started on this. We knew we had some pretty significant challenges where we needed to improve.

We also had discussions within the department about the challenges we faced. So the Treasury Board assessment wasn't news to us. We've been working on those issues ever since. That's why I say we now have a five-year evaluation plan in place that will provide the 100% coverage that's expected in the policy.

So I think there is a fair history to this.

9:45 a.m.

Auditor General of Canada, Office of the Auditor General of Canada

Sheila Fraser

I just want to add some context. I think the major impetus for all this was a change in policies and the requirement to do more evaluation of grant and contribution programs, and then the recent evaluation that sets all direct spending. Those requirements were not in place previously.

As I mentioned, this audit is about a year old now. We were doing this to see how the evaluation functioned across government. Was it prepared to be able to assume these new responsibilities under this new policy that came out? We can see, from what the secretary said, that a lot of work has been done in the last year to try to increase the capacity.

We also note in the report that most of the departments we looked at did not have formal quality management systems in place, or continuous improvement. Environment Canada did, and it can be a model for others in how they do that.

9:45 a.m.

Deputy Minister of the Environment

Ian Shugart

Mr. Chairman, if I could just add and maybe connect the committee's preoccupation on—

9:45 a.m.

NDP

David Christopherson NDP Hamilton Centre, ON

My next question was directly to you. You can answer my question and throw your comments in, how's that? It's a simple question.

Again, in your opening remarks you stated that

It is important to acknowledge that there are challenges inherent in striving for greater accountability. Increased evaluation coverage in order to have more information on program performance has to be balanced against the need to focus on program delivery and the realization of results.

I was just trying to understand how that was unique to this situation, unique to your ministry. It sounds like the balancing act that as deputy you do every day. So help me understand what you meant by that. I just didn't quite get it. I understand that you have the challenge, but why is this challenge unique enough that you needed to underscore it in your remarks to us?

9:45 a.m.

Liberal

The Chair Liberal Shawn Murphy

Go ahead, Mr. Shugart.

9:45 a.m.

Deputy Minister of the Environment

Ian Shugart

Primarily, Chair, it's because the Auditor General in her report refers to the challenge that audit staff and officials within departments have recognized in completing the policy set out by the board. So I anticipated that this is an issue that is germane to this whole debate, and simply wanted to refer to it in that way.

You're absolutely right, it is characteristic of the kind of thing that we have to do all the time, and I think I would say that this is an excellent example of the kind of continuous improvement that we try to achieve, and it's relevant to the issue of data. As a deputy minister, even before the audit, before the policy, before the requirement of 100% coverage in grants and contributions spending, I would receive an evaluation report, and typically that evaluation report would indicate that we were able to answer these questions because we had data, but not these questions because there is no performance data. So even within the context of a particular evaluation, it's not a complete lack of data that we have; sometimes there is data but it is not developed by program managers to support evaluation per se.

For example, we might be able to know what the coverage of a particular program is, but we might not have data that relates to service standards, for example, and an evaluation--for which we decide in the department what program will be evaluated and what is the scope and the nature of the evaluation--would identify relevant questions. There might be data for some of the questions but not for others. And evaluation staff will, in their interaction with the AG and her team, have identified that one of the areas for improvement that we need is to have performance management.

In Environment Canada we recently redid our performance activity architecture. We already have a performance management framework, but it is by no means complete. So within the department we continue to work on our performance management framework, all of the data development that will in fact support that PAA, and it will then result in more data that's available for evaluations in the future.

The last comment I would make is that I as a deputy don't welcome and yet I do welcome the policy on 100% coverage. I don't welcome it because it's another thing I have to meet the requirement for; it's another pressure, another obligation. I do welcome it because it is the right thing to do, and because it will add a discipline to everything we do, both in program delivery and in evaluation. It will force us to develop the performance data, and so on. Will it be perfect in three or five years? No, it won't, because there will still be relevant questions that should be asked, and we may not have all of the data, but we will be improving, and I'm confident of that because we're already on a trajectory of improvement.