Evidence of meeting #12 for Public Accounts in the 40th Parliament, 3rd Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was program.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Sheila Fraser  Auditor General of Canada, Office of the Auditor General of Canada
Michelle d'Auray  Secretary of the Treasury Board of Canada, Treasury Board Secretariat
Neil Yeates  Deputy Minister of Citizenship and Immigration
Ian Shugart  Deputy Minister of the Environment
Tom Wileman  Principal, Office of the Auditor General of Canada
Alister Smith  Assistant Secretary, Expenditure Management Sector, Treasury Board Secretariat
Elizabeth Ruddick  Director General, Research and Evaluation, Department of Citizenship and Immigration

9 a.m.

Liberal

The Chair Liberal Shawn Murphy

I would like to call the meeting to order at this time.

I want to welcome everyone here. Bienvenue à tous.

Colleagues, this meeting is called pursuant to the Standing Orders. We're dealing today with chapter 1, “Evaluating the Effectiveness of Programs”, from the fall 2009 report of the Auditor General of Canada.

We have a number of witnesses before the committee this morning.

From the Office of the Auditor General of Canada, of course, we have the Auditor General herself, Sheila Fraser. She is accompanied this morning by Assistant Auditor General Neil Maxwell and Principal Tom Wileman.

From the Treasury Board Secretariat, we have the secretary, Michelle d'Auray. She is accompanied by the assistant secretary, Mr. Alister Smith.

From the Department of Citizenship and Immigration, we have the deputy minister and accounting officer, Neil Yeates. He is accompanied by Elizabeth Ruddick, director general of research and evaluation.

Finally, from the Department of the Environment, we have the deputy minister and accounting officer, Ian Shugart. He is accompanied by William Blois, the associate director of the audit and evaluation branch.

On behalf of all members of the committee, I want to extend a very warm welcome to everyone.

We will now have opening statements

Ms. Fraser, you're first. You have up to five minutes.

9 a.m.

Sheila Fraser Auditor General of Canada, Office of the Auditor General of Canada

Thank you, Mr. Chair.

We thank you for this opportunity to present the results of an audit included in our November 2009 report on evaluating the effectiveness of programs in the federal government.

As you mentioned, I'm accompanied today by Neil Maxwell, Assistant Auditor General, and Tom Wileman, principal, who were responsible for this audit.

I would like to point out to the committee that the work for this audit was substantially completed on May 31 of 2009.

Effectiveness evaluation is an established tool that uses systematic research methods to assess the extent to which a program is achieving its objectives. Over the past 40 years, the federal government has made repeated efforts to establish the evaluation of effectiveness as an essential part of program management.

One of the most important benefits of effectiveness evaluation is to help departments and agencies improve their programs. Departments also need to be able to demonstrate to Parliament and taxpayers that they are delivering results for Canadians with the money entrusted to them. Sound information on program effectiveness is particularly important in light of the recent budgetary measures to contain administrative costs and review government operations.

In this audit, we examined how examination units in six departments identified and responded to increasing needs for effectiveness evaluation. We also looked at whether they had built the required capacity to respond to those needs. In addition, we looked at the oversight and support role of the Treasury Board of Canada Secretariat in monitoring and improving the program evaluation function in the government, particularly with respect to the evaluation of program effectiveness. The period covered by our audit was 2004 to 2009.

Overall, we found that the six departments were not sufficiently meeting the needs for effectiveness evaluation. Annual coverage of departmental expenses by evaluation was low, ranging from 5% to 13% across the six departments.

In addition, the effective rate of coverage was even less because many of the evaluations we reviewed did not adequately evaluate effectiveness. Of the 23 evaluation reports we reviewed, 17 noted that the analysis was hampered by inadequate data, which limited the evaluation of program effectiveness. This lack of performance data is a longstanding problem as noted in my office's earlier audits of the evaluation function.

With respect to the six departments' capacity to meet the needs for effectiveness evaluation, department officials told us that they have not been able to hire enough experienced evaluation staff and have used contractors extensively to meet requirements.

Of the audited departments, Environment Canada had internal processes to systematically identify areas for improvement in effectiveness evaluation. For example, Environment Canada solicits client feedback through post-evaluation surveys. Such processes ensure that departments are following the management cycle for continuous improvement.

The situation with respect to program evaluation in the federal government is not unlike that of internal audit before the policy on internal audit was implemented. Lessons learned from the government's recent strengthening of internal audit could be applied usefully to program evaluation.

We believe that implementation of the new requirement to evaluate all direct programming spending faces serious challenges. Earlier requirements for full coverage were never met, and current legal requirements for effectiveness evaluation of all grant and contribution programs have been difficult to meet. Department officials told us that they have concerns about their capacity to implement the expanded coverage requirement in the new evaluation program.

In our view, it will be important for the secretariat and departments to carry out effectiveness evaluation of programs that are susceptible to significant change because of shifting priorities and circumstances. These are programs where evaluations of the relevance, impact, and achievement objectives can be put to best use. During the transition to full coverage, these programs may present the biggest opportunities for effectiveness evaluation. Taken together, the increasing needs for effectiveness evaluation and the department's limited capacity to meet those needs pose a serious challenge to the function. Concerted efforts by both the secretariat and departments will be needed to meet this challenge.

The secretariat and the audited departments have agreed with all of our recommendations. In several cases, they have made commitments for remedial action in their responses. In line with the committee's request for action plans and timelines, the committee may wish to explore the progress made to date in addressing the issues raised in the chapter.

Mr. Chair, this concludes my opening remarks and we would be pleased to answer the committee's questions.

9:05 a.m.

Liberal

The Chair Liberal Shawn Murphy

Thank you very much, Ms. Fraser.

We're now going to hear from the Treasury Board.

I'll turn it over to you, Madam d'Auray.

9:05 a.m.

Michelle d'Auray Secretary of the Treasury Board of Canada, Treasury Board Secretariat

Thank you, Mr. Chair.

Good morning.

Good morning. Thank you for this opportunity to speak about the evaluation function in the Government of Canada. As you mentioned, I'm here with my colleague Mr. Alister Smith, Assistant Secretary of the Expenditure Management Sector. Mr. Smith is responsible for my department's Centre of Excellence for Evaluation. This centre is responsible for evaluation policies and works very closely with the government evaluator community.

As Ms. Fraser mentioned, evaluation is a longstanding management tool that is vital to the sound management of public spending. It involves the systematic collection and analysis of evidence on the outcomes of programs. This invaluable information is used to make judgments about the relevance, performance and value for money of programs. It is also used to examine alternative ways to deliver programs or achieve the same results.

Finally, it supports policy and program improvement, expenditure management, cabinet decision-making, and public reporting to Parliament and Canadians.

Given the increasingly important role evaluation plays in support of program effectiveness and expenditure management, we are in full agreement with the recommendations contained in the Auditor General's report. They mirror in large part what the Treasury Board Secretariat has learned through extensive consultations and monitoring activities. And they are reflected in and addressed by the actions we have taken as part of the implementation of the new policy on evaluation that was issued in April 2009. Unfortunately, the scope and timing of the Auditor General's report did not allow for recognition of these improvements, as it focused on the period up to the introduction of the new policy.

Our action plan, in response to the Auditor General's report, which we provided to the committee, outlines what the secretariat has undertaken and delivered since the report's publication and what we will continue to do. Allow me to highlight some of our actions.

One of the Auditor General's concerns was that evaluations were not adequately assessing effectiveness, and that they lacked performance information. This concern has now been addressed under the new policy, which sets a clear standard for evaluation quality, as well as responsibilities for performance measurement. It also requires that all evaluations examine program effectiveness.

The new policy also requires each departmental head of evaluation to prepare an annual report to their deputy head on the state of performance measurement in their organization. This report will assist the deputy head in ensuring that the key data needs of program evaluations are met.

Finally, the policy has also expanded the evaluation coverage requirements to cover all direct program spending over a five-year cycle, after an initial transition period.

The Auditor General also recommended that the Treasury Board Secretariat should do more to monitor and support departments to help them identify priorities for improvement. This is addressed under the new policy that calls on the Treasury Board Secretariat to provide functional leadership for evaluation across the government. This includes regular monitoring and annual reporting to Treasury Board on the health of the evaluation function. Our first report will be issued before the end of 2010-11.

Much of our monitoring and support work is carried out through the annual management accountability framework assessment process, which assesses evaluation quality, neutrality, coverage, and use. It is also carried out through the advice and guidance we provide to departments on performance measurement frameworks, which are required under the management, resources, and results structure policy.

The secretariat has also allocated resources to our centre of excellence for evaluation to strengthen the evaluation expertise we provide to departments.

We appreciate that the new policy represents some important changes for departments—as the Auditor General noted by calling on the secretariat to help departments prepare to implement the new coverage requirements. This is why there will be a four-year phase-in period before departments are required to meet the comprehensive coverage requirement in their five-year evaluation plans, beginning with the 2013-2014 to 2017-2018 planning period.

I will turn now to the support we have provided to departments during the transition period, largely through the secretariat's centre of excellence for evaluation. For example, in November 2009 we issued a draft guide to developing departmental evaluation plans, which will be finalized and issued this summer. This provides guidance to departments with regard to evaluation timing, coverage, prioritization, and instruments.

We also issued, in November 2009, a draft guide to developing performance measurement strategies to support heads of evaluation in assessing the department's performance measures. This too will be finalized this fall, after integrating feedback and recommendations from departments.

We also set up, in June 2009, the evaluation community of practice with a website for exchanges of best practices. And we have held regular meetings to guide the capacity development of the evaluation community.

In addition, we provided preliminary guidance to departments on the possible merits of including external experts on departmental evaluation committees. The final document will be integrated this fall in a guide on the evaluation function, which will set out the expectations of the secretariat in relation to the evaluation policy and directive.

We recently led a post-secondary recruitment initiative for graduates with evaluation-related backgrounds. This led to the establishment of two pools of pre-qualified evaluators at the entry and intermediate levels. We also continue to work with universities and the Canada School of Public Service to promote and develop the types of evaluation skills and knowledge we need.

All these improvements have addressed the Auditor General's concerns over the quality, capacity and program coverage of the evaluation function in the government.

In sum, even though much remains to be done and even though we have attempted many times to improve the evaluation function within the federal government, I am of the opinion that with the new policy on evaluation, the guidelines and guides, and especially through our interactions with the deputy ministers and evaluators, we are laying the foundation for building a stronger, more competent and productive evaluation function in the Government of Canada in order to ensure better expenditure management.

Thank you.

9:10 a.m.

Liberal

The Chair Liberal Shawn Murphy

Thank you very much, Madame d'Auray.

Now we're going to hear from Neil Yeates, the Deputy Minister of Citizenship and Immigration.

9:10 a.m.

Neil Yeates Deputy Minister of Citizenship and Immigration

Good morning, Mr. Chairman, ladies and gentlemen.

I'm Neil Yeates, Deputy Minister of Citizenship and Immigration, as the chair has noted. I'm joined by Elizabeth Ruddick, who is the director general of research and evaluation at CIC.

I would like to thank the committee for inviting me back to speak today. Today I will focus my brief remarks on chapter 1 of the Auditor General's report, and afterwards, we will be happy to answer any questions you have.

With its focus on results and accountability, the government has emphasized the importance of the evaluation function in assessing the effectiveness of federal policies, programs and services.

Our Evaluation Division leads the evaluation function, and has developed an action plan, tabled here today, to respond to the Auditor General's recommendations.

I'd like to highlight progress we've made over the five-year period between 2004 and 2009. CIC initiated significant changes to the evaluation function with the creation of the research and evaluation branch and the establishment of an evaluation committee, as well as the implementation of a formal evaluation policy. Funding for the function increased from about $650,000 in 2004-05 to about $2 million in 2008-09. That's along with an increase in the number of professional staff, from three full-time equivalents in that initial year to 13 in 2008-09.

In the past the focus was on evaluating grants and contributions to meet the requirements of the Treasury Board and the Federal Accountability Act. This growth in resources has allowed us to increase the coverage of departmental programs and the rigour and quality of the evaluations themselves. With more and better studies, the results and conclusions of CIC evaluations are increasingly used by senior managers to inform program and policy decisions. As a result, under the management accountability framework, the evaluation function--initially assessed as unacceptable in 2006--reached a rating of acceptable in 2008, showing steady improvement over a relatively short period of time.

The department agrees with the OAG's findings, Mr. Chairman. We have developed an action plan that includes the renewal of the comprehensive departmental performance measurement framework and program activity architecture, and will further the integration of the framework into our business planning process. This will improve the availability of performance information for evaluations.

CIC is also adding an external evaluation expert to the department evaluation committee. We are currently identifying a list of potential candidates and developing terms of reference for such an expert.

A process is now also under way for soliciting client feedback at the end of evaluations. We are finalizing an internal client survey and we will carry out the survey this year. The survey will be administered to senior managers of programs that have recently been evaluated, as well as to members of the evaluation committee.

Mr. Chairman, the Auditor General's report observed that my department's coverage of spending was low, particularly for grants and contributions. This is due largely to the renewal cycle of CIC's grants and contributions and the fact that about 88% are concentrated in only two programs. I'm happy to report that between fiscal year 2009-10 and this fiscal year we will have evaluated these two large programs, accounting for this large proportion of our grants and contributions budget. The other programs are much smaller in comparison, but all will be evaluated over the five-year cycle.

Recognizing the need for more comprehensive evaluations and a broader coverage, CIC has increased the non-salary evaluation budget by $500,000 this fiscal year, 2010-11. We will add an additional $500,000 in 2011-12, for a total non-salary budget of $1.5 million in 2010-11 and $2 million in 2011-12 and ongoing. As well, by the end of this fiscal year the FTE complement will reach 20 persons devoted to this function.

Mr. Chairman, over the past several years some evaluations have had to be postponed or rescheduled for various reasons, including a lack of available performance data, which has created challenges for completing those and other evaluations in a timely manner.

To avoid similar problems in the future, the Evaluation Division is working closely with CIC staff to develop more robust data collection strategies and tools, to ensure that data collected through our administrative systems will be available in the format required at the time any program is being evaluated.

These are just some of the ways we are working to address the Auditor General's concerns in a timely fashion. We are ready for your questions now.

Thank you very much.

9:15 a.m.

Liberal

The Chair Liberal Shawn Murphy

Thank you very much, Mr. Yeates.

Finally, the committee will hear from Ian Shugart, the Deputy Minister of the Environment.

9:15 a.m.

Ian Shugart Deputy Minister of the Environment

Thank you, Mr. Chairman.

I want to add my comment right at the outset that we at Environment Canada concur with the Auditor General regarding the valuable contribution that effectiveness evaluations can bring to decision-making, and we support the recommendations she has made.

At Environment Canada the evaluation function is an important contributor to our decision-making. It provides an essential source of information on the relevance and performance of the department's programs and has an important role to play in managing for results.

We use evaluations to make decisions ranging from program improvement, such as how an existing program should modify its activities or processes to enable it to better meet objectives, to ensuring whether there is an ongoing need for intervention or an appropriate role for the federal government before renewing programs. It provides me with information I need to demonstrate accountability for the use of public funds.

Environment Canada's evaluation function also plays an active role in the review of memoranda to cabinet, Treasury Board submissions, the department's performance measurement framework as well as individual programs' performance measurement strategies and plans. This ensures that program managers are considering, planning for and collecting performance information that can be used in future evaluations.

Our valuation function has had continuous improvement processes in place for several years, and we continue to look for ways to improve the value of our evaluations to support the department. Some of this was noted, and we were pleased to see that in the report of the Auditor General. We've accepted the two recommendations that the report addressed to Environment Canada. In particular, we agree with the recommendation to develop and implement an action plan to ensure that ongoing program performance information is collected to support effectiveness evaluation.

Past evaluations have included recommendations for improvement in the area of performance measurement. We think we're now starting to see improvements in the department with respect to an increasing number of programs that are developing and implementing performance measurement strategies, and we will continue to work hard at this, because it is critical.

Environment Canada also accepts the Auditor General's recommendation to consider the merits of including external members on departmental valuation committees. We have recently received a preliminary guidance from the centre of excellence for evaluation on this issue, and we're pursuing that examination.

Finally, I'd like to speak briefly to the concerns identified in the report regarding the capacity of departments to evaluate all direct program spending every five years. This policy was put in place to increase coverage, and this is an important goal, one we agree with. It recognizes the benefit of maintaining a broader base of effectiveness information for departmental activity. It is important to acknowledge that there are challenges inherent in striving for greater accountability. Increased evaluation coverage, in order to have more information on program performance, has to be balanced against the need to focus on program delivery and the realization of results.

The four-year implementation period for the policy allows us, we believe, the time to adapt our approach, realistically to expand the scope of evaluation activity within the context of department priorities, resources, and program requirements. To increase coverage within current funding levels, Environment Canada will adopt a flexible risk-based approach to planning the study's scope, the approach, and the level of effort for each evaluation. In so doing we will ensure that our evaluation resources are focused on areas within the department where evaluation information is most needed.

Finally, where appropriate, we will conduct evaluations that take a broader perspective on some areas of the program activity architecture, as opposed to conducting individual evaluations of each program within the PAA element.

These are some of the changes that are taking place at Environment Canada that were motivated both by the Auditor General's report and by the new policy on evaluation. We're looking forward to and believe we will see positive impacts in our evaluation program resulting from both of these.

Thank you, Mr. Chairman.

9:20 a.m.

Liberal

The Chair Liberal Shawn Murphy

Thank you, Mr. Shugart.

We'll now going to go to the first round of questions, starting with the Liberals, seven minutes.

Monsieur Dion.

9:25 a.m.

Liberal

Stéphane Dion Liberal Saint-Laurent—Cartierville, QC

Thank you Mr. Chairman.

Good morning to you all.

Ms. Fraser, I expect that for the Office of the Auditor General, it's a dream come true to be able to evaluate evaluations. You must feel like a fish in water. I suppose what you found however is less of a dream come true. I'd like to begin by making sure I understood you. You stated that you examined 6 departments and that you reviewed 23 evaluation reports, is that correct? I'm assuming that's a sample. What percentage of the evaluations carried out by those six departments over those years would that represent?

9:25 a.m.

Tom Wileman Principal, Office of the Auditor General of Canada

Mr. Chairman, it wasn't that kind of a sample. We asked the departments to provide us with a list of all the evaluations they undertook in order to examine their effectiveness. We chose some of the evaluation reports on their lists, on effectiveness, according to a procedure that we have adopted. We then reviewed those reports.

9:25 a.m.

Liberal

Stéphane Dion Liberal Saint-Laurent—Cartierville, QC

How many evaluations did you receive? Out of how many did you review 23?

9:25 a.m.

Principal, Office of the Auditor General of Canada

Tom Wileman

I do not have that information. Perhaps I could provide you with that information later.

9:25 a.m.

Liberal

Stéphane Dion Liberal Saint-Laurent—Cartierville, QC

Can you give me an idea? How many evaluations do you think those six departments would have undertaken?

9:25 a.m.

Auditor General of Canada, Office of the Auditor General of Canada

Sheila Fraser

Mr. Chairman, on page 9 in exhibit 1.3, you can see how many evaluations were completed over that period. The number varies from 14 to 42 by department.

9:25 a.m.

Liberal

Stéphane Dion Liberal Saint-Laurent—Cartierville, QC

Out of the 23 evaluations, 17 did not have adequate data. Therefore one can conclude that they weren't particularly enlightening.

9:25 a.m.

Auditor General of Canada, Office of the Auditor General of Canada

Sheila Fraser

Yes, the evaluation was not complete because of a lack of information.

9:25 a.m.

Liberal

Stéphane Dion Liberal Saint-Laurent—Cartierville, QC

That leaves six. Can one assume that the six other evaluations were of acceptable quality?

9:25 a.m.

Auditor General of Canada, Office of the Auditor General of Canada

Sheila Fraser

Yes, I believe so.

9:25 a.m.

Principal, Office of the Auditor General of Canada

Tom Wileman

There were problems in 17 out of the 23 evaluations. The departments pointed out in their report that there were shortcomings in performance data. The other departments did not mention that. Our review showed that 17 evaluations out of the 23 had in fact indicated, in the performance evaluation, a shortage of information on performance, performance data.

9:25 a.m.

Liberal

Stéphane Dion Liberal Saint-Laurent—Cartierville, QC

I know, but were you satisfied with the other six evaluations? Did this lead to program improvement, or to the elimination of bad programs? What was the outcome?

9:25 a.m.

Principal, Office of the Auditor General of Canada

Tom Wileman

The purpose was to determine to what extent the 23 reports were complete, based on certain criteria. The other six did not mention any problems or issues with respect to data collection or methods. In those reports, it appeared to us that the procedure was more satisfactory with respect to program evaluation standards.

9:25 a.m.

Liberal

Stéphane Dion Liberal Saint-Laurent—Cartierville, QC

There were problems in more than two-thirds of the evaluations you reviewed. The other six perhaps gave results. What is even more worrisome is that this only represents 5% to 13% of the programs per year. The others aren't even evaluated. Therefore very few are evaluated. For those that are evaluated, in more than two-thirds of cases, the evaluations are not satisfactory. One would hope that at least the others would lead to results. That's what your report states.

However, six departments—we heard this again this morning—increased their resources in this area. I did the math. If one looks at exhibit 1.6 on page 22 in the English version and page 27 in the French version, from 2004-2005 to 2008-2009, there was an increase of 38% in the funding for program evaluations. With respect to staff increases, table 1.7 on page 23 in the English version and page 28 in the French version shows that there was a 54% increase in evaluation staff. We end up with this kind of result. Obviously one has to ask why we have such unsatisfactory results with an increase in resources.

How much more should resources be increased, given that only 5% to 13% of programs were evaluated, and badly evaluated, given that your goal is to evaluate them all over five years? Twenty per cent of programs should be evaluated over five years, satisfactorily, when you're having difficulty in recruiting competent staff, which is another aggravating factor. You have used contractors, but we don't know if the contractors will stay long enough to provide any memory or experience to the various departments and Treasury Board itself appears to be completely overwhelmed. According to your report, there is not enough staff and you have not provided sufficient leadership. How will we get there? The government is increasing its requirements and you are not able to meet the ones you had already. The gap is quite glaring: 5% to 13% of programs evaluated when the goal is adequate evaluations of at least 20%.

9:30 a.m.

Secretary of the Treasury Board of Canada, Treasury Board Secretariat

Michelle d'Auray

Thank you Mr. Chairman.

You have asked several questions in your question. I will try to respond as fully as possible.

The points that were raised by the audit were in fact identified by Treasury Board Secretariat. That is why, over the evaluation period we reviewed and redrafted the evaluation policy. The policy now deals with evaluation in terms of program performance. I should point out the importance of data collection. That is why we put considerable effort into establishing data for measuring performance, because that was one of the deficiencies noted in Ms. Fraser's report.

We also provided for greater flexibility in the policy in terms of the type of evaluation, scope of coverage, as well as risk management, in other words their nature and importance. As Ms. Fraser pointed out, if these are programs that are going to change, then departments must be encouraged to focus on those questions.

We also started recruiting staff, as I pointed out in my opening remarks. We have created two recruitment pools at the intermediate entry level. There are approximately 1,500 individuals in these pools. We represent, in terms of the evaluation function, more than 500 individuals throughout government. Last year evaluation coverage was on average 15% for programs. In terms of grants and contributions programs, we have achieved more than 65% coverage.

I would say therefore that although not everything is perfect, we have made progress. We are now working in a very concrete and practical fashion with the evaluation community and with departments.

9:30 a.m.

Liberal

The Chair Liberal Shawn Murphy

Thank you very much.

Madame Faille is next, for seven minutes.