Summative Evaluation Report: ICP Final Report 2010-10-26 - Findings - Effectiveness and Efficiency

6.0 Findings - Effectiveness and Efficiency

6.1 Extent to which the design of the ICP allowed for appropriate planning and extent to which timelines were met

Elements of program design were addressed in detail through the formative evaluation report dated September 2005. It addressed seven evaluation questions related to the appropriateness of the ICP design elements, the alignment between the ICP design and implementation, and specific issues related to timelines, roles and responsibilities, support, monitoring and reporting. The formative evaluation included several recommendations in this area.

Based on interviews (INFC, FDPs, MC members and JS staff) and document reviews, it was difficult to determine how well the design of the ICP allowed for appropriate planning. However, the program has faced some difficulties in a few respects, which had impact to its management and implementation. They include changes in overall national program coordination shortly after its implementation, overestimated capacity to deliver contributions in early years and unanticipated delays that required two program end date extensions.

The ICP was designed by the Infrastructure - National Office initially located with Treasury Board Secretariat. The ICP was subsequently rolled into INFC, a new department created in August 2002 to provide a focal point for infrastructure issues and programs. These changes affected the management of the ICP with changes in key players responsible for the coordination of the various aspects of the program. This became apparent throughout the interview process where it was noted that very few interviewees were familiar with the early stages of ICP implementation and planning.

Figure 1 - Comparison of Anticipated and Actual Annual Contributions, in dollars

Comparison of Anticipated and Actual Annual Contribution, in dollars

Although the ICP was designed as a five-year program to be wrapped up by March 31, 2006, during the course of its implementation, evidence began to emerge for a need to extend the end date to allow additional time for ICP project completion.

The fact that the ICP had to be extended two times from March 31, 2006 to March 31, 2011, five years past the original end date of March 31, 2006, suggests that it was not anticipated that project completion and close-out phases would require more time. The four main reasons identified in the mid-term evaluation were: lengthy approvals, announcement and amendment processes; environmental assessments (EA); rising material costs; and short construction seasons / weather. It was also reported in the mid-term evaluation that much work was still required on the part of INFC and FDPs to agree on requirements for monitoring and reporting outputs and outcomes to ensure consistency across jurisdictions and to ease analysis of results at the program level.

Key mid-term evaluation recommendations included improved horizontal management and a review of SIMSI project close-out procedures. Despite these efforts, the program required a second extension in 2007 as a few provinces reported a broad range of unanticipated problems with closing out projects by the extended deadline. These uncompleted projects accounted for 1.4% of all projects approved to date and 8% of total program contributions.

The evidence from SIMSI data and case studies suggest that many projects experienced delays at one point or another during the life of the program. Nine out of ten provinces required amendments to more than 25% of their projects. In four jurisdictions, more than 50% of all projects required amendments. Moreover, for many projects examined in case studies, the delays occurred during the planning phase before the actual construction began. There were multiple factors causing delays. For some projects, discussions between the proponent and the funding organization took considerable time after the initial proposal was received. For other projects, discussions between proponents and funding organizations resulted in amended proposals. Also, the requirement for a Canadian Environmental Assessment Act (CEAA) review caused major delays for some projects, as some reviews identified significant factors that needed to be dealt with during the project construction.

In contrast to findings from SIMSI data and case studies, the recipient survey results indicate that 90% of projects were completed as planned, 83% were on time and 80% were completed within budget. In general, nine out of ten provinces required amendments to more than 25% of their projects. In four jurisdictions, more than 50% of all projects required amendments.

The comparison of planned versus actual program contributions (see figure 1) on a annual basis shows that the capacity to transfer large contributions at early stages has been overestimated and that the amount of time required during the program implementation phase for approval, construction and project close-out was underestimated.

From the perspective of municipalities, some felt that the ICP was a help and others said it was a hindrance to planning. For some, the rigour required in applications was beneficial to support project planning. The federal and provincial partners indicated that municipalities were strongly encouraged to do better planning (e.g., they looked to see if municipalities had taken life cycle costs into consideration). This led to the development of greater rigour in the planning processes.

However, the uncertainty surrounding approval of funds meant that some projects were put on hold for extended periods of time or that projects that were a better fit with the eligibility criteria may have been undertaken ahead of higher priority projects. There was also some concern from municipalities that lengthy gaps from receipt of application to decision led to significant cost increases for projects, thereby negatively affecting municipal financial plans.

6.2 Extent to which the ICP has been implemented as intended

INFC, FDPs, MC members and JS staff indicated that the ICP was implemented as intended and that efforts were made to ensure that the program was in compliance with the Agreements and other federal policies and regulations (e.g., Financial Administration Act, Canadian Environmental Assessment Act). Several indicated that compliance audits were undertaken and recommendations were addressed. With the exception of Ontario and the territories, all jurisdictions provided sample audit reports, management letters and references to compliance audits in MC minutes. Interviewees indicated compliance with the terms and conditions of federal/provincial agreements.

Joint Management Committees have been implemented in all jurisdictions. All jurisdictions except Quebec have established joint secretariats to communicate and coordinate program delivery at the local level. F/P/T agreements placed priority on green municipal infrastructure projects, and as a second priority, other types of local infrastructure. Specific targets for each priority were set in these agreements.

Table 6 compares the percentage of actual expenditures devoted to green municipal infrastructure with the initial delivery targets set in the Funding Agreements. Green Municipal Infrastructure targets were achieved or exceeded in most provinces, with one exception (Prince Edward Island). The targets set for the percentage of program expenditures devoted to Rural Municipal Infrastructure have been met or exceeded in some provinces (e.g. Ontario, Newfoundland, and Nova Scotia.)

Table 6: Proportion of Green Municipal and Rural Infrastructure Expenditures
(Source: SIMSI data as of January 31, 2010)

Federal Delivery Partner
or Department
Green Municipal
Infrastructure
Rural Municipal
Infrastructure
% of
Expenditures
Target
Levels
% of
Expenditures
Target
Levels
ACOA
New Brunswick 86.8% 70% 45.3% 40.0%
Newfoundland and 98.3% 60% 79.9% 56.0%
Nova Scotia 77.2% 60% 79.9% 39.0%
Prince Edward Island 49.4% 60% 48.0% 46.0%
CED-Q
Quebec 39.8% 40% 24.5% 24.0%
IC
Ontario 46.1% 40% 48.2% 15.0%
WD
British Columbia 74.6% 75% 26.7% 16.0%
Alberta 63.8% 40% 36.2% 26.0%
Saskatchewan 62.4% 50% 50.9% 50.0%
Manitoba 37.5% 20% 42.9% 33.0%

6.3 Extent to which the ICP formed effective partnerships with implementing agencies and departments

Joint Secretariat functions were established within each jurisdiction to manage the administration of the program and to support the MCs. In some cases, a Joint Secretariat was established where federal and provincial staff were co-located (Manitoba and Alberta). In other areas, a 'virtual' secretariat was established. In both cases, the division of roles and responsibilities for assessing project applications, project management and administration were established in collaboration with the FDP and provincial partners. The majority of interviewees (FDP, MC, JS and other stakeholders) found that the ICP governance model was a very effective method for F/P/T/M collaboration in addressing infrastructure needs.

Interviewees (MC members, JS staff and FDPs) indicated that roles and responsibilities within the ICP were clearly laid out in foundation documents such as the F/P/T agreements, Governance and Accountability Framework, and Guidelines for Federal Co-chairs in the Administration of the ICP. Some challenges were noted within Ontario due to the fact that three distinct provincial ministries were involved. Federal staff felt that Ontario had greater influence over project priorities and program delivery than other provinces. FDPs reported effective partnership arrangements with provincial departments in implementing the ICP. Several MC members and JS staff commented that the FDPs and provincial counterparts had on-going relationships with each other prior to the ICP, which made it easier to form collaborative partnerships. Federal and provincial staff often had relationships based on mutual trust and understanding that had evolved over time.

Interviewees (FDP, MC and JS representatives) described the relationship with INFC as being very hands-off, in part due to the fact that the ICP was launched prior to the establishment of INFC as a department. Although most interviewees described the relationship with INFC as positive, some difficulties arose because newly hired staff at INFC were less familiar with the operations of the ICP. It was sometimes felt that requests for information from INFC were not well coordinated leading to increased workload and some frustration for the Secretariats and MCs. INFC had regular conference calls with FDPs as well as annual meetings. At mid-management level, relationships were very good.

Two special case studies were developed to document the level of local government involvement in the ICP and the various joint secretariat approaches used in each jurisdiction. The case studies are annexed to this report.

The interviews with FDPs, MC members, municipalities and other partners revealed that local government involvement was seen as a very important factor that contributed to the success of the ICP, particularly in those jurisdictions that had a more clearly defined role and higher degree of involvement for municipal representatives. Municipal associations in the western provinces had the highest degree of involvement. In Manitoba, Saskatchewan and Alberta, the municipal associations were involved in the project selection process. In British Columbia, Nova Scotia and Prince Edward Island, the municipal associations were non-voting members of the MC.

6.4 Extent of support for INFC and FDP managers and employees in carrying out the ICP and achieving desired results

Several other infrastructure programs were introduced during the lifecycle of the ICP and continue to be managed through the Joint/Virtual Secretariat functions that were established under the ICP. Some concern was expressed by FDPs and JS representatives that their staff resources have not kept pace with the demand on administrative support. Application-based programs require significant time and resources for assessment of project applications. Project monitoring, processing claims and other financial requirements are also significant. Although some funding for Operations and Maintenance (O&M) expenditures was provided for within the F/P/T agreements, most interviewees indicated that additional resources were provided by FDPs and provincial departments.

It is difficult to reach a conclusion on this issue, since additional programs have been taken on by the Secretariats. When the national evaluation was conducted, it was also difficult for interviewees (FDPs, INFC and JS representatives) to comment on this question because, for the most part, the ICP has already been wrapped up and the staff is now focused on activities associated with other programs.

Interviewees agree there were inefficiencies during the start up phase, but that issues were generally resolved properly through the MCs. Beyond that, the perspectives of INFC and FDP representatives differ substantially:

  • FDPs prefer the ICP arrangements by which funds are transferred and managed by the FDPs. FDPs have some good words for the SIMSI support and the increased capacity of INFC over time.
  • INFC representatives feel that the management of funds by the FDPs (and the heavy influence of the provinces and municipalities) has limited the capacity of the federal government to manage the funds according to its priorities, including national results reporting.

6.5 Extent to which the ICP was adequately monitored and controlled by federal partners

Evidence gathered showed that key mechanisms for monitoring and control of the ICP were effective in program design and were operating satisfactorily for successful delivery of the ICP across Canada.

It was a requirement for management committees to ensure that both annual program audits and recipient audits were conducted based on an assessment of project risks. The mid-term evaluation indicated that audit activity had not been consistently undertaken across all jurisdictions over the course of the ICP and that corrective action was to be taken. A number of audits of management procedures were reviewed. Although there were initially some instances of non-compliance with established procedures, the audits confirm that, overall, the FDPs exercised due diligence in the delivery of the ICP and corrective actions were taken in order to comply with all policies and guidelines established for the program.

Project monitoring was essentially done by the Joint Secretariats (except in Quebec where it was done by the province). The funding agreements did not include very clear requirements for project reporting, and SIMSI did not initially provide sufficient capacity to monitor project implementation. As a consequence, FDPs used their own databases or tracking system to do project monitoring. The monitoring procedures included:

  • Check for claim payments (if no claims within a given period);
  • Site visits to projects (representative sample and all projects with higher risks);
  • Audits (based on audit plan); and
  • Specific site visits for environmental mitigation issues.

However, the information in SIMSI was insufficient for the management and oversight of ICP. In addition, it is difficult to determine from the available data whether or not the ICP was adequately monitored and controlled by federal partners. Nevertheless, the fact that key data are missing from a number of projects indicates that data are incomplete for monitoring purposes, at least within INFC (i.e., the data may be available in FDPs' systems but are not directly available to INFC)1. It should be noted, however, that interviewees indicated that substantial improvements have been made to SIMSI in the past few years to address information management needs more fully.

6.6 Effectiveness of INFC and FDP communications in supporting the ICP

As per the program's logic model, ICP communication activities and products developed over the program lifecycle contribute directly to the program's ultimate communication outcome: increased awareness by Canadians of the federal role in infrastructure. A communication strategy and regional communication strategies relating to federal involvement in infrastructure were developed, implemented and updated on a yearly basis. Communication products and activities such as websites, news releases, media monitoring and analysis of news coverage of the ICP, official ceremonies or special events and signage, have been produced over the lifecycle of the program.

The formative evaluation noted the lack of a consistent system for tracking and measurement of communication activities, and recommended that INFC develop and implement a strategy to collect the evidence required to measure the outcomes of communication activities. The communications program for the ICP, however was essentially complete in 2006, and the recommendation could not be applied to ICP communications.

6.7 Approach in program delivery deemed to be most cost-effective in terms of delivery agency and/or province.

As was the case with the mid-term evaluation, the National Evaluation could not generate strong evidence that the delivery approach used for the ICP is the most cost-effective. This is due in part to the lack of pertinent data to carry out an analysis of administrative cost-effectiveness and variations in the delivery modes used for other infrastructure programs. The mid-term evaluation made a series of recommendations in this respect, as follows:

  • INFC should work with FDPs to identify salary costs incurred in each year of the ICP in preparation for the summative evaluation of the ICP.
  • For all future programs, a project code should be set up in financial systems to allow for the allocation of salary dollars to program activities in a more efficient manner.
  • INFC, in cooperation with the FDPs, should develop documents describing in detail how the ICP is implemented in each jurisdiction. This would facilitate the development of a more rigorous methodology for the analysis of administrative cost-effectiveness during the ICP's summative evaluation.

This National Evaluation could not find evidence that these recommendations had been implemented, as the relevant financial information was unavailable. In the absence of relevant data to carry out an analysis of administrative cost-effectiveness, the evaluators attempted, as in the mid-term evaluation, to address the evaluation question by comparing operating budget allocated to each implementing organization at the outset of the program to the administrative costs they incurred to deliver the program in their respective jurisdictions.

Table 7: Amounts for Operations, including communications.

FDP Contributions
(2)
Operations
(planned)
(3)
Operations
(actual)
(4)
Variation
(3) & (4)
Operations
as %
of Contributions
ACOA $186,000,000 $4,400,000 $5,550,000 ($1,150,000) 3.0%
Quebec $521,900,000 $9,750,00 $10,080,000 ($330,000) 1.9%
Ontario $676,400,000 $13,050,00 $10,820,000 $2,230,000 1.6%
WD $554,200,000 $11,100,00 $12,300,000 ($1,200,000) 2.2%
Total $1,938,500,000 $38,300,00 $38,750,000 ($450,000) 2.18%

Source: SIMSI. These amounts do not include INFC administrative and overhead costs and Infraguide expenditures.

The variation between planned and actual operations budgets in Table 7 shows that the budget allocated at the outset of the program was underestimated. Resources allocated to federal delivery partners were not sufficient to cover administrative costs, with the exception of Industry Canada in Ontario. The formative evaluation has already reached the same conclusion, with interviewees reporting most FDPs used their operating and maintenance budget to subsidize administrative costs incurred for delivering the ICP. The National Evaluation did not generate evidence on measures used by FDPs to bridge their financial gap.

The lowest administrative costs (expressed in percentage terms) are observed in Ontario, where Industry Canada delivered the largest program contribution. The highest administrative costs are observed in the Atlantic Provinces, where ACOA delivered the smallest overall program contribution. These administrative cost variations suggest that FDPs need a minimum resource level to deliver programs, and that economies of scale can be achieved in the larger jurisdictions.

6.8 Recipient satisfaction with program delivery, and suggestions for improvement

A survey of organizations reveals a comparable rate of satisfaction across the different types of organizations. While recipients are generally more satisfied, and rejected applicants less satisfied with the program, all groups are fairly satisfied overall. However, as shown in figure 2, satisfaction is lowest with respect to the application process. The respondents' main suggestions for improvement relate to an easier and streamlined application process and a faster approval process.

Figure 2 - Satisfaction with Infrastructure Canada Program

Sastisfaction with Infrastructure Canada Program

More specific suggestions were provided through in-depth interviews with municipalities. Smaller municipalities often lack the resources and capacity to prepare funding applications without the assistance of external consultants (for some, the cost was prohibitive). A few municipalities mentioned that they have had to hire a full-time person just to monitor available programs and complete grant applications. In the Atlantic Provinces, several interviewees appreciated the direction and advice provided by the Joint Secretariat with respect to the preparation of project applications. This was a very important service, especially for smaller municipalities with limited staff resources.

Some improvements suggested by municipal interviewees include:

  • Give MCs delegated authority to approve projects and project amendments to expedite decision-making;
  • Provide the option of completing a hard copy, as a few municipalities were uncomfortable with the on-line application;
  • Provide a response to project applications (assuming they are submitted in timely manner) by March or April to give enough time for design work and tenders and to get work scheduled and completed within the construction season;
  • Communicate ranking criteria to municipalities; and
  • Provide clear explanation as to why an application is unsuccessful (this is supported by the aforementioned low satisfaction of rejected applicants with their level of understanding in this regard).

[1] As noted in the SIMSI audit report: "There was also little motivation for SIMSI users to update SIMSI beyond the project approval stage, once funds were authorized. Users were never required to keep SIMSI up to date, resulting in timeliness and/or completeness issues in the data. It is usually data associated with later stages of project development such as the project monitoring and financial data that were not entered as the project progresses. There was the risk INFC may not be able to report complete, accurate program results in a timely manner."

Date modified: