Summative Evaluation Report: ICP Final Report 2010-10-26 - Methodology

3.0 Methodology

3.1 Overview of Methodology

A detailed methodology with evaluation matrix is found in Annex C. In summary, the evaluation included the following methodologies:

  • Document review – During the course of the evaluation, an in-depth review was completed on documents from INFC, the FDPs and interviewees, including background documents of the program, Treasury Board submissions, contribution agreements, and Terms and Conditions. The complete list of documents reviewed is included as Annex D.
  • Data review – The data review included a review of Shared Information Management System for Infrastructure (SIMSI) data, other data received from FDPs and data on other programs (for the cost-effectiveness review)
  • Cost-effectiveness review – The cost-effectiveness review examined whether the resources allocated at the outset of the program to federal delivery partners were adequate for the administration of the program The analysis also compared the ICP to eight domestic and foreign programs. In addition, as per the requirements outlined in the 2001 evaluation framework for the ICP, the administrative efficacy of the implementing agencies/departments (ACOA, CED-Q, WD, IC, and INAC) was examined. The detailed analysis is found in Annex E.
  • In-depth interviews – a total of 99 in-depth telephone interviews were completed with a range of stakeholders (INFC representatives, Joint Secretariat representatives, Management Committee co-chairs and members, municipal representatives, partners, provincial representatives, FDP representatives and other stakeholders).
  • Telephone surveys – Four different telephone surveys were conducted among Canadian municipalities and recipient organizations. In total, 1,089 interviews were conducted – 500 with funding recipients, 255 with rejected applicants, 126 with withdrawn applicants and 208 with non-applicants.
  • Case studies – 13 case studies were completed. 11 involved project case studies with recipient organizations (municipalities or other eligible organizations), one per province and one in the Yukon. In addition, a case study on local government involvement and another one on Joint Secretariats were prepared. More details on the project case studies are provided in Annex F. A summary on local government involvement is provided in Annex G and the Joint Secretariats case study is included as Annex H.

3.2 Evaluation Issues

This evaluation examines issues related to the program's relevance, success, effectiveness and efficiency. The evaluation also examines lessons learned that can be applied to other INFC programs. A matrix outlining how each evaluation question is addressed through the various study approaches is provided in Annex C. Multiple lines of evidence have been used to address all issues and questions.

3.3 Evaluation Study Limitations

While the overall study methodology is strong and provides the basis for addressing all evaluation study issues through multiple lines of evidence, there are some limitations with the evaluation study. These limitations include:

  • Given that multiple delivery partners were involved at the F/P/T levels, some information and data were not available directly through Infrastructure Canada but were provided by delivery organizations including the FDPs, the Management Committees and the Joint Secretariats (JSs). Consequently, some of the data were available in some provinces or territories but not in others. Additionally, some of the requested data were not received. Documents were not necessarily still available from INFC, the FDPs, provinces, etc. since many provinces and territories have completed and closed their projects.
  • The province of Quebec's own evaluation of the program was not available at the final report drafting stage.
  • The number of federal and provincial infrastructure programs in place over the past few years made it difficult to differentiate the ICP from other programs, especially during the interview process Since the program has ended in many provinces and territories, new federal infrastructure programs are more prominent in the minds of stakeholders, and some of the individuals consulted were confused about which program was being evaluated.
  • Given the nature of the program, the number of case studies completed is fairly small, particularly in light of the variables to consider (e.g., several program objectives and sub-objectives, 13 provinces and territories, project case studies versus special topics, etc.). The case studies therefore do not involve sufficient coverage of the different program elements to provide the basis for comparing different cases against each other. However, the case studies provide qualitative and quantitative information that is complementary to other lines of evidence.
  • As the program began in 2000, many of the individuals initially involved were no longer available; some of the individuals interviewed/surveyed had limited knowledge of the program or projects.

To the extent possible, the above limitations were mitigated by using multiple lines of evidence to address evaluation questions.

Date modified: