Evaluation

Showcase

Below are some examples of projects listed by type of activity or need expressed by clients.

SCIENCE AND TECHNOLOGY PRODUCERS

Evaluation of Forest Ecosystems Science and Application Program Sub-Activity (2013)

Client: Natural Resources Canada (NRCan)

Description: The objective of the FESA Program Sub-Activity is to increase scientific knowledge on forest ecosystems and support stakeholders in their sustainable forest management policies and practices. As part of this Program Sub-Activity, NRCan conducts research, national assessments and monitoring to develop, synthesize and integrate scientific knowledge. This knowledge is used by the appropriate government jurisdictions, industry, and other stakeholders to develop forest management practices and policies and by NRCan and other federal government departments to meet international reporting obligations, form Canada’s negotiating positions on international environmental issues related to forests, and counter misconceptions of Canada’s forest practices. The evaluation encompassed the five-year period from fiscal year 2007/08 to 2011/12 and examined FESA's objectives and activities, focusing on relevance and performance issues.

Multiple indicators and lines of evidence were used to address the main evaluation questions:

Phase IDesign: Prior to the evaluation, an evaluation assessment was conducted by Science-Metrix to develop a program profile for FESA, including the adjustment/confirmation of the logic model. The evaluation assessment and subsequent planning phase also served to collect, analyze and assess available data for use in the subsequent evaluation (i.e., data quality assessment), including the identification of potential interviewees, financial data, and project-level data. The evaluation assessment not only ensured that the information necessary to complete the work was collected and reviewed, but also improved the buy-in and understanding of various participants who would be involved in the evaluation. During the design phase, further development and refinement of the evaluation framework, questions, and program logic model were made. Most of the evaluation tools were developed during this phase.

Phase IIFieldwork and Technical Reporting: Based on the use of multiple indicators and lines of evidence to address the main evaluation questions, the second phase consisted of the implementation of the evaluation tools. Four data-collection methodologies were developed and implemented in order to provide evidence from multiple types (quantitative and qualitative) and sources (primary and secondary) of data. Findings identified from each line of evidence were then summarized and presented by evaluation question in four separate technical reports. In addition, a technical report summarizing findings from the database analysis conducted by NRCan was provided to Science-Metrix.

Phase IIIIntegration and Reporting: All data collected as part of the fieldwork, including that already collected during the evaluation assessment phase and by NRCan (database analysis), were integrated into a final evaluation report.

The following lines of evidence were used in this evaluation:

Database Analysis: NRCan’s Strategic and Evaluation Division conducted a review of quantitative data (including financial information) contained within FESA’s internal planning tool to provide insight into the efficiency and economy of the sub-activity.

Document Review: A thorough review of approximately 200 documents, secondary literature, files and sub-activity data was conducted to provide an overall understanding of the FESA Program sub-activity, as well as details on program needs, alignment with government priorities and other information to inform relevance issues. These documents primarily consisted of departmental documents (strategic plans, annual reports, audits, etc.) and program documents (financial and administrative databases, performance monitoring, communication outputs, etc.).

Survey of Component Leads: A web survey was conducted with 56 component leads out of a total survey population of 93. The main objective of the survey was to collect information regarding achievement of component objectives and subsequent outcomes, participation of partners/collaborators, level of stakeholder satisfaction, etc. The survey primarily addressed performance issues to fill data gaps relating to outcomes, but relevance issues were also briefly examined. Views on cost-effectiveness and program delivery were also collected.

Interviews and Focus Groups: A total of 97 interviews were conducted with stakeholders both internal (i.e., individuals who played a significant role in the design and delivery of FESA) and external (including federal and provincial representatives; industry; international experts; academia; NGOs). Internal stakeholders provided valuable insight on relevance, efficiency and economy, as well as on factors that impacted program performance and lessons learned. External stakeholders helped evaluators gauge the levels of awareness and stakeholder support for the program and offered valuable observations of achievements, impacts, and alternatives. Additionally, five focus groups were conducted to gain an in-depth understanding of the needs of external stakeholders and how FESA activities relate to these needs.

Bibliometrics and Webmetrics: Bibliometric and webmetric methods were used to analyze the scientific performance and use of peer-reviewed publications, grey literature and other types of outputs (e.g., tools, frameworks, databases) of the FESA Program Sub-Activity.

The evidence suggested that while there is an ongoing need for the FESA sub-activity, a few adjustments should be made to best fulfill that need. As such, six recommendations were made, notably to clarify its mandate and objectives in certain areas and improve communication with both internal and external stakeholders.

Read the report

 

SCIENCE-BASED POLICY AND REGULATION

Evaluation of the Office of Energy Efficiency (2014)

Client: Natural Resources Canada

Description: This report presents the findings of the evaluation of Natural Resources Canada’s (NRCan’s) Energy Efficiency Programs. The programs are administered by NRCan’s Office of Energy Efficiency (OEE), part of the Energy Sector. These programs were last evaluated in 2009/10. The evaluation covers program activities from 2009/10 to 2013/14, comprising $1.1 billion in NRCan funding. The objective of the Sub-Program is to increase energy efficiency, resulting in energy savings and reduced greenhouse gas emissions across targeted sectors of the Canadian economy (housing, buildings, vehicles, equipment, and industry). To do this, the Sub-Program has activities that target the energy-using behaviour of consumers and businesses.

The evaluation employed the following multiple lines of evidence:

Document and Performance Measurement Data Review
Evaluators reviewed OEE’s Performance Measurement data, key program documents and completed impact studies and surveys that covered the evaluation period. The findings from this exercise assisted SED in calibrating the data collection instruments so as to focus primary data collection on specific evaluation questions, program activities, and issues where there were information gaps or where findings needed to be further explained. This information was used to inform the final data collection tools, interview and case study selections and to corroborate evaluation results. As additional documents were collected through interviews, they were incorporated into the document review to ensure they were considered in the final evaluation analysis.

Interviews
Key informant interviews were conducted to respond to the majority of the evaluation questions and to complement the other methods. A total of 100 semi-structured interviews were conducted across the five program sectors to provide in-depth qualitative information on activities, context, relevance, progress towards outcomes, and economy and efficiency including delivery alternatives. This included 20 internal program interviewees, 72 external interviews with partners, users, and non-users of OEE programs, and 8 interviews with other organizations who are not directly involved in the programs but could speak to energy efficiency needs and views of their membership vis-à-vis OEE Energy Efficiency programs. All interviews were semi-structured, lasting approximately 1 hour.

Case Studies 
Fifteen case studies were conducted, across the five OEE sectors. The case studies provided an in-depth review of individual initiatives to more fully understand what occurs from program delivery to achieving results for particular instances. Case studies were selected to assess key causal mechanisms in the program theory and to confirm theories of implementation and change (e.g., the effectiveness of regulation in a particular instance, or the effectiveness of training).

  • Ongoing activities: Ten standard-effort case studies (i.e., 5 interviews (mostly external), review of documents and data provided by OEE and interviewees) focused on ongoing activities that are updates to key activities that are part of the program’s theory of change: providing training, information development, and regulations, codes and standards; and
  • New Activities: Given the need to provide information on activities that have not been evaluated before (i.e., part of the 2011 mandate), five case studies were selected to have a higher level of effort (i.e., 10 interviews (mostly external), review of documents and data provided by OEE and interviewees) to assess the roll-out and implementation of new program activities identified in the 2011 mandate for the ecoENERGY Efficiency programs

Surveys
Stakeholder and client surveys provided quantitative data on the views and experiences of target audiences that have used OEE products and services, including impacts of programming on their behaviour. SED worked closely with OEE to augment their already planned studies to capture data for the evaluation. To preserve neutrality, SED had final approval on specific evaluation questions added for the evaluation and advised on data collection. Target groups included building and industrial facilities managers, home builders, homeowners, ENERGY STAR program participants, and trucking companies (carriers).

Read the report

Evaluation of the Enhanced Feed Ban (2013)

Client: Canadian Food Inspection Agency (CFIA)

Description: In accordance with the Treasury Board Secretariat (TBS) Policy on Evaluation, the primary objective of this evaluation was to assess the relevance (continued need, alignment with government priorities, and alignment with federal government roles and responsibilities) and performance (achievement of expected outcomes, as well as demonstration of efficiency and economy) of the Enhanceed Feed Ban (EFB) and provide recommendations to improve program effectiveness and efficiency, as necessary.

The Enhanced Feed Ban (EFB) is a part of the Government of Canada's Bovine Spongiform Encephalopathy (BSE) Program, which is a horizontal initiative led by the Canadian Food Inspection Agency (CFIA). The main objective of the EFB is to accelerate Canada's progress in BSE management by preventing more than 99% of potential infectivity from entering the feed system as well as to enhance risk management of transmission of BSE in the cattle herd. To this end, the EFB aims to:

  • Strengthen animal feed restrictions through amendments to the relevant regulations;
  • Ensure compliance with control measures around prohibited materials and specified risk material (SRM) removal; and
  • Increase the level of verification and confidence that SRM is segregated from feed, fertilizer and pet food and that prohibited materials are not fed to ruminants.

The removal of SRM from animal feed is an important animal health protection measure and an indirect public health protection measure.

Evaluation methodology

This evaluation used a combination of quantative and qualitative methods, as follows:

  • Program data review
  • Document and literature review
  • Program documentation andf file review
  • Internal and external stakeholder interviews
  • Analysis and integration of data

Read the report

Evaluation of Canadian Blood Services Grant and Contribution Programs (2013)

Client: Health Canada and the Public Health Agency of Canada

Description: Within the Canadian health system, a critical role of Health Canada is to act as funder and information provider, supporting organizations through grants and contributions to help meet overall health system objectives. The non-profit Canadian Blood Services (CBS) is one of these organizations supported through federal funds (as well as provincial and territorial funding), operating at arm’s length from government. Related to these broad lines of work, Health Canada supports two specific CBS programs through the Grant and Contribution Program under evaluation: the Organ and Tissue Donation and Transplantation (OTDT) Program (five-year contribution, $17.9 million total) and the Blood Research and Development (R&D) Program ($5 million annual grant). The evaluation period for both programs was from 2008/09 to 2012/13. The evaluation covered activities carried out by CBS under the federal components of the OTDT and R&D Programs (i.e., those that were funded from Health Canada’s Grant and Contribution Program).

In accordance with the 2009 Treasury Board Policy on Evaluation and related Directives, the evaluation focused on core issues of Relevance and Performance (effectiveness, efficiency and economy) of the two CBS programs.

Evaluation methodology

Multiple lines of evidence were used to address 10 evaluation questions.

Literature, document and file review (including the review and analysis of administrative data): This review of program documents, files, databases, and governmental and departmental documents (e.g., budgets, reports on plans and priorities, performance reports, RMAF, etc.) was a key line of evidence to address several core issues. For this particular evaluation, more than 700 documents and files were reviewed.

Web surveys (2): Two surveys were conducted with CBS and OTDT external stakeholders. These surveys were particularly important to gather information on the achievement of expected outcomes.

Key informant interviews (18): Key informant interviews were conducted with Health Canada management and staff (i.e., internal stakeholders who have played a significant role in the oversight of this grant and contribution program), as well as representatives from CBS (i.e., the recipients). Key external stakeholder and/or experts groups outside of these two organizations, such as partners (e.g., CIHR, CIHI, associations, representatives from provinces/territories), beneficiaries, and subject experts––particularly those who have played a direct role in program activities––were also consulted.

Overall, the evaluation found that the CBS Grant and Contribution Programs continue to address a demonstrable need and remain responsive to the needs of Canadians. Three recommendations were made to Health Canada to strengthen the program’s objectives, develop contingency plans and facilitate the shift toward R&D.

Read the report

Evaluation of the CFIA’s Enhanced Bovine Spongiform Encephalopathy (BSE) Initiative (2009)

Client: Canadian Food Inspection Agency (CFIA)

Description of project 

In response to the first BSE cases in May of 2003, the Government of Canada, through Health Canada and the CFIA, enhanced its existing BSE-related activities and rapidly developed a comprehensive suite of science-based measures to effectively minimize the likelihood of exposure, amplification and spread of BSE within the cattle population. Since 2003, the CFIA obtained additional financial resources for the Enhanced BSE Initiative, including the following components:

  • Enhanced Surveillance and Testing
  • Enhanced Tracking, Tracing and Enforcement
  • Expansion of Domestic Slaughter Capacity
  • Removal of Specified Risk Material
  • Expanding Export Markets

This evaluation examined the following issues: success and achievements, governance, design and delivery, relevance and continued need, performance measurement and reporting, and cost-effectiveness and alternatives.

The evaluation was based on evidence from: (1) interviews with key informants internal and external to the CFIA and (2) a review of documents and files from the CFIA and relevant external organizations.

 

RESEARCH FUNDING

Genome Canada Five-Year Evaluation (2015)

Client: Genome Canada

Description:Genome Canada, established in 2000, is a not-for-profit organization that invests in large-scale genomics initiatives in sectors of strategic and economic importance to Canada, aiming to strengthen genomics research and technical capacity in Canada, and foster multi-sectorial partnerships nationally and globally. With a view to generating economic and social benefits for Canadians, its target sectors include health, agriculture, environment, forestry, fisheries and energy and mining. Genome Canada has also worked to ensure that genomics research efforts consider underlying ethical, environmental, economic, legal or social aspects (GE3LS).

The purpose of the second five-year evaluation of Genome Canada, was to assess the organization’s relevance and retrospective performance in the context of the Canadian research and innovation system from 2009/10 to 2013/14. The report also comprises a prospective dimension as it seeks to inform management and other stakeholders on how best to implement the organization’s strategic direction (Strategic Plan 2012–2017).

Evaluation methodology

Five data collection methods were used as part of this evaluation.

Management and delivery review: As part of the review, 24 interviews were conducted, and a document and file review of over 200 documents related to Genome Canada’s activities was performed. The purpose of the management and delivery review was to examine the history, processes and performance of Genome Canada across the evaluation period (2009–2014). Interviews were particularly useful to inform issues of continued need, efficiency and economy, whereas documents were one of the main sources of evidence – including quantitative evidence (i.e., financial and output data) – on the achievement of outcomes, as well as efficiency and economy issues.

Survey: Web surveys were conducted with five stakeholder groups, namely the Principal Investigators (PIs; n=53), co-applicants and other investigators (co-PIs; n=153), GE3LS PIs and leaders (n=52), highly qualified personnel (HQP; n=153) and other stakeholders (e.g., partners, collaborators, current and potential end-users; n=137) involved in Genome Canada-supported projects. These surveys mainly sought to collect information on performance issues to address data gaps relating to outcomes. Views regarding cost-effectiveness and delivery were also collected.

Bibliometrics: Using various indicators (e.g., scientific output; specialization, citation impact), Science-Metrix assessed the scientific performance of Genome-Canada’s funded researchers in genomics and in each of the strategic sectors, namely agriculture, health, environment, fisheries/aquaculture, forestry and energy/mining. This performance was also examined within the broader Canadian context (e.g., comparison to non-funded researchers, benchmarking with other countries), in an attempt to determine Genome Canada’s contribution to the national standing in genomics over the years. The level of national and international scientific collaboration of Genome Canada and of individual researchers was also measured, as an indicator of the organization’s effectiveness in coordinating genomics research efforts.

Case studies: Eight projects funded by Genome Canada were examined in depth as part of the case study method. The selection of case studies focused on emerging and natural resource sectors, namely agriculture, environment/energy/mining, fisheries, and forestry in order to address particular needs relating to the implementation of the 2012-2017 Strategic Plan. Each case involved conducting two or three interviews with relevant stakeholders (e.g., project contributors, partners and/or end users). A project-level document review was also conducted, examining applications, Genome Canada Five-Year Evaluation Evaluation Report March 2014 43 Science-Metrix Inc. quarterly, interim and annual reports as well as information such as collaboration agreements, scientific publications and any other related outputs as provided by Genome Canada and interviewees. The purpose of the case studies was to provide insight on key themes that span across projects and that relate to the success and impact of large-scale genomics projects. The resulting cross-case analysis is presented in Appendix 2.

International comparative review: The data collection and analysis for the international comparative review were based on two methods: a literature review of five organizations comparable with Genome Canada and targeted interviews with representatives from three of these organizations. The literature review component focused on identifying and extracting relevant information and data from the organizations’ websites and other relevant sources such as grey literature, funding announcements, etc. A total of four interviews were also conducted with individuals knowledgeable about their organization and the country’s overall support mechanisms for genomics research. This line of evidence was used to assess the continued need for national support of genomics research and contribution of Genome Canada to Canada’s global leadership in this field, as well as to position Genome Canada in the global context according to elements such as operating environment, strategy development, design and delivery of programs or projects, and best practices.

Read the report

Evaluation of the Canada Excellence Research Chairs (CERC) Program (2014)

Client: Social Sciences and Humanities Research Council (SSHRC)

Description: Launched in 2008, the Canada Excellence Research Chairs (CERC) Program supports Canadian universities in building a critical mass of expertise targeted within the four priority research areas of the Government of Canada’s Science & Technology (S&T) Strategy, in support of Canada’s growing reputation as a global leader in research and innovation. As a tri-agency initiative of the Canadian Institutes of Health Research (CIHR), the Natural Sciences and Engineering Research Council (NSERC) and the Social Sciences and Humanities Research Council (SSHRC), the Program is administered by the Chairs Secretariat, housed within SSHRC.

The evaluation of the CERC Program was conducted by SSHRC in collaboration with Science-Metrix and covered the initial five years of the Program, from inception in 2008/09 to 2013/14 (the end of phase 1 of the second competition). The evaluation addressed key questions of relevance, performance and efficiency, in accordance with section 42.1 of the Financial Administration Act and the Treasury Board Policy on Evaluation (2009).

Evaluation methodology

Ten evaluation questions were addressed via eight data collection methods: document review, review of administrative and performance data, cost-efficiency analysis, interviews, international comparison study, web survey, bibliometric analysis, and case studies of CERC units. Methodological challenges were mitigated in a proactive manner, resulting in the collection of robust evaluation data that was triangulated across multiple lines of evidence for each evaluation question. The impact of any limitations (e.g., small population size, current stage in the program lifecycle) is noted in this executive summary when related to specific findings or recommendations.

Read the report

Evaluation of Caltech Commitment for the Gordon and Betty Moore Foundation (2013)

Client : Gordon and Betty Moore Foundation

Description: In 2002, the Gordon and Betty Moore Foundation made a long-term commitment of $300 million in potential grants to the California Institute of Technology (Caltech). The stated goal of this Commitment was to support the institution in “advancing its position at the forefront of higher education, technological development, and scientific research.” In 2012, Science-Metrix conducted an external evaluation of this Commitment, examining 24 of the 29 grants in the portfolio to date. The Foundation especially sought to gain an understanding of the significance of scientific achievements as well as the catalytic value of the Commitment funding approach. Overall this evaluation aimed to provide conclusions and recommendations regarding the Commitment’s impact for science, for Caltech and for the Foundation.

Evaluation methodology: Multiple lines of evidence were used to conduct the evaluation:

  • Document and file review: About 250 documents were reviewed including administrative and financial documents, annual reports, grant applications and related documentation, as well as scientific and grey literature.
  • Interviews (53): To validate documentary information and to further explore grant outcomes and other issues, interviews were conducted with the following personnel.
    • 21 interviews with grant Principal Investigators
    • 9 interviews with management (at Caltech and at the Foundation)
    • 23 interviews with case-study-specific grant stakeholders
  • Case studies (8): Eight selected grants were explored in detail to generate a deeper understanding of the complexity of the funded initiatives, and the significance of their scientific achievements. The case studies involved:  
    • Targeted bibliometric analyses (4 of the 6 case studies)
    • Additional grant-specific document and literature reviews
    • Additional interviews with grant stakeholders

In addition to the above, Science-Metrix was responsible for preparing and facilitating an Expert Panel meeting. The Expert Panel provided expert judgement on the scientific achievement of the Caltech Commitment, reviewed and validated the preliminary findings of the evaluation, and elaborated on preliminary recommendations. Science-Metrix prepared the material for this panel to review, and analyzed the results of the panel meeting for inclusion into the final evaluation report.

Overall, the evaluation found that nearly all of the grants funded under the Commitment realized, or will realize, their originally stated outcomes. Of all of the grants examined, four have heightened potential to be transformational for science. The results stemming from the majority of grants were found to be highly attributable to the Commitment. The Commitment also had a remarkable impact on Caltech, by facilitating Caltech’s ability to concentrate its resources on specific research areas and by directly allowing Caltech to maintain or advance its position in education, research, and technology.

Evaluation of the Collaborative Research and Development Grants (2010)

Client: Natural Sciences and Engineering Research Council of Canada (NSERC)

Description: NSERC mandated Science-Metrix to carry out the evaluation of its CRD program. The CRD program is intended to give companies that operate from a Canadian base access to the unique knowledge, expertise, and educational resources available at Canadian postsecondary institutions and to train students in essential technical skills required by industry. The mutually beneficial collaborations are expected to result in industrial and/or economic benefits to Canada. CRD Grants support well-defined projects undertaken by university researchers and their private-sector partners. Direct project costs are shared by the industrial partner(s) and NSERC. Projects may range from one year to five years in duration, but most awards are for two or three years.

This evaluation, which covered the 1997–2008 period, included several data collection methods: (1) a documents/literature, files and program data review, (2) a grants review, (3) key informant interviews, (4) five web surveys (conducted with academic researchers, industrial partners, unfunded academic researchers, unfunded industrial partners, and highly qualified personnel (HQP)), (5) an economic impact analysis, and 6) six case studies.

The findings indicate that the CRD program is relevant, well designed, appropriately delivered, and generally provides considerable long-term benefits to industrial partners, academic researchers and HQP.

Read the report

 

NETWORKING, PARTNERSHIP AND KNOWLEDGE TRANSFER

The UK’s Performance in Physics Research—National and international perspectives (2014)

Client: The Institute of Physics (IOP)

Description: The Institute of Physics (IOP), in collaboration with the Engineering and Physical Science Research Council (EPSRC) and the Science and Technology Facilities Council (STFC), sought to commission a comprehensive assessment of the performance and impact of UK physics research in order to demonstrate its value at the highest level.

The objectives of the study were to:

  • Compare the UK’s physics research output with that of leading countries and other scientific disciplines
  • Compare the UK’s physics research impact with that of leading countries and other scientific disciplines
  • Explore patterns of international collaboration among UK physics researchers
  • Identify the determinants of UK physics research success and its impacts

This report, prepared by Science-Metrix made use of a portfolio of established data collection methods, encompassing both primary and secondary sources, in order to answer nine study questions in full, supported by concrete evidence and rigorous analysis: 

  • Bibliometric methods were used to compile robust data on research output, impact, and collaboration in physics research, its sub-areas as well as in related research areas in the UK and other leading countries. The bibliometric study used indicators based on counts of scientific papers and citations to papers (e.g., number of papers, specialisation index (SI), average of relative citations (ARC), average of relative impact factors (ARIF), international and national collaboration rates).
  • Interviews (12) were carried out with UK physics researchers, active in four UK physics research sub-areas, to discuss the impacts of their research.
  • Case studies (4) focused on these sub-areas in order to compile and compare information on the variety of impacts from UK-based research.

The project took place over a span of 16 months (December 2012 to March 2014). The bibliometric indicators are based on counts of scientific papers and citations to papers indexed in the Web of Science database produced by Thomson Reuters.

Read the report

Evaluation of the Networks of Centres of Excellence New Initiative (NCE-NI) (2009)

Client: Social Sciences and Humanities Research Council of Canada (SSHRC)

Description of project 

The NCE–NI pilot was established in 2003 by the NCE Secretariat to: (1) facilitate the creation of networks; (2) support networking activities among well-established researchers or research teams to encourage them to develop new partnerships with receptor communities; and (3) respond to the needs for interaction, partnership, and networking.

This evaluation examined the following issues: (1) the results of the pilot, in terms of outputs and outcomes; (2) the delivery of the pilot, as delivered by the NCE Secretariat and by the NCE–NI networks themselves; and (3) the relevance of the pilot.

Multiple methods and lines of evidence were used to address the evaluation issues: (1) a review of administrative documents, files, data and web-based sources; (2) case studies of the five funded NCE–NIs (including interviews and a documentation review); (3) a comparative analysis with comparable programs; and (4) a counterfactual analysis.

Read the report