Skip to main content
Start of content

HUMA Committee Report

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

PDF

CHAPTER THREE: MEASURING SOCIAL IMPACT

As social finance is broadly aimed at improving social outcomes, evidence and measurement of such outcomes will play an important part of any social finance project moving forward. The Committee heard that proper measurement of social outcomes and a strong evidence base are essential to the implementation of social finance tools, and could ultimately lead to more effective social programming. However, as will be outlined below, witnesses also described difficult challenges associated with developing appropriate metrics and evaluating the outcomes generated by social finance interventions.

A. Improving Social Outcomes

Siobhan Harty indicated that rigorous use of metrics to determine whether the agreed upon outcomes are achieved is essential to ensure effective use of resources and accountability when using public funds.[110] Furthermore, when accompanied by the appropriate metrics and evaluation methods, focusing on social outcomes could lead to a better idea of which interventions and programs work.[111]

With regard to the measurement currently taking place in ESDC, Ms. Harty indicated that ESDC has the data and the skill set to measure outcomes, and that such a measurement model could be applied in other sectors without necessitating an important increase in resources.[112] In addition, with regard to the Department’s potential for future action on outcomes measurement, Ms. Harty emphasized the following:

[F]or instance, in my directorate we do poverty measurement. We measure labour market outcomes … We have a research function that allows us to determine what the risk factors are for somebody who might have a poor labour market outcome, what the risk factors are for a young adult who's going to drop out of high school or post-secondary education. There's a large body of research in this country and internationally that would allow us to measure those things.[113]

Witnesses also indicated that there would be value in knowing when programs are not successful, or whether any change has occurred, in order to determine the extent of additional resources needed.[114]

Notwithstanding the value of measuring outcomes, the Committee heard that there are particular challenges associated with doing so. Some witnesses noted that social outcomes take more time to measure than program outputs, and require shifting perspective toward the longer term.[115] Mr. Jeffrey Cyr remarked that:

There are a ton of indicators you can measure all across the board, everything from increased economic participation and better schooling to how [clients] adjust in society. There are ways. It’s not rocket science to do it, but it takes a lot of effort and you have to build systems very thoughtfully at the outset.[116]

To illustrate the challenges associated with measuring outcomes, Mr. Cyr spoke of a leadership program and the complexity of determining whether the program has in fact created a leader. He observed that the measurement of these types of outcomes is difficult in the relatively short term of a typical government cycle. He expressed the need for a longitudinal measurement system, one that would establish the short-, medium- and long-term changes that are targeted.[117] Of a similar view, Tim Richter indicated that measuring outcomes would require tracking individuals over time to determine whether or not they fell back into homelessness.[118]

Although discussing SIBs particularly, Professor John Shields outlined the importance of measuring outcomes over outputs, as well as the significant investment associated with such measurement:

Data is, I think, absolutely critical. To know if they’re [SIBs] going to be effective or not, we’re going to need substantial data. That means, obviously, far more than counting bums in seats. It means actually using statistics from organizations like Stats Canada, being able to attach those to projects, trying to evaluate the outcomes of things like recidivism within the larger context of other factors happening within society. This requires, I think, some significant type of investment, in terms of the analysis and the importance of evidence-based data. That is a challenge with SIBs, but I think it’s a challenge more generally in terms of evaluating the outcomes of programs.[119]

Echoing these concerns, James Mulvale of the University of Manitoba suggested that governments instead draw on existing research to develop evidence-based approaches and improve the current public finance model.[120]

Witnesses also noted that focusing on social outcomes requires an appreciation of qualitative impacts, in addition to strictly quantitative results, which may be more difficult to demonstrate in terms of returns to investors.[121]

In an attempt to address these measurement challenges, some witnesses suggested that social finance initiatives should target projects that generate outcomes more susceptible to measurement, such as “[f]inding work for otherwise unemployable people, preventing recidivism, housing people who would otherwise be unhoused.”[122] However, as Andrew McNeill of the National Union of Public and General Employees argued, most social problems are influenced by many factors, making it hard to determine whether a specific program has had the desired social impact.[123]

Indeed, even with outcomes that are susceptible to measurement, causal relationships are often difficult to establish. As Sharon Mayne Devine of the Honourable William G. Davis Centre for Families explained, while one can measure the number of murders in a given region where a safe centre exists for victims of violence, it is difficult to assess whether it is the presence of the safe centre that directly contributed to preventing the crimes. Obtaining such data would require significant resources.[124]

The Committee heard that all parties to a social finance project should be involved in deciding which outcomes to measure.[125] Once outcomes are agreed upon, some witnesses indicated that evaluating whether outcomes are achieved would best be done by an independent third party.[126]

The Social Research and Demonstration Corporation (SRDC), a non-profit independent social policy research organization, is an example of such a third party. As noted previously, ESDC contracted SRDC to be an independent evaluator on two social finance essential skills training projects, where “private investors pay for the training up front and are repaid by the government if the training is successful in achieving pre-established outcomes.”[127] As the evaluator, SRDC designed the evaluation, but involved the proponents and intermediaries of the projects from the outset. Benchmarks were determined based on evidence from previous essential skills training programs.[128]

The projects evaluated by SRDC illustrate the challenges in measuring social outcomes. In the above skills training projects, repayment is triggered based on gains in literacy skills, measured before and after training. The gains in literacy skills are used as a proxy for labour market outcome success. These gains in literacy are intermediate outcomes, and not directly associated with measurable cash savings to government.[129]

B. Developing Appropriate Metrics

Metrics are tools to define and measure the outcomes sought. The Committee heard that they are “critical to success and … they need to be identified from the start and must show value or savings to government.”[130]

According to witnesses from the Mowat Centre, the evidence base currently available to actors in the social finance marketplace is a patchwork at best. They indicated that a valuable role for government would be to “invest in better evidence and measurement to support promising opportunities for program innovation and support the long-term development of evidence-based policies.”[131]

Tools to measure social impact, and specifically the ability to ensure that they remain constant, were identified as a challenge by an official of the Saskatchewan government.[132] Adam Spence explained the assessment of impact as having three components: a standard of impact, a plan for improvement, and appropriate metrics:

I think, secondly, beyond the standard there’s also having metrics, reportable metrics, or data points that are going to be able to demonstrate the change that exists among the enterprises and organizations that you’re working with. There are taxonomy or translation devices, including the impact reporting and investment standards of the global impact investing network, which can be used in this regard. There are many local examples that have been generated by Canadian enterprises and non-profit organizations.[133]

Some witnesses described the tools they have developed to measure the impacts generated by their work. For example, Vickie Cammack of Planned Lifetime Advocacy Network and Tyze Personal Networks told the Committee that her organization looks at “measuring the individual’s experience, their outcomes, and the economic efficiency of the application. Those three pieces are really key.”[134] Jeffrey Cyr shared with the Committee that they have created a system where they proceed to a 20-minute intake session with a client to measure where they stand on a given outcome – in that particular case, public speaking and engagement – using various indicators. This short intake can be repeated at different points in time to measure change.[135] In addition, Preston Aitken of Enactus Canada explained:

As an organization, we have implemented our own standardized metrics using research on such existing frameworks as IRIS and the sustainable livelihoods model. That has been invaluable, as we now have a common framework and language for our Enactus teams to show our impact. We can aggregate and better understand our data nationally. However, these standards do not necessarily align with other organizations, as there are no common standards.[136]

In addition to metrics related to the immediate users of a given social finance project, measuring broader community impact can be challenging for organizations. Ms. Devine spoke of the difficulty of measuring large-scale impact for an entire community:

When we look at larger-scale impacts, for a very large community, it's a challenge to measure some of those impacts. Doing that kind of impact study also requires dollars and investments of money. Sometimes we're asked to do that measuring, on the one hand, but we're not given the resources we would need to actually do the kind of study or the kind of work we need to do in order to demonstrate that impact. On the micro-level, we can demonstrate it. At a larger community-based level, we're just now beginning to be able to do that.[137]

Some witnesses stated that common standards or a universal measurement mechanism for measuring and reporting social impact in Canada would be necessary. Such a national standard for measurement would, the Committee heard, allow for impact comparison on a common baseline throughout the country.[138]

Witnesses also made reference to the United Kingdom’s Unit Cost Database, which provides the “cost” for taxpayers of over 600 social outcomes. As Tim Jackson explained:

The United Kingdom has posted on its cabinet office website the cost of 600 outcomes, everything from how much it costs to keep a single mother together with her child, to how much it costs to incarcerate a 16-year-old, to how much it costs to incarcerate a 45-year-old for the third time. They've essentially said to the private sector and to foundations, “Here is what we think it costs the taxpayers. If you can do it more cheaply, make us an offer on a bond.”[139]

Some witnesses suggested that the federal government could play a similar role in Canada by providing uniform information about the monetary value of such outcomes. Knowing the costing structure of social outcomes would allow stakeholders interested in the social finance marketplace to assess the monetary value of a given intervention in terms of cash savings to the government.[140]

The Committee heard that such outcomes-based values are particularly useful in the context of creating SIB agreements. For example, Kieron Boyle described how the U.K. Government had applied outcomes-based metrics to assess the costs and progress of a recently established £30 million social impact bond aimed at preventing youth unemployment. With respect to establishing the cost of the intervention, Mr. Boyle explained that:

What it was doing was intervening in youths aged 14 to 17 to improve things like their school attendance and their grades, because we know there's a very high correlation between those sorts of outcomes among those aged 14 to 17 and the risk of somebody becoming unemployed at age 18 to 21.
Essentially we have done a lot of data matching to say, if you achieve those sorts of things, what is the likelihood that somebody becomes employed or unemployed at age 18? That's around our knowing how much we save when somebody's employed at 18 versus unemployed at age 18. We've been able to put a price on those outcomes occurring for ages 14 to 17. We then put that out to the market, and predominately social enterprises and social sector organizations have said they can achieve that. The way in which they're achieving it is even in the sorts of ways that you're saying.[141]

Mr. Boyle further noted that the overall process of defining and evaluating progress toward the achievement of positive outcomes on this SIB is “a strictly and tightly defined process where the public managers will look at the outcomes … they're trying to achieve proof of those outcomes, and also the amount that they're willing to pay for those outcomes.” More specifically, he noted that, “these first social impact bonds that have been set up … [are] heavily evaluated so they will be spotting the longer term outcomes for these youths.”[142]

Finally, David Juppe cautioned against using a “fixed cost per case” when evaluating savings. Based on his research of SIBs, Mr. Juppe advised that such an approach could risk overstating the savings. He provided the example of the fixed cost per year of housing an inmate, which includes both fixed costs for operating the facility, and the variable cost associated with food and supplies for that given inmate.[143] Preventing an individual from being incarcerated would not save the government the entire fixed cost associated with that individual, since the facility would continue to operate.


[110]     HUMA, Evidence, 2nd Session, 41st Parliament, 17 February 2015, 1530 (Siobhan Harty).

[111]     HUMA, Evidence, 2nd Session, 41st Parliament, 19 February 2015, 1610 (Sarah Doyle).

[112]     HUMA, Evidence, 2nd Session, 41st Parliament, 14 May 2015, 1605 (Siobhan Harty).

[113]     Ibid.

[114]     See for example HUMA, Evidence, 2nd Session, 41st Parliament, 31 March 2015, 1620 (Jeffrey Cyr).

[115]     HUMA, Evidence, 2nd Session, 41st Parliament, 24 March 2015, 1620 (Bill Crawford, Executive Director, Eden Community Food Bank); and HUMA, Evidence, 2nd Session, 41st Parliament, 31 March 2015, 1615 (Jeffrey Cyr).

[116]     HUMA, Evidence, 2nd Session, 41st Parliament, 31 March 2015, 1620 (Jeffrey Cyr).

[117]     Ibid. 1615.

[118]     HUMA, Evidence, 2nd Session, 41st Parliament, 26 March 2015, 1715 (Tim Richter). A similar example was given by Jeffrey Cyr regarding the need to track people’s lives to show certain types of outcomes: HUMA, Evidence, 2nd Session, 41st Parliament, 31 March 2015, 1615 (Jeffrey Cyr).

[119]     HUMA, Evidence, 2nd Session, 41st Parliament, 23 April 2015, 1715 (John Shields).

[120]     HUMA, Evidence, 2nd Session, 41st Parliament, 26 May 2015, 1545 (James Mulvale, Dean and Associate Professor Faculty of Social Work, University of Manitoba).

[121]     HUMA, Evidence, 2nd Session, 41st Parliament, 26 March 2015, 1615 (Sunil Johal).

[122]     HUMA, Evidence, 2nd Session, 41st Parliament, 19 February 2015, 1620 (Stanley Hartt).

[123]     HUMA, Evidence, 2nd Session, 41st Parliament, 12 March 2015, 1530 (Andrew McNeill).

[124]     HUMA, Evidence, 2nd Session, 41st Parliament, 23 April 2015, 1625 (Sharon Mayne Devine, Chief Executive Officer, The Honourable William G. Davis Centre for Families).

[125]     HUMA, Evidence, 2nd Session, 41st Parliament, 17 February 2015, 1550 (Siobhan Harty).

[126]     Ibid., 1555.

[127]     HUMA, Evidence, 2nd Session, 41st Parliament, 12 May 2015, 1630 (Jean-Pierre Voyer).

[128]     HUMA, Evidence, 2nd Session, 41st Parliament, 12 May 2015, 1715 (Sheila Currie, Principal Research Associate, Social Research and Demonstration Corporation).

[129]     HUMA, Evidence, 2nd Session, 41st Parliament, 12 May 2015, 1635 (Jean-Pierre Voyer).

[130]     HUMA, Evidence, 2nd Session, 41st Parliament, 12 May 2015, 1530 (Dale McFee).

[131]     HUMA, Evidence, 2nd Session, 41st Parliament, 26 March 2015, 1530 (Sunil Johal).

[132]     HUMA, Evidence, 2nd Session, 41st Parliament, 12 May 2015, 1530 (Dale McFee).

[133]     HUMA, Evidence, 2nd Session, 41st Parliament, 28 April 2015, 1655 (Adam Spence).

[134]     HUMA, Evidence, 2nd Session, 41st Parliament, 26 February 2015, 1700 (Vickie Cammack, Founding Chief Executive Officer of Tyze Personal Networks and Co-Founder, Planned Lifetime Advocacy Network).

[135]     HUMA, Evidence, 2nd Session, 41st Parliament, 31 March 2015, 1620 (Jeffrey Cyr).

[136]     HUMA, Evidence, 2nd Session, 41st Parliament, 26 February 2015 1645 (Preston Aitken, Director, Programs, Enactus Canada).

[137]     HUMA, Evidence, 2nd Session, 41st Parliament, 23 April 2015, 1625 (Sharon Mayne Devine).

[138]     HUMA, Evidence, 2nd Session, 41st Parliament, 26 February 2015, 1645 (Preston Aitken); HUMA, Evidence, 2nd Session, 41st Parliament, 24 March 2015, 1545 and 1620 (Bill Crawford); HUMA, Evidence, 2nd Session, 41st Parliament, 24 February 2015, 1625 (Wayne Chiu, Chief Executive Officer, The Trico Group).

[139]     HUMA, Evidence, 41st Parliament, 2nd Session, 19 February 2015, 1615 (Tim Jackson).

[140]     HUMA, Evidence, 2nd Session, 41st Parliament, 10 March 2015, 1605 (Sandra Odendahl), and HUMA, Evidence, 41st Parliament, 2nd Session, 19 February 2015, 1615 (Tim Jackson).

[141]     HUMA, Evidence, 41st Parliament, 2nd Session, 28 April 2015, 1610 (Kieron Boyle).

[142]     Ibid.

[143]     HUMA, Evidence, 2nd Session, 41st Parliament, 12 May 2015, 1545 (David Juppe).