How to Align Incentives and Outcomes: Supporting the Creation of the Impact-linked Finance Funds

How to Align Incentives and Outcomes: Supporting the Creation of the Impact-linked Finance Funds

Lead Photo
Template G Content Blocks
Sub Editor

Type of RFE Engagement: Methodological support

Sub Editor

IPA’s Right-Fit Evidence (RFE) Unit provided methodological support to help set up Impact-Linked Finance Fund (ILFF) with an approach to setting outcomes-based metrics. The goal of this approach is to select payment metrics that accurately measure outcomes, to incentivize implementers to achieve the most impact they can. This guidance enabled fund managers to structure results-based contracts with a feasible and cost-effective verification approach, ensuring meaningful impact for students.

The Impact-Linked Finance Fund (ILFF), initiated by the Jacobs Foundation and the Swiss Agency for Development and Cooperation, aims to align incentives for social enterprises, enabling them to prioritize social impact alongside profit. ILFF required accurate measurement of final outcomes linked to payments, a task complicated by the high cost and length of traditional Randomized Control Trials (RCTs).

ILFF sought IPA's expertise in rigorous impact measurement and decision-driven data collection to identify feasible outcome measurement strategies that maintain impactful incentives. To do this, the RFE Unit explored various methods to measure impact and analyzed the trade-offs between cost and epistemic rigor. This engagement focused on three key areas, tailored to the unique context of each implementer: 

  1. Payment Metrics: Ideal impact-linked finance metrics focus on final outcomes like literacy or numeracy. When logistical or cost constraints prevent this, early outcomes such as changes in student or teacher knowledge or practices can be used if they are known to directly lead to final learning outcomes.
  2. Data Sources: Pre-existing data, such as national exam scores, should be utilized whenever possible to save time and resources while assessing learning outcomes. Remote data collection shows promising results.
  3. Evaluation Strategy: RCTs provide the highest certainty around impact estimates. The next best option is a well-designed quasi-experiment, which can minimize bias if well-designed and supported by a rich set of covariates, though they can be more costly than RCTs for the same statistical power. Simple before-and-after comparisons are generally ineffective for accurately estimating impact.

The RFE unit provided an actionable guide that leads ILFF contract designers through a series of questions to identify the combination of data source, payment-linked outcome, and causal identification strategy with the highest cost-credibility ratio for a given implementer. RFE also worked directly with ILFF team members to provide technical assistance in the application of this guidance for an initial set of implementers. Going forward, the guide can help both ILFF and other decision-makers in the results-based funding space to intentionally structure contracts in a way that maximizes the impact and incentives of results-based contracts.

Sub Editor

Implementing Partner

CB30 Flex Block
Donor Repeater Block
Swiss Agency for Development and Cooperation
Swiss Agency for Development and Cooperation
Sub Editor

Funding Partner

Donor Repeater Block
Jacobs Foundation
Jacobs Foundation