Impact evaluation: What works and what doesn’t

FOR DEVELOPMENT practitioners, it’s important to find out what programs work and what don’t. This is where impact evaluation comes in.

Doing impact evaluation is crucial to assessing the effectiveness of antipoverty or development interventions and determining whether such efforts should be scaled up or expanded.

Effective ways of evaluating impact were discussed in a seminar hosted by PIDS and international nonprofit organization, Innovations for Poverty Action (IPA) – Philippines, last Sept. 10 at the Romulo Hall of NEDA sa Makati Building, as part of the observance of the 11th Development Policy Research Month.

Nassreena Sampaco-Baddiri, IPA country director, said impact evaluation has gained new importance amid questions over the government’s use of development funds.

Jessica Kiessel, IPA director for country programs, said impact evaluations “tell us if we are on the right track and how we can improve.” Impact evaluation is also necessary for accountability and for taking stock of lessons learned.

“Instead of asking ‘Do development programs work?’ we should be asking ‘Which works best?’ and ‘How we can scale up what works?’” said Kiessel.

Kiessel stressed two important things. Impact evaluation should involve a comparison between a treatment group and a “counterfactual” or control group to determine if there has been an impact. Also, randomized experiments are more effective in conducting evaluations.

“We need to make comparisons. If the counterfactual group can’t be observed, we need to ‘mimic’ it…randomization is needed to make sure the counterfactual group matches what the treatment group would look like without the program,” she explained.

Kiessel noted that in Ghana, three randomized evaluations found low-cost, remedial education programs to have quick, positive impacts on literacy and numeracy.

Moreover, randomization helps ease tension between treatment and control groups.

“If properly designed and conducted, randomized experiments are the most credible method to estimate impact of a program,” Kiessel said. “The bottom line here is that the math that we use does matter.”

PIDS President Gilberto Llanto said impact evaluation is becoming important with researches needing more rigor. Impact evaluation is also needed now that the government has decided to incorporate performance measures in the budget process, he said in his closing remarks.

PIDS and IPA have agreed to a partnership to disseminate the results of impact evaluations to policymakers and the public, Llanto said.

IPA, established in 2002, ties up with researchers and organizations around the world to study which social and economic programs work and which don’t. IPA has 14 country programs, including IPA Philippines, and more than 350 research projects completed or ongoing in over 40 countries across Africa, Latin America, and Asia. IPA has been conducting research in the Philippines since 2003.

Post Author:

Leave a Reply