Evaluation and organizational learning
NGOs aspire to become better learning organizations, and leaders, equally, aim to become better as learners.
NGOs sometimes need to be able to measure success throughout their organizations
A group of InterAction member INGOs felt the need to learn lessons from the significant efforts and resources they had invested toward developing their capacity to measure results at the agency level. They needed an independent perspective on whether these efforts had been worthwhile and well-directed.
Tosca co-authored a white paper to capture and analyze NGO experiences with agency-level measurement. The authors shared this with the NGO community, including an executive brief for CEOs of InterAction members.
The participating NGOs were able to compare their approaches and experiences with those of their peers. NGOs who had not yet attempted to measure results at the agency level learned from the experiences of their peers and thus avoided some costly lessons.
NGOs need to evaluate at the strategy level
A major child-sponsorship NGO had chosen a human rights-based approach as a strategy but did not know what effect, if any, this new approach had on sectoral level outcomes in its programming.
Tosca was the leader of an evaluation team at the Maxwell School that undertook a Meta level evaluation of the effects of a human rights-based approach on the NGO’s outcomes in three sectors.
The NGO learned that implementation of the new approach was uneven and that capacity needed to be increased substantively to produce the intended outcomes.
A faith-based NGO needed to up its staff’s evaluation skills
The person who directs Evaluation and Learning had a solid draft staff training plan, but was in need of a mentor to run content and adult learning related ideas by
Tosca was hired to mentor the Director to make sure that training activities were not just designed for evaluation-technical staff, but also took into account the perspectives, worldviews and constraints of the ultimate users/decision makers in the organization
The training roll-out led not just to better technical skills in evaluation, but also to greater and appropriate uptake of evaluation data generated
A funders’ collaborative focused on transparency and accountability needed to commission an external evaluation of a technically complex set of interventions
The team lead assigned with this external evaluation had a lot of depth in evaluation but did not have sufficient prior exposure to the transparency and accountability context in civil society
Tosca was engaged to provide the background needed in the transparency and accountability practitioner discourse
The evaluation was able to provide more strategic suggestions that were also contextualized for the sub-sector