Skip Navigation

ABM Archive Website


For up-to-date information, including our latest appeals, news, and resources, please visit our current website.




Why does ABM spend money on Program Evaluations?

Gender Action Group Members welcome ABM staff in Lui River, Western Province. © Julianne Stewart/ABM 2015
 Gender Action Group Members welcome ABM staff in Lui River, Western Province. © Julianne Stewart/ABM 2015

In 2016, ABM has funded program evaluations in Zambia, South Sudan, Papua New Guinea, and Vanuatu. Why?

The clearest reason is because evaluations provide ABM and our partner organisation with an outside person’s perspective of what has succeeded and what hasn’t succeeded in the program. Evaluations also provide a more objective way for ABM to be accountable to our donors, whether these are organisations like the Australian government or ABM’s generous supporters.

Our partner in Zambia, ZAC (Zambia Anglican Council), has been running a program to improve governance and gender equity in rural communities. The evaluator, who examined the program phases from 2011-2015, found the program had contributed to a number of significant changes. The Anglican Church had included gender-based violence and other gender-related topics in teaching aids for training the clergy and the church; there was a reduction in the practice of removing girls from school for child marriage; and sustainable businesses were established by some gender action groups, enabling the groups to run gender awareness activities and enabling many community members to walk shorter distances to access some business services.

The evaluator did not deem the program to be perfect, and made several recommendations. There are a number of stake-holders, like particular government offices and sectors of the community, which could be engaged more closely in the future, to maximise their contribution. Data collection could be improved and clearer ‘exit’ strategies need to be discussed with the gender action groups so they can remain active after program completion. In October-November 2016, as ZAC plans the next phase of the program in consultation with ABM, we are conscious of these and other recommendations by the evaluator.

Our partner in South Sudan, ECSSS (Episcopal Church of South Sudan and Sudan) ran a health program from January 2012 to December 2015. When the evaluator visited the program in May 2016, he found that the two health centres that had been constructed were running smoothly, financed largely by a guesthouse and restaurant that had also been constructed through the project. Locals were happy with the facilities and felt these had positively impacted on their health. They were even using the health centres’ solar panels to charge their mobile phones!

Our partner in Papua New Guinea, Anglicare PNG, is running programs to improve adult literacy, increase HIV-AIDS awareness and HIV-AIDS services, and improve gender equity. Two evaluations have been conducted and ABM has discussed the findings with Anglicare PNG. Whilst community members felt that awareness about gender equity and HIV-AIDS had increased, data collection was again a challenge. Another recommendation related to improve information flow between program staff in Port Moresby and program staff in outer regions. 

Our partner in Vanuatu, ACOM (Anglican Church of Melanesia), was one of the organisations participating in a response to Cyclone Pam. The response, run by ACT Alliance (a worldwide coalition of churches and faith-based organisations) was found to have been extremely successful overall. The evaluator found that ACOM filled a valuable gap by providing food relief in the northern islands, while most other agencies focused on the more heavily affected southern islands. One recommendation was that Church organisations in general should be more thoroughly integrated into nationally planned and coordinated humanitarian response.

ABM discusses evaluators’ recommendations with our partner organisations during planning of upcoming programs and many recommendations, like those related to monitoring and evaluation, are addressed in an ongoing way.

But it is not just the findings from program evaluations that have value. The actual evaluation process itself is an important tool for motivating program implementers, ranging from community members and our partner organisations to ABM staff ourselves. People who put time and effort into running a program want to know that others care enough to examine the program, and want to know what others felt was successful and unsuccessful. Visits by an evaluator are only a small part of what motivates program implementers to work hard, but communities and staff are usually proud to show off their work and indeed there is often a flurry of activity immediately before and after a visit by an evaluator!

The actual evaluation process itself can also be an important tool for building capacity of program implementers. A visit by an evaluator can build individuals’ public speaking skills and questions asked by the evaluator can give field staff and community members an insight into the priorities and ways of thinking of ‘outsiders’. ABM-commissioned evaluators are also required to consciously play an educating role, explaining the evaluation process to local staff who accompany their travels and sharing their understandings of effective approaches to community development. When ABM and evaluators visit partner organisations and communities we constantly seek information and understandings from them, so it is only fair that they have opportunities to learn from evaluation processes.

2016 has been a big year for evaluations. The idea is that all program implementers gain something from the evaluation process and findings.


Dr Terry Russell
ABM Program Effectiveness Officer 


< Back