Learning in Development. Olivier Serrat. Читать онлайн. Newlib. NEWLIB.NET

Автор: Olivier Serrat
Издательство: Ingram
Серия:
Жанр произведения: Экономика
Год издания: 0
isbn: 9789290922087
Скачать книгу
policy dialogue and reforms, and the quality of the design and monitoring framework.

       Box 13: Developing Evaluation Capacity in Developing Member Countries

      • Stability of trained staff, high-level support, and the existence of a mandate for evaluation by decree are factors that contribute to success.

      • More thorough preparation of future TA operations should ensure high-level ownership and commitment, and participation of key stakeholders in formulation and design.

      • If the conditions for public sector capacity building are not met, an assessment must determine whether the systemic or underlying problems should be addressed first.

      • Building DMC capacity requires a holistic approach, considering the needs at all levels.

      • The location of responsibility for evaluation within organizational hierarchies is also important.

      • During design and implementation of TA operations, care must be taken that performance evaluation systems do not become supply driven, complex, or too resource intensive to sustain.

      • Establishing performance evaluation systems is a means to an end—benefits are obtained when the results are used in decision making. The design of TA should include specific features to encourage, facilitate, and formalize the incorporation of evaluation results in decision making.

      • A case study approach is needed to develop staff competency and confidence to carry out evaluation.

      • For larger TA operations, a firm or institution should be recruited, rather than individuals.

      • The pace of TA should be driven by a sense of ownership and commitment in DMCs.

      • The introduction of computerized information systems is not a solution to poorly performing manual systems. Various institutional, management, and social factors need to be taken into account.

       Box 14: Perceptions of OED: Feedback from Members of the Board of Directors

      Interviews with Board members revealed the general perception of the mission and functions of OED as to provide independent assessment with a direct link to operations. OED is seen as collegial, dedicated, and professional. While OED has generally changed with the changing focus of ADB, there is an inevitable lag as evaluation activities adjust to new organizational thrusts.

      OED has been able to influence ADB’s operations at all levels by providing concrete recommendations based on solid and credible analysis. At the project/program level, the time lag between completion and evaluation is an issue, as evaluation findings can easily be dismissed as discussing the old way of doing things, while current practices may have changed. At the strategy and policy levels, the improved timing of country and sector assistance program evaluations has increased impact on the design of new country partnership strategies.

      In its knowledge management, OED faces several interface problems. Within ADB, OED should open up channels of communications, become even more specific about actionable recommendations, and delineate accountabilities clearly. The most difficult interface is with DMCs: OED should emphasize development of evaluation capacity. In the eyes of wider clienteles, such as NGOs, civil society organizations, and the general public, OED should not only be independent, but be perceived as such. It should produce concise and insightful summaries of its work that people can access and understand easily.

      To help ADB improve its development effectiveness, Board members invited OED to

      • develop a comprehensive annual development effectiveness report—building on the Annual Evaluation Review and Annual Report on Loan and Technical Assistance Portfolio Performance— that presents a truly serious discussion of results and holds ADB’s Management accountable for what it promised to do;

      • work in ways that enhance the link between development effectiveness and resource allocation;

      • generally emphasize simplicity in project/program designs;

      • keep the focus of ADB on poverty reduction, both income and non-income;

      • further strengthen the design and monitoring framework of projects, in particular by identifying killer assumptions and risks; and

      • promote more interaction and sharing among ADB departments and offices.

      Disseminating Findings and Recommendations. Although there have been improvements, ADB is not yet a learning organization in terms of actively using the lessons documented in OED reports to improve future operations. OED is developing a better system to categorize and disseminate its findings and recommendations using information technology. However, technology by itself will not solve the problem. OED is investing resources in knowledge management to distill lessons and do a better job of disseminating them within and outside ADB. New knowledge products and services are being designed, tailored to specific audiences,22 in forms that present results in accessible and digestible ways. Objective indicators are being developed to assess whether ADB is becoming a learning organization by using OED findings and recommendations.

       Influential Evaluations

      Evaluations that focus on key issues and provide usable findings and recommendations in a timely manner are a cost-effective means to improve the performance and impact of policies, strategies, programs, and projects. By challenging accepted thinking, such evaluations also contribute to improving overall development effectiveness.

       Box 15: Building a Results-Based Management Framework a

      Results-based management involves identifying the impact of an intervention, formulating its outcome, specifying outputs and inputs, identifying performance indicators, setting targets, monitoring and reporting results, evaluating results, and using the information to improve performance. A good quality design and monitoring framework is an integral quality-at-entry results-based management tool that (i) clearly identifies key project objectives with measurable performance indicators, (ii) establishes quantified and time-bound milestones and targets for the indicators at each level of the project, and (iii) specifies the sources of data for tracking implementation progress. Lacking one or more of these elements at entry weakens a project’s design quality.

      In 2003, an evaluation study on project performance management found that the quality of ADB’s design and monitoring frameworks was poor—particularly in terms of clearly documenting the impacts and outcomes that ADB is trying to achieve. In response to the findings from this evaluation, ADB’s Management developed an action plan to rectify the situation. Multiple actions were initiated to create quality assurance, mentoring, and training capability within originating departments, and these departments were given clear responsibility and accountability for quality, and quality assurance. The vice-presidents of ADB’s operations departments gave instructions that frameworks needed to be improved for loans and TA operations, and directors general and directors were also required to sign off on frameworks. Recognizing that staff skills needed to be enhanced, the action plan directed that focal points be appointed in all regional departments to promote awareness, consistency, and knowledge sharing. Greater executing agency involvement in the preparation of design frameworks was also anticipated to help develop executing agency ownership further, sharpen design quality, and build understanding that the frameworks would be used as a monitoring tool.

      The Central Operations Services Office and OED both played important roles. The former engaged a framework specialist, formulated the project performance monitoring system, and administered the initial inputs of the specialist to draft guidelines and conduct training programs. In 2004, more than 300 staff members attended briefing sessions that OED delivered on framework quality. A video version of this briefing was released for use by resident missions and interested parties. In 2004, OED also responded, daily, to requests for help in strengthening frameworks. Nevertheless, internal quality assurance alone is unlikely to be sufficient to ensure quality. Independent checking is also needed to validate that quality assurance systems are working effectively and whether quality improvements