Farmanesh, Amir and Ortiz Bobea, Ariel and Sarwar, Jisha and Hasegawa, Tamiko (2006): Speaking the Same Language within the International Organizations; A Proposal for an Enhanced Evaluation Approach to Measure and Compare Success of International Organizations.
Preview |
PDF
MPRA_paper_15146.pdf Download (300kB) | Preview |
Abstract
It is currently difficult for Member States to assess and compare the success or performance of UN organizations despite recent movements towards results-based approaches. Efforts in the implementation of logical frameworks have been too independent and uncoordinated and left at the discretion of agencies. This has led to different and deficient implementations of the same theoretical approach making it almost impossible to draw any conclusions. The lack of a common approach is perceptible across agencies in the diversity of evaluation standards and terminology used to describe the same concepts, the unevenness and diversity of staff training as well as in the way intentions and results are presented. The myriad of organizations with some different sort of evaluation role may be seen as an additional symptom of the lack of coordination within the UN system.
The establishment of a useful and reliable evaluation process in the UN system requires three main elements: 1- a common and enhanced evaluation framework, 2- the human and organizational capacity to ensure the accurate implementation of the framework, and 3- the commitment of Member States and agencies to implement the approach.
This report mainly discusses the common evaluation framework and methodological issues, although it also provides significant insight regarding how to build the human and organizational capacity of the UN to carry out this approach.
Assessing the success of an organization entails the determination of three elements: mandate or mission relevance, effectiveness, and efficiency. The report provides insight into these three components of success but its primary focus is on effectiveness. Measuring effectiveness entails establishing precise targets to be reached by agencies and collecting actual results in order to assess if intended targets are being met. Indeed, assessing effectiveness encompasses comparing intentions (provided by targets) to actual achievements (collected through monitoring). The UN Secretariat itself does not provide targets to be met by the organization. Additionally, it over-emphasizes outputs (output implementation rates) and disregards the “big picture” provided by outcomes.
Under the proposed approach, subprograms meeting most of their targets are the most effective. Programs (agencies) with a large share of effective subprograms (programs) may be considered effective themselves. As a way to simplify and give an intuitive sense of effectiveness, subprograms could be attributed a category or color following a “traffic light” methodology (green for satisfactory, amber for average, red for below expectations) according to the share of targets satisfactorily met. The same could be done for programs according to their share of satisfactory subprograms. Program and subprogram performance data of every agency could be centralized (by a coordinating body) in a comprehensive webpage that would facilitate comparison between similar functions or themes across the UN system [Please refer to pg. 27 for an elaborate illustration].
The report also suggests the possibility of complementing this objective approach with a perception survey. Despite significant limitations of this type of subjective approach, it is still widely used and gives an idea of which organizations are best regarded by their peers. Contrasting actual performance data and perception indicators could be revealing, and could shed light in areas where the objective methodology may fall short.
One of the most important recommendations concerns the organizational capacity ensuring the accurate implementation of the evaluation approach. This capacity should be embodied by a centralizing coordinating body (perhaps under the CEB) that would 1-ensure a common evaluation training and support of UN staff and uniformity of standards (terminology, methods, etc.), 2- centralize performance data gathered from agencies in a common database and present results in a user-friendly manner where programs and agencies could be compared and 3- verify the validity of the data submitted by the agencies (performance auditing).
Item Type: | MPRA Paper |
---|---|
Original Title: | Speaking the Same Language within the International Organizations; A Proposal for an Enhanced Evaluation Approach to Measure and Compare Success of International Organizations |
Language: | English |
Keywords: | Evaluation Approaches in International Organizations; Perception Survey; Applied Log-Frame; Score Card; Efficiency and Effectiveness of United Nations Agencies |
Subjects: | P - Economic Systems > P4 - Other Economic Systems > P47 - Performance and Prospects H - Public Economics > H4 - Publicly Provided Goods > H43 - Project Evaluation ; Social Discount Rate L - Industrial Organization > L3 - Nonprofit Organizations and Public Enterprise > L31 - Nonprofit Institutions ; NGOs ; Social Entrepreneurship O - Economic Development, Innovation, Technological Change, and Growth > O1 - Economic Development > O19 - International Linkages to Development ; Role of International Organizations |
Item ID: | 15146 |
Depositing User: | Amir Farmanesh |
Date Deposited: | 09 May 2009 17:52 |
Last Modified: | 06 Oct 2019 04:22 |
References: | Ambassador John R. Bolton, U.S. Permanent Representative to the United Nations. “Challenges and Opportunities in Pushing Ahead on UN Reform”, Testimony before the Senate Foreign Relations Committee. Washington, DC., May 25, 2006. British Department for International Development, Multilateral Effectiveness Framework (MEFF), 2004. http://www.dfid.gov.uk/news/files/meff-faq.asp#organisational Executive Board of UNDP and UNFPA. “2003 Results-oriented annual report to the United Nations Capital Development Fund”, June 2004. Executive Board of UNDP and UNFPA, “Multi-year funding framework report on UNDP performance and results for 2004”, June 2005. Executive Board UNICEF. “Annual Report of the Executive Director: Results achieved for children in support of the Millennium Submit agenda, through the medium-term strategic plan, 2002-2005”, June 2006. GAO. “Internal Oversight and Procurement Controls and Processes Need Strengthening”, 27 April 2006. Mathiason, John R. “Invisible Governance”, 2005. Mathiason, John R. “Who Controls the Machine, III: Accountability in the Results Based Revolution”. Published online in Wiley InterScience. Public Admin. Dev. 24, 61–73 (2004). Multilateral Organizations Performance Assessment Network (MOPAN). “The MOPAN Survey 2005 Perceptions of Multilateral Partnerships at Country Level”, 2005. http://www.ifiwatchnet.org/doc/MOPAN2005.pdf Norwegian ‘Chr. Michelsen Institute’. “Multilateral Organizations Performance Assessment Network (MOPAN), Report from the 2003 Pilot Exercise”, December 2003. http://www.cmi.no/pdf/?file=/publications/2003/mopan_final_report_2003.pdf Office of Internal Oversight Services. “United Nations Organizational Integrity Survey”, 2004. http://www.un.org/News/ossg/sg/integritysurvey.pdf UNEG. “Norms for Evaluation in the UN System”, 29 April 2005. http://www.uneval.org/docs/ACFFC9F.pdf UNEG. “Standards for Evaluation in the UN System”, 29 April 2005. http://www.uneval.org/docs/ACFFCA1.pdf UN Secretariat. “Medium Term Plan for 2002-2005” available at: http://www.un.org/documents/ga/docs/55/a556rev1.pdf UN Secretariat. “Programme Budget for 2002-2003”, part IV, Section 9 (Programme 7 of the medium-term plan for the period 2002-2005) available at: http://unpan1.un.org/intradoc/groups/public/documents/un/unpan004212.pdf UN Secretariat. “Programme Performance for the 2002-2003 Biennium” is available at: www.un.org/depts/oios/ppr2002_2003.htm |
URI: | https://mpra.ub.uni-muenchen.de/id/eprint/15146 |