Staff Response to the External Evaluation of the Independent Evaluation Office Executive Board Meeting

1. Some five years after the Independent Evaluation Office (IEO) was established, it is appropriate, as was envisaged then, to consider how the IEO is functioning, what it has achieved, and how it can be improved. The perspective of a disinterested, external panel can be valuable in this regard, and the present report will surely contribute to a lively Board discussion of this important topic. Staff appreciate the Panel’s attention to maintaining a high quality IEO staff and improving the quality and effectiveness of IEO reports.

Abstract

1. Some five years after the Independent Evaluation Office (IEO) was established, it is appropriate, as was envisaged then, to consider how the IEO is functioning, what it has achieved, and how it can be improved. The perspective of a disinterested, external panel can be valuable in this regard, and the present report will surely contribute to a lively Board discussion of this important topic. Staff appreciate the Panel’s attention to maintaining a high quality IEO staff and improving the quality and effectiveness of IEO reports.

1. Some five years after the Independent Evaluation Office (IEO) was established, it is appropriate, as was envisaged then, to consider how the IEO is functioning, what it has achieved, and how it can be improved. The perspective of a disinterested, external panel can be valuable in this regard, and the present report will surely contribute to a lively Board discussion of this important topic. Staff appreciate the Panel’s attention to maintaining a high quality IEO staff and improving the quality and effectiveness of IEO reports.

2. Staff comments are centered on three main aspects:1 (i) the IEO’s mandate and relationship to the staffs work program; (ii) follow-up to IEO findings and implementation of its recommendations; and (iii) modalities of IEO operation. The Panel also makes several suggestions related to the internal management of the IEO; as the IEO is independent and accountable to the Board, we refrain from commenting on these, though naturally we welcome suggestions that would further enhance the IEO’s, and therefore the Fund’s, effectiveness.

IEO mandate and relationship to staff work program

3. The IEO was established to complement the review and evaluation work within the Fund. As the Panel rightly states “The Board did not intend the IEO to displace the review function of PDR [Policy Development and Review Department] or external ad hoc panels. Management needs the capacity to conduct reviews of issues it deems important to the institution.” We were concerned by the (mis)perception that staff have a tactic of “front-running” the IEO to “marginalize its impact.” Not only is this perception factually incorrect (¶l), it is completely at odds with our perspective that the IEO makes a vital contribution to the Fund’s learning culture, external credibility, and effectiveness by complementing, rather than substituting, the staffs own efforts.

4. Staff reviews typically originate in requests made by the IMFC, guidance from the Board during the semi-annual discussions of the work program, and Board decisions or Fund policy. Management cannot abrogate its responsibility for conducting mandated, requested, or needed reviews and—since the IEO sets its own work program and timetable, conducting its evaluations independently of Fund management and the Board—cannot delegate such reviews to the IEO.

5. Even if the IEO chooses to examine a topic that has, is, or will be reviewed by the staff, this does not necessarily imply wasteful duplication of effort. Inasmuch as the IEO reaches similar conclusions to the staff, this enhances the external credibility of staff reviews, and if it does not, then the fresh perspective of the IEO is in itself valuable. But the IEO can, does, and should do much more than tramp over the same ground as staff. As the Panel observes rightly, the IEO can examine issues that the staff cannot—internal Fund decision-making, governance issues, and shareholder interventions. Moreover, the IEO may be better positioned to reflect the views of national authorities who could be reluctant to be as candid with staff. Thus, there may be important synergies and complementarities between the work of the staff and the IEO. In our view, it would have been more useful if the Panel had discussed ways in which these complementarities can be best exploited—without, of course, prejudicing the IEO’s independence.

6. Nevertheless, while IEO evaluations bring benefits, these must be weighed against their costs. We therefore concur with the Panel’s recommendation that every IEO report should state clearly why the scarce resources of the IEO should be deployed for that evaluation. But we were disappointed that the Panel itself did not do a much more careful cost-benefit analysis of the IEO, rather than just asserting that it is not a costly operation.2 In particular, the Panel takes no account of the substantial staff costs involved in responding to IEO requests for data, interviews, and documents, reading reports, checking their factual accuracy, and commenting upon them, and preparing staff and management responses. Indeed, the report does not even compare the purely budgetary costs of IEO evaluations relative to staff reviews (¶3).

7. One area where the cost of IEO reports may far exceed the benefits concerns evaluations where the member has an ongoing Fund-supported program. The IEO’s TOR proscribe its interfering with operational activities, including programs.3 This is for good reasons. First, an evaluation of an ongoing program is likely to be premature as it necessarily cannot take account of outcomes and thereby assess whether program targets were achieved, and such evaluations could enormously complicate program negotiations. Second, such an evaluation might undermine the authorities’ ability to implement agreed policies. However, these concerns do not mean that as a general rule the IEO should not evaluate any country with an ongoing Fund-supported program. Rather the IEO could review after a decent interval a member’s previous Fund-supported program, which would be their own ex post assessment, and where such a review would yield beneficial lessons for other members.

8. As the report acknowledges, the Indonesian authorities felt that the IEO report created serious embarrassment for the government, and the political opposition argued that the Indonesian government should not continue its Fund-supported program. This is confirmed by Fund staff, who had to play an active role to contain the damage related to the release of the IEO report (as well as to the subsequent outreach by the IEO in Indonesia). The IEO report on Argentina likewise complicated an already difficult and tense situation, requiring considerable effort by staff to manage. If an IEO evaluation of an ongoing program weakens the member’s relationship with the Fund or the ability of the authorities’ to implement policies—which both logic and experience suggest that it could—then this would defeat the very purpose of the IEO, which ultimately is to enhance the Fund’s effectiveness in providing service to its membership.

Follow-up to IEO reports and implementation of recommendations

9. The Panel observes that IEO evaluations often contain “a confusing combination of many ‘conclusions’, ‘findings’, and ‘recommendations’. We agree that this is a problem which, has at times weakened the IEO’s messages.

10. We take a somewhat different view from the Panel, however, when it notes that “there is no formal mechanism for the Board to follow up specific recommendations made by the IEO” (page 26). First, not all IEO recommendations are endorsed by the Board. Second, if the Board does adopt a recommendation, it becomes part of Fund policy—no different from any other. Management therefore has the responsibility to ensure that relevant staff implement that policy properly and is accountable to the Board, and ultimately, to the membership for so doing. Periodic staff reviews of Fund policies, such as those on PRSPs, FSAPs, and TA, cover implementation experience, evaluations, and proposals for improvements.4 It is the responsibility of the Board to ensure that relevant IEO recommendations are implemented, rather than for the IEO itself to monitor the implementation and report back to the Board. Of course, the IEO may choose to re-examine a past evaluation topic, including to assess whether its Board-endorsed recommendations had been effective. But presumably it should do so in the context of its proposed work program so that the appropriate priority between a proposed re-evaluation and other possible evaluation topics may be established.

11. The Panel does not think that the IEO needs to cost its recommendations. Staff, on the other hand, is required by the Board to cost its proposals and to identify savings to finance them. We do not think that the Board can make an informed decision concerning an IEO recommendation without knowing its implementation costs. One way would be for the IEO to prepare cost estimates with the help of the Office of Budget and Planning (OBP) as is the case for staff papers. A more efficient alternative would be for the IEO to refrain from making specific recommendations, and instead present findings to the Board, leaving it to Fund management and the Board to identify appropriate solutions.

Modalities of IEO Operation

12. The Panel makes a number of recommendations for improving the operation of the IEO. Here we comment on only a few key recommendations that have implications for the IEO’s relations with the rest of the Fund.

13. The Panel recommends that the Board reconsider management’s memorandum of April 16, 2002 on IEO access to confidential communications with the Office of the Managing Director. However, the guidance in the memorandum was welcomed and found fully consistent with the IEO’s TOR by the Evaluation Committee and the Executive Board in 2002 (EBD/02/66; 4/18/02). Furthermore, the Board adopted the very same guidance for the offices of Executive Directors. We are also not aware of any instances of staff knowingly withholding a pertinent document in their possession requested by the IEO and the envisaged mediation procedures have never been called upon. This guidance therefore does not appear to have restricted IEO access to information.

14. The Panel recommends that the Board and its Evaluation Committee should decide the timing of the Board discussion of an IEO report. Board dates for IEO reports are fixed by the Secretary’s Department following the same guidelines that apply to staff reports and subject to the same constraints (e.g., three-week circulation period for policy papers). However, the preparation of IEO reports differs importantly from that of staff reports, in that management has read and approved the final staff report when it is circulated to the Board, which is not the case for IEO reports. Consequently, when the IEO paper is circulated to the Evaluation Committee, staff and management must read and absorb its contents and then prepare staff and management replies. While staff and management responses should be prompt, recommendations often have serious implications for the functioning of the Fund and need to be considered carefully. Further, the Board also needs adequate time to reflect on these replies before the Board meeting. In our view, a more desirable procedure would have the IEO send its reports to SEC for circulation to the Board, which is the practice for staff reports, and for SEC to circulate the report to the Board once management’s reply is ready. The Board meeting should take place thereafter with a reasonable time for Directors and capitals to reflect upon both the IEO’s and management’s views.

15. Finally, although not included in the Panel’s report, in our view there are some important procedural issues regarding IEO operations that merit consideration:

  • Based on issues that arose in the case of the evaluation of capital account crises, Argentina, Jordan, and multilateral surveillance, a publication policy for IEO reports needs to be established. In our view, the Board or the IEO itself should adopt a publication policy for IEO reports which, like the Fund’s transparency policy, sets rules under which factual corrections and deletions may be made.

  • Further clarification is needed between staff and the IEO on the procedure for interaction with staff on commenting on their reports. In the past, the IEO has been directly requesting comments on its reports from all departments, which has led to a lot of confusion and inefficient work practices. Instead, the IEO should circulate its reports to Management (or its designated delegate), who will then lead the effort to coordinate comments from all departments and be the central contact between staff and the IEO for a particular project.

  • It would be useful for staff to be represented along with the IEO during Board discussions to be able to respond to Director’s questions either on Fund operations or on the staffs response to the evaluation report. As regards the summing up, it is the responsibility of the Secretary (SEC) to ensure that the summing up accurately reflects the views of Executive Directors, whether expressed in “Grays” or during the discussion. Directors have the opportunity to review the summing up prior to finalization to check that it is accurate.

ATTACHMENT

Factual Comments on the Report on the External Evaluation of the IEO

This note focuses on aspects of the report that we believe that important facts have been omitted or full context has not been supplied, which could potentially mislead the audience.

1. The report asserts (page 8) “Executive Directors commented on what they saw as a deliberate strategy on the part of PDR to front-run the IEO”. The final report also says, “PDR’s tactic of front-running the IEO seems designed to marginalize the impact of the IEO.” Reviewing past topics considered by the IEO, three evaluations—Prolonged Use of IMF Resources, Fiscal Adjustment in Fund-Supported Programs, and The IMF’s Approach to Capital Account Liberalization—have no obvious immediately preceding Board papers, although all three topics have been studied by staff at some point. Two Board papers on capital account crises were completed quickly after the Asian crisis (EBS/98/202;11/25/98 and SM/01/43; 8/3/01), while the IEO began operations only in mid-2001. As for Argentina, staff began drawing lessons in early 2002 in part based on management initiated task forces. Early reflections were contained in the selected issues paper for Argentina’s 2002 Article IV consultation (SM/02/385;12/17/02), long before the IEO announced its intentions. The final product was issued to the Board in October 2003, while the IEO completed its Argentina evaluation in July 2004. In addition, the Board has long mandated regular policy reviews on various topics, such as PRSPs, FSAPs, TA, and surveillance—topics that the IEO has also taken up. The staffs Ex Post Assessment of Jordan was required by the policy on Ex Post Assessments, itself a follow-up to the IEO’s Prolonged Use Evaluation. As regards the claim of similarity between the IEO’s and PDR’s work program for 2006, the Director of PDR informed the evaluation team in January 2006 that PDR’s work program beyond April 2006 had not yet been defined, pending completion of the Medium-Term Strategy and further guidance from the Spring 2006 IMFC meeting.

2. Cost comparisons to the total budgets for PDR and RES departments are not relevant (page 7) because these departments have mandates and operational activities that extend well beyond evaluations—including policy development and country review for PDR and multilateral surveillance (WEO) and research for RES. The appropriateness of the budget comparison with OIA is also doubtful given the acknowledged (page 8) absence of overlap with the IEO and clear division of labor.

3. With regard to cost effectiveness of the IEO, what is not said in the report is as important as what is said. For instance, the cumulative IEO budgets for FY2002-05 have totaled about US$13.8 million, while the total number of evaluations issued to the Board has been 10, implying an average cost per evaluation of almost US$1.4 million. It would be informative to compare these average costs to the average cost per report for other evaluation offices. Both staff and the IEO have undertaken ex post assessments (EPA) of Jordan, which allows a direct comparison by the external evaluators of value for money. In the Review of EPA (SM/06/115; 3/21/06), staff estimated the total cost of the 32 EPAs prepared by staff at US$3.8 million or around US$0.1 million per report. This calculation suggests a potentially large cost difference in the preparation of such reports and could have implications for IEO operations if the EPA function and budget were transferred to the IEO as suggested on page 9.

4. The report’s claim of a continuing Fund-supported program with Argentina after the release of the IEO report in July 2004 is not wholly accurate. While the stand-by arrangement with Argentina was only cancelled at the authorities’ request in early 2006, no program reviews were completed after March 2004 and the Fund-supported program was off-track. The report also says that Argentina had “a new government coming when the [IEO] report was released.” The IEO report on Argentina was circulated to the Board in July 2004; President Kirchner was elected in May 2003.

5. The Panel’s recommendation that the Board and its Evaluation Committee should decide the timing of the Board discussion of an IEO report is based solely on the scheduling experience with two cases—Argentina and Jordan—out of ten IEO reports. As regards the IEO report on Argentina, it is our understanding that all parties sought to discuss this important report on the earliest possible date. Any Executive Director could have requested a postponement in the Board date for this discussion had that Director thought it necessary, but none did. Turning to the Board discussion of the IEO report on Jordan, this was not scheduled near a Board recess. It was placed on the Board agenda for the same day as the staffs EPA report on Jordan but as a separate agenda item, following the usual practice of scheduling similar topics on the same day—to tap obvious synergies, typically yielding better discussions and promoting efficiency.

6. The report notes (page 7) that the IEO employs only twelve staff (ten professional staff and two support staff). However, the report does not mention the large number of consultants and contractual support staff hired by the IEO.

7. The report states that technical assistance “was regarded by staff and governments as wasteful and misdirected” (page 15). However, no evidence is offered to support this statement, and it contrasts markedly with the positive assessments of the usefulness and effectiveness of Fund TA, including the findings of the IEO report on TA.

1

The attachment to the staff response summarizes some key factual points, made on the Panel’s draft report, which have not been taken into account in the final version. Cross references to this attachment are indicated by its associated paragraph (¶) numbers.

2

Cost comparisons provided in the report to the total budgets of PDR and RES departments are not relevant (¶2).

3

Although the Panel believes that the IEO should conduct evaluations regardless of whether there is an ongoing program, its suggested changes to the wording of the IEO’s TOR do not include changing the provision that prohibits the IEO from “interfering with operational activities, including programs, or attempting to micro-manage the institution.”

4

For example, the Board will shortly be discussing a staff review of Ex Post Assessment (EPA) policy, which was itself instituted following the IEO’s first evaluation report.