Public Investment Management Assessment - Review and Update

"Public Investment Management Assessments (PIMAs) are the IMF‘s key tool for assessing infrastructure governance over the full investment cycle and supporting economic institution building in this area. The PIMA framework was first introduced in the 2015 Board Paper on “Making Public Investment More Efficient,” as part of the IMF’s Infrastructure Policy Support Initiative (IPSI). A key motivation for its development has been that strong infrastructure governance is critical for public investment to spur economic growth. PIMAs offer rigorous assessment of infrastructure governance, that is, the key public investment management (PIM) institutions and processes of a country. On the basis of the PIMAs conducted to date, this paper summarizes the lessons learned and updates the assessment framework itself. PIMAs summarize the strengths and weaknesses of country public investment processes, and set out a prioritized and sequenced reform action plan. The PIMA framework has been well-received by member countries, with over 30 PIMAs conducted to date (mainly in emerging markets (EMs) and low income developing countries (LIDCs), and a pipeline of new requests in place; eight PIMAs have been or are about to be published. The PIMAs conducted show that there is much room for strengthening PIM, with weaknesses spread across the investment cycle. The results and recommendations of several PIMAs have been used in IMF lending, surveillance, and capacity development (CD) work, and have improved support and coordination among CD providers. While leaving the structure of the 2015 framework unchanged, the revised PIMA framework highlights some critical governance aspects more prominently. In particular, it brings out more fully some key aspects of maintenance, procurement, independent review of projects, and the enabling environment (e.g., adequacy of the legal framework, information systems, and staff capacity). Yet, the revised PIMA retains the key features of the 2015 framework, including the three-phase structure (planning, allocation, and implementation) with five institutions assigned to each phase, three dimensions under each institution, and three possible scores under each dimension (i.e., not/partially/fully met). The revision has benefitted from extensive stakeholder feedback, including from IMF teams, World Bank staff, and country authorities."

Abstract

"Public Investment Management Assessments (PIMAs) are the IMF‘s key tool for assessing infrastructure governance over the full investment cycle and supporting economic institution building in this area. The PIMA framework was first introduced in the 2015 Board Paper on “Making Public Investment More Efficient,” as part of the IMF’s Infrastructure Policy Support Initiative (IPSI). A key motivation for its development has been that strong infrastructure governance is critical for public investment to spur economic growth. PIMAs offer rigorous assessment of infrastructure governance, that is, the key public investment management (PIM) institutions and processes of a country. On the basis of the PIMAs conducted to date, this paper summarizes the lessons learned and updates the assessment framework itself. PIMAs summarize the strengths and weaknesses of country public investment processes, and set out a prioritized and sequenced reform action plan. The PIMA framework has been well-received by member countries, with over 30 PIMAs conducted to date (mainly in emerging markets (EMs) and low income developing countries (LIDCs), and a pipeline of new requests in place; eight PIMAs have been or are about to be published. The PIMAs conducted show that there is much room for strengthening PIM, with weaknesses spread across the investment cycle. The results and recommendations of several PIMAs have been used in IMF lending, surveillance, and capacity development (CD) work, and have improved support and coordination among CD providers. While leaving the structure of the 2015 framework unchanged, the revised PIMA framework highlights some critical governance aspects more prominently. In particular, it brings out more fully some key aspects of maintenance, procurement, independent review of projects, and the enabling environment (e.g., adequacy of the legal framework, information systems, and staff capacity). Yet, the revised PIMA retains the key features of the 2015 framework, including the three-phase structure (planning, allocation, and implementation) with five institutions assigned to each phase, three dimensions under each institution, and three possible scores under each dimension (i.e., not/partially/fully met). The revision has benefitted from extensive stakeholder feedback, including from IMF teams, World Bank staff, and country authorities."

Background

1. The PIMA forms part of the IMF’s suite of fiscal governance assessment tools. In addition to the PIMA, these include the Fiscal Transparency Evaluation (FTE), the Public-Private Partnership Fiscal Risks Assessment Model (PFRAM), Fiscal Stress Testing, and the Balance Sheet Approach (Figure 1). These tools have been developed or updated since 2012 to help countries address a variety of governance issues, particularly costs and risks. The tools complement each other, and, when taken together, provide systematic support for evidence-based decision making to enhance fiscal governance.

Figure 1.
Figure 1.

IMF Fiscal Governance Assessment Tools

Citation: Policy Papers 2018, 025; 10.5089/9781498308441.007.A001

2. The Public Investment Management Assessment (PIMA) framework was introduced in April 2015 in a Board paper on “Making Public Investment More Efficient.”1 Since then, the PIMA has become a key tool for helping Fund member countries strengthen the efficiency and effectiveness of their public investment, with over 30 assessments conducted to date. PIMAs are an integral part of the IMF’s Infrastructure Policy Support Initiative (IPSI)2 that supports the implementation of the 2015 Addis Ababa Action Agenda and the infrastructure-related Sustainable Development Goals (SDG).3 PIMAs also support the initiative under the Compact with Africa to strengthen investment in the region, which was endorsed by the G-20 in March 2017.

3. The 2015 Board paper finds that public investment can be an important catalyst for economic growth, but the benefit of ramped-up investment depends crucially on its efficiency. Public investment remains a top priority for governments to support growth. The increase in public investment in EMs and LIDCs has partially closed the gap between richer and poorer countries in terms of the quality of, and access to, social infrastructure and, to a lesser extent, economic infrastructure. However, the average country loses about 30 percent of the value of its investment to inefficiencies in its public investment processes. Improvements in infrastructure governance can help countries close up to two-thirds of that “efficiency gap.”4 The growth dividend from doing so can be large: the most efficient investors get twice the growth “bang” for their investment “buck” than the least efficient investors.

4. The PIMA framework helps to improve infrastructure governance by identifying strengths and weaknesses of country practices and providing targeted recommendations. The PIMA evaluates infrastructure governance using 15 key institutional features across the three stages of the public investment cycle: (i) planning public investment; (ii) allocating public resources to sectors and projects; and (iii) implementing productive public assets. PIMAs assess each PIM institution5 from three perspectives: design (“de jure” perspective), effectiveness (“de facto” perspective), and reform priority (relative importance in the country’s context) (Box 1). Recommendations are presented as a sequenced reform action plan with clear priorities, specific timelines and key actors. PIMAs are conducted as part of the IMF’s CD mandate; all member countries are eligible, and PIMAs are conducted based on member country requests.

5. This paper lays out key findings and lessons of the PIMAs conducted and sets out some revisions to the original framework. The paper is structured as follows. Section II summarizes experiences with the initial PIMAs. Section III presents revisions to the PIMA framework and discusses the rationale for the updates. Section IV outlines the plan for the future implementation of PIMAs and resource implications.

2015 PIMA Framework

The PIMA evaluates infrastructure governance using 15 institutions that cover the three stages of the public investment cycle: planning, allocation, and implementation (Figure 2). Most of the PIMAs conducted assess institutions from three perspectives: institutional design, effectiveness, and reform priority:

Figure 2.
Figure 2.

The 2015 PIMA Framework

Citation: Policy Papers 2018, 025; 10.5089/9781498308441.007.A001

1. Design (de jure): Are formal institutional requirements in place? Assessments check how well investment institutions (such as public investment rules, instruments, legal and regulatory procedures, and standardized roles) are designed compared to “good international practice.” Each institution is scored according to a three-level classification (low, medium, high). For example, if a country has rules and procedures requiring that major projects be systematically subject to rigorous technical, economic, and financial analysis, and that selected results of this analysis be published, it would receive a high score in the design for this institution (Institution 9: Project Appraisal).

2. Effectiveness (de facto): Are institutions performing adequately? Here, PIMAs assess how public investment laws, instruments, legal and regulatory procedures are implemented in practice. Following up on the previous example, a country may have formal requirements for major projects to be selected and evaluated according to set criteria, but, in practice (de facto), these analyses may not be performed systematically or to the intended standard. If the quality of these analyses is poor, or if they are only randomly implemented, it would result in a medium or low score for effectiveness for this institution.

3. Reform priority: What should be a country’s reform priorities across the various public investment institutions? Each PIMA provides prioritized recommendations in the form of a reform plan that is country-specific, well targeted, and sequenced over time. For example, addressing a low score on project selection and appraisal would be a high priority reform, given their importance for efficiency.

Source: PIMA reports.

PIMA Findings

6. This paper takes stock of and analyzes the PIMAs conducted since 2015 in 30 countries.6 The PIMAs conducted have covered a wide range of countries from Africa (13), Asia and the Pacific and Europe (5 each), Western Hemisphere (4), and Middle East and Central Asia (3)—mainly EMs and LIDCs, and one Advanced Economy (AE). Most PIMAs have been conducted in cooperation with other partners: the World Bank (24), IADB (2), and ADB (1). The analysis here is based on PIMA scores, which provide a basis for comparing countries and country groups. PIMA scores are not released for countries that have not agreed to publication. PIMAs for six countries have been published (Botswana, Ireland, Jordan, Kosovo, Liberia, and Mali), two more are in the process of being published (Benin and Brazil). Publication is voluntary, and while several countries have agreed to publish their PIMAs, many have preferred to use the PIMA for internal deliberations and to guide their reforms. Similar to CD reports, PIMAs are shared with relevant external partners.

A. Key Findings

Design

7. The PIMAs conducted suggest that there is significant room for improving the design of PIM institutions, both across and within country groups. Using a scoring system ranging from 0 to 10, with 10 indicating full alignment with good PIM practices, EMs, on average, score higher in design compared to LIDCs (Figure 3). Also, PIM institutions score higher in design in Europe than in other regions, with Africa being the weakest.7 The dispersion of results (i.e., the difference between maximum and minimum scores) within country groups is also relevant. For example, Africa shows a large dispersion, with scores ranging from 1.3 to 6.4. Overall, compared to maximum score of 10, even the best performing countries in EMs and LIDCs have much room for improving the design of their PIM institutions.

Figure 3.
Figure 3.

Average and Dispersion of PIMA Scores in Assessed Countries 1/

(July 2015 to October 2017)

Citation: Policy Papers 2018, 025; 10.5089/9781498308441.007.A001

Source: Staff calculation based on PIMA reports.1/ AE average scores are not included, as Ireland is the only country in this group.

8. Weaknesses of PIM institutions are widespread across the public investment cycle, but are more prominent in the implementation stage. Across all PIMAs, institutions are better designed in the planning stage, compared to institutions in the allocation and implementation stages (Figure 4). Looking at the design of the 15 PIM institutions, countries scored the highest in terms of budget unity, budget comprehensiveness, and national planning—all of which belong to the planning and allocation stages. The weakest design of PIM institutions was found in the allocation and implementation stages (notably: project appraisal, selection, and management, as well as asset monitoring). In contrast, only one institution in the planning stage, the management of PPPs, had a relatively low score. The latter reflects the lack of a well-designed PPP framework to efficiently use PPPs in many countries.

Figure 4.
Figure 4.

Ranking of PIM Institutions by Scores in Design

Citation: Policy Papers 2018, 025; 10.5089/9781498308441.007.A001

Source: Staff calculations based on PIMA reports.

9. In the sample, EMs show overall better design scores than LIDCs, in line with the findings of the 2015 Board Paper (Figure 5).8 Concerning individual institutions, EMs score better or similar to LIDCs, with the exception of national and sectoral planning. Development planning is a historical legacy in many LIDCs, often encouraged and supported by external partners, for example through Poverty Reduction Strategy Papers (PRSPs). Compared to LIDCs, EMs have relatively better-designed PIM institutions for availability of funding and company regulation, while both EMs and LIDCs show similar weaknesses in project appraisal and selection.

Figure 5.
Figure 5.

Ranking of PIM Institutions by Scores in Design and Country Group

Citation: Policy Papers 2018, 025; 10.5089/9781498308441.007.A001

Source: Staff calculations based on PIMA reports.

Effectiveness

10. The PIMAs show that countries are better in designing PIM institutions than in implementing them effectively. Figure 6 shows that for almost all PIM institutions, the scores on design are superior to those for effectiveness. The largest differences exist in the following institutions:

  • National and sectoral planning. While many countries have formal national sectoral plans, they are often fragmented, not properly costed, not aligned with the medium-term framework and annual budget, and they do not inform public investment decisions.

  • Multiyear budgeting. Half of the countries assessed publish medium-term estimates for capital spending by ministry or sector, and the other half publish at least aggregate capital ceilings. Still, in many countries the forecast of public investment for two or more years ahead is unreliable, limiting the benefits/effectiveness of medium-term budgeting.

  • Transparency of execution. While most countries have procedures to ensure transparency of procurement, the PIMAs revealed significant non-compliance with those procedures by procuring agencies. Similarly, while most of the countries have central monitoring mechanisms of project implementation, they rarely result in corrective actions.

  • Project selection. While some countries have standardized technical criteria for project selection following good international practices, frequently governments do not implement these criteria in practice.

Figure 6.
Figure 6.

Scores for Design vs. Effectiveness of PIM Institutions

Citation: Policy Papers 2018, 025; 10.5089/9781498308441.007.A001

Sources: Staff calculations based on PIMA reports. Number of observations 27.1/1/ Effectiveness scores for three of the 30 countries are not available.

11. However, countries sometimes are better at protecting investment spending than the institutional design alone would suggest. While some countries do not allow for formal multiyear budget commitments, line ministries still manage to protect their investment spending plans by signing multiyear contracts or renewing one-year contracts without having to go through another procurement process.

Reform priorities

12. PIMAs provide prioritized recommendations to address main weaknesses in PIM institutions. Overall, the recommendations concentrate on the PIM institutions that have low effectiveness scores (Figure 7, left panel). The most frequent measures recommended by the staff are detailed below:

  • Project selection. Almost all PIMAs (90 percent) contained recommendations to strengthen project selection (Figure 7, right panel). To overcome fragmented and unsystematic project selection processes, recommendations included establishing a pipeline of projects, developing a comprehensive project database, and defining and/or improving project guidelines and selection criteria.

  • Project appraisal. To improve the assessment of public investment projects, 86 percent of PIMAs proposed developing and implementing guidelines for project appraisal (including cost-benefit and risk analysis).

  • Multiyear budgeting. To improve the credibility of multiyear budgeting, 76 percent of PIMAs proposed country-specific measures, including improving transparency over multi-annual commitments, better assessment of the fiscal space for new projects, establishing capital budget ceilings, or publishing existing ceilings.

  • Management of PPPs. To ensure that PPPs are well-managed and do not expose the government to excessive risks, 72 percent of PIMAs recommended to integrate PPPs into the overall PIM framework, and improve the budgeting, accounting and reporting of PPP operations, including long-term commitments and contingent liabilities.

  • Project management. To ensure that projects are well-implemented, staff recommended adopting project management guidelines, training on project management, introducing centralized monitoring, and piloting and/or implementing ex-post reviews.

  • National and sectoral planning. To ensure that national and sectoral plans can effectively guide public investment decisions, PIMAs recommended improving the costing of plans, consolidating sector strategies, and introducing overall national strategies.

Figure 7.
Figure 7.

PIMAs: Prioritizations of Recommendations

Citation: Policy Papers 2018, 025; 10.5089/9781498308441.007.A001

Source: Staff calculations based on PIMA reports.

13. While PIMAs focus on PIM institutions, the assessment of reform priorities was supported by country-specific analysis of the efficiency of public investment. The 2015 Board papers found that about 30 percent of the benefit of the public investment is lost in weaknesses in the investment management process. The most efficient countries double the impact of public investment on growth, relative to the least efficient. Public investment priority reforms are intended to address the efficiency gap.9 Strengthening PIM through better-designed and more effective institutions, can close up to two-thirds of the public investment efficiency gap.10 Staff estimates of the efficiency gap for each country based on the methodology developed in the 2015 Board Paper, were discussed with the authorities and included in all PIMA reports.

B. Application of PIMAs

14. PIMAs provide a standard framework to assess strengths and weaknesses of infrastructure governance, allowing cross-country comparisons, and country-tailored recommendations. PIMAs provide a structured and systematic approach, allowing countries to quantify and benchmark their performance against peers. The in-depth analysis, complemented with cross-country comparisons, raises the profile of PIM issues and builds a shared understanding among key stakeholders of the actions required over the short to medium term. This helps countries to develop an overarching strategy for PIM that is accessible to policy makers and development partners alike.

15. PIMAs have seen strong demand from countries in all regions, and have also received much support within the Fund and from third parties. Since launched in mid-2015, on average, ten countries per year have requested a PIMA. Staff expects this level of demand to continue. Countries that have gone through a PIMA found the exercise helpful, including for strengthening infrastructure governance, and focusing their dialogue with different development partners.11 Similarly, PIMAs have helped to strengthen the Fund’s surveillance dialogue, making the aggregate size of public investment more sustainable in the medim term. Other development partners have reacted positively, also because PIMAs have helped to unify different support efforts into a prioritized reform program that is owned by the government.

16. For country authorities, PIMAs provide the basis for developing prioritized reform plans for strengthening infrastructure governance that is tailored to their needs. One of the main points of feedback from countries that responded to a post-PIMA survey is that the PIMA report is concise, accessible, and easy to understand. It brings together in-depth data analysis based on standard charts, and useful qualitative descriptions of the key issues. Also, due to the consultative approach that is followed, encompassing government ministries and agencies, development partners, and other actors, the reform plan that emerges from a PIMA assessment typically has broad support. The reform actions are generally tailored to each country’s needs and prioritized in line with the country’s resources and institutional capabilities. According to the survey, country authorities generally appreciate the comprehensiveness, usefulness and accuracy of the PIMAs (see Box 2).

17. PIMAs have supported IMF policy dialogues with countries, including in the context of surveillance, lending, and CD. PIMA teams regularly consult with the relevant area departments prior to PIMA missions, share detailed recommendations after each mission, and include area department comments before finalizing a report. This extensive engagement has raised the level of understanding of PIM issues by Fund staff, resulting in better reflection of these issues in Fund-supported program design (e.g., in the form of structural benchmarks), and surveillance (e.g., Selected Issues Papers (SIPs)) as shown in Table 1. For example, for the Kyrgyz Republic, the PIMA conducted in 2016 has supported program design through several structural benchmarks on strengthening PIM institutions (e.g., on project appraisal and project monitoring).

Table 1.

Reflection of PIMA Findings in IMF Staff Reports (Examples)

article image
Source: Selected IMF staff reports.

18. Many countries have begun to implement PIMA recommendations. For example, in Ireland, the government has put the implementation of several PIMA recommendations in its updated Capital Plan Review for 2018–21 and the National Development Plan 2018–27, “to achieve significant improvements in the efficiency of public capital investment.” In Togo, the government has committed to end the problematic practice of prefinancing pointed out by the PIMA. In the Kyrgyz Republic, the government has issued a decree to formalize gate keeping roles of the Ministry of Economy on evaluation, including economic assessment and project efficiency, and the Ministry of Finance on financing. In Côte d’Ivoire, the government is developing the PPP database to include the main PPP projects to better manage fiscal risks.

19. PIMA assessments also helped to prioritize follow-up CD activities. Overall, PIMAs have allowed for a stronger integration between the Fund’s country surveillance work and CD support in the area of public financial management. For example, in Mauritius, the PIMA led to renewed interest in CD on a broad set of PFM issues, with several follow-up activities led by AFRITAC South (AFS). Similarly, Burkina Faso, Côte d’Ivoire, Ghana, Liberia, Mali, Mongolia, and Ukraine, among other countries, requested follow-up CD to address specific weaknesses identified in their respective PIMAs.

Results of the Survey of Country Authorities

Staff surveyed country authorities that have gone through a PIMA, focusing on content and quality of the framework and assessment. The survey questionnaire was sent to 30 countries, of which 19 responded.

All country responses indicate that the PIMAs had comprehensively captured the key elements of PIM institutions, and felt that it was useful (either to a large extent or some extent) that the PIMAs distinguished between institutional design (de jure) and institutional effectiveness (de facto). Nearly all responding country authorities found that PIMAs accurately reflected the state of PIM and provided relevant recommendations for reform. Approximately three quarters of the responding authorities found PIMAs useful (either to a large or some extent) in informing CD and external financing needs. This was particularly the case for countries that generally rely on development assistance; naturally, countries that are not aid recipients found the PIMA less useful for obtaining external (donor) support.

More importantly, all countries that responded to the survey indicated that they have introduced reform measures to improve public investment efficiency based upon the PIMAs recommendations. In implementing the recommendations, most countries would have appreciated the PIMAs to provide more detailed analyses of public investment projects managed by national governments, public corporations (PCs), subnational governments (SNGs), and the private sector. While some countries have published their PIMA reports,1 others have circulated the report within the government to facilitate internal learning and decision making. Most countries would consider an update of PIMA evaluation within 2– 4 years to assess progress.

Figure 8.
Figure 8.

PIMA Survey Results

Citation: Policy Papers 2018, 025; 10.5089/9781498308441.007.A001

Source: FAD survey on PIMA, 2018.
1/ The countries that have published PIMA report include Botswana, Ireland, Jordan, Kosovo, Liberia, and Mali.

20. In many countries, PIMAs helped to mobilize additional external funding and improve coordination among CD providers. PIMA missions routinely consult with development partners in the field to elicit views on a broad range of PIM issues, including sector-specific topics. This engagement has promoted a shared understanding of the key PIM challenges, and allowed the development partners to re-assess their respective country strategies in light of the PIMA findings. Mozambique, Madagascar, Togo and Zambia all provide examples in this regard. In Mozambique, the PIMA led to a three-year program involving a collaboration between the Government of Mozambique, the World Bank and DFID, with approximately US$2 million in funding towards strengthening PIM. In Madagascar, the World Bank followed up the PIMA with a series of interventions to revive the PIM manual, strengthen project appraisal procedures, and provide training. In Togo, the World Bank has put forward a five-year project to strengthen PIM with approximately US$10 million in funding. In Zambia, the World Bank, EU and African Development Bank have taken the PIMA results as an input for their respective country programs. In the Maldives, the US$12 million additional financing of the capital budget under the World Bank’s PFM Systems Strengthening Project12 is informed by the 2016 PIMA report.

21. The PIMA framework has also been recognized by third parties as a useful tool to support fiscal governance. For example, in 2017, the Overseas Development Institute (ODI)13 carried out a detailed review of different PIM diagnostic tools, including the PIMA. It noted favorably the PIMA’s “emphasis on budgetary and fiduciary concerns,” and the consideration “given to the institutional features that might be needed to ensure the fiscal sustainability of overall investment spending.” Given rising levels of public debt, the ODI review highlighted the importance of limiting borrowing to investment projects that yield suitable social and economic returns. Yet, the ODI review also suggested some limitations of the initial PIMA framework, including that its approach “runs the risk of overlooking critical downstream aspects of the implementation, maintenance and operation of investment assets.” Many of these issues are addressed in the revised PIMA framework discussed below.

C. Lessons Learned

22. The basic three-phase PIMA structure—planning, allocation, and implementation—serves the assessment well. It reinforces the notion that all three phases have similar weight in achieving public investment efficiency. Experience has shown that the three-phase structure of the PIMA framework is easy both to communicate and sufficiently flexible, including to reflect the unique circumstances and challenges facing each country, and to make relevant reform recommendations.

23. Assessments are carried out in a cost-effective way to support PFM reform programs. Each PIMA is carried out by a mission team of (usually) four to five staff and experts that interacts closely with the authorities, including through workshops and seminars. In several PIMAs, country authorities carried out a preliminary self-assessment before the mission, which improved significantly the quality of the PIMA evaluation and fostered ownership of the mission findings and suggested reform measures. In some PIMAs, a short pre-mission visit was conducted to explain the framework and gather information.

24. However, the PIMAs conducted also pointed to some PIM practices that required a more detailed analysis. Specifically, the initial PIMA framework was considered to be somewhat deficient in some areas and redundant in others. In particular, it was felt that a stronger focus on four areas—procurement, maintenance, independent review of projects, and portfolio management—would allow for more focused recommendations to country authorities, and help to incorporate these in follow-up support programs. The revised PIMA framework presented in the next section seeks to address these issues.

25. One area that needed more detailed analysis relates to the different channels that exist for providing public infrastructure. The PIMA takes central government ministries and agencies as the primary point of reference for infrastructure delivery. Yet, in several countries, particularly in EMs, the assessments showed that subnational governments and public corporations were accounting for a large share of public infrastructure services. Public corporations, in particular, dominate areas such as water and energy provision, and long-distance rail services. PPPs, while still being a relatively small channel, are growing in importance, generate significant fiscal risks, and in many cases, circumvent the required public investment rules and procedures. Therefore, the adjusted PIMA framework distinguishes more clearly between alternative implementation channels for public investment.

26. Finally, it was felt that addressing cross-cutting issues in a more systematic way could enhance the analytical power of the PIMA framework. Apart from the 15 PIM institutions, several cross-cutting issues act as system enablers, notably a country’s legal and regulatory frameworks, IT-support services, and general staff capacity. These issues may have an impact on the effectiveness of the 15 core institutions. Given the complex ways in which the cross-cutting issues may manifest themselves in practice, it was felt that a qualitative assessment of these issues, rather than their full integration into the PIMA scoring system, would be the preferred option.

Adjustments to the PIMA Framework and Processes

27. Updates to the PIMA framework were made in consultation with assessment teams, country authorities, and external stakeholders. More specifically, the updates reflect the recommendations of a IMF-World Bank working group that has received inputs from mission teams, country authorities, and other stakeholders.14

28. The PIMA framework was enhanced with a more in-depth focus on four major aspects of a country’s PIM institutions. Table 2 describes the relationship between the original framework and the revised one. The revisions to the framework have been accommodated within the existing structure, without increasing the total number of institutions, but adjusting their composition.

  • Maintenance: public capital is an asset that provides benefits for many years beyond the initial investment. Appraisal and selection of individual projects are largely based on the present value of returns to investment spending over the asset’s life. Lack of routine maintenance threatens those expected returns, and thus undermines the efficient allocation of resources. Capital maintenance has the potential to extend the life of an asset, and budgeting for both routine and capital maintanance requires a methodology to estimate the required expenditure. The revised framework includes a specific institution on maintenance.

  • Procurement: technical efficiency of public investment spending is, at least, as important as proper project appraisal and selection. Deficiencies in the procurement process may lead to higher than expected costs, lower than expected quality of construction, and longer than expected construction time, all of which result in less efficient public investment. Strong and effective procurement laws and procedures are also necessary, but not sufficient, to combat corruption. While procurement was addressed only partially in the original PIMA framework, the revised framework recognizes the importance of procurement by establishing a dedicated PIMA institution devoted to it.

  • Independent review of projects: the input of independent experts and organizations in project appraisal, selection and ex-post review is important to ensure high-quality and unbiased assessments. It also can be a counter-weight to political influence, and reduce corruption. Therefore, independent review was expanded as an evaluation criterion in the updated framework.

  • Enabling environment: for PIM institutions to perform well, countries need at least three “enablers”: (1) a supportive legal framework; (2) good systems for managing information; and (3) adequate staff capacity, with clear roles and responsibilities. As these three issues are cross-cutting, they are assessed qualitatively and separately, with a focus on how they support the framework.

Table 2.

Comparison of Original 2015 Framework and the 2018 Update

article image

29. The revised PIMA framework streamlines some overlaps and provides more precise language on selected principles or criteria previously included. Examples of such changes are:

  • Fiscal principles and rules: significant revisions were made to emphasize the importance of fiscal policy to provide a stable and predictable context to support public investment planning, budgeting, financing and execution.

  • Public-private partnerships (PPPs) are addressed in the context of several relevant PIM institutions, rather than in a single one.

  • Budget Comprehensiveness and the Budget Unity were merged into a single, new institution.

  • Portfolio management and oversight: several previously existing evaluation criteria were modified to clarify the challenges of managing a portfolio of projects. For example, the ability to shift money between approved projects, some of which may be delayed while others are advancing without impediment, pertains to the portfolio of projects. Such practices involving execution of multiple projects supplement the previously existing focus on individual project management during execution.

  • Public corporations and financing sources: coverage of the PIMA was expanded, especially to include public corporations and all potential financing sources.

30. Two practices already used in PIMAs are incorporated in the revised framework. First, as noted above, evaluation of the effectiveness of PIM institutions has added to the assessment of institutional design. Hence, the overall PIMA scoring is two-fold: institutional design and effectiveness. Separate scores highlight the fact that having good policies and procedures on paper does not necessarily mean that they are effectively implemented. Also, the PIMAs will contain a ranking of reform priorities, with the aim of designing a sequenced, prioritized reform plan for improving the efficiency and effectiveness of public investment. Typically, low scores on both design and effectiveness would lead to a recommendation of high reform priority. The same would apply to any reforms that could have a significant impact on public investment efficiency and whose implementation can be readily realized by the authorities.

Implementation and Resources

31. PIMAs have proven helpful for countries in reforming their PIM framework with the aim of improving the efficiency of public investment. Efforts to promote economic growth through increasing public investment spending can pose significant risks for governments. In the absence of a robust PIM framework, higher investment spending might not result in improved economic growth and could lead to increased fiscal risk and slippages. With public investment at the heart of the development agenda, PIMAs will be key to assist countries in identifying growth-supportive reform priorities and priorities for CD support.

32. By supporting PIM reforms, PIMAs can help boost the growth returns of public investment. To achieve the SDGs, better public infrastructure (e.g., transport, irrigation, energy, health, education) is required. To maximize the growth returns of future investment spending, countries will need to have in place well-designed and effective infrastructure governance. PIMAs are an effective tool to assist countries in strengthening their infrastructure governance.

33. The revised PIMA framework maintains the main original features while addressing identified gaps and other shortcomings. As discussed, it extends the analysis in some areas (e.g., procurement and maintenance funding) and consolidates other aspects (e.g., alternative delivery modes like PPPs and PCs) under a new institution. The revised framework, summarized in Figure 9, will be applied in future assessments. A Guidance Note on how to apply the PIMA questionnaire will be prepared.

Figure 9.
Figure 9.

Updated PIMA Framework

Citation: Policy Papers 2018, 025; 10.5089/9781498308441.007.A001

34. Implementation of PIMA recommendations will be closely monitored by staff and supported by CD, with an emphasis on follow-up actions at the country level. Addressing the findings and recommendations of each PIMA usually requires governments to own and commit to a medium-term reform plan for strengthening PIM institutions. Progress in infrastructure governance can be measured by monitoring implementation of the reform plans and by conducting repeat PIMAs at reasonable intervals. To ensure systematic follow-up support, countries could use the reform plan to encourage development partners to focus on specific areas and institutions.

35. CD to support reforms of infrastructure governance can be provided upon request, as consistent with the Fund’s CD mandate and given resource envelope. Progress in implementing reform plans will be monitored by staff through regular contacts with country authorities (and, as indicated, repeat PIMAs). Follow-up CD support, to individual countries or in a regional context, will be integrated into the usual resource allocation planning (RAP) processes, which set CD delivery priorities on the basis of close consultations across departments and with country authorities. The provision of CD by Fund staff will focus on reforms that are macro-critical and correspond to the Fund’s mandate, mainly in the area of fiscal governance. Multilateral development banks (MDBs) and several of the bilateral development agencies are called upon (and have already started) to use the PIMAs to support other key PIM reforms that fall within their own mandates, e.g., strengthening procurement, project appraisal systems, and regulation of infrastructure.

36. Staff will carefully balance the allocation of resources between conducting new PIMAs and meeting demand for CD follow-up. CD arising from the more than 30 PIMAs that have already been conducted is potentially subtantial. Still, the magnitude of follow-up CD demand will only become evident over time, as the PIMA process matures further. Decisions on the right balance between follow-up CD and new (or repeat) PIMAs will be taken, based on given resource constraints, in the context of the usual CD RAP processes.

37. Staff will continue to encourage country authorities to publish their PIMAs. Publishing the reports could help draw attention to key PIM issues, strengthen public support for reform, and, in doing so, also invite external support for addressing them. PIMA mission teams will discuss with country authorities the added benefits of publishing the final report.

References

Annex I. The Updated PIMA Questionnaire

article image
article image
article image
article image
article image
article image
article image
article image

Annex II. Sample PIMA Summary Outputs

A. PIMA Scoring Heatmap for Jordan

article image

B. PIMA Peer Comparison Chart for Jordan

uA01fig01
Source: Jordan: Technical Assistance Report: Public Investment Management Assessment, December 2017, IMF Country Report No. 17/366.
2

The Fund’s IPSI aims to consolidate ongoing efforts to increase the efficiency of public investment, including by pulling together tools used in the assessment of options for scaling up such spending. Several countries, where infrastructure issues are particularly salient and constitute a key area of IMF engagement with the authorities, were identified as IPSI pilot countries. In these countries, IMF surveillance and CD provision are particularly closely integrated, and several IPSI tools have been used. The IPSI pilot countries are: Cambodia, Colombia, Honduras, Kyrgyz Republic, Serbia, Solomon Islands, Thailand, Timor Leste, and Vanuatu.

3

UN, 2017, “Sustainable Development Goals: 17 Goals to Change the World.” Goal 9 refers to building resilient infrastructure, promote sustainable industrialization, and foster innovation. http://www.un.org/sustainabledevelopment/infrastructure-industrialization/

4

IMF, 2015b. Analysis shows that, on average, the G20 countries face an efficiency gap of 22 percent compared to the efficiency frontier. They could reduce two-thirds of this gap by adopting the PIM practices of the best performer. Improving PIM institutions would have the largest payoff in emerging markets where institutions are relatively weaker. The results are consistent with other studies.

5

Following North (1991), “institutions” consist of both informal constraints (sanctions, customs, traditions, and codes of conduct) and the formal rules (constitution, laws, regulations) that determine the behavior of public servants and other actors. The PIMA framework is mainly focused on the formal institutions, but, by looking at the effectiveness of PIM, also measures, to some extent, the impact of informal institutions.

6

These consist of one AE (Ireland); 15 EMs (Albania, Botswana, Brazil, Guyana, Jordan, Kosovo, Malaysia, Maldives, Mauritius, Morocco, Peru, Serbia, Thailand, Timor Leste, Ukraine); and 14 LIDCs (Benin, Burkina Faso, Cameroon, Côte d’Ivoire, Ghana, Honduras, Kyrgyz Republic, Liberia, Madagascar, Mali, Mongolia, Mozambique, Togo, Zambia).

7

The calculations in Figure 3 describe the results of the first 30 PIMAs completed. Given the small sample size, differences between the average PIMA scores by income category or region are not statistically significant.

8

As only one advanced economy (Ireland) has undertaken a PIMA, comparisons are limited to EMs and LIDCs.

9

For a given level of public investment spending, the efficiency gap is defined as the difference in a country’s quantity and/or access to infrastructure relative to the most efficient country with a similar level of income.

10

The estimation of the public investment efficiency gap was revised in January 2017, but the views and main findings of the 2015 Board paper still prevail. http://www.imf.org/external/np/fad/publicinvestment/

11

A survey was sent to country authorities; responses were received from 19 countries (Box 2).

12

The project will be presented to the World Bank Board on May 30, 2018.

13

Strengthening Public Investment Management: Reviewing the Role of External Actors. ODI report. September 2017.

14

The revised PIMA framework also seeks to accommodate the World Bank’s own assessment structure (the eight PIM “Must-Haves”), while maintaining the comprehensive analytical character of the PIMA framework. Rajaram, A., and others, 2014, WB.

Public Investment Management Assessment - Review and Update
Author: International Monetary Fund. Fiscal Affairs Dept.