Taking Stock of IMF Capacity Development on Monetary Policy Forecasting and Policy Analysis Systems
Author:
Nils Mæhle 0000000404811396 https://isni.org/isni/0000000404811396 International Monetary Fund

Search for other papers by Nils Mæhle in
Current site
Google Scholar
PubMed
Close
,
Tibor Hlédik 0000000404811396 https://isni.org/isni/0000000404811396 International Monetary Fund

Search for other papers by Tibor Hlédik in
Current site
Google Scholar
PubMed
Close
,
Mikhail Pranovich 0000000404811396 https://isni.org/isni/0000000404811396 International Monetary Fund

Search for other papers by Mikhail Pranovich in
Current site
Google Scholar
PubMed
Close
,
Carina Selander
Search for other papers by Carina Selander in
Current site
Google Scholar
PubMed
Close
, and
Mikhail Pranovich 0000000404811396 https://isni.org/isni/0000000404811396 International Monetary Fund

Search for other papers by Mikhail Pranovich in
Current site
Google Scholar
PubMed
Close

This paper takes stock of forecasting and policy analysis system capacity development (FPAS CD), drawing extensively on the experience and lessons learned from developing FPAS capacity in the central banks. By sharing the insights gained during FPAS CD delivery and outlining the typical tools developed in the process, the paper aims to facilitate the understanding of FPAS CD within the IMF and to inform future CD on building macroeconomic frameworks. As such, the paper offers a qualitative assessment of the experience with FPAS CD delivery and the use of FPAS in the decision-making process in central banks.

Abstract

This paper takes stock of forecasting and policy analysis system capacity development (FPAS CD), drawing extensively on the experience and lessons learned from developing FPAS capacity in the central banks. By sharing the insights gained during FPAS CD delivery and outlining the typical tools developed in the process, the paper aims to facilitate the understanding of FPAS CD within the IMF and to inform future CD on building macroeconomic frameworks. As such, the paper offers a qualitative assessment of the experience with FPAS CD delivery and the use of FPAS in the decision-making process in central banks.

1. Background and Purpose

A forecasting and policy analysis system (FPAS) is a system of tools and related processes designed to support forward-looking monetary policy formulation based on economic data and analysis. Any central bank or economic policymaking organization that to some degree relies on forecasts for its decisions de facto has one, although the details of its components and structure may differ. What now has become known as the “FPAS” originated in the pioneering work of the early inflation targeting (IT) central banks in the second half of the 1990s to develop procedures and tools that could provide a more rigorous analytical foundation for their interest rate settings. The result was a system for forecasting and policy analysis along the lines set out in Figure 1.1 While the system involves a formal economic model used in analysis and forecasting, the FPAS itself is not a “model.” Rather, it is a consistent, well-organized framework for collecting, processing, and analyzing economic information, with a special emphasis on providing analysis and policy recommendations to policymakers to support their monetary policy decisions. It is designed to facilitate a systematic evaluation of new information and provide regular policy analyses and updates for policymaking.

Figure 1.
Figure 1.

The Forecasting Process in a Nutshell

Citation: Departmental Papers 2021, 026; 10.5089/9781557753274.087.A001

Source: Hledik (2011).

The main elements of a well-structured FPAS include:

  • ■ Databases and procedures for collecting, processing, managing, and sharing data.

  • ■ A core medium-term (forecasting and) monetary policy analysis model that has an explicit role for monetary policy.2

  • ■ Nowcasting and near-term forecasting (NTF) tools and procedures.

  • ■ Satellite models and analytical tools suitable for analysis and/or forecasting of some specific aspects/sectors of the economy or analysis of specific phenomena.

  • ■ Tools and procedures for ex post evaluation of the forecasts and policy recommendations.

  • ■ A forecasting and policy analysis team (FT) with well-defined managerial and operational responsibilities supported by a sound reporting structure. The full team usually consists of the head of the team, a core modeling group, and sectoral experts. This broad team is in charge of medium-term forecasting and policy analysis; sectoral analysis, including NTF; developing and maintaining models; preparing presentations for the monetary policy decision-making body; and drafting monetary policy statements and monetary policy reports (MPRs).

  • ■ A structured forecasting and policy analysis process with well-defined deadlines and responsibilities. The process consists of staff-level technical meetings, technical meetings between the FT and policymakers, and monetary policy decision-making meetings.

  • ■ Processes for preparing internal and external MPRs, and structured monetary policy advice and presentations to the policymakers.

Developing FPASs has been a central focus of the IMF’s monetary policy technical assistance and customized training over the last number of years. In addition to classroom training, IMF capacity development (CD) activities just in the last three to five years have supported building or enhancing FPASs in 25 countries.3 It has covered a diverse group of countries in every region of the world with very different monetary policy frameworks and at different stages of reforming their policy frameworks, ranging from low- and lower-middle income countries with weak monetary policy transmission, underdeveloped markets, and limited macroeconomic policy expertise to advanced high-capacity countries with well-developed markets and instruments, freely floating exchange rates, and full-fledged IT frameworks. Most of this CD has been delivered through multiyear, externally funded projects with frequent (quarterly) missions using headquarters (HQ)-based or regional technical assistance center (RTAC)-based staff, supplemented by a pool of short-term experts. While many of these projects were from the onset aimed at supporting a transition toward an IT-like policy framework, others were oriented toward enriching policy decision-making with forward-looking economic analysis as part of a broader program of progressive policy modernization.

The FPAS can be thought of as the bridge between the broad objectives of the regime and the monetary policy operations. It helps the authorities map their medium-term goals into concrete high-frequency actions such as the conduct of open market operations and the setting of short-term policy interest rates. Along the same lines, FPAS CD has aimed at connecting, and thus supporting, complementary Monetary and Capital Markets Department CD on operations and on reform of the regime itself.4

This paper takes stock of FPAS CD, drawing extensively on the experience and lessons learned from developing FPAS capacity in the central banks. By sharing the insights gained during FPAS CD delivery and outlining the typical tools developed in the process, the paper aims to facilitate the understanding of FPAS CD within the IMF and to inform future CD on building macroeconomic frameworks. As such, the paper offers a qualitative assessment of the experience with FPAS CD delivery and the use of FPAS in the decision-making process in central banks.

At the same time, the paper does not attempt to discuss or evaluate any of the actual country cases and abstracts from any cross-country comparison of the FPAS CD projects. Similarly, assessment of appropriateness of resulting FPAS tools and procedures at individual institutions is also beyond the scope of this paper. Such assessment is, however, often available in the joint working papers produced by the CD teams and the authorities that document the main features of the recipient countries’ FPASs.5

The remainder of the paper is organized around four main sections. We first provide an overview of FPAS as a system, discuss the role of models in the decision-making process in a central bank, and outline the main principles behind actual FPAS CD delivery (section titled “The FPAS: Designing It and Developing It”). We then zero in on the tools used typically for the NTF and the medium-term forecasting and policy analysis, and discuss in some detail the workhorse medium-term model—the Quarterly Projection Model (QPM)—developed with most of the FPAS CD recipients (section titled “The Forecasting and Policy Analysis Toolkit”). We also review typical enhancements to the QPM that allow to reflect the specific economic structure of:

  • (1) Low-income countries that are subject to large supply shocks,

  • (2) Countries with a heavily managed exchange rate and/or a continued role for targets on money aggregates, and

  • (3) The channels through which changes in the fiscal stance affect inflation and the required monetary policy response.

We then discuss optimal ways of incorporating the FPAS into the monetary policy decision-making process (section titled “The Forecasting Process: Incorporating the FPAS into the Decision-Making Process”), and finally provide a general assessment of the FPAS CD delivered and lessons learned in the process (section title “Assessment and Lessons”).

2. The FPAS: Designing It and Developing It

A. Designing the FPAS to Support Policy Decision-Making

The main reason for building an FPAS is to support making monetary policy decisions in a systematic, forward-looking fashion, informed by economic data and analysis. In IT central banks, FPAS and the “forecasting and policy analysis” process aims at producing a comprehensive macroeconomic forecast for central variables like inflation, output, and the policy rate. While the inflation forecast for a given policy stance is always an integral part of a central bank’s forecasting process, FPAS stresses that the central bank’s projections should be solidly anchored on a reaction function for the policy rate which is designed to drive inflation toward its medium-term objective at a pace that reflects policymakers’ preferences. For an IT central bank policymaker, a key question that requires the use of policy models is what the policy response should be to return inflation close to the central bank’s target when temporary shocks or underlying economic fundamentals have pushed current or expected inflation away from the target.

More specifically, the key questions the policymaker needs an answer to are:

  • (1) Is a change in the policy stance needed and if so, why?

  • (2) How strong should the policy response be initially?

  • (3) What is the optimal policy path over the policy horizon, and how fast one should aim at bringing inflation back to target?

  • (4) What are the implications of the policy response for other variables such as output and the exchange rate?

  • (5) What is the “story”? That is, what are the drivers of the projected near- and medium-term development, and how to explain to the public the reason for a change in the policy stance, or the lack thereof?

In this sense, the FPAS helps aligning the decision-making process with (1) the overall strategic objectives of monetary policy (sometimes illustrating shortcomings with those objectives, in the process), (2) an analytic understanding of the economy and monetary policy transmission, and (3) real-time analysis of economic data. Focusing on the role of policy instruments in the transmission mechanism featuring forward-looking expectations, therefore, became a guiding principle for IT central banks and the design of their FPAS.6

These fundamental principles should also guide the design and development of the FPAS for non-IT central banks. Monetary policy actions affect the economy, including inflation, with long and variable lags. Thus, to achieve the central bank’s medium-term monetary policy objective (whatever that might be), setting the policy stance—say, level of short-term interest rates—must be based on sound forward-looking economic analysis. That requires a solid understanding of:

  • (1) the current economic conditions;

  • (2) various factors on the policy horizon affecting the attainment of the objective (such as changes in the fiscal stance, developments in main trading partners or international financial markets, and predictable weather-related effects); and

  • (3) how the monetary policy instrument affects the variables of interest on the relevant time horizon, that is, the monetary policy transmission.

Consequently, all central banks with some degree of active monetary policy, irrespective of their monetary policy and exchange rate framework, need an information system, analytical tools, and a formalized decision-making process that support forward-looking policymaking to be able to achieve their medium-term policy objective. That is, they need an FPAS.7, 8

A common misperception about the FPAS is that it is implemented to increase forecast accuracy and that its usefulness depends on how accurate those forecasts are. Of course, the future is uncertain, and forecasts are wrong most of the time, including because of unexpected and nonpredictable shocks hitting the economy. Accordingly, forecast accuracy (in isolation) is a poor metric for usefulness, as forecasts can be wrong for the right reason as well as right for the wrong reason. Indeed, the latter may be more troubling. Economies are frequently hit by unanticipated shocks. A forecast that comes true when the economy is hit by an unexpected shock (Figure 2, case 1) indicates a problem with either the forecasting tools and/or the staff’s judgment that could lead to policy mistakes in other circumstances (unless there are multiple shocks that offset each other). It is only in the case where the economy was not hit by an unexpected shock and the forecast still did not come true (Figure 2, case 4) that the forecast was clearly bad. Understanding the sources of forecast errors in the presence of unexpected shocks (Figure 2, case 2) as well as the presence of no shocks (Figure 2, case 4) or the lack of a forecast error in the presence of unexpected shocks (Figure 2, case 1), therefore, is crucial for avoiding policy mistakes stemming from model misspecification or faulty application of expert judgment in the future.

Figure 2.
Figure 2.

Ex Post Forecast Evaluation

Citation: Departmental Papers 2021, 026; 10.5089/9781557753274.087.A001

Source: Holub(2017)

The fact that forecasts are often not in line with the data observed ex post does not mean they are useless for policymaking purposes, however. It implies that the knowledge obtained during the forecasting process and the ex post forecast evaluation may be as important as the numerical forecasts themselves. Regular forecasting rounds will help the formulation and implementation of monetary policy if:

  • (1) The forecasts correctly signal the need for a policy adjustment and its direction most of the time and do that more reliably than any available alternative;9

  • (2) They help communicating the reasons for changing the policy stance both to the decision makers and to the public; and

  • (3) The policymakers trust the insight provided by the FT and base their policy actions on it.

The fact that the future is rife with uncertainty due to nonpredictable shocks and thus that the forecasts often do not come true is a reality that needs to be factored into the policy setting and should not be interpreted as a sign of failure. It requires that the policy stance is set with a sufficiently long horizon and not changed frequently in response to developments in the most recent data that may turn out to be mainly transient or noise. Instead, policy should be based on a solid understanding of the fundamentals that drive the business cycle over a sufficiently long period for monetary policy to be able to influence it to a large extent. It is also an argument for gradual policy adjustment. Preparing what-if scenarios, as well as simulating alternative policy responses to complement the baseline scenario, helps address the inherent uncertainty involved with forward-looking policymaking.

Structural macroeconomic models with an explicit role for endogenous monetary policy and forward-looking (model-consistent) expectations have become indispensable for forward-looking policy decision-making and for analyzing risks to the policy scenario. Macroeconomic models with an explicit economic-theory-based structure and a role for (endogenously set) monetary policy allow for systematic and economically coherent analysis of current and future economic conditions. That is crucial for rigorous monetary policy deliberations as well as for effective monetary policy communication. Such structural models also help quantify uncertainty and study of the implications of major risks to the baseline forecast. Creating consistent alternative scenarios without a structural model is both difficult and time consuming. Forecasters’ ability to quantify implications of main risks is greatly enhanced by exploiting the power of models to simulate the implications of changes in key assumptions in the baseline forecast. Such models allow forecasters to gauge the uncertainty around their baseline forecast, including the sensitivity to key assumptions or model parameters, and thereby permit policymakers to take into account the risks to the outlook in their decision-making more systematically.10

Econometric time series models and expert information about near-term developments may be more accurate for forecasting over the near-term horizon, however (Figure 3). Near-term developments are often driven by past policy actions, shocks, and idiosyncratic events that may be known but cannot be easily modeled. Near-term forecasts produced by sectoral specialists using analytic tools, complemented with their superior knowledge of sectoral developments and data, will typically systematically outperform any structural model one or two quarters ahead.11 At the same time, the long-term properties of these NTF time series models may not be well defined and are subject to the Lucas critique. The implicit steady state of these models is typically inferred from data and often depends on the estimation period. Moreover, these econometric time series models’ long-term properties are not necessarily consistent with ongoing structural and planned policy changes. Notably, an excessive focus on near-term forecasts, particularly when not anchored in a structured medium-term context, can promote short-term and reactive policymaking.

Figure 3.
Figure 3.

Precision, Tools, and Forecast Horizon

Citation: Departmental Papers 2021, 026; 10.5089/9781557753274.087.A001

Source: IMF mooelary policy analysts and forecasting course lecture material.

A good understanding of current economic developments and an assessment of initial conditions are preconditions for producing a high-quality forecast and for effective policymaking. Analysis and discussions of initial conditions reflecting the knowledge of sectoral specialists is essential for separating the signal from noise in the data. It generates expert opinions on ad hoc developments in specific sectors whose transmission may not be explicitly captured by the core medium-term forecasting policy analysis model and/or by the NTF time series models and that need to be added to the projections and to the model-based policy analysis via incorporating expert tunes.12 Possibly more importantly, the analysis and discussions of the initial conditions are also essential for determining both the trend—or longer-term equilibrium—developments in the key variables and the cyclical position of the economy. The assessment of the cyclical position of the economy and the estimates of the deviations (the “gaps”) of output, interest rates, and the exchange rate from those longer-term equilibriums may go a considerable way toward determining early in the process whether there is a need for adjusting the policy stance.

A solid understanding of external developments, including the policy stance and (likely) policy changes in key trading partners, is also essential both for preparing the forecasts and for determining the need for adjusting the policy stance. The more open the economy is, the more important are the external factors for both. Policy changes in the large economies and the major trading partner may, everything else equal, require commensurate policy changes in small open economies—that is, in small open economies the policy stance consistent with meeting the policy objective depends not only on domestic factors but also on the cyclical position and policy stance of their main trading partners. The NTF and the model-based medium-term forecasts are typically conditioned on trade weighted economic indicators (such as GDP, prices, exchange rates) in the most important trading partners and on various commodity (such as oil and food) prices. Sectoral expertise is needed to analyze these markets and assess their impact on the domestic economy and how to incorporate them into the forecast. Changes in the forecasts for the external variables are often an important source of domestic forecast revisions (and errors) and thus need to be understood and explained.

All of the previous points explain why judgment is a key component in both forecasting and policymaking. Expert judgment must be based on a deep knowledge of the economy, including stylized facts and institutional details. To acquire that knowledge, sectoral experts need to analyze the data, utilize extraneous projections for exogenous variables (for example, commodity prices, output, and inflation in large advanced economies as developed by international financial institutions and foreign agencies). They also need to develop aggregated as well as disaggregated satellite models for analyzing the sector for which they are responsible. Indeed, complementing analytic tools with good economic intuition, developed over time, is crucial. Iterations, relying on both formalized tools and intuition, represent the best practice to quantify a sound, analytically based expert judgment to be incorporated into the baseline forecast or alternative scenarios. Sectoral experts therefore need to constantly monitor their sector-specific developments, be up to date on relevant research, network with other experts at other policymaking institutions and the private sector, and continuously deepen and broaden their knowledge. Hence, judgment is not something loosely applied, like a gut feeling; it needs to be disciplined, based on thorough analysis and in-depth sectoral knowledge of the economy.

The insight provided here informs the design of the FPAS in terms of both analytical tools and processes. In particular, it informs:

  • The design and role of the core medium-term forecasting and policy analysis model as further elaborated on in the section titled “The Core Medium Term Forecasting and Policy Analysis Model.”

  • The design of the NTF framework and its role in the FPAS. Its main role in the process is twofold: to inform the assessment of the current conditions and to improve near-term forecast quality by analyzing various segments of the economy (one to two quarters ahead). The NTF-based forecasts of selected variables are typically imposed on the medium-term forecasts produced with the core projection model, allowing the model-based medium-term forecasts for those variables to “kick in” only after the first one or two quarters of the forecast horizon. Consequently, the forecasts of the other variables are conditioned on the NTF-based forecasts for those quarters (see the section titled “Sectoral Analysis, Nowcasting, and Near-Term Forecasting Tools” for a further elaboration on this). The final forecast and associated policy recommendation are essentially a combination of two elements: (1) expert judgment supported by analytical tools, including models for near-term forecasting; and (2) forward-looking, structural model-based forecasting and policy analysis complemented with sectoral analysis and judgment for the medium term.

  • The organization and structuring of the forecasting, policy analysis, and decision-making processes. A well-structured process and close interaction not only between specialized teams of the central bank, but also between the staff and policymakers are needed to ensure quality and consistency of the forecast and policy analysis and the policy decisions. The internal interaction, collaboration, and communication should ensure that policymakers understand, trust, and rely on the presented analysis and that it serves their needs. Preparing the policy documents (forecasts and policy analysis, materials supporting policy decision, and communication materials) require close interaction among the contributing analytic teams (the sector experts responsible for the near-term forecasts, the core medium-term modelers responsible for the medium-term forecasts, and the communications experts). It also requires close interaction with other parts of the central bank, including market operations, banking supervision, and/or financial stability. Ideally, discussions should lead to a consensus between staff and the policymakers regarding key underpinnings of the forecast and policy analyses, namely:

    • ■ The assessment of the current economic situation;

    • ■ The external and other exogenous assumptions underpinning the forecast;

    • ■ The fundamental drivers of the likely developments over the medium term that policy may have to respond to;

    • ■ The pace and magnitude of the required policy adjustment given the current economic circumstances and reflecting the policymakers’ preferences as well as concerns not captured by the model;13 and

    • ■ The policy stance underpinning the published forecasts (accurately reflected in implemented policy).

Figure 4.
Figure 4.

The Forecasting and Decision-Making Process

Citation: Departmental Papers 2021, 026; 10.5089/9781557753274.087.A001

Source: Holub(2011).

These points are further elaborated in the section titled “The Forecasting Process: Incorporating the FPAS into the Decision-Making Process.”

  • The need for and design of forecast evaluation toolkits. A toolkit for analyzing forecast revisions between forecasting rounds and ex post forecast errors is an essential part of a well-designed FPAS. It facilitates learning, accountability, and building trust in the process and underlying policy advice. When evaluating forecast errors, it is important to assess separately each of the potential sources for the errors. Forecast evaluation involves the assessment of the initial conditions, the impact of other macroeconomic policies, the external outlook, model structure and parameters, and the contribution of expert judgment tunes. This requires thorough documentation of the quantification of the sectoral expert’s tunes to be able to run ex post model simulations with and without expert judgment incorporated.

B. Developing FPAS Capacity: Scope of CD Activities

Because a successful FPAS is more than simply a model, FPAS CD focuses also on more than just model development. A successful framework requires at least three basic components: (1) a set of reliable and well-designed analytical tools for economic and policy analysis, (2) staff trained in how to use such tools (including the application of expert judgment), and (3) a policy framework and decision-making process in which the output of this process is used in a systematic way to provide policymakers with the alternative policy choices. FPAS CD revolves around these three broad functions.

Accordingly, assisting CD recipients with building the FPAS tools is just one part of CD activity. Sustainability requires that the staff of the recipient central bank has a solid understanding both of how to use these tools, and of how to maintain and develop them further. Ensuring that staffing is sufficient, and that the staff are allowed to spend adequate time using and maintaining the analytic framework, also requires that the management of recipient central banks find the tools useful for their decision-making and regularly rely on the analytic inputs from the staff. For that reason, many of the CD projects have spent as much time on training the staff responsible for forecasting and modeling in using the analytic framework, helping the management to guide forecasting, policy analysis, communication, and decision-making-related processes, as they have spent on building the tools.

To ensure that the forecasting and policy analysis toolkit is well maintained and has institutional traction, CD projects have advised on the following:

  • ■ Staffing and organization

  • ■ Data storage and data management14

  • ■ Collecting data on the external sector

  • ■ Forecasting calendars and structuring of the forecast process, including staff forecasting meetings and policy meetings between the staff and the policymakers (such as the monetary policy committee (MPC] or executive board)

  • ■ Coordination with other economic agencies (for example, ministry of finance, ministry of economy) in sharing projections and analyses

  • ■ Internal communication, including briefings organized for other parts of the central bank, interactions between technical staff and the decision makers and preparation of policy materials and presentations

  • ■ External communication, including the drafting of press releases and MPRs and conducting press conferences

Relatedly, CD projects have helped central bank staff in developing both individual and institutional capacity. This has included deepening their understanding of macroeconomics and monetary policymaking; time-series and structural macroeconomic models; the use of various software, such as Excel, EViews, Matlab, and related analytic toolboxes; and policy tradeoffs. CD projects have also helped central bank staff (1) run mock forecasting and policy rounds and conduct their first actual forecasting rounds, (2) implement tools and procedures for systematic evaluation of the model-based forecast performance, and (3) undertake outreach activities and make presentations to other parts of the central bank, to central bank management, and to the monetary policy decision-making body (the board or MPC). They have also helped design templates for the central bank’s first MPRs, its press releases (monetary policy statements), and its presentations for the monetary policy press conference.

Establishing a sustainable and functional FPAS typically requires sustained and intensive CD over multiple years. The first phase of a typical project for developing an FPAS in a low- or lower-middle-income country with limited analytic background is typically planned for two years, with three to four missions per year. The missions must be frequent to keep the momentum. Regular visits ensure that the project develops tools and capacity that could be put to use within a reasonably short timeframe. At the same time, sufficient time in between missions is required for central bank staff to implement the recommendations. Follow-up missions may be contingent on the authorities’ progress in achieving organizational milestones. Many projects have included specific tasks and training exercises, to be undertaken by the central bank staff in between the missions, as well as provided for online interactions between them and the CD experts. IMF classroom-style and online courses also complemented CD activities, including the recently released IMF online course on Monetary Policy Analysis and Forecasting.15

3. The Forecasting and Policy Analysis Toolkit

A. Sectoral Analysis, Nowcasting, and Near-Term Forecasting Tools

A well-designed framework for sectoral analysis and NTF includes models and other tools, which in addition to having forecasting ability are suitable for quantifying expert judgment. Sectoral analysis and the NTF framework serve as a judgment support system. It is not only to provide an input for the medium-term forecast. Foremost, it is to ensure that all relevant information is extracted from available data and to provide an assessment of the current stance of the economy. For the latter purpose, it is important that tools for filtering out the signal from high-frequency data are available. It is equally important that staff have expert knowledge of the sectors they cover and closely follow and analyze the most recent developments (that is, the high frequency movements in the data), including with the help of qualitative information. Use of standard seasonal adjustment tools for decomposing time series into their seasonal, irregular, and trend-cycle components and simple univariate filtering tools for separating the longer-term trend from the cyclical component is an integral part of this. A good understanding of current economic developments and an assessment of initial conditions is, as noted previously, a precondition of a successful forecast and policy analysis.

While assistance on sectoral analysis and NTF was an integral part of most projects, compared to medium-term model development fewer resources were often devoted to these CD efforts, at least initially. The reasons for this were twofold. First, this was an area where the recipient central banks often had stronger expertise, with reasonably well-established tools and procedures and staff with good sector-specific knowledge at the onset of the projects. Second, the more pressing need was often to enhance the recipients’ overall macroeconomic analysis capacity, notably shifting their focus from analysis of individual time series to the economy as a whole and to identifying the policy stance that would achieve their policy objective(s). That said, most CD projects assisted with building or enhancing existing univariate and multivariate time series models. CD projects also helped with construction of coincident and leading indicators, as well as conducting, processing, and analyzing business (sentiment) surveys.

NTF frameworks tend to be primarily focused on the key macroeconomic variables, such as GDP and inflation.16 In addition to models developed for forecasting these variables, sectoral experts use various tools to analyze data on an aggregated as well as disaggregated level to acquire a good understanding of the variable/sector of their responsibility. Typically, every new data observation is analyzed in detail, compared against the most recent forecast, and forecast errors are calculated and analyzed. This is both to identify the source of those forecast errors and over time improve the forecasts and forecasting methods, and to analyze the most recent developments in the data. Special emphasis is put on differentiating between the signal of underlying developments in the data and noise. Often, internal memos containing this analysis are circulated to the policymakers, to keep them constantly abreast of staff’s interpretation of the new data and of how the earlier forecast and analysis fared.

There is no unique, benchmark modeling set up of sectoral analysis and NTF tools.17 CD projects have helped recipient central banks develop high-frequency time series models, including single-equation autoregressive integrated moving average, principal component, vector autoregressive, vector error correction, Bayesian vector autoregressive, Bayesian vector error correction, factor vector autoregressive, mixed-data sampling, and bridge equation models. Using a suite of models and tools to forecast each of the main variables, or supporting judgment-based expert projections, is best practice. However, many of the FPAS TA recipients have thus far not developed such a suite of NTF models. As of now, most have a few models only and some countries are in the process of developing more elaborate suites of models for supporting their NTF frameworks. Where officials possessed limited expertise in this area, CD focused on basic time series analysis was provided. In countries with limited availability of high-frequency data to support economic analysis, nowcasting, and NTF, projects assisted with developing coincident and leading indicators and the processing and use of qualitative data (consumer and business sentiment data).

Although nowcasting primarily relies on high-frequency data, such models are often complemented by analytic tools based on lower-frequency data or mixed-frequency data. The aim is to adequately cover all variables and utilize all relevant information. Lower-frequency data are typically less noisy and have a broader coverage. The analytical work of both sectoral experts and structural modelers includes the assessment of the impact of unobserved variables, such as potential output or natural rate of interest, on the economy. The unobserved variables are often estimated by means of univariate or multivariate statistical filters. The most frequently used univariate filters include the Hodrick-Prescott and band pass filters. The multivariate Kalman filter is widely used for estimating unobserved trends and gaps that are part of structural models. As these variables are unobserved and their estimates are wrought with uncertainty, the use of different methodologies for estimating them is not only advisable but also in line with best practice. In addition, satellite models relying on lower-frequency data may be used to estimate trends or form expert judgment regarding their future path. One example would be labor market trends, which are often based on demographic trends and labor market policies.

There is a risk when developing an NTF framework of focusing too much on technically sophisticated tools at the cost of building sectoral expertise to support the use of judgment. An application of even technically highly sophisticated tools will result in an inferior forecasting performance, when they are used mechanically (hiding behind the model), without combining them with expert judgment. Models cannot substitute for the evaluation of a wide range of sectoral information by sectoral specialists to form their informed judgment. Judgment is based on experience built over time, good knowledge of the economy, combined with an evaluation of idiosyncratic factors that often drive the near-term development.18 Decomposing time series into their seasonal, irregular, and trend-cycle components using the standard seasonal adjustment programs helps identifying the idiosyncratic factors (the irregular component) and the underlying factors (the trend-cycle component) that may be driving near-term developments, thereby supporting both the use of judgment and a modeling of the systematic aspects of the series.

Experience shows that recipient central banks had in some cases a relatively solid understanding of the advanced tools (because they are taught at most universities) but a poorer understanding of the basics of “reading” time series. For example, skilled analysts can distill a narrative from charts, simple averages over time, seasonal patterns (and seasonally adjusted data), and standard time series decompositions. There were cases when this basic craftsmanship was neglected in favor of more complex mathematical models. As a result, the standard tools for seasonally adjusting and decomposing time series, such as the Census Bureau’s X-13 ARI MA-SEATS, were often underutilized, and expert knowledge reflecting good understanding of data were replaced with highly aggregated, econometrically poorly specified NTF tools. Examples of the latter includes a direct modeling of the headline 12-month rate of change in the variable, which by construction introduces autocorrelated errors with long memory and can easily result in spurious relationships. It also does not support a proper understanding of the current economic situation and underlying driving forces in the near-term, nor support incorporation of judgment into the forecast.19

The nowcasts and one- to two-quarter-ahead forecasts prepared using the NTF tools and judgment are imposed on the medium-term core projection model forecast. There are several ways to accomplish this. The nowcasts of the most recent developments can be treated as if they were observed data, to be used for the QPM-based forecast and policy analysis exercise. This allows the decomposition of the “observed” data into trend and cycle, the assessment of the initial conditions, and the medium-term projection to be based on a complete data set. The QPM-based forecasts can also be conditioned on the NTFs by treating the current and the next quarter(s) NTF-based forecast of any selected variable as fixed, using one of the following three approaches:

  • One-to-one conditioning: One endogenous model variable (say, GDP) is made exogenous using exactly one type of shock as the corresponding endogenous variable (in the case of GDP, for example, the output gap shock). The value of the shock is calculated so that the resulting output gap takes the value required to reproduce the desired GDP growth for a given value for potential GDP. This is a determined system (one-to-one), where a single unique solution exists.

  • Weighted conditioning: One endogenous model variable (for example, GDP) is made exogenous using more than one type of shocks (for example, a potential output shock and an output gap shock). Because this results in an underdetermined system as an infinite combination of the two shocks can give rise to the desired value of the now exogenous model variable (GDP), we need to specify a fixed proportion in which the two shocks will move relative to each other. For instance, we may constrain the potential output shock to be one-fifth of the output gap shock (because the gap is much more likely to move more than potential output). Given this extra condition, the system becomes determined, and a single unique solution exists for the values of the two shocks that reproduces the desired value of GDP. Alternatively, it is possible to estimate the weights for structural shocks based on Bayesian estimation as shown in Waggoner and Zha (1999).

  • Two-stage conditioning: First, within the filtering procedure, the NTF is treated as extra observed data points (an outcome), the series are extended with these extra observations, and the filter runs all the way until the end of the NTF. The trend and cycle components of the variable (GDP) calculated using the filter (potential output and the output gap) are based on the extra information from the NTF. Then, one-to-one conditioning is performed. The two model variables (potential output and the output gap) are then made exogenous using exactly two types of shocks (a potential output shock and an output gap shock). This results in a determined system with a single unique solution for the shocks to reproduce the desired values of potential output, the output gap, and GDP growth.

B. The Core Medium-Term Forecasting and Policy Analysis Model

Macroeconomic models for monetary policy analysis are designed to describe the interactions of key macroeconomic variables over the medium term. The main purpose is not to produce a forecast understood as best guess of the values of the main variables. In practice, models do not produce the forecast, economists do. What models can do is to provide a coherency check on the judgment that produces the main forecast. They allow the systematic analysis of risks to the forecast, including sensitivity to various assumptions, shocks, and policy responses. Most importantly, they provide a framework that can help to ask the right questions.” (Berg, Karam, and Laxton 2006)

Although there is no unique core forecasting or “FPAS” model, most central banks have started out with one relatively simple New Keynesian semistructural quarterly gap model. Some have subsequently supplemented it or replaced it with a full-fledged dynamic stochastic general equilibrium (DSGE) model.20 While some central banks with substantial resources and extensive modeling expertise have a suite of models to support their medium-term forecasting and policy analysis activities,21 most central banks, and especially those at early stages of developing their analytical capacity, have found it better to focus on only one model and to keep that model relatively simple. That allowed them to use more of their scarce resources on building the infrastructure and processes needed to ensure effective and transparent use of it. As a result, they were able to relatively quickly put the FPAS and the core forecasting model into regular use supporting their policymaking.

Consistent with this, CD projects largely focused on helping the recipient central banks develop similar semistructural gap models, referred to as the QPM after the earlier but more complex Bank of Canada model.22 The QPM is a semistructural model and a loose shortcut for a fully structural model derived from optimization of all economic agents in a DSGE setup. However, in contrast to DSGE models, coefficients of the QPM class of models are not necessarily functions of structural parameters, and the model structure does not satisfy all relevant market clearing conditions and stock and flows consistency. The QPM is a gap model, meaning it explicitly distinguishing equilibriums, or “trends,” driven by long-term fundamentals from the deviations from these equilibria, known as the “gaps,” which are interpreted as business cycle fluctuations. The semistructural nature (that is, exclusion of some restrictions) makes these models more flexible in terms of replicating the data and adding country-specific features compared to fully micro-founded models. However, they do not strictly adhere to micro foundations, and certain features are hardwired while in reality the parameters, or the full impact of a shock, would depend on the changes in the structure of the economy or the nature of the shock.23

There are several reasons for choosing a semistructural gap model as the core policy analysis model. First, the model has to be sufficiently simple so that it can be built and put into use relatively quickly (typically within the first or second year of the FPAS project,24 in countries that often have relatively limited expertise and experience in building and using models for policymaking). Second, it has to be forward-looking with a clear role for (endogenous) monetary policy, based on solid theoretical foundation, and facilitate incorporation of expert judgment and “tunes” from other sources.25 Third, the model must be sufficiently malleable to fit reasonably well the particular features and dynamic properties of the economy, and facilitate “story-telling” depicted by the forecasts and policy recommendations to both policymakers and to the public at large.

The semistructural QPM meets these requirements. The basic version is linear, which keeps its operation and maintenance relatively simple. It is malleable in structure, which makes it easier to operate than full-fledged DSGE models and helps to match observed historical data without losing theoretical consistency. It is fairly easy to understand, yet it still embodies the main principles of monetary policymaking and general equilibrium, such as long-term monetary neutrality. As a New Keynesian model, it features both nominal and real rigidities. It also incorporates an easy-to-understand mechanism with a solid theoretical foundation that drives inflation over the business cycle—the fluctuations of real variables (such as output) around their long-term trends. All shocks are orthogonal, and each equation has an economic interpretation. The model’s parameterization is flexible in order to be able to bring the model to the data and take into account country-specific characteristics of the economy. The key parameters are almost always calibrated rather than estimated, as discussed in the “Calibration of the QPM” section. Experience shows that the limitations of models of the QPM class can be mitigated in most cases by applying expert judgment.

In addition, the QPM has several properties specifically suitable for supporting central banks’ regular monetary policy decisions. First, the implementation of expert judgment and tunes from other sources is easy. Second, the model is forward-looking, allowing the central bank to react to future events in a consistent and predictable manner. Third, the long-term trends reflect real processes beyond the reach of monetary policy, while the business cycle developments (“gaps”) are of particular relevance for the monetary policy decisions.

Stepping up the modeling complexity by building a fully micro-founded DSGE model and training the staff to operate, interpret, and explain its results can take substantial time. Furthermore, despite the complexity and effort needed, a fully fledged DSGE model does not necessarily provide better forecast performance or better support for policy analysis and policy decisions.26 There is a substantial risk that the complexity of the fully micro-founded DSGE model could hamper internal discussions both about the monetary policy issues and about the other macroeconomic policy issues of interest, as well as potentially complicating external communication. The use of more complicated models also makes it harder to cope with staff turnover because more specialized expertise is needed and more specific human capital is lost when a team member leaves. An attendant risk is that the model would end up being used as a “black-box,” obviating its role in providing economic insights to supplement judgment and statistical analysis.

Figure 5.
Figure 5.

Basic Structure of the Textbook

Citation: Departmental Papers 2021, 026; 10.5089/9781557753274.087.A001

Source: IMF monetary policy analysis and forecasting course lecture material.Note: IS curve depicts the set of all levels of interest rates and output at which fatal investment equals total savings; UIP = uncovered interest parity.

While following a similar structure, referred to as “the QPM,” there is variation among specifications of the individual country models. The textbook version in the following discussion has typically been amended to (1) incorporate the demand impact of changes in fiscal policy (see “Subsection Incorporating the Impact of Changes in Fiscal Policy”), (2) incorporate terms of trade (TOT) effects (see “Incorporating Terms of Trade Effects”), (3) account for supply shocks and administrated prices by having multiple Philips curves in the model (see “Addressing Domestic Supply Shocks and Adding Multiple Phillips Curves”), (4) add a money block to assist money targeting (MT) central banks or central banks that are transitioning from MT (see “Incorporating Money and Money Targeting in the QPM”), (5) account for exchange rate persistency and managed or pegged exchange rate regime (see “Incorporating Money and Money Targeting in the QPM”), (6) address persistent deviations of market rates from the policy rate, and (7) incorporate the impact of central bank credibility on inflation expectations (see “Central Bank Credibility and Expectations”). Some projects have also added a labor market block, a more detailed balance of payment structure, and the impact on the economy of dollarization and wage policy.

C. The Basic “Textbook” Version of the QPM

The core behavioral part of the standard QPM comprises of:27

  • An aggregate demand curve/open economy IS curve, with the output gap (ŷt).28 depending on its past and expected future value; the real monetary conditions index (mcit), in the standard version measured as a weighted average of the deviation of the real interest rate (r^t) from its neutral (noninflationary) equilibrium level (rtn) and the deviation of the real effective exchange rate (z^t)29 from its equilibrium level, the foreign output gap (y^t*), and an aggregate demand shock (ϵty^):
    y^t=a1y^t1+a2Ety^t+1a3mcit+a4y^t*+εty^(1)
    30
    mcit=aa1r^t+(1aa1)(z^t+1)(2)
  • An aggregate supply curve/open economy forward-looking Phillips curve, with inflation πt.31 depending on its past and expected future value; real marginal costs (rmct) where the real marginal costs of domestic producers is approximated by the output gap and the real marginal cost of imports by the exchange rate gap, and cost push shocks (εtπ):32,33
    πt=b1πt1+(1b1)Etπt+1+b3rmct+εtπ(3)
    rmct=bb1y^t+(1bbt)z^t(4)
  • The uncovered interest parity (UIP) condition, which relates the nominal exchange rate (st) to its expected future value (Etst+1), domestic and foreign nominal interest rates (itandit*), and a risk premium (premt). The exchange rate is defined as units of domestic currency per one unit of foreign currency.
    st=Etst+1+(it*it+premt)/4+εts(5)
  • The monetary policy rule is specified as a forward-looking monetary policy reaction function aimed at stabilizing inflation, where the policy stance, represented by a short-term interest rate, depends on the equilibrium nominal interest rate (itn),34 the expected deviation of inflation from the target (Etπt+44πtT),35 and the output gap:36

it=c1it1+(1c1)(itn+c2(Etπt+44πtT)+c3y^)+εti(6)

Little structure is imposed on the long-term evolution of the equilibrium variables, or real trends. Besides the restrictions implied by definitional relationships between them, the long-term trend/equilibrium rate changes (Δx¯t) are generally modeled as autoregressive processes that converges to their steady-state rates (Δx¯ss):

Δx¯t=hxΔx¯t1+(1hx)Δx¯ss+εtΔx¯(7)

In practice, univariate or multivariate filtering techniques are used to estimate long-term trends, captured by equation (7), for the historic part of the data sample. They might be altered by expert tunes, should additional analysis and/or sectoral specialist information justify this alteration. Over the forecast horizon, these trends either are assumed to converge to the steady state or they can be projected outside of the model based on expert judgment and/or alternative forecasting methods.

D. Incorporating the Impact of Changes in Fiscal Policy

The main approach for capturing the impact of the fiscal policy stance on monetary policy and inflation has been to add a fiscal impulse variable to the aggregate demand curve/open economy IS curve in equation (1). It is possible to include a more detailed fiscal block to also capture the effects of changes of the government debt on the country risk premium, as well as an explicit modeling of fiscal revenues and expenditures. Neither of these expansions have been included in any of the country projects to date. Adding a full flow-stock framework would be more difficult, though, as the QPM in its core is a flow model.37

y^t=a1y^t1+a2Ety^t+1a3rmcit+a4y^r**+a5fimpt+εty^(1b)

The specification of the fiscal impulse variable has varied. The most commonly used approach decomposes the fiscal deficit ratio38 into (1) a cyclical part (deftgap) that captures the effect of automatic stabilizers on the deficit and mirrors the position of the output gap (deftqap=f2y^t); (2) a long-term structural component (deftstr)); and (3) an idiosyncratic, or discrete, component (deftsh), with the fiscal impulse variable defined as either the discrete component or as the change in the structural and discrete components:39

fimpt=deftshdeft1sh+f3(deftstrdeft1str)(8)
deftstr=f1deft1str+(1f1)deftstr¯+εtdefstr(9)
deftsh=f4deft1sh+εtdefstr(10)

The fiscal impulse variable has also in some cases been specified as the deviation of real fiscal expenditures (rfxt) from its long-term (sustainable) trend (rfx¯t), that is, the fiscal expenditure gap (rfx^t), with:

rfxt=rfx¯t+rfx^t(11)
rfx^t=αrfx^t1+ϵrfx,t^(12)
rfxy¯t=βrfxy¯t1+(1β)rfxyss+εrfxy,t¯(13)

E. Incorporating Terms of Trade Effects

The impact of changes in terms of trade (TOT) on real economic activity and policy responses can be large in small open economies. Permanent shocks to TOT usually require a rebalancing of the current account and alter the long-term equilibrium real exchange rate. Given the forward-looking behavior of economic agents in most economies, some of the long-term effects may be frontloaded in current consumption and investment demand. Cyclical TOT fluctuations around its long-term trend may, therefore, impact aggregate demand and the output gap dynamics. Those can be realized through current and permanent income effects, shaping consumption and investment decisions made by households and businesses. Additionally, long-term changes in the TOT might have an effect on the equilibrium real exchange rate as well. Fully modeling the real exchange rate would require also imposing a stock-flow consistency, which is not possible in models with QPM architecture as they are inherently flow models. Still, some country models have included a simplified reduced form to improve the way the real exchange rate equilibrium is determined as well as to capture the impact of TOT fluctuations on the output gap. To keep the models’ structure simple while allowing for incorporation of analytic results obtained by sectoral balance of payments specialists, the TOT block in these models was structured as follows:

Decomposition of the overall TOT into a long-term trend and a cyclical, or gap, component:

tott=tot¯t+tot^t(14)

Little structure is imposed on the evolution of either component that are modeled as autoregressive processes: Long-term trend in the TOT:

Δtot¯t=ρtot¯Δtot¯t1+(1ρtot¯)Δtot¯ss+εtot,t¯(15)

The TOT gap:

tot^t=ρtot^tot^t1+ϵtot,t^(16)

These equations are only technical simplifications to incorporate the effect of relative prices of exports and imports into the model based on analyses prepared outside of the model by the balance of payments sectoral experts:

Δz¯t=ρΔz¯Δz¯t1+(1ρΔz¯)Δzss+τΔtot¯t+εΔz,t(17)

The standard aggregate demand (output gap) equation augmented with the TOT gap to capture the positive impact on the output gap by an improvement in the TOT (relative to the long-term TOT evolution):

y^t=a1y^t1+a2Ety^t+1+a3mcit+a4y^t*+a5to¯tt+εy^,t(1c)

Note that the higher aggregate demand followed by TOT improvements can be generated either by the public or private sector, depending on the share by which these subject benefit from the TOT change and that equation (1c) is a reduced form equation and as such not immune to the Lucas critique. Should, for instance, the trade of terms improvement increase aggregate demand via the fiscal policy channel, any change in the fiscal rule would result in a change in parameter a5.

F. Addressing Domestic Supply Shocks and Adding Multiple Phillips Curves

The output gap measure has been based on GDP excluding agriculture, oil, or mining output in many of the country models. Changes in the output gap serve as a proxy for changes in marginal production cost that are passed onto consumers as demand pushes output above or below full capacity—potential output—and should thus be positively correlated with consumer price index (CPI) inflation. Domestic supply shocks, such as harvest shocks, which conceptually are shocks to both actual and the contemporaneous potential output (and not the output gap), may in practice lead to a biased estimate of the output gap and even estimates of the output gap that are negatively correlated with CPI inflation.40 This risk is especially high when potential output is estimated by simple univariate trend filters. This distortion can be particularly significant in developing countries with large agriculture sectors that provide a significant share of the domestic food supply, and even more so when there are restrictions imposed on export and import of the staple food items. The existence of a large natural-resource-intensive export-oriented production sector, whose output is predominantly determined by production capacity or external factors as opposed to domestic demand, may also render an output gap measure based on total GDP less appropriate as a measure of demand-determined inflationary pressures. Basing the output gap on GDP excluding such sectors, therefore, is a simple solution to this problem.

Some country models have also featured more than one output gap measure to more fully capture the effect of the domestic supply shocks discussed previously. A bumper harvest that creates excess supply of domestically produced food products (for example, maize in many sub-Saharan Africa countries) should create downward pressure on food price inflation. It may also temporarily cause exports to rise, the exchange rate to appreciate, and the country risk to decline in countries with large contribution of agricultural production to GDP.41 The bumper harvest may also stimulate increased demand for nonagriculture output. This is attributed to both direct factors (for example, growth in industries servicing the agriculture sector) and indirect ones (for example, through the resulting boom in disposable incomes). This effect can be large. In some developing countries, where a significant share of the food supply is domestically produced, it is often the main short-term driver of inflation. It can also be a major short-term driver of swings in demand for nonagriculture output, when a significant share of the population is employed in the agriculture sector, and thus the (local) income multiplier may be large.

The temporary deviation of supply-driven output, such as agriculture, from longer-term trends should be able to capture these supply-side effects. In a case where these effects were particularly dominant, the model included an explicit modeling of the agriculture output gap and its impact on the economy. The agriculture output gap (y^tagr) equation was specified as:

y^tagr=αty^t1agr+α2y^t*+α3tot^tα4mcit+α5fimpt+εtagr(1d)

This specification is similar to the standard IS curve, but with some additional terms. First, the fiscal impulse is added, to approximate the impact of fertilizer subsidies on agriculture output supply 5fimpt). Second, the TOT gap (tot^t) was also included to capture the change in world prices on the export of crop commodities and consequently on total domestic agriculture supply.

To take into account the fact that a bumper harvest typically also stimulates demand for nonagriculture output with a lag, the lagged agriculture output gap (β4y^t1agr) was also included as one of the main determinants of demand for nonagriculture output:

y^tnagr=βty^t1nagr+β2Ety^t+1nagrβ3rmcit+β4y^t1agr+β5fimpt1+εtnagr(1e)

In some countries with a large agriculture sector, the direct impact of domestic food supply shocks on food price inflation was captured by including the agriculture output gap (y^tagr) into the Phillips curve for food inflation, as in equation (3c).

Augmenting the supply-side block by splitting the overall CPI into components and modeling them with separate Phillips curves has also been common practice. This is usually done to better capture the impact of regulated prices, domestic and international food price shocks, oil price shocks, and other factors that may drive individual components of the total CPI inflation. While there have been significant country differences in the detailed specification, reflecting country-specific factors, many of the country models have broadly followed the setup in Andrle and others (2013b) outlined in the following:

  • Headline CPI specified as a weighted average of components, for example:
    ptCPI=wfptf+(1wf)ptnf(18a)
  • for food/nonfood prices split, or
    ptCPI=wfptf+wtptt+wapta+(1wfwtwa)ptnfta(18b)
  • for a more detailed food/transportation/administrated prices split.

  • And the corresponding qq inflation rates:
    πtCPI=wfπtf+(1wf)πtnf,or(19a)
    πtCPI=wfπtf+wtπtt+waπta+(1wfwtwa)πtnfta(19b)
  • Here wf, wnf ,wt, and wa are the weights for food (or volatile food), nonfood, transportation, administrated prices accordingly, and pti is the respective price indices (in logs).

    Headline CPI excluding the specified components (ptnf,ptnfta) is commonly referred to as “core” inflation.

    The different CPI components typically exhibit different long-term trends reflecting persistent changes in relative prices. This not only renders the common exclusion-based core inflation measure a poor measure of underlying inflation (or the current headline inflation corrected for “noise”) but may also require an explicit modeling of the long-term trend and the temporarily deviation from the long-term trend (the “gap”) in relative prices and their impact on the inflation process.

  • Relative prices are specified either as:
    rptf=ptfptnf(20a)
  • or as:42
    rptf=ptfptCPI,rpta=ptaptCPI,etc.(20b)
  • And decomposed into a gap and a trend component:
    rptf=rp^tj,rp¯tj,j=f,t,a,...(21)
  • With the growth in the trend component assumed to follow an autoregressive process converging to its steady state growth as in equation (7):
    Δrp¯tj=rajrp¯t1j+(1raj)rp¯jss+ϵtrp¯tj(22)
  • The Phillips curves for the CPI components as:

The Phillips curve for core inflation (CPI excluding food) may be similar to equation (3):

πtnf=b1nfπt1nf+(1b1nf)Etπt+1nf+b2nfrmctnf+εtπ(3b)

Note that the real effective exchange rate gap in the real marginal cost variable, rmctnf, in this case is based on the nonfood real exchange rate (ztnf=st+pt*ptnf), measuring the relative prices of foreign versus domestic goods (adjusted for a trend) in the nonfood segment of the CPI. The nonfood real marginal cost measure may also depend on temporary deviations from trend (gap) in some other relative prices (rp^tj) to capture spillover effects (see Amarasekara and others 2018):

rmctnf=bb1y^t+bb2rp¯tj+(1bb1bb2)z^tnf(4b)

Additional terms may be added as well, including administratively set wages and oil prices.

The Phillips curve for food inflation may in the real marginal cost term also include a term for the temporary deviation of domestic food prices from international food prices as in Andrle and others (2013b). It may also include a term for excess supply of domestic food (agriculture) products to capture the direct effect of a particularly bad or good harvest as in the case of the models for some countries with sizable agricultural sectors43:

πtf=b1fπt1f+(1b1f)Etπt+1f+b2frmctfb3f(y^tagrα3tot¯t)+εtπ(3c)

The administrated price component may be exogenous, assumed to follow a simple autoregressive process, modeled as following an error correction process around an endogenous trend based on overall inflation and the exchange rate, or as:

πta=z1πt1a+(1z1)π¯t+εa,tshortz2εa,tshort+εa,t(3d)

which distinguished between short-term εa,tshort and more persistent a,t) shocks to administrative price inflation.

G. Incorporating Money and Money Targeting in the QPM

Many country projects have included a money block in the QPM. This inclusion may facilitate making the QPM useful for MT central banks and central banks that are transitioning from MT toward a new monetary policy arrangement or pay significant attention to money aggregates when determining its interest rate target. Adding a money block also allows for analyzing the implications of following an MT strategy along the lines of Berg, Portillo, and Unsal (2010) and Andrle and others (2013a). The detailed specification of the money blocks in these two papers differ, though, and while largely following the approach set out in either Berg, Portillo, and Unsal or Andrle and others, the specifications in the various country models have differed as well. Regardless of the chosen specification and policy strategy modeled, policy actions are reflected in the dynamics of short-term (money market) interest rates in all versions, which is the first step in the policy transmission.

The starting point for including a money block is to add a money-interest rate relationship. The following dynamic specification relates real money growth (Δrm)44 to real GDP growth (Δy); the rate of change in the deviation of short-term interest rates from equilibrium (ititn)(it1it1n); the real money gap (rm^) the rate of change in trend, or equilibrium, money velocity (Δν¯); and a shock term (see equation (23)). The real money gap is, in this specification (equation (24)), defined as the deviation of real money from what is explained by actual real GDP, nominal interest rates, and trend velocity:45, 46

Δrmt=Δyt+γ1{(ititn)(it1it1n)}+γ2rm^t1Δv¯t+εtΔrm(23)
rm^t=rmt(ytγ1(ititn)v¯t)(24)

This setup collapses to the identity from quantitative money theory Δm¯tπ4tTar=Δy¯tΔν¯t when the economy is in long-term equilibrium, with interest rate at its the neutral level, and the real money demand gap at zero.

Depending on the intended use of the model for policy formulations and forecasting, the policy block/policy reaction function may have to be adjusted as well. Both of the following two alternative uses and specifications are feasible:

  • A pure forecasting tool with exogenous policy. Omitting the policy reaction function (6) and setting real money growth (Δrmt) equal to an exogenously set target (ΔrmtT) path (say through a separate financial programming exercise) would result in a model that may be used to derive the implications of that money target path for interest rates, the output gap, and inflation.47

  • A tool to set consistent (intermediate or indicative) targets on both a monetary aggregate and on short-term interest rates. The previous version with both equations (6) and (7) would allow the policymaker to form a view about what both short-term interest rates and money growth should/would have to be in order to keep inflation on “target” over the policy horizon. The results can be used to set the (intermediate) money target under a MT framework with the implied path for short-term interest rates used as a cross-check. Alternatively, it can be used to set the interest rate operating target under a two-pillar approach, where deviations of actual money growth from indicative money targets are used as a cross-check, or tripwire, signaling the need for reassessing and possibly changing interest rates. For this approach to be useful, the high-frequency volatility in money velocity should be relatively low.

The standard approach has been to adjust the policy block to include a specification for how the money targets are set and allow for a potentially flexible adherence to those targets along the lines of the following setup:

  • The money growth target. The specification of this should depend on how the MT central bank is setting its targets. For example, in one of the country models it is specified as set one period ahead based on government GDP projections (Et1{ytGov})48 that the central bank treats as exogenous, an inflation target (Et1{π4tT}),49 expected change in velocity Et1{Δν¯t}, and expected shocks to the money-interest rate equation (11) Et1{εtΔGov}.50
    ΔmtT=Et1{ΔyrGoν}+Et1{π4tT}Et1{Δv¯t}+Et1{εtΔrm}(25)
  • Adjusting the policy rule. Strict adherence to the money target in equation (25) would, according to equation (6), imply the following for the short-term interest rate:
    itMT=it1+(itnit1n)++1γ1{ΔmtTπtΔyt+γ2rm^t1+Δv¯tεLΔrm}(6b)
  • This policy rule, depending on how actual growth, inflation, and shocks to the money interest rate relationship differ from their ex ante projections, would result in volatile money market rates. The central bank may instead allow money growth to deviate somewhat from target in order to keep market interest rates more stable in the short term. To allow for no or only partial adherence to the money targets, market interest rates, it, are set as a weighted average of what they would have been under IT, itIT, according to equation (6) and under MT (6b):
    it=mpritT+(1mpr)itMT(6c)
  • The mpr parameter allows to cover policy regimes ranging from “pure” MT with full adherence (mpr = 0) to the target, through the partial adherence (0 < mpr < 1) to full-fledged IT (mpr = 1).

Although implemented in some countries, the experience with adding a money block to the QPMs along the lines previously discussed has been mixed. There were several reasons for this. Models with an MT block are quite complex, which posed a challenge for the central bank staff to use them effectively. The money interest relationship in equations (23) and (24), and velocity of money, were in practice often not sufficiently stable, which made the analysis difficult even for experienced staff. In addition, it proved to be difficult to incorporate into the QPM the way decisions about money targets and monetary policy de facto were done in practice. The policy decision-making process in many of these countries was fairly opaque, with multiple and internally inconsistent policy objectives. While the policy framework formally was MT, the de facto policy target was often the exchange rate, managed mainly with frequent foreign exchange market interventions. Interest rates were also often kept low, and much lower than required for achieving either the formal money targets or the exchange rate target, for political reasons. Thus, in practice the money targets were often not followed closely and frequently missed, sometimes deliberately.51 These factors made it challenging to design the model so that it would potentially be useful for policy discussions and to incorporate it effectively in the decision-making process.

H. Incorporating Exchange Rate Persistence, Limited Foreign Exchange Market Arbitrage, Foreign Exchange Interventions, and Pegged or Managed Exchange Rates

Most implemented country models have been based on a more elaborated modeling of the exchange rate than using the simple UIP condition in equation (5). This is both to account for observed exchange rate persistence and to incorporate TOT effects (as discussed previously), capital account frictions and limited foreign exchange market arbitrage, variations in risk premiums; the impact of aid shocks, the impact of foreign exchange market interventions to dampen volatility and/or to steer the exchange rate, and partial or full exchange rate targeting.

Exchange Rate Persistence

The observed persistency in the exchange rate has typically been achieved in country models by assuming that the expected spot rate (Etst+1) in the UIP condition is a weighted average of the model-consistent future rate (st+1), and a naïve forecast (stNF), including in the versions of the QPM in in various countries in (for example) Europe and Africa.52

Etst+1=ρsst+1+(1ρs)st+1NF(26)

The weight on the model-consistent, next-period, exchange rate is ρs. In long-term equilibrium, the expected future rate (as well as the naïve forecast) converges to the model-consistent forecast. The naïve forecast is given by

st+1NF=st1+2[Δz¯t+(πTπ¯*)]/4(27)

That is, it assumes that market participants have a view on the long-term trend in the real exchange rate and uses that to make a myopic exchange rate projection by extrapolating the last observed nominal exchange rate, adjusted for inflation differentials. Alternatively, it implies that market participants base their projections on relative purchasing parity adjusted for the changes in the long-term trend of the real exchange rate. This equation reflects the long-term appreciation of the exchange rate (a negative Δz¯t) as well as the difference between the domestic inflation target and the foreign inflation target/steady state inflation.53

Combining equations (26) and (27) gives the following exchange rate equation:

st=ρsst+1+(1ρs)(st1+2[Δz¯t+(πTπ¯*)]4)+it*it4+premt+εts(5b)

Variations in Risk Premiums

Besides these changes to the UIP condition, adding an endogenous risk premium to the exchange rate equation can help better capture systematic changes in economic fundamentals:

st=ρsst+1+(1ρs)(st1+2[Δz¯t+(πTπ¯*)]4)+it*it4+premt+premtF+εts(5c)

where premt is exogenous and the endogenous risk premium premtE, for example, can be a function of deviations from the inflation target and the foreign output gap as follows:

premtE=α{β(πt4πT)γy^t*}(28)

In this specification the endogenous risk premium is assumed to rise whenever inflation picks up (due to an observed tendency for that to induce a switch to a foreign currency) and it falls when foreign demand improves (via trade balance, and/or remittances channel). The endogenous risk premium, which should be country specific, can also be made dependent on other factors, including deviations in real aid inflows from trend, booms in the dominant export industry, the stock of foreign exchange reserves as in equation (35), or government debt.

Limited Foreign Exchange Market Arbitrage

To capture the impact of capital account frictions and limited foreign exchange market arbitrage, in certain models the exchange rate has been determined either by balance of payment (BOP) flows, the UIP conditions, or a combination of the two.54 This version of the model was developed for a country that was exiting from a fixed exchange rate regime and was assumed to become gradually more integrated into the international financial markets. When no arbitrage activity took place, the exchange rate was modeled as a function of the deviation of the BOP current account balance from its long-term trend (the BOP gap, (bop^t), which again was modeled as equal to the TOT, export, and import gaps (ignoring transfers and income flows):

Δstbop=β6bop^t(29)
bop^t=tot^t+x^t+m^t(30)
x^t=ε1x^t1+ε2z^tnf+ε3y^t1*+ε3tot^t+εx^(31)
m^t=θ1dd^t1+θ2((1ω)z^tnf+ω(1ω)z^tf)+ϵm^(32)

where bop^t,x^,m^t,anddd^t1 are the balance of payment, export, import, and domestic nongovernment demand gaps, respectively.

With full arbitrage activity, the exchange rate was assumed to be based on the UIP as in equation (7), and with partial arbitrage activity as a weighted average of the two.

Pegged and Managed Exchange Rates

The exchange rate and interest rate specifications for managed exchange rate regimes have in most cases followed the approach in Beneš, Hurník, and Vávra (2008). The setup is flexible and can capture alternative degrees of exchange rate management, including hard pegs as a special case, and different degrees of capital account frictions55 that make it possible for sterilized interventions to influence the exchange rate (path) for some time. In order to trace out policy options and facilitate consistency between the assumed exchange rate interventions and interest rate decisions underpinning the forecasts, the use of model-based simulations of policy scenarios with alternative parameter settings are needed.

The basic setup for modeling a managed float has been as follows:

  • The central bank may have an implicit or explicit nominal exchange rate depreciation target ΔstT that it may fully or partially adhere to. Because relative purchasing power parity should still hold in the long term, this targeted, or “desired,” exchange rate depreciation must be set consistent with the trend changes in the real equilibrium exchange rate (Δz¯t) and the difference in long-term rates of domestic and foreign inflation (πTπ¯*), for the medium-term inflation objective to be achievable. That is:
    ΔstT=Δz¯t+πTπ¯*orΔstT=Δz¯t+πtTπ¯t*(33a)
  • or alternatively
    ΔstT=σ(Δst1T)+(1σ)(Δz¯t+πtTπ¯t*)+εtΔsT(33b)
  • ■ The market exchange rate may be partially determined by this target and by the UIP condition, depending on the degree of adherence to the exchange rate target:
    st=h1(st1+ΔstT/4)+(1h1)Etst+1+(it*it)/4+premt+εts(5d)
    or using the augmented UIP condition in equation (5b) as
    st=h1(st1+ΔstT/4)+(1h1)[ρsst+1+(1ρs)(st1+2[Δz¯t+(πTπ¯*)]4)+it*it4+premt+εts](5e)

    The h1 parameter (0 < h1 1) controls the tightness of the exchange rate management, which is a policy choice and the setting of it should be part of the internal policy discussion. h1 = 1 is consistent with a hard peg exchange rate arrangement.

  • Market interest rates may then be only partially or not at all under the control of the central bank over the policy horizon. The degree of central bank control over market interest rates would depend on (1) the degree of adherence to the exchange rate target and (2) the degree of capital account frictions/imperfect substitution between domestic currency and foreign currency assets and “thinness” of the foreign exchange market.56 With full adherence to the exchange rate target (hard peg) and full capital mobility/perfect substitutability of domestic and foreign currency assets, market interest rates would be determined by the UIP condition. For intermediate cases, market interest rates would be a weighted average of those implied by the UIP condition (itUIP) and those implied by the standard (Taylor type) monetary policy rule (itIT) in equation (6), that is as:
    it=h2itUIP+(1h2)itIT(6d)

Whether sterilized interventions can be used (it is assumed) to influence/control the exchange rate independently of the interest rate policy, and to what degree, is determined by the relative value of the h1 and h2 parameters. In some models they were set to be equal, implying that only unsterilized interventions would impact the exchange rate. The h1 parameter was set equal to 1 for an official crawling peg exchange rate framework, while h2 was set equal to 0.5, implying a significant degree of capital flow frictions. In general, setting h2 > h1 implies that sterilized interventions are assumed to be able to influence the exchange rate to a certain degree.

This discussion does not imply, though, that the central bank cannot fully steer short-term interest rates and properly align them with its policy rate on a day-to-day basis.57 Consequently, it also does not imply that short-term interest rates cannot be the operating target for the central bank’s daily liquidity management operations, when it partly or fully targets the exchange rate.58 These relationships do not hold necessarily on a daily basis, because shocks may dominate the relationship on a day-to-day basis. They are, however, relationships that hold at lower frequency, say monthly, quarterly, and in the longer term, and they (and the results of the model simulations) do imply the required path for market interest going forward for the central bank to achieve its policy objective(s). That is, while both the policy rate and market rates are endogenous in the model, the policy rate can be set explicitly by the central bank and the central bank can, and should, use its daily operations to align market rates with it. A failure to set the policy rate in line with the one implied by the forecast would, however, result in the policy objective eventually being missed. Similarly, a failure to manage interbank liquidity so that market interest rates are fully aligned with the policy rate would also result in the policy objective(s) eventually being missed.59 This issue is further discussed in the section titled “Persistent Deviations of Market Interest Rates from the Policy Rate.”

The setup outlined previously allows for specifying alternative rules for setting the exchange rate target, including targeting the rate of depreciation and the level (path) of the exchange rate. This makes it possible, for example, to analyze the economic implications of using exchange rate interventions as an instrument (supplementary or the only instrument) for achieving the inflation target and of using interventions to smooth volatility around the longer-term path for the exchange rate determined by the fundamentals. The latter would imply setting a target path for the level of the exchange rate, not just its rate of depreciation, as follows:

stT=sι1T+ΔstT/4(34)

where the targeted rate of depreciation (ΔstT) is set as in equation (33a) (that is, as ΔstT=Δz¯t+πtTπ¯t*.

Using the exchange rate more actively as an (or the) instrument for achieving the inflation target may require targeting the exchange rate to temporally deviate from this path in response to changes in the inflation and exchange rate gap as follows:

ΔstT=f1(Δst1T)+(1f1)(ΔstTNf2(Etπt+44πT))f3y^t+εti(33c)

where the “neutral” change in the is as in equation (33a), that is: ΔstTN=Δz¯t+πtTπ¯t*

This can be further expanded on to explicitly model the associated foreign exchange reserve interventions. Beneš, Hurník, and Vávra (2008) and Beneš and others (2015) make the endogenous risk premium premtE in the UIP condition a function of the deviation of the stock of international reserves (rest) from their long-term/trend or optimal level (res¯t):

premtE=θ(restres¯t),dθd(res)>0(35)

And with the central bank adjusting the deviation of the stock of foreign exchange reserves from trend in response to deviations of the exchange rate from the targeted path or, alternatively, deviations of the exchange rate depreciation from the target:

rest=res¯t+γ(stTst)+εtfxorrest=res¯t+γ(ΔstTΔst)+εtfx(36)

The setup for foreign exchange interventions can be made more elaborated. For example, one country model distinguished between interventions to smooth the exchange rate and foreign exchange purchases (referred to as “structural interventions”) aimed at fulfilling the central bank’s long-term reserve coverage target. The latter were assumed to have no effect on the exchange rate because keeping a constant level of import cover was assumed to be neutral with respect to the risk premium required by international investors. The setup was as follows:

  • The reserve target in months of imports (res¯t) was assumed to follow an autoregressive process that converged to its steady-state value res¯ss:
    res¯t=ρ1res¯t1+(1ρ1)res¯ss+εtres¯(37)
  • The intervention reaction function, with interventions (ivt, measured in months of imports) reacting to deviations of the exchange rate depreciation (Δst) from target (Δs¯t), and deviations of the level of reserves from the target, given by the “reserve gap”, res˜t.
    ivt=ρ2ivt1+(1ρ2)ιν¯α1(ΔstΔs¯t)α3res˜t+εtiv(38)
  • The reserve gap term was specified to capture not only the current, but also all future expected deviations.
    res˜t=ρ3(restres¯t)+(1ρ3)res˜t+1(39)
  • Structural interventions to fulfill the central bank’s long-term reserve coverage target.
    ιv¯=res¯(1(1+dz¯)(1+r*¯)1+g¯)(40)
  • ■ The UIP condition with intervention effects.
    st=Etst+1+(it*it+premt)+β1(ivtιv¯)+εts(5f)

    where the steady-state value of interventions (ιv¯) were derived from the basic reserve accumulation accounting identity.

I. Persistent Deviations of Market Interest Rates from the Policy Rate

High day-to-day volatility in market interest rates with large and semipersistent differences from the policy rate are common in many of the FPAS CD recipient countries. This may be explained by:

  • ■ Partial adherence to money and/or exchange rate (depreciation) targets, and with the policy rate not set according to the combined policy rule in equations (6c) or (6d) nor de facto serving as the operating target for the central bank’s daily liquidity management/interest rate steering operations.

  • ■ Cost concerns that prevent the central bank from consistently undertaking the liquidity management operations needed to align market rates with the policy rate, and thus implement the stated policy stance.

  • ■ A misconception about how monetary policy is implemented and that just announcing a policy rate should be sufficient and that money market conditions would then somehow adjust by themselves so that the short-term market rates would become aligned with the policy rate.60 The commonly expressed argument that the persistent deviations between the policy rate and the market rate is caused by segmented markets (and/or market failure) and not by the lack of central bank market operations to align market rates with the policy rate suggest that this might be the case.

High day-to-day volatility in interbank rates with large and persistent deviations from the policy rate makes interbank rates unpredictable. In turn, this makes it difficult for the market to relate the pricing of longer-term securities to the interbank rate. The lack of signaling power of the policy rate, coupled with the fact that most or all transactions might take place at rates that are very different from the policy rate, makes it difficult to base the pricing of securities on the policy rate. Therefore, the market in such situations tends to use simple rules of thumb for pricing securities and use the market rate for 91-day Treasury bills as the benchmark for pricing other instruments, including retail loans and time deposits for banks’ larger clients. The policy rate may play some role in the pricing of credit to smaller borrowers through moral suasion effects. In addition, the policy rate may play a role indirectly through the lending facility rate when, as is the standard case, the central bank’s lending facility rate is linked to the policy rate. The policy rate may then in particular have an impact on the behavior of smaller banks that face difficulties accessing the interbank market, including because of counterpart risks, and thus have a higher risk of having to access the central bank’s standing lending facility.

This raises several difficult questions. There is a technical question about how to specify the interest rate that drives behavior in the model. More importantly, there are difficult political questions about how to address the deviation of market rates from the policy rate in the policy analysis, recommendations, and in discussions with policymakers to ensure that policy discussions are properly focused and that the intended policy is implemented. Technically, the simplest approach is to omit the policy rate from the model and let interest rates in the model be represented by a market rate instead: the overnight interbank rate or the 91-day government security rate. This may help draw the policymakers’ attention to the fact that what matters is the impact of their policy implementation operations on market interest rates. The policy discussion, therefore, should focus on what market interest rates have to be over the policy horizon to achieve their policy objective.61,62 Policymakers may, however, insist on focusing the debate on the policy rate, not market rates, with or without any considerations for whether to align them.

Because of the reasons outlined previously, it may be necessary to include both the policy rate and one or several market interest rates in the model, and explicitly model the relationship between them. The market interest rate, and not the policy rate, would then be the one used for deriving the real interest rate gap that enters the real monetary condition index in equation (2). The market rate would also be the one used in the UIP exchange rate equations (5a) to (5f) and in the money growth equations (23) and (24).

A common approach has been to specify the market interest rate (it) as equal to the policy rate (itp) plus a spread term.

The spread term may be specified as following an autoregressive process. That is:

it=itp+spreadt(41)
spreadt=ρ1spreadt1+(1ρ1)spread¯ss+εti(42)

where the steady state spread spread¯ss may be set equal to zero.

The market interest rate may alternatively be represented by a weighted average of several short-term interest rates when short-term market rates are neither well aligned with each other nor with the policy rate. For example, one country model included the following specification:63

iteff=ωtb91ittb91+ωibritibr+(1ωtb91ωibr)itp,63(43)
itibr=itp+spreadtibr,ittb91=itibr+spreadttb91(44)
spreadtibr=ρ1spreadt1ibr+εtil3r,spreadttb91=ρ2spreadttb91+εttb91(45)

This setup can help capture the effects of failing to implement the de jure policy stance as represented by the announced level of the policy rate, if used properly. It can help capture the past de facto policy stance and its deviation from the de jure, or announced, stance and be used for simulating policy scenarios reflecting different degrees of consistency between the de facto and de jure policy stance. The exogenous error terms (εti,εtibr,εttb91) in equations (42) and (45) do provide the model user with a tool to model the failure to undertake the liquidity management operations needed to align the rates, and to evaluate the impact of that failure on the economy on the economy. The setup can also be used to simulate a case where operations are gradually enhanced to align market rates with the policy rate over time.

However, using this setup properly for policy simulations and providing policy advice requires a good understanding of the source of the spreads and an ability to openly discuss it. There is a risk that this model specification can contribute to further cement the previously discussed misconception that the high spreads are de facto semi exogenous, caused by market segmentation or market risk premiums, and not a failure of the central bank to undertake the market operations needed to align interbank rates with the policy rate. Moreover, it can give the false impression that changing the policy rate without undertaking the supporting liquidity operations64 would be sufficient for the money market rates to move toward the policy rate, when in fact only the central bank’s open market operations can align the interbank rates with the policy objective and that without such operations the spread is what would be adjusting.65 Modeling the spreads as driven by a semipersistent process may risk serving to hide the fact that the spreads are due to policy implementation failure. Such a modeling can thus be unhelpful.

The pass-through from the policy rate to the (longer-term) rates that determines private sector behavior may be incomplete and/or endogenous. This can be handled by explicitly modeling the link between the policy rate and these market rates. In turn, these longer-term market rates can replace the short-term real rates in the models’ IS curve. This extension makes the transmission of real interest rates into the IS curve more inertial and adds country specific feature (and realism) for policy analysis. A general setup is the following:

ittb182=12(ittb91+Etit+1tb91)+tpremttb182(46)
ittb365=14(ittb91+Etit+1tb91+Etit+2tb91+Etit+3tb91)+tpremttb365(47)

For simplicity, the term premiums may be assumed to follow an autoregressive process, gradually converging toward the steady state:

tpremt=ρ1tpremt1+(1ρ1)tprem¯ss+ϵti(48)

Alternatively, it can be modeled as a function of some measure of credibility, such as high-frequency interest rate volatility and/or deviation of market interest rates from the policy rate. It is also possible to specify the short-term interest rate expectations (Etit+1tb91,Etit+2tb91,Etit+3tb91) as not being fully model-consistent but based partly on backward-looking expectations and credibility proxies.66

J. Central Bank Credibility and Expectations

Central bank credibility is a critical factor for anchoring expectations. The standard set up as in equation (3) is to assume that expectations are partially backward-looking (for example, as agents deduce the central bank’s intentions from its past [lack of] control over inflation) and partly forward-looking. A high weight on past inflation might reflect backward indexation in price formation but also low credibility and less anchored expectations. One of the country models included an explicit modeling of the central bank’s lack of credibility. Specifically, forward-looking expectations were set as a function of model-consistent rational expectations altered by the central bank’s lack of credibility, and with the stock of the central bank’s lack of credibility (incredt) depending on past deviation of headline inflation from target. The specific setup was as follows:67

incredt=φ1incredt1+φ2(πt14πt1T)+εtπ(49)
Etπt+1=πt+14+φ3incredt(50)

K. Calibration of the QPM

The parameters of the QPM-type models built with the help of CD projects are almost always calibrated, rather than estimated. First, the process of calibration promotes a deeper understanding of the properties of the model. Second, the inadequacy of any model to capture all the key features of the data implies that estimates may attempt to capture the data, which the model is not meant to fit fully, for example trying to “explain” a large jump in prices due to a weather shock. More fundamentally, few emerging market economies have long enough series for a robust estimation of the parameters. Even when long series exist, economic and structural changes reduce the relevance of older data. Monetary policy regime changes, in particular, are frequent and may render older data irrelevant.68,69

More fundamentally, parameters in policy and forecasting models should represent the expected future monetary policy transmission mechanism, rather than the historical behavioral representation of the economy. These models have, in most cases, been built for countries that have gone through several fundamental structural changes, including with respect to their policy framework, and that are embarking on further changes to their monetary and exchange rate policies. Those changes, in turn, would further alter the transmission mechanism and potentially also modify the way economic agents form expectations. When a country experiences frequent and significant structural breaks, the parameter estimates of the model reflect economic policies and agents’ behavior over the past decades and therefore will not capture well the transmission mechanism in the future. Calibration increases the chance that the model parametrization will represent the future policy setting. Therefore, it is also likely to partly address the Lucas critique.

The calibration of applied policy models requires more effort than just finding plausible model parameters that ensure stable model dynamics. Modelers implementing semistructural/structural models in a policy environment apply various methods to bring the model to data. They include the following:

  • (1) Empirical fit

  • (2) In-sample forecasting performance

  • (3) Economic consistency demonstrated by impulse response functions70

  • (4) Ability to explain historical developments using Kalman filter technique

  • (5) Double-checking the calibration by Bayesian estimation

The parameters in the monetary policy rule cannot be estimated as they should represent the policy framework and the policymakers’ preferences. The properties of the policy rule determine both the speed of bringing inflation back to the target when a shock occurs and implied output gap volatility, and the forecast-implied future path of the policy rate. It is therefore essential for the successful functioning of the FPAS that the properties of policy rule are well understood by all and that the rule fully reflects the policymakers preferences regarding (1) variables to include in the rule, (2) the tradeoffs between pace and magnitude of policy adjustment to bring inflation back on target and the impact on output and other variables, and (3) the degree of interest rate smoothing.

To ensure that the policy rule specification reflects policymakers’ preferences and that its properties are well understood by all, staff should quantify and present to the policymakers the implications of alternative specifications and parameterization of the policy rule. Analyzing and presenting the inflation-output volatility tradeoff implied by the model, the inertia of the change in the policy rate obtained via historic model simulations for alternative specifications of the policy rule, is crucial for finding a reaction function that reflects the preferences of the policymakers and is well understood by all involved in policy analysis and decision-making. Due regard should be given to the potential impact on central bank credibility and risk of unanchoring of inflation expectations. This is particularly important when putting in place an FPAS and forecast-supported policy decision-making process for the first time. Changes to the rule thereafter should preferably be rare; otherwise, financial markets and the public may have difficulty understanding the central bank’s policy goals and actions. The parameterization of policy rule may typically be reconsidered only when there are major changes to the composition of the MPC or a change in the monetary policy regime; in this case, the full model will need to be reviewed.

Confronting the model with the data and economic intuition is an important debugging tool, however. The model structure separates the long-term developments (loosely referred to as “trends” or “equilibriums”) from cyclical fluctuations (“gaps”). Model-consistent estimation of these unobserved variables—trends and gaps—is in most cases done using the Kalman filter. The procedure also estimates historical shocks hitting the economy and allows for interpreting the economy via the model’s transmission mechanism. In-sample simulations, analyzing shocks decomposition results and comparing the implications of alternative parameter specifications for alternative forecasts outcomes, are crucial for bringing the model close to data. Such sustained effort to improve the model’s forecasting properties deepens the staff’s understanding of the model’s overall properties and enhances the quality of the applied forecasting and policy work. It also deepens the institutional knowledge about the economy’s transmission mechanism and understanding of the economy and the data.

4. The Forecasting Process, Incorporating the FPAS into the Decision-Making Process

To ensure that the technical work has traction in the policymaking, TA projects have generally provided advice on how to structure the forecasting, policy analysis, and decision-making process. Most have also spent substantial time helping the core forecasting staff in briefing the other parts of the central bank, including the policymakers (the MPC, executive board, and/or governor), about the tools, processes, and the role of FPAS in monetary policy decision-making. Many projects have also helped with staffing and organization, and with structuring and drafting the policy materials for the decision-makers and the public in a real-time forecasting and policy analysis environment.

A. The Forecasting, Policy Analysis, and Decision-Making Process

The advice on how to structure the process has been based on the experience of IT central banks. While there are variations in how these central banks have organized themselves, they all have in place a structured process for a monetary policy decision round that allows for:

  • (1) A focused discussion of topical issues and an assessment of the latest forecasts and policy recommendations, the forecast errors, and the resulting forecast revisions early on in the process;

  • (2) Sufficient time to analyze and discuss foreign and financial variables (often in the FPAS framework referred to as external assumptions) relevant for the domestic forecast;

  • (3) Sufficient time to analyze the most recent data, conduct nowcasting to fill any data gaps, prepare the near-term forecasts, and assess the current economic fundamentals, including with regards to cyclical position of the economy, trends, and external developments (“the initial conditions”);

  • (4) Organization of several rounds of model simulations; and

  • (5) A series of structured meetings held both at the technical level and with the policymakers at various stages during the process.

The whole forecasting round typically takes five to seven weeks from the start of the technical work to the final MPC meeting.71 FPAS CD usually recommend a process structured along the lines of Figure 6 (see also Annex 1).

Figure 6.
Figure 6.

The Forecasting Calendar

Citation: Departmental Papers 2021, 026; 10.5089/9781557753274.087.A001

Source: Hledik (2011).Note: MPC = monetary policy committee.

The standard recommendation has been to undertake such full forecasting rounds only quarterly and to closely align the start of the process with the release of quarterly national accounts data. The process is resource intensive, and more frequent than four full-fledged forecast rounds a year increases the risk of leaving the staff with insufficient time to properly analyze the data, prepare the forecasts, and improve the analytical toolkit in between forecasting rounds. Also, although new high-frequency data arrive constantly, quarterly national accounts data are the most important source for identifying and quantifying changes in the economic cycle. With the output gap being one of the key determinants of demand-led inflationary pressures, the national accounts data are also among the most important sources for determining whether a change in the policy stance is warranted. Limiting the number of full-fledged forecast rounds to four may also help to focus policy formulation more on the underlying fundamental medium-term inflation drivers rather than on the high-frequency changes in the data that, though important for near-term developments, may often merely be noise when it comes to the medium-term developments that monetary policy can influence. Focusing policy too much on the near-term developments might result in an overreaction of monetary policy to noise. This can result in frequent reversals of the policy stance and thus noisy policy signals and high interest-rate volatility, which undermine monetary policy credibility and weaken policy transmission.

The standard recommendation has been to have interim policy meetings in between the full-fledged quarterly forecasting rounds. Many IT central banks started their FPAS implementation with 12 policy meetings a year and analytically weak and repetitive monthly policy reports. The recommendations have been to reduce the number of policy meetings to a maximum of eight meetings a year, with the interim policy meetings being based on a more limited update of the information set in between the full forecasting rounds.

CD projects have helped to develop policy meeting schedules. Many, and at times conflicting, factors need to be considered when it comes to the exact timing of the MPC meetings. The start of the process should be closely aligned with the timing of the release of new quarterly national accounts data, with the first technical level meeting on the initial conditions and the near-term forecasts typically taking place 7 to 10 working days after. Also, while staff need time to analyze the latest CPI data before finalizing the forecasts, having new major data releases—for example, the national accounts and the CPI—shortly after the MPC meeting should be avoided if possible. The MPC meetings should ideally take place at the beginning of a new reserve maintenance period to avoid expectations of forthcoming policy rate changes impacting bidding when longer-maturity open market operations instruments are used; achieving this may require changing the start of the reserve maintenance period. For the communication of monetary policy to be efficient, other considerations are also important, such as not to have the policy meetings too close to other events that will take media precedence (for example, the release of a new budget).

Having a preannounced and published monetary policy meeting calendar along with the calendar for the publication of the MPR is typically recommended. The publication of the MPC meeting calendar for the year ahead will help guide the financial market expectations about the timing of changes in monetary policy. This will also assist media in planning their coverage of monetary policy. Internally, it will also help to discipline and structure the forecasting and policymaking rounds. Typically, the forecast coordinator will schedule the detailed tasks based on this meeting calendar and department management will be able to plan projects and maintenance and development work accordingly.

The technical (“pre-MPC”) meetings between the staff and the MPC/policymakers ahead of the final policy meeting are essential for the smooth working of the process. First and foremost, they serve to inform the policymakers of the economic situation, the assumptions underlying the forecast, and the policy action proposed to reach the target at the end of the forecast horizon. Second, these meetings fulfill the need for MPC members to discuss and present their views to staff for consideration when staff continue the work on the forecast. Third, they also fulfill the purpose of informing the MPC members of each other’s views. These interactions thus serve to ensure that the MPC will enter the actual policy meeting fully informed about the forecast and the economic situation, and with a good sense of where everyone stands with regard to their preferred policy choice. The policy meeting can thus be focused appropriately: on a policy discussion instead of details in the forecast or the policy report. These interactions are needed to ensure that the FPAS adequately supports policymaking, and for ensuring a sufficient degree of consistency between the forecasts and actual policy. The latter will mostly be the responsibility of staff which, with their in-depth knowledge of the tools and models used to derive the forecast, can ensure consistency between policy and the forecast, as well as consistency between the forecasts for various other variables.

It is common for the views of the policymakers, department management, and the technical staff conducting the forecasting and policy analysis at the onset of the process to differ. There are legitimate reasons for such differences as staff and policymakers may have different backgrounds, experiences, and information sets. These differences may be about the assessment of the current economic conditions and near-term developments, assumptions about the future path of some exogenous variables, assessment of risks to macroeconomic projections, and the strength of some of the monetary policy transmission channels or additional factors not explicitly modeled, as well as different views on how these factors might impact the economy, among other factors. Such differences in inputs to the forecast would imply various expectations about likely future development of the economy, including inflation, and the possible need for a policy adjustment.

Active participation of policymakers and other senior management in the forecasting process and close interaction between them and the technical staff can therefore substantially improve the quality of the forecasts and policy analysis. It is essential for ensuring that all available information is properly utilized.72 Providing sufficient time for discussions between staff and policymakers to allow them to fully articulate their assumptions, thinking, and working out the reasons for their different views typically result in additional insight that should improve the forecasts and policy analysis, including by triggering further research. It can also help identifying cases of model misspecification and hence biased forecasts and inform further model development, which would help building consensus about the workings of the transmission mechanism.

Close interaction between staff and policymakers is also crucial for ensuring that the policy decisions are solid and the policymakers fully benefit from the FPAS. In order for the policymakers to trust staff’s policy analysis, they need to understand the staff’s analysis and the recommendations, which is why the interactions between them are so crucial. Both have a great responsibility in ensuring this: staff need to be very clear in how and what they present to the policymakers, and the policymakers must be clear on what they want from staff and how they want staff to present their findings. This is essential to ensure that policymakers agree with the forecast and thus base their policy decisions on staff’s recommendations. A forecasting process that does not allow for the views of the policymakers to be properly incorporated with regard to underlying assumptions, risks, and policy choice may result in a consistent, well-based forecast and policy path that is disregarded by policymakers. Alternatively, a process where staff are forced to incorporate policymakers’ assumptions, forecast, and policy choice without solid economic basis may result in an inconsistent and biased forecast. Both scenarios will in the best case result in suboptimal policy and in the worst case result in detrimental policy. Policy decisions that are not supported by empirical evidence and rigorous policy analysis tend to be less systematic and more driven by short-term considerations, which can hamper achievement of the policy objective. Both scenarios can demoralize staff and thereby over time reduce the quality of their work and weaken the public’s trust in the central bank’s commitment to its stated policy objective.

Practices differ on how and whether the views of staff and policymakers are fully reconciled. When incorporating the newly developed FPAS into their decision-making, some central banks did not initially provide time for a sufficiently elaborated and rigorous process for ironing out the disagreements between staff and policymakers in order to reach a consensus. Instead, they tried to circumvent the issue by either (1) have a fully staff-dictated process where staff were solely responsible for the forecasts and policy analysis with no or little room for incorporating policymakers’ views and assumptions; or (2) a policymaker-dictated process where the policymakers’ views were imposed on the forecasts through “hard model tunes.” Neither option is appropriate. The first, as noted, may not only result in lower-quality forecasts and analysis, but in policymakers’ distrust and policy decisions that are inconsistent with the staff forecast and recommendations. The second option not only implies that the policymakers are not fully benefiting from the analysis but may also result in internally inconsistent forecasts, which, among other reasons, makes it harder to communicate the forecasts and the reasons for the policy decisions. This may also weaken the trust in the central bank and credibility of monetary policy, including importantly when commercial banks and other private entities have the capacity to undertake their own analysis. Most advanced central banks with well-established IT frameworks have in place an elaborate process along the lines outlined in this section that allows for a full or reasonable degree of consensus to be developed, however.

The following three technical meetings with the MPC/policymakers, followed by staff/departmental level meetings, have typically been recommended:

  • A meeting on the initial conditions (first pre-MPC) to:

    • (1) Provide the policymakers with an economic interpretation of the newly available data, external and financial developments, forthcoming domestic or foreign policy changes entering into the forecast, and staffs preliminary near-term forecasts and assessment of the economy’s cyclical position, which may often indicate the direction of the upcoming recommended changes in the policy stance as well; and

    • (2) Seek the policymakers’ feedback and views, including whether additional analysis is needed or whether the existing one needs to be reassessed.

    • This meeting helps to ensure that both policymakers and staff share the same information set regarding historic data and their interpretation, thereby reducing the risk of fundamental disagreements later in the process. It provides policymakers with an opportunity to communicate to staff the main issues and risks in terms of the identification of initial conditions as well as other issues and risks that should be addressed in the forthcoming forecast and policy analysis.

  • A meeting on the first version of the baseline forecasts and alternative scenarios (second pre-MPC) to seek feedback from the MPC members on the forecasts and implied monetary policy decision, and to discuss the specification of any alternative scenarios that should complement the baseline forecasts for the final MPC meeting. The meeting also provides policymakers with an opportunity to state their views on the policy tradeoff between the pace and magnitude of the policy adjustment needed to keep inflation on target/bring inflation back to target and the impact on other variables—as reflected in the specification of the policy reaction function in the QPM. This may be the most important preparatory, pre-decision-making meeting during the process. Note that sufficient time is needed between this meeting and the final MPC meeting so that staff can incorporate the feedback received into the final baseline and the MPR.

  • A meeting to present the final forecasts and policy analysis, and policy recommendations to the MPC, either on the same day as the MPC/policy decision meeting or on the day before (this may just be a segment of the MPC decision meeting where staff present the analysis and then leave the room).

Additional meetings between staff and MPC members can be useful and help improve the quality of the policy discussions as well as help smooth the process. Joint seminars and workshops in between the forecasting provide the core FT staff with an opportunity to discuss with other central bank staff, central bank management, and MPC members (1) the forecasting and policy analysis tools used, including in particular the QPM in more detail; (2) ongoing research activities; (3) ex post evaluations of the most recent forecasting round; (4) contrafactual policy simulations; and (5) preliminary results of analysis that may become a box or chapter in the next MPR. This can be particularly useful when establishing an FPAS for the first time. Several of the CD projects helped the recipient central staff with such seminars. It can be helpful to have regular joint startup meetings before the start of each forecasting and policy analysis rounds. In these meetings, the previous round is reviewed and any outstanding questions are picked up, the earlier forecast is assessed for rough validity, current risks are identified, and any new issues or questions that have emerged that will need extra focus during this round are assessed. This provides an opportunity for early input and signals from the MPC as to where there may be differences in opinions between MPC members and/or between staff and the MPC.

B. Staffing and Organization

FPAS CD projects have also provided advice on staffing and organization. While having sufficiently sized teams is essential for the smooth functioning of an FPAS, many CD recipient institutions had a tendency to initially provide fewer resources to FPAS-related work than what was needed for both establishing and operating the system, and for ensuring that the capacity developed could be sustained and enhanced further. They also often paid insufficient attention at the onset of the projects to how to incorporate the FPAS, once established, into the routine monetary policy decision-making process. Developing the FPAS was sometimes seen as just general training of staff and a research project to build a macroeconomic policy model. As a result, the project was delegated to small research-oriented modeling teams located in the research department, instead of assigning the work to the monetary policy department, where it eventually belongs, given its policy support function. Relatedly, many did not realize that successfully operating the FPAS required participation of a broad range of central bank staff, including sectoral experts. As a result, little attention was given to the interaction between the core modeling staff, sectoral experts, and the staff responsible for external communication and for preparing the policy documents for the MPC meetings. These teams are in some central banks located in different departments, which can raise internal communication challenges.

The detailed organization of the full FT varies among central banks. This reflects differences in the circumstances, background, and internal organization of the central banks. Accordingly, the advice provided has also varied. The full FT usually consists of a core modeling group and a group of sectoral experts. In turn, the sectoral experts are often organized in smaller teams. Central banks with an extensive financial programming background often had well-established teams of sectoral experts. A typical well-articulated FT structure could consist of:

  • A core modeling/forecasting and monetary policy analysis group. This group typically is the main coordinator of the forecasting and policy analysis process. It is involved in all stages of the modeling, forecasting, and policy analysis activity. Therefore, the members of this group should be well trained with diversified skills, including with solid mathematical, statistical, and programming expertise; deep understanding of macroeconomics and macroeconomic policy making; good economic intuition; and a solid understanding of the domestic economy. In addition, they should be able to synthetize a complex economic picture and present it in a simple nontechnical language to the central bank management/MPC and public at large (storytelling). A properly staffed modeling group would consist of at least six or seven people. They should rotate their roles (for example, on an annual basis), with three or four of the group members participating in the regular quarterly forecasting and policy analysis rounds, with the rest focusing on research and model development work.73 74

  • A real sector team (six or seven people) for analyzing and providing the NTFs for primarily the real sector variables (prices, GDP and its components, and the labor market).

  • An external sector team for preparing the external sector NTFs and inputs for the QPM-based medium-term forecasts and policy analysis. This team may also be tasked with preparing medium-term forecasts of the more detailed BOP components based on the outputs from the QPM (“forecast disaggregation”). Two or three people may be sufficient if the external sector analysis mainly involves processing forecasts prepared by other institutions such as the IMF, World Bank, Organisation for Economic Co-operation and Development, European Commission, and larger central banks. However, up to six or seven people may be required if the central bank decides to produce its own model-based forecast of the external environment, instead of relying on the forecast from other institutions.

  • A fiscal sector team responsible for following fiscal policy developments, preparing the near-term forecasts of the fiscal variables in the QPM (when there is a fiscal block in the model), and providing guidance on how changes in fiscal policy may influence developments in other parts of the economy. Three people may be sufficient when relying on analysis/forecast obtained from the ministry of finance. A larger team (six or seven people) may be required for central banks that produce their own fiscal analysis and projections.

  • A financial sector team (one or two people) that closely follow developments in the money, credit, and foreign exchange markets and provides inputs to the other teams on how developments in these markets may impact the rest of the economy.

  • A drafting and external communication team that works closely with the other experts to both prepare the documents for the MPC meetings (reports and/or presentations) and for the public (press releases, speeches, inflation/MPRs).

Managing and coordinating forecasting and policy analysis rounds can be challenging, and particularly so when the broad FT is interdepartmental. Therefore, the role of the forecast coordinator must be clearly assigned, preferably to a relatively senior and seasoned staff member with a vision for change, technical skills, an ability to communicate vertically and horizontally, direct hands-on involvement in the day-to-day work of the FT, and with a clear mandate to command the full FT during the duration of the forecasting and policy analysis rounds. The precise level of seniority may depend on the circumstances because responsibilities, skills, and the mandate to command at an interdepartmental level among senior staff members vary.

5. Assessment and Lessons

A. Assessment

The observable impact on the ground of FPAS CD efforts has, as expected, varied. Determinants (or “factors”) have included the maturity of the authorities’ monetary policy reform plans, and in particular whether the CD project was formulated as part of a clear reform strategy with strong commitments from the top; the technical skills of their staff at the onset of the project; staff and management turnover during the project; and, importantly, the duration to date of the CD project and the authorities’ reform efforts. For most projects, the capacity development and technical work—that is, development of the QPM and NTF tools—progressed broadly as planned. The experience with incorporating model-based analysis and projections in policymaking and the impact of the projects on actual policy decisions were more mixed.

The FPAS CD projects have contributed to a fundamental change in the monetary policy framework and monetary policy conduct in several countries. Most of these countries adopted, or were firmly on the way toward adopting, a full-fledged IT, IT-Lite,75 or IT-like monetary policy framework during the project. The authorities in one of these countries described the FPAS project as one of the most transformative CD projects they had ever received. In these countries, besides strengthening their modeling and forecasting capacity, the project had been instrumental in helping to:

  • Enhance their internal organization, work, and the policy formulation processes, and external monetary policy communication.

  • Change how the central banks were thinking about monetary policy and monetary policy analysis. With a fully functioning FPAS in place, more emphasis was put on understanding initial conditions, the main underlying economic forces (real exchange rate, neutral interest rate, potential output, etc.), and on the nature of shocks (for example, their expected persistence). These are all factors that are essential for successful forward-looking monetary policy formulation, including in the context of large structural changes. The result was enriched and better-structured macroeconomic and policy analysis, resulting in clearer analysis and more focused policy discussions.

  • Put in place a well-structured policy decision-making process. This group of countries now typically have regularly MPC or board policy meetings that adhere to a preannounced schedule and interest rate decisions that are forward looking, based on the projections and policy analysis prepared with the tools and processes put in place under the CD project. They have regularly published forward-looking quarterly MPRs that highlight analysis and projections and are intended to communicate policy decisions externally.

These countries typically had staff with relatively strong technical and modeling background, strong management commitment, and a clear desire and political backing for reforms at the onset of the project. For some, the transition was relatively quick and was completed in a few years; for others, the transition was more gradual, continuing over many years and with prolonged CD support.

The impact on the policy framework and policy conduct has been more muted in other countries. Compared to the first group of countries, these countries typically at the onset of the FPAS project had:

  • Weaker analytical capacity and somewhat erratic policy implementation. They often had large and persistent deviation of market interest rates from the policy rate and published little or no forward-looking information.

  • Less institutional support for establishing a functioning FPAS and reforming monetary policy. While the commitment from parts of management was strong, other parts of the organization were often less supportive, and the political backing for reforming the policy framework was often weak or unclear.

However, many of these countries still made important progress that might provide the foundation for further reforms. The internal policy discussion and policy decision-making process, and even external communication, were in many cases substantially enhanced and became more transparent, analytical, forward-looking, and focused on the policy objective and policies needed for achieving it. As a result, the policy dialog with the IMF country team typically improved. Policy implementation and liquidity management also became more consistent, with interbank rates better aligned with the policy rate for prolonged periods, in some of these countries.

Nevertheless, substantial weaknesses often remained, including:

  • A continued heavy management of the exchange rate and slippages in policy implementation because of cost concerns.

  • An incoherent internal policy analysis and decision-making process. In some countries the process suffered from competing and at times conflicting advice prepared by different parts of the organization with no orderly process in place for arriving at a consensus view.

  • Frequent staff rotation that in some cases caused the capacity built to be lost.

  • Weak human resource management practices and incomplete integration of the FPAS into the policymaking process. The top management did in several cases not appreciate the full benefits of a well-functioning FPAS for their policymaking. Frequent changes in the top management at times contributed to this.

  • Political considerations dominated policymaking. This can substantially reduce the impact staffs FPAS-based analysis and recommendations could have on policymaking.

It remains to be seen whether in these cases the capacity that was built will be sustained, and whether the central bank management eventually will be in a position to make the reforms for which the CD project had provided a foundation.

Established FPAS capacity was in a few instances subsequently lost and had to be rebuilt. Reorganizations and other staff relocations, sometimes following changes in the governor’s or top management’s positions, resistance from influential middle management staff, and gaps in CD engagement appear to have been major factors contributing to this. Insufficient incorporation of model-based analysis and projections into the decision-making progress and a lack of progress with modernizing the monetary policy framework and moving toward forward-looking policy formulation and enhancing policy communication were often the underlying cause.

B. Lessons

The experience with FPAS CD to date suggests several important lessons. These concern all aspects, including planning of the project, securing staff and management commitment, developing the modeling apparatus, establishing FPAS processes and organization, training staff, and incentivizing human resource management. In particular:

  • Detailed assessment of capacity, organization, and process in the central bank prior to formulating the FPAS CD project, clarity of project objectives, and realistic assessment of what can be achieved are essential for success. FPAS CD projects typically involve substantial resource commitment over many years. Building the technical capacity requires intensive hands-on training over an extended period of a recipient central bank team that is large enough to ensure backup and continuity when there are staff changes and is kept together for the duration of the project and beyond. Putting it into use to support policymaking often means significant organizational change to staff’s work and management’s interactions with the staff. In addition, FPAS projects require frequent interactions with policymakers to ensure that they are keep abreast of the technical developments and support the organizational changes that will be required. Over time, as the bank develops, the FPAS project will expand and affect more departments, which the central bank must be prepared for. Scoping missions are important for assessing the initial capacity and internal organization and processes, which in turn should inform the project objectives, timeline, and the milestones to track the progress toward the objectives.

  • Proper planning of the project is critical. Project Logical Frameworks (logframe) should lay out the objectives for the project, specify the expected outcomes with measurable indicators, and include a detailed timeline with clear milestones. They help plan the project and keep implementation on track, enabling both deliverers and recipients to monitor progress.76 A close dialogue between the CD deliverer and the CD recipient at all stages of the project is essential. It helps ensure agreement on project design and promotes ambitious yet realistic timelines and milestones. Periodic project assessments provide an opportunity for reassessing the project and taking corrective measures if necessary.

  • FPAS projects should have a multiyear perspective. Medium-term planning needs to embed a clear view about frequency of missions and their scope. This helps maintain the pace of the project implementation and manage expectations on the side of CD recipients, providers, and donors. Interruptions and delays to planned activities can jeopardize the whole project. Similarly, rotation of recipient central bank staff causes disruptions, loss of capacity, and delays that can derail the project. Capacity that is being built can easily be lost, especially when it is yet to be put into practical use. Three to four missions a year over several years are often needed during the initial phase. Depending on the initial capacity and project progress, multiple phases may also be needed.

  • Continuity of the IMF experts implementing the projects is important for success. While different expertise may be needed for some FPAS components, most of the work involves hands-on follow-up on earlier missions. In addition, the FPAS missions work with the core function of a central bank and experts are privileged with sensitive and often confidential information, calling for trust and well-established relations.

  • FPAS projects have a greater chance for success when there are synergies and progress with other aspects of monetary policy modernization. Being in a position to make economically meaningful, analytically informed policy decisions is a precondition for successfully operating and effectively using an FPAS in a policymaking institution. Establishing forecasting and forward-looking policy analysis capacity is an important, but not the only, element of modernizing monetary policy. Best results are typically achieved when the FPAS project is undertaken together with other activities, supported by CD assistance, to (1) reform the monetary policy framework; (2) enhance monetary operations and liquidity management so that short-term market interest rates are aligned with the policy rate; (3) establish a foreign exchange interventions policy that clarifies the role of foreign exchange interventions in the policy framework and for addressing disorderly market conditions, while protecting the natural and beneficial short-term exchange rate volatility; and (4) enhance financial stability and develop the financial markets. FPAS projects can be a catalyst that eventually lead such reforms.

  • FPAS CD is not mainly about modeling. Although it always includes a modeling component, introducing modeling tools requires a lot more than just developing models. Building the QPM and the NTF tools may not be the most important ingredient for success. Models can be built remotely and relatively quickly if the data is available. Successful FPAS implementation requires, however, that staff of the recipient central bank obtain the skills to maintain and develop the QPM and NTF tools further and, importantly, the skills to use them properly for preparing the forecasts, policy analysis, and policy recommendations. Therefore, training staff in applying the tools and communicating model-based analysis is essential for ensuring success. However, in most cases it is not sufficient for ensuring success.

  • Establishing and sustaining an FPAS that is routinely used for supporting policymaking requires rooting the system in the organization. The tools need to be put to practical use relatively quickly and be properly integrated into the decision-making process. For the project to succeed, the tools and new processes should also be viewed by policymakers as useful for making sound, analytically backed decisions. This requires rethinking the organization of human resources, establishing regular forecasting rounds based on forecasting calendars, and putting in place well-designed internal communication among staff and between staff and the decision makers. The latter naturally include regular meetings between staff and policymakers during the forecasting rounds. Work on FPAS organization and related processes should proceed in parallel with development of technical capacity as both streams seem to be mutually reinforcing.

  • Keeping the tools simple and building a strong base for forecasting and policy analysis is essential. There is a risk of both the recipient as well as the CD provider focusing too much on building advanced, complex analytical tools, at the expense of developing the base for effective economic analysis and decision-making— data warehouse, developing and reading economic measures and indicators, macro diagnostic framework, and writing MPC briefings and presentations.

  • Proper judgment and a solid understanding of the current developments and initial conditions are essential for successful forecasting and policy analysis, and ultimately for good decision-making. Any forward-looking policy model suitable for serving as the core forecasting and policy analysis tool within an FPAS depends on a number of hard-to-measure unobserved variables. Forecasting and policy analysis with the basic QPM requires estimating potential output and the output gap, the trend or equilibrium real exchange rate and the exchange rate gap, and the equilibrium interest rates. These variables are difficult to estimate, and errors in these estimates might result not only in increased forecasts errors but also bad policy advice. There may, however, not be any better alternatives. The lessons to date suggest, as mentioned previously, that the best approach is to keep the core model relatively simple and combine the use of it with expert judgment, supported by sectoral analysis, satellite models, and filters.

  • Training the decision makers and ensuring that they quickly see the benefits of the capacity being built for policymaking is essential. Staff need to be given sufficient time to build, practice, and use the tools developed. At the same time, management needs to be convinced that using the resources required to build staff capacity and undertaking the organizational and internal procedural changes is a good investment that will enhance policymaking. Experience shows that at the onset of the CD authorities often have an overly simplistic idea of what an FPAS entails and provides. Therefore, their expectations (including resource implications) need to be carefully managed. Central bank management, moreover, often did not fully subscribe to the FPAS. Their support typically increases over time as they start reaping the benefits. Regional peer-to-peer interactions and staff secondment to similar countries that are somewhat more advanced can be very helpful.

  • Building the broader FT is necessary for FPAS sustainability and depth of analysis. There is a risk that the forecasting and policy analysis work gets limited to an almost mechanical “running of the model” by a small modeling team that lacks nowcasting and sectoral expertise. This puts excessive pressure on the QPM team and can cause important sectoral information to be overlooked, which can limit the quality and credibility of the analysis and lead to policy mistakes. High-quality forecasting and policy analysis requires inputs from a wide range of central bank staff. This calls for a close cooperation and ability to reach consensus within the FT, consisting of members of the core modeling team and often a large number of sectoral experts.

  • Potential interdepartmental conflicts should be recognized quickly and tackled while the FPAS is being introduced. The FPAS is a system to support monetary policy decision-making. The responsibility for managing the system should thus naturally rest with the monetary policy department. The research department may, however, at the onset of the process have more extensive modeling expertise and view the project as mainly an academic exercise that should naturally belong to them, or at least consider developing of the core model in that way. This can create a risk for unconstructive competition between these two departments both with regards to developing the tools and to producing forecasts and policy recommendations. These potentially conflicting interests should be managed in advance, otherwise it may become an obstacle for establishing the FPAS and securing broad-based staff support. Insufficient involvement of departments such as financial stability and markets can also give rise to conflicts and suboptimal policy analysis. Most importantly, they can result in deviations between the formal policy stance as indicated by the policy rate and the de facto policy stance as reflected in the interbank money market rate. While these departments may not be seen as central in preparing the forecast, they have an important role to play both in the policy formulation process and, importantly for markets, in day-to-day policy implementation.

  • The role of the forecast coordinator must be clearly assigned, preferably to a relatively senior and seasoned staff member with good technical skills and economic intuition, an ability to communicate both vertically and horizontally, and direct hands-on involvement in the day-to-day work of the FT. The role of the head of the FT is hard to overestimate. Formalization of this role, for example by reflecting the duties in the person’s terms of reference, is important both for accountability and for providing the person with the formal power to manage the team. Clear delegation of leadership and sufficient seniority of the forecast coordinator is important for avoiding conflicts when the full FT is interdepartmental (as opposed to intradepartmental).

  • Continuity of CD engagement, even after basic capacity has been established, can be important for sustainability. Capacity can easily be lost as staff moves to other assignments, and new staff needs to continuously be retrained.

  • Human resources issues should not be overlooked. Staff turnover is normal but can be detrimental. Sustainability requires a conscious human resources policy to ensure that the key FPAS functions are properly staffed, and that new staff are recruited and properly trained before existing key staff moves on. Ideally, in the modeling team, which consists of individuals with highly specialized technical skills, there should be two staff members that could undertake each task—a lead expert and a backup. Best practice is to have two full teams that rotate between running the forecasting and policy analysis round and working on maintaining and improving the tools and procedures and doing related research.

  • It is also important to establish clear incentives for the members of the FT to duly prioritize the FPAS-related work. This, for example, can be achieved by having the FPAS-related duties fully reflected in staffs’ terms of reference and/or key performance indicators.

  • Proper documentation of analytical tools, organization, and processes is critical for sustainability. Documentation and manuals should ideally be prepared by the CD recipient, not the CD provider, in order to ensure ownership and mastery.

  • Collapse
  • Expand
Taking Stock of IMF Capacity Development on Monetary Policy Forecasting and Policy Analysis Systems
Author:
Nils Mæhle
,
Tibor Hlédik
,
Mikhail Pranovich
,
Carina Selander
, and
Mikhail Pranovich