This paper presents a basic plan for developing a Forecast and Policy Analysis System designed to support an inflation-forecast targeting regime at a central bank. It includes discussion of the development of data management and reporting processes; the creation of a forecast team and the development of human capital; the implementation of a simple model, plus possible extensions; and the management of regular economic projections. We emphasize that it is better to implement simple models earlier and use them well, rather than wait in an attempt to develop an all-encompassing model.
This paper presents a basic plan for developing a Forecast and Policy Analysis System designed to support an inflation-forecast targeting regime at a central bank. It includes discussion of the development of data management and reporting processes; the creation of a forecast team and the development of human capital; the implementation of a simple model, plus possible extensions; and the management of regular economic projections. We emphasize that it is better to implement simple models earlier and use them well, rather than wait in an attempt to develop an all-encompassing model.
Countries with experience implementing independent monetary policies in periods of flexible exchange rates have had considerable advantages making the transition to explicit inflation-forecast targeting (IFT) regimes. This is because they already had most of the systems and expertise in place to support IFT. However, for countries facing a rapid transition from fixed exchange rates to IFT, developing the internal monitoring and forecasting processes has been more difficult.
This document provides a basic plan for developing a forecasting and policy analysis system (FPAS) to support policy decision-making in countries that are considering (or have recently adopted) an IFT regime. The suggestions are based on our view of ‘best practice’ for IFT, and an assessment of how that practice could be transplanted to institutions that face challenges of limited resources and fast-changing economies. We deliberately avoid reference to specific institutions, because we think that no institution has a monopoly on what constitutes best practice. While institutions that are new to IFT undoubtedly face challenges, they are also free to consider the whole picture of what is required for IFT in creating their own practices.
The proposed development strategy is based on a belief that many of the difficulties associated with developing the systems and expertise required for IFT are usually the result of two problems:
First, there may be an attempt to develop the perfect system too quickly, at the expense of achieving a minimum acceptable degree of functionality in the short run.
Second, there may be an attempt to put too much emphasis on the model itself. An effective IFT system requires considerable integration of its parts—the databases, the reporting system, the forecasting process, the information technology, the communication and decision-making framework, and so on.
Hence, while the system proposed here borrows greatly from the forecasting and policy analysis systems that have been developed over the years in various central banks, it has deliberately been kept simple. The system must focus on the needs of policymakers, but always with a view to the resources available. It can and should be refined and extended as time and resources permit. However, it is most important to start operating as quickly as possible.
The goal of the paper is to outline systems and processes that help the staff of the economics departments in central banks achieve a coherent view of what is happening in the economy, what the policy implications of that are, and what that implies for research and further investigation.
More concretely, the proposed system involves the following parts:
Developing a reporting, database, and near-term forecasting system based on a limited but key set of macroeconomic variables, so that everyone involved in the forecast and policy process can share the same information.
Updating the database, monitoring and reports on a timely (weekly) basis so that everyone involved in the process will be informed about how new information affects the near-term inflation forecast (and the implications, if any, for the longer-term outlook).
Developing a simple quarterly projection model of the economy that embodies policymakers’ views about the monetary policy transmission mechanism and the standard set of shocks that affect the economy. The model may be very simple at first; the important thing is to get a complete system operating quickly. The model can be extended over time as dictated by experience, but the core projection model should not be allowed to turn into a ‘black box’ by becoming too complicated.
Developing a consistent, model-based macroeconomic forecast every quarter. This includes assessing the risks to the previous official baseline forecast and using that to propose changes to the official baseline forecast.
Developing measures of uncertainty in the forecast, such as model-based confidence intervals. These measures should be used to communicate the extent of this uncertainty, both internally and to the public.
Studying specific risks in the baseline forecast and developing contingency plans for reacting to new information that is released between official forecasts.
Inflation-forecast targeting involves using a wide range of information in order to obtain the best forecasts for inflation and the economy. The staff must extract the underlying pressures on inflation and the economy from data that may be conflicting and noisy. To foster the necessary cooperative effort, all participants must be able to share information and to see how new information—from themselves and their colleagues—affects the forecast.
This is important because successful inflation forecasts are not mechanical; the principal role of models is to see what implications these judgments about the state of the economy have for policy. In fact, in the very early stages of an IFT regime, forecasting and policy analysis may be based entirely on judgment, without the use of any formal model. The development of the FPAS can be greatly advanced simply by developing a process for regular meetings and reports, where the staff review recent economic developments and provide an assessment of the risks to the previous baseline forecast. However, as the IFT regime evolves, it would be useful to impose more discipline and structure on the policy debates by developing a core macro model explicitly designed to support IFT. The model may therefore be a natural extension of an existing process, but introducing a macro model in a situation where there is no process for interpreting and reporting data outturns is not likely to be very helpful.
Similarly, it will be difficult for staff to introduce successfully the changes proposed here unless their ‘clients’—the senior managers who make key policy decisions—are receptive to the need for such a structured process. In what follows, we assume that this exists, but there may need to be a preliminary process wherein upper management is introduced to the processes described here and persuaded of their benefits. This may take some time, but it is nonetheless necessary that staff and upper management approach building a structured FPAS with a shared view of what is to be put in place and the ultimate goals.
The remainder of the paper is organized as follows. Section II discusses an internal process for policy decisions with fixed, announced action dates and reports to the public. It also reviews what information and analysis policymakers need to make informed and timely decisions about setting the policy instrument. Section III explains the basic benefits associated with a successful IFT framework and the specific benefits associated with developing a structured FPAS. Section IV discusses the steps that are necessary in developing such a system. Section V describes how the quarterly projection exercise could be organized. Section VI provides an outline of a simple model that could be used for small open economies, and explains some of the technical aspects about the forecast. Section VII explains what would have to be done to develop the internal expertise to build more sophisticated macroeconomic models for use in the FPAS.
II. Developing a Decision-Making Process
In a regime with a fixed exchange rate, there is no independent domestic monetary policy and so no decision-making process is required, other than the day-to-day reaction to developments in the exchange market and closely related financial markets. However, once there is an independent domestic monetary policy, such as arises from an IFT regime, it becomes imperative that there be a structure and process for monetary policy decisions. In later sections of this paper, we will look at the details of the underlying FPAS necessary to support monetary policy decisions. Here, we focus on the end result, the policy decisions, covering the information required, the nature of discussions and deliberations that will facilitate the process and the nature of communication of the decisions to markets and the public generally.
Different institutions have different internal decision-making structures. In some, the final decisions on policy actions may rest in a single individual, a Governor of the central bank or the equivalent. In others, there may be a group of senior managers charged with that task. In our discussion, we will simplify the exposition by assuming that the decision-making body is a Monetary Policy Committee (MPC). In this discussion, it does not matter whether the MPC is a group or an individual.
Central to the success of an IFT regime is developing credibility that the monetary authority will act to respect the announced inflation targets over time.2 A major benefit from earning such credibility is that routine shocks will not have profound lasting effects; long-term inflation expectations will remain anchored to the policy target and routine shocks will have only minor transitory effects on markets. This ideal situation will not come immediately from an announcement of a new policy regime. It will have to be earned and defended over time. The only way this can happen is through clear and consistent communication of the policy decisions and, most importantly, the reasons for them.
Information comes to policy decision-makers all the time. In financial markets, there is something new every day. However, IFT policy decisions require a longer horizon and a clear macroeconomic framework. In most countries, complete economic data in the form of national and/or domestic accounts comes quarterly, with a lag of course. Yet, many partial indicators are available monthly. In the sections that follow, we describe a system of monitoring, forecasting and policy analysis that is structured around the flow of information. This system is designed to provide for ongoing discussion of economic developments and the implications for monetary policy that will take place continuously, and with some formal internal discussion and summary at least once a week. Policy decisions based on such ongoing discussions will be better informed and better communicated. However, that does not mean that decisions need to be taken and communicated every week.
In recent years, many monetary institutions have moved to a fixed calendar of formal decision dates and consequent communication with markets, with these dates announced in advance. Since we are discussing regimes designed to deal with inflation, there is a case for using the release of data on inflation, typically monthly, to trigger the process of deciding and communicating the results. Indeed, in some institutions, a full monetary or inflation report is generated monthly. If that is desired and feasible, there is no problem. However, we want to stress that neither the decision-making nor the communication need be tied to such a demanding schedule. For countries entering into IFT decision-making for the first time, full reporting on a quarterly basis, linked to the release of the domestic accounts, and a full-blown update of the economic outlook and medium-term forecast, would be reasonable.
This does not mean that decisions should be taken and communication undertaken only at these quarterly ‘full-report’ times. There could be a fixed schedule of monthly decisions triggered by the latest inflation data (if available at monthly frequency), with a brief update on economic and monetary conditions and the reasons for any actions taken, presented in the context of an update on the previous quarterly report, in the months between releases of the domestic accounts.
It is not essential that the fixed schedule be monthly. It is widely accepted that quarterly is not often enough, but some countries have chosen to have fewer than twelve dates in their decision-making schedule. In the end, the frequency of decisions and reports must strike a balance. Having too many puts unnecessary strain on internal resources and may frustrate markets if there is regularly too little new information to warrant reviewing economic conditions and policy choices. Having too few decision dates risks avoiding the discipline of confronting the evolving evidence and providing a clear interpretation, and risks delaying policy action past the point where it is clearly called for.
It is important when choosing at fixed timetable for decisions and communications with markets that it be made clear that there is nothing to stop the MPC from acting at other times and communicating the nature and reasons for these actions, if extraordinary circumstances arise. This cannot be a regular occurrence, or the benefits of the fixed timetable will be eroded, but markets will understand that the MPC must be free to act when necessary under extraordinary circumstances. It is in everyone’s interest that monetary policy be conducted well.
Although a schedule should not be allowed to compromise good monetary decisions, there are important advantages to having fixed decision dates and reports. Markets like predictability of process. Having a fixed timetable for policy decisions and public discussion focuses the attention of all players on the underlying issues at the same time and with the same data. Commentators, both from markets and from the academic community, will debate what the policy authority should do, listen to what is said about what was decided, debate whether it was the right action, and then move on. Both sides can gain from this. It is important to remember that markets are there to provide goods and services and, while the monetary policy environment can be an important consideration for at least the timing of resource allocation decisions, it would be unfortunate if markets felt it necessary to focus on these issues all the time, rather than their core businesses.
The policy authority also benefits from the fixed timetable, since there is no need to respond formally to every piece of new information. It is also helpful that there is the discipline of responding and providing commentary at each decision date. Even when no change of policy stance is necessary, and perhaps especially when that is so early on in the learning process, it is important that the MPC communicates to the markets why that is so. Without the discipline of fixed dates, it might be tempting to wait until there is something more to do and say. Or, possibly worse, it might be tempting to react to every new development, without the benefit of careful thought. The fixed timetable also provides useful discipline for the internal communication process; nothing focuses an organization better than the need to put arguments on paper for release to the public.
It is also helpful for policy makers to get responses from the markets at these specific times. At the least, contrary views can help define the research agenda. Issues of disagreements on the facts can be dealt with relatively easily. Issues of interpretation are more difficult, but the policy maker must always be ready to engage in the debate, if by doing no more than repeating consistently the story that blends the economic conjuncture (economic conditions as revealed in the data), the vision of how the economy operates and how policy decisions impact on that economy, and how the decisions taken therefore fit with the objectives of the IFT regime.
Let us now consider what information the MPC needs to make policy decisions. For virtually every country entering into an IFT regime, the starting point must be a world economic outlook, with special attention to economic conditions in the economies of important trading partners and sources of capital. It is by no means necessary to develop such an outlook internally, but members of the Staff must be assigned to monitor and report on external conditions and to provide a forecast of the implications for trade and prices. Special attention must be given to oil prices or significant commodity prices, depending on what is important for the domestic economy. Significant capital account developments, linked to developments in private and public debt or to major foreign direct investments, may also be important to the outlook generally and domestic investment demand monitoring. To understand developments with respect to the currency, there will have to be some consensus on the underlying real exchange rate path. This is easy to say, but not so easy to obtain.
Nevertheless, the interpretation of data and the consequences for policy must always be presented in the context of the underlying trend equilibrium paths.3
Turning to the domestic economy, the MPC will need to know about domestic demand and the nature of inflationary pressures. This will require some consensus on the level and growth rate of potential output, as well as a detailed monitoring of the components of aggregate demand. The notion of an output gap, the difference between aggregate demand and potential, provides a key part of the macroeconomic model of the links between economic conditions and inflation to be described below. Judgments on the level and likely future course of excess demand will be a key part of the background necessary for policy choices.
In most countries with an IFT regime, the policy instrument is some interest rate that can be set or influenced directly by the monetary authority. Often, the rate used in announcing policy decisions will be some rate that determines lending conditions in an overnight market including the central bank and large financial institutions. Broader short-term market rates that influence domestic demand are usually closely linked to the policy rate. Understanding the links between these and other market rates is necessary background, but whatever particular market rates are chosen for focus in the policy discussions, the policy maker will have to have some view of the underlying real equilibrium, in order to interpret the implications of particular choices for nominal rates. A particular level for nominal rates may be stimulatory or restrictive, depending on the underlying real equilibrium and the expected rate of inflation.4
Finally, since inflation, somehow defined, is the underlying focus of an IFT regime, the policy maker will need a complete monitoring of recent developments and the likely course of various measures of inflation over the policy horizon. In all cases, short-term monitoring of this sort will include attention to component detail and all useful indicators and other information, far more detail than would be incorporated into any macroeconomic model used to provide a vision of how the economy works and is influenced by policy choices.
We have stressed that monetary policy decisions are always made in an uncertain environment. It is important in the decision-making process that this be addressed explicitly. The MPC will not want just the best guess of the staff in terms of a baseline forecast. They will need, as well, a thorough discussion of the main risks to the forecast and the particular judgments made by the staff in arriving at the baseline forecast. Particularly important in this are the implications for the policy choices that must be made.
The process of assembling and presenting this macroeconomic view will depend on what information is available. Let us begin with a period when the full domestic accounts have been released. In this case, the Staff will have reviewed all the data and arrived, in consultation with the MPC, on a new base-line forecast and a new set of alternative scenarios based on various risks to the baseline projection. This exercise will include a review of all the underlying judgments about real equilibriums mentioned above. In most cases, the baseline projection will include a path for the policy rate consistent with respecting the IFT goals. The alternative scenarios will also contain information on the possible effects of shocks or risk scenarios on the policy choices. The MPC must consider all this information, along with all other available economic intelligence, and their personal views, in coming to a choice as to what to do. In the context of a quarterly report to markets, a full discussion would be expected, including the underlying views about external conditions and commodity prices, views about the state and future course of the domestic economy, and the implications for inflation. Policy changes must be presented as response to changing views on these matters in order to build a broader consensus as to how monetary policy works. Routine discussion will gradually build confidence and credibility, which will pay off when serious shocks arise.
Consider now how this process would change if the decision point were not one that included a complete release of domestic accounts. In such circumstances, the Staff will normally not be expected to prepare a complete new baseline projection. Rather, economic developments will be assessed relative to the existing baseline. All the components described above will be there, starting with changes to the external outlook and ending with the monitoring of developments and the short-term outlook for inflation. The MPC may want to see a review of the policy implications of these changes of view, perhaps in the form of analysis of the effects relative to the previous baseline of the new information or views, or a revision of the risk assessment. Communication of decisions would then focus on what has changed since the last decision point and how it has influenced the choices made. In cases where no change in a policy setting has been announced, this must be explained in terms of why the new information is within the compass of the previous scenario. Explaining if there is a change of view on the nature of the main risks or the balance of risks would also be useful in such updates.
III. Benefits of a Successful IFT Regime
There are several real benefits that can be derived from a successful IFT regime. These include:
Less uncertainty about the future price level and inflation.
More efficient allocation of resources because of better price signals.
Lower levels of unemployment and higher levels of productive economic activity.
Lower real interest rates and higher levels of investment (and permanent income).
Improved transparency and policy credibility.
The ultimate success of an IFT regime depends on how well the monetary authorities do at avoiding large boom and bust cycles and providing an anchor for inflation expectations. In a fixed exchange rate regime, the performance of the central bank can be monitored–internally and externally–by reference to well-defined targets, which are continually observable in real time. However, in a floating exchange rate regime with IFT, a central bank needs to rely on internal assessments to determine the appropriate policy stance. Moreover, a clear logical policy framework is needed to support the dissemination of policy to the outside world. This emphasizes the need for an internal structure that can provide the basis for assessments of the state of the economy and the nature of the underlying pressures challenging the achievement of the target. The decision-making process inside central banks can be improved considerably by the development of a structured FPAS. Indeed, we can identify at least six types of benefits that can be derived from a structured FPAS. These benefits include:
Improved communication between all people involved in the forecast and policy debates, leading to important synergies and the continual improvement of the process of coming to a collective view.
A more structured debate about the risks in the forecast, allowing policymakers to more easily identify and communicate the key policy issues.
The development of a database to assess the historical forecasting record, informing staff where new research effort is required.
The development of highly specialized human capital.
The development of institutional knowledge about the monetary transmission mechanism and the effects of disturbances or shocks that arise.
The possibility of greater transparency about the systematic component of monetary policy and how the forecast is developed.
The challenge for the institution is how to allocate its resources and structure the decision-making process to achieve these benefits. In what follows, we assume that the institution has a team of economists on staff, serving a central decision maker such as a Monetary Policy Committee. Our plan will involve organizing these staff into forecast teams, reporting to the MPC, that can produce and maintain various inputs into a projection process: databases; regular reports; assessment of current economic conditions and the near-term outlook; and, of course, the forecast itself. Our proposed structure is derived from the argument that a carefully structured process will be required for the staff to be able to provide the requisite internal advice to the MPC on how the economy is evolving and the appropriate policy response. This will involve being able to reassure the MPC that the staff have all relevant information at their disposal, and that they are able to extract the key messages from this information. The next section describes the building blocks for this process in more detail.
IV. Critical Path for Developing a Structured FPAS
This section provides a critical path for developing a structured forecasting and policy analysis system and the necessary expertise to use it. An important assumption of the proposed path is that management has a strategic vision of what the system looks like: who is involved; what resources they have; and how the information flows within the institution. If this vision is widely understood and accepted, then the system is able to exploit the synergies associated with a shared common paradigm and philosophy, enhanced data integrity, improved communication between staff and management, and the productive use of information technology. Furthermore, there must be recognition by all parties involved in the process that approaches that attempt to develop a perfect system too quickly, or parts of the system independently of other parts of the system, may be doomed to failure.
The initial systems proposed here are, by design, simple and should be extended as dictated by experience.
Step #1: Creation of a Small Forecasting Team with Well-Defined Responsibilities
The two main responsibilities of the forecast team are to ensure that the forecast process is internally consistent and that the near-term and medium-term forecasts are based on all relevant information. To achieve these two objectives, it is essential that the forecasting team has the support of management and can mobilize resources within the institution to help.
The responsibilities and focus of members of the Forecasting Team can be separated into two types: those that focus on developing near-term forecasts for the economy (current and next quarter), and those that focus on developing medium-term projections (as well as conducting alternative scenarios that are generated to highlight risks).
We make a distinction between near-term and medium-term forecasts for several reasons. First, in order to build and maintain credibility, the institution will need to be seen, both externally and by the MPC, to be able to forecast the near term reasonably well. Much of what determines near-term outcomes is often quite idiosyncratic, and therefore difficult to forecast by a macroeconomic model designed to capture stylized business cycle dynamics. Expert staff using detailed (often anecdotal) information will have a comparative advantage in forecasting the short-run outturns. Second, macro variables are usually quite persistent, which implies that an assessment of pressures underlying the economy will need to include an assessment of those pressures over the near term as well. It therefore makes better sense for a model-based, medium-term analysis to start when macro variables are freer to move in response to fundamental pressures. Although it is difficult to be precise about just when this might be, treating the first two quarters as the near term will work well in practice.
This implies that the near-term and medium-term forecasters have quite different roles. The role of the near-term forecasters is to assess the current state of the economy and to develop forecasts for the current and subsequent quarters based on all available information. By contrast, the role of Macroeconomic Projections Team (MPT) is to assess how the estimated disequilibria (or imbalances) in the economy will play out (and hence what pressures there will be on policy).
Within the first group, each forecaster should have a well-defined role, such as monitoring a particular sector. This might be usefully organized along the lines of national accounts expenditure groups, balance of payments, prices and inflation, money and credit, financial markets, and external conditions, hence achieving economy-wide coverage.
While this demarcation of responsibilities is necessary for the team members to work effectively, each member of the team should be aware of their responsibilities to the others. An essential part of this is providing feedback, as required, to ensure that the forecast process and scenarios are internally consistent.
While there should be well-defined roles, individuals should not be asked to play the same role indefinitely. There should be ongoing training to ensure that when someone leaves another is ready to take over.5 This is necessary for the integrity of the system over time and to ensure that motivation is kept high. Indeed, each group that supports the forecast should have positions devoted to longer-term projects, training, or backup for the forecast role. For human-capital development, it would be useful if the staff in these groups interacted with the research staff in the division assigned to execute longer-term projects. Furthermore, to exploit the benefits of information technology, all staff involved in the IFT system should have direct access to support resources (programmers, research assistants and database managers) assigned to assist them with their tasks.
Appendix 1 provides an example of how the responsibilities of the Forecast Team could be defined and organized and how they could report to the Monetary Policy Committee. As mentioned above, while the Forecasting Team should have well-defined responsibilities, the exact composition and the responsibilities of individual team members should depend on the expertise of the available personnel and the other trained resources that the FPAS system has its disposal.
Step #2: Development of a Database System Designed to Support Monitoring and Near-Term Forecasting
The first task of the Forecast Team should be to develop a Historical and Near-Term Forecasting (HNTF) database. This database forms a common point of reference for discussions between staff and management. It also forms the basis for trying to assess trends and make judgments about what is signal and what is noise in the latest data outturns. Until it is completed, it will be very difficult to develop a reporting system, or to impose consistency and transparency in the forecast process.
This database should include the key macroeconomic aggregates that are used to monitor and forecast the economy. For an FPAS designed explicitly for IFT, it should also include extra detail to provide the basis for assessing and forecasting trends in the real economy and core measures of inflation. Appendix 2 discusses database management issues in more detail and the relationship between the HNTF database and other databases that need to be created to support IFT. The HNTF database should include the sector specialists’ near-term forecasts for the first two quarters of the forecast as well as historical time series.
The database should be updated weekly and be accessible to everyone who is involved in the forecast and decision-making process. The process of weekly updates forms the basis for continual ‘filtering’ of new information, providing staff with the ability to systematically account for how and why a new outturn affects their view of the state of the economy. This involves discussions of data problems, special factors, and many of the technical aspects that are central to extracting trends from data. The output is information that can be used in formal reports to management and the public (such as Situation Reports and Inflation Reports). These discussions also lead to the important ‘stylized facts’—the staffs view of how the economy works—which a formal macro model should be able to mimic.
It is important that clear lines of responsibility be established for updating the database, for explaining and documenting how the data are constructed, why the data have changed, and any special factors that might have affected them. Each weekly database should be stored permanently so that they can be analyzed periodically. This will provide measures of the uncertainty in the near-term forecast and may indicate where the near-term forecasting system can be improved.
Step #3: Development of a Weekly Reporting and Intelligence System
The HNTF database serves as the foundation for a weekly reporting system. A weekly report to management ensures that new information is assessed systematically, and should reassure the MPC that they have a command of ongoing developments in the economy. This weekly reporting system will include a standardized set of tables and charts, and will provide the basis for discussing risks associated with the last official forecast and other more recent monitoring.6 Indeed, within hours of the database update, the chief economist (or a designated backup) should chair a short meeting with the staff and forecast team to discuss possible revisions to the near-term outlook. Shortly after this meeting, the chief economist should report the information to the MPC and explain how the recent data and other anecdotal evidence have suggested changes to the near-term outlook. If the process is working well, it will help ensure that no action is taken in those situations when the outturn is judged insignificant noise, but where policymakers will react to an apparently extreme development, even if this were not one of the fixed decision dates.
These meetings are intended to provide a regular forum for upper management to ask specific questions and to probe the staff’s interpretation of the data (and any related implications for the near-term forecast and monetary policy). All members of the Forecast Team should therefore be present at key meetings with the MPC. The chief economist should also invite other staff members if questions are likely to be posed that require their specific expertise.7
In addition to improving communications between staff and the MPC, the reporting system should provide benefits for the staff themselves. First, it should provide a forum in which staff can understand how new information leads to changes in the very near-term forecast. Second, it will help focus attention on the links required between the actual consumers and producers of data—that is, part of the reporting process is to provide incentives for ongoing contact between staff at the central bank and at the statistical agencies.
We have described weekly meetings as a default option, which raises two issues. First, depending on staff and MPC structure, this may prove to be quite demanding. There is some flexibility here to work with less frequent meetings, and we trust it goes without saying that if resource demands mean that the quality of the information to the MPC suffers then changes would be made. Second, there is the risk that staff and the MPC may become focused on immediate concerns or very recent data, causing them to focus on the noise rather than the signal. This is an ongoing risk with economic forecasting. We are inclined to think that it will be easier for staff to stay on top of recent outturns with more short meetings rather than fewer longer ones, since at the higher frequency there will be fewer outturns to consider. Whether the meetings are weekly or less frequent, the most important question is whether there is any new information that would lead staff to change their view.
We also see these meetings as somewhat independent of the policy decision cycle. The frequency at which the MPC changes the monetary instrument will generally not be a staff decision—in some countries, the MPC may not be bound by a fixed decision calendar at all. In advanced economies, the underlying shocks will tend to be ‘fundamental’ in nature, whereas in emerging market economies the policymakers will have to deal with confidence shocks and potentially large swings in asset prices as the result of shifts in portfolio allocations as a result of preference shocks. For this reason, we favor the weekly frequency for reports to the MPC, as much of the relevant information will be informal in nature and the MPC will need to feel well-informed about potentially fast-changing circumstances. Indeed, part of the transition process will involve building up the credibility of the institution and the IFT framework through clear and well-informed communications with private sector commentators.
Step #4: Creation of a Repository for Written Documents
It is important to learn from experience and to incorporate available information as efficiently as possible. In order to develop this sort of institutional knowledge, it is important that staff document data problems, special factors, changes in procedures, revisions to data, models etc. In addition, these written documents provide an invaluable resource for new staff as they attempt to learn new tasks or improve on existing ones. The repository system should be simple and the notes should be accessible to everyone. A very simple system is proposed where internal research notes (RNs) are distinguished from memos (MEs), but both are stored in a specific location that staff and upper management can access. This process can be facilitated by computer technology, but it is important to keep the system that manages the repository simple initially when the quantity of written documents is fairly small. As with other aspects of the system, it is more important to get something working as quickly as possible rather than spending too much time and resources trying to develop the perfect system.8
Step #5: Development of a Simple Model of the Economy
A natural extension of the system when these elements are in place is to express the institution’s view of the way the economy works in a formal model. It is essential that the model reflect the views of policymakers for it to be considered seriously in policy discussions. Of course, the staff must also have confidence that the model’s properties are defensible. In the early stages of working under an IFT regime, it will probably be much easier to achieve this consensus with a small model, with a view to elaborating it over time in response to questions that need to be addressed in policy debates. It is much better that a simple model is introduced as soon as possible, rather than aiming to introduce a complex model in one, long step. Even in the longer term, carefully-considered analysis with a simple model will prove to be far superior to badly-coordinated and under-resourced analysis with a complex model.
The core model’s purpose is to aid staff in interpreting the implications of the state of the economy provided by the near-term forecasters. The model’s insight and usefulness come from ensuring some degree of macro consistency in the projection process. It should not, however, be treated as a ‘black box’—in particular, staff should always be aware that there are important uncertainties in the model’s assumptions and should attempt to understand the implications of these uncertainties for policy. The model should be improved over time, but always with the requirement that it remain clear and be widely understood.
Step #6: Exploring the Implications of Risk and Uncertainty in the Forecast, and Highlighting this Uncertainty to the MPC and Externally
One of the main advantages of using a core model in the projection process is that the model can provide a logical foundation for studying the policy implications of uncertainty. Indeed, the potential strength of models designed to support IFT is not that they can forecast the economy accurately, but that they can be used as organizational devices for developing strategies for dealing with uncertainty.
One useful step would be to compute confidence levels around the forecast.9 This serves to remind both participants within the institution and market observers that forecasts are not exact, and hence that policy decisions should not be swayed by small errors. In essence, the monetary policymaker lives to fight another day, by re-assessing new information and resetting policy accordingly. On the other hand, this process can also be used to illustrate specific risks around the official baseline forecast in order to warn market participants in advance about how monetary conditions might need to change in response to new information. By highlighting the risks and uncertainties surrounding the forecast, the institution will find it easier to communicate these points to observers and to deal with the effects of larger shocks as they arise.
V. The Quarterly Projection Exercise
The quarterly projection exercise has a number of objectives. These include: (1) producing an internally consistent, medium-term baseline forecast; (2) assessing risks and uncertainties in the forecast; and (3) providing a regular forum for the deliberations that are necessary for the MPC to develop strategies for implementing monetary policy. The quarterly projection exercise should be completely open and transparent to all involved; this can be facilitated by a series of meetings and written documents that are circulated before and after the meetings.
The role of the Leader of the Forecast Team is to manage the projection process and to organize the meetings that will inform all participants about how the forecast is being constructed. The quarterly exercise could be organized as follows:
Step #1: Production of the Forecast Calendar
The first step in the quarterly projection process is to produce and circulate the forecast calendar. The Leader of the Forecast team should circulate a preliminary version of the forecast calendar to the MPC and the staff at least 8 weeks prior to the release of the national accounts, to be finalized within a week.10 This means that all of the key meetings can be scheduled well in advance, so that both staff and management can plan accordingly.
The first date in the forecast schedule—the Issues Meeting—should typically be about two or three weeks before the release of the national accounts. The quarterly projection exercise finishes when the MPC signs off on the baseline forecast and risk assessments. This final, ‘big picture’ forms the basis for the forecast book.
Step #2: Production of the Issues Paper (2 Days before the Issues Meeting)
Three days before the Issues Meeting, staff and members of the MPC should submit issues to the Leader of the Forecast Team in the form of written memoranda. The Leader of the Forecast Team should then assemble these smaller issues notes into a complete Issues Paper. We can usefully think of the material in the Issues Paper under three headings:
changes in the staff’s view of how the economy is working and consequent proposed changes to the model’s structure and properties;
long-run or equilibrium issues, such as structural imbalances, the trend real exchange rate, potential GDP etc.; and
implications of recent data for the near-term forecast.
The Issues Paper should be circulated at least 2 days before the Issues Meeting. After reading it, staff and members of the MPC should be aware of the concerns of all people involved in the Quarterly Projection Exercise.
Step #3: The Issues Meeting (2 or 3 weeks before the release of National Accounts)
(Attended by the MPC, Forecast Team and Sector Specialists)
The exact date of the Issues Meeting will depend on the time required to consider the topics in the Issues Paper, but would usually be two or three weeks before the release of the national accounts. The Issues Meeting should be chaired by the chief manager of the FPAS, or the person on the MPC who is responsible for the process. This person must ensure that the points from the Issue Paper are worked through. If an issue is deemed to require further thought, additional analysis may be requested, either for immediate action or for future Issues discussions. After the Issues Meeting, everyone involved in the projection exercise should have a good idea about how the quarterly projection exercise will proceed and what the key assumptions will be. However, they will also be aware of differences in views, and this information will be essential for preparing the risk assessments.
Step #4: National Accounts Meeting (The day the National Accounts are released). (Attended by the Forecast Team and Sector Specialists)
Shortly after the release of the national accounts there should be a meeting with the people responsible for producing the national accounts. This meeting is to assess data quality and to collect any background information that people in the statistical agency might have on special factors that have influenced the data. A short document should be written that summarizes these discussions.
Step #5: The Near-Term Forecast Meeting (1-2 Days after the Release of the National Accounts).
(Attended by the MPC, Forecast Team and Sector Specialists)
Shortly after the national accounts meeting, staff should update the near-term forecast and present the implications of new data and anecdotal information to the MPC. At this meeting, there should also be discussion of the risks to the near-term forecast and any related risks they might pose for the medium-term forecast. This meeting and the update of the HNTF database provide two of the essential building blocks for the first round of the quarterly projection. It is important to emphasize that the purpose of this meeting is to lay the foundation for the rest of the quarterly projection—that is, to assess the current state of the economy. Members of the MPC should be able to question the staff’s judgment and to suggest changes to the basic assumptions that will be used to generate the near-term part of the official forecasts. The intent here is to make near-term forecasts only for those variables that can be assumed to be unresponsive to changes in monetary conditions. Variables such as interest rates and exchange rates should be allowed to adjust in response to changes in the official short-term policy rate.
Before the official forecast is finalized, the near-term forecast should be updated based on any new information that has arrived. In normal circumstances, this update should be relatively minor. However, in extenuating circumstances, the update may require a slight extension of the forecast schedule. Therefore, after the near-term forecasting meeting, everyone involved should have a clear picture of the basic assumptions of the near-term outlook and be aware of what sorts of new developments would cause a change in view.
Step #6: The First Medium-Term Projection Round (Round 1)
In what follows, it is assumed that a good forecast is not produced in one step but requires several iterations. Each iteration, or round, should serve to highlight inconsistencies in the forecast and build a consensus view about the pressures on the economy and the response of policy. Unless there are extenuating circumstances, it should take about five working days of projection rounds for the forecast to converge. This should not be taken as a fixed rule, but if the quarterly projection rounds take much longer than five working days to complete, there may be problems that need to be rectified to make the process more efficient.
The first round serves the purpose of building on the near-term forecast and assembling all the information required to derive implications from that forecast and the other exogenous assumptions for the medium term. This cannot be expected to be fully consistent—that will be ensured by subsequent rounds. The objective is to gain a first impression of the medium term. Hence, the emphasis is upon making sure that all information is brought to the attention of everyone involved in the process.
Formally, directly after the HNTF database has been updated, the Macroeconomic Projections Team (MPT) updates the estimates of the equilibrium measures that are used in the Quarterly Projection Model and then employs the accepted forecasting rules (agreed upon in the Issues Meeting) to derive the first round of the medium-term projection. The results of this round and all future rounds should be available when the staff and members of the MPC arrive each morning.
The reports circulated with each quarterly projection round should be transparent about the assumptions used to generate the results. Indeed, each person involved in the process should be able to access a system that will allow them to incorporate changes to the assumptions and to produce scenarios based on their own judgment.11 After discussing concerns with staff and members of the MPC based on the results in the previous projection round, the Forecast Team should meet at about 11:00 a.m. for approximately 60 minutes to discuss any suggested changes that they should make to the baseline forecast. Because one of their key roles is to ensure that the projection exercise is internally consistent, the baseline scenario should not be changed unless that represents a clear consensus on the part of the MPC. If suggested changes make the forecast internally inconsistent with other maintained assumptions, then these new changes can be considered as potential starting points for (completely separate) alternative scenarios. These scenarios will highlight and quantify risks in the official baseline scenario by showing the implications of an alternative assumption for policy. By 1:00 p.m. each day, the MPT should have clear directions from the Forecast Team about the types of judgment that they wish to impose in the next projection round. The job of the MPT is to then construct and circulate the results for the next projection round so that it is available the following morning.12
Step #7: Imposing Judgment in Subsequent Projection Rounds
In subsequent projection rounds, it is the role of the Model Operator (MO) from the MTP to incorporate judgment through the residuals (or the add factors) of the model. This judgment usually arises from the need to refine the aggregate picture produced in earlier rounds; the model should show up inconsistencies between assumptions, leading sector experts to adjust their own assumptions in the light of their effects on the broader picture.
Judgment should be added in small increments. After the daily meeting with the Forecast Team, the Model Operator should prepare a series of mini-projection rounds (Round 1a, 1b, etc.) that add judgment incrementally so that it can be quantified and its impact understood. When these rounds have been signed off by the Leader of the Forecast Team, they should then be circulated so that they are available when the staff arrives the following morning.
After each projection round (Round 1, 2, 3…) a set of standard reports should be circulated along with a summary of the changes made to the forecast round and the reasons for these changes. Each daily report on the development of the medium-term projection should show how the current round has changed from the previous round, as well as provide information that shows how it differs from the last official forecast.
While generally the case for judgment comes from looking at the aggregate, ‘top-down’ perspective coming out of the forecast, in some cases it will be useful to take a ‘bottom up’ perspective. Satellite models designed to provide details about specific sub-components should be used to check the consistency of the aggregate picture. For example, if the core projection model contains an aggregate Phillips Curve that is based on the CPI, it would be important to check the consistency of this forecast with other models that include explicit interactions between wages, producer prices and the CPI. If the core projection model contains an aggregate equation for output gap, it would be necessary to check the consistency of the projection with models that attempt to disaggregate the GDP forecast into smaller expenditure components. The choice of how disaggregated these components should become depends on how useful they are judged for monitoring the state of the business cycle, accounting for special factors, or linking the forecast to specific assumptions about external demand conditions. Appendix 3 provides a flow chart that shows the interactions between the core projection model and the satellite models.
Step #8: Forecast Presentation.
(Attended by the MPC, Forecast Team and Sector Specialists)
It should take about 5 working days of projection rounds for the forecast to converge. At the end of this period, there should be a final meeting with the staff and the MPC.
The presentation by the staff should be at a high level and focus on the main story line. What is desired is a coherent, systematic analysis, starting from external conditions and moving to the outlook for the domestic economy and the implications for policy. The presentation should use simple charts that illustrate the main points. Especially important is to highlight what has changed in the facts or in the judgments of the staff. The presentation should start with the near-term outlook and then move to the medium-term outlook. It is recommended that the presentation be made by one person; but that specialists be called on to amplify important points or reply to questions. The presentation should conclude with a discussion of some of the main risks that have been considered formally.13
By the end of this meeting, the MPC should provide guidance to the staff on the following questions: What changes are necessary for the forecast to become an official forecast? What risks do they wish to highlight in the Risk Section of the Forecast Book?
After the meeting with the MPC, it should take no more than 2-3 days to lock in the official baseline forecast and to prepare the alternative scenarios.
Step #9: Documenting the Forecast Process and the Scenarios
The next step is to write a short forecast book that explains the assumptions used to generate the forecast. This book should provide a summary and presentation of the forecast. If the model has been revised, the model documentation should be updated. These are two areas where Information Technology can streamline these processes and allow the staff to complete these tasks in a timely manner. In addition, the document that explains the objectives of the forecast process should be revised if any changes were implemented during the projection process or any new changes have been approved by the MPC.
Step #10: Post Mortem Meeting.
(Attended by the MPC, Forecast Team and Sector Specialists)
The last step is to have a meeting that discusses what went well and what needs to be done to improve process. Concrete action plans should be created at these meetings aimed at improving the system over time. The issues discussed in this meeting could range in complexity from changes in procedures or model structure or the database and management information system to much larger issues such as the development of the next generation of models that are being designed explicitly to support IFT.
VI. The Core Model
Up to this point, we have been vague about the kind of core model we recommend for the FPAS. The process described in the previous section and the steps outlined in Section IV will remain essentially unchanged no matter what sort of model is used. In this section, however, we describe in more detail the sort of model that could be used.
There is no such thing as the perfect model. All models, even complex ones, are heroic oversimplifications of the real world. Moreover, regardless of what type of model is being created, the staff will not create a final product at the first attempt. Models always need maintenance as more is learned about their properties, and conditions change. It is important to get something operating as soon as possible so that the whole institution can learn from its use. A particular problem is that the staff may have in mind a more complex model they would ultimately like to use in the FPAS. The staff should not let a preoccupation with the perceived ideal model stop them from establishing a simple working model or the other elements that make up the FPAS (the database, reporting systems, and so on). It is far more important in the early stages of an IFT regime to establish a relationship with the MPC and ensure that the MPC is comfortable with the forecast process. The first goal should be to establish a forecast process that provides the MPC with the reassurance that they will not be caught out by information that was not well incorporated into the analysis. Typically, policymakers are much more comfortable and realistic about the idea of making a series of small mistakes, but become much more uncomfortable with making large serially correlated errors. The core model is only a part of this.
Two further points bear emphasizing. First, it is important that the core model should be focused on the task of inflation-forecast targeting. In this respect, the key value of the model is its ability to help the MPC proceed from the assessment of the state of the economy to the implications for the desired stance of monetary policy. Second, the core model should reflect the view of the MPC about the workings of the economy, always guided, of course, by the advice of the staff. Clearly, both sides need to have some confidence in the model’s properties for the process to work. Hence, we emphasize that the system can and should be kept very simple, at least initially. In the early stages of an IFT regime, it will be easier to come to a consensus view on the nature of the economy when the model is small, probably with only a handful of behavioral equations linked to some key accounting identities. It cannot be stressed enough that the MPC must understand and accept that the model provides a reasonable characterization of how the economy works and how policy works. Otherwise, there will always be a cloud over things the model does best, like risk analysis.
Given a near-term forecast for the next two quarters and a consistent measure of the output gap, the FPAS could be supported by something as simple as a small ‘gaps’ model that attempts to provide consistent projections for real GDP, inflation, market-determined interest rates and the exchange rate.14 We provide the skeleton of such a model in Appendix IV. Such a model completely abstracts from issues about the supply side, fiscal solvency, private wealth, and the equilibrium levels of the real exchange rate and interest rates. These are no doubt important, and views about them would have to be brought to the table as part of the forecast and policy discussions. Nonetheless, the use of a simple model would immediately help to start a dialogue between the staff and the MPC about such issues as the appropriate policy response to the inflation forecast, and the pros and cons of various assumptions and alternative scenarios. The effort in the early stages should be on building a relationship with the MPC that exploits the value added from being able to talk through counterfactual experiments using the model. This would also allow the staff to discuss uncertainty in the inflation forecast, and the risks to any proposed policy decision.
Because a small gaps model abstracts from so many issues and deals only in broad aggregates of inflation and output, demand will probably arise soon from the staff for an expanded or more complete model. This can present a number of dangers. There is a tendency for the projection model to evolve in an ad hoc manner, which usually results in a model that is expensive to operate and maintain. More damagingly, the addition of detail usually comes at the cost of clarity and insight, so that the model becomes something of a ‘black box’. Not merely is this a problem for staff—all but a few staff will effectively become excluded from the forecast process—it is potentially very damaging to relations with the members of the MPC, who, when unable to find a clear story in the model-based forecast, will be inclined to resort to their own judgments and anecdotal evidence, which would risk compromising consistency in the published forecast.15
If there is to be a larger model, it should not aim to provide just more disaggregation. This can be handled separately in ‘satellite’ models, while preserving the simplicity and clarity of the core model (see Section VII). Instead, any development should be to provide more structure by which to provide insight to the policy decisions. One example might be adding an explicit treatment of the role of oil or commodity prices in the inflation process. This can be done within the gaps model framework.16 Other elaborations, such as adding a labor market, would also be possible within this framework.17
The shift from a gap model to dynamic general equilibrium models with such features as stock-flow dynamics and forward-looking expectations is far from trivial.18 It requires highly specialized human capital, powerful computers and solution software, and very large demands on data sources. It will also take some time to build such a model, even under ideal circumstances. As an intermediate solution, it may be useful to develop add factors to the aggregate demand and supply schedules that have a structural reasoning behind them. As an example, one that could be very relevant in transition economies undergoing large fiscal adjustment—is to use the gap model in conjunction with an added composite indicator of fiscal stance. This would measure not just the expenditure component of government spending, but provide an estimate of the likely demand effects that arises when households’ intertemporal assessment of government policy is made. That is, any change to aggregate demand because of a change in government spending will depend on not just the initial expenditure, but also how households assess the impact on the present value of their labor income (i.e., what taxes are likely in the future) and the extent to which government debt is net wealth. This will depend on the extent to which households have finite horizons, how impatient they are, and the anticipated pattern of future taxes and deficits. In general, there will be important questions involving future potential growth that can only be answered in a more complicated general equilibrium setting, but nonetheless a simpler composite indicator would enable a first pass to be taken at the appropriate counterfactual experiments.
The use of such indicator variables also provides some practice at working with more structural issues. This can make the transition to larger general equilibrium systems easier for staff. The advantage of the larger models in the context of an IFT regime is that they can provide a more careful articulation of the relationship between private agents and monetary policy. This implies moving to models with careful specifications for the motivations of private agents, their intra- and intertemporal constraints, endogenous policy reaction rules, forward-looking expectation structures, and appropriate arbitrage conditions. In a small open economy, there may be important issues relating to tradable- and nontradable-goods sectors. Staff will therefore need to be careful that they have the necessary data to work with a model with tightly specified fiscal accounts and stock-flow relationships.
VII. The Role of Models in the FPAS
As was suggested earlier, there is a potential role for more than one model within the FPAS. This section contains a proposed structure for using several models to assist in the preparation of inflation forecasts. The approach emphasizes the benefits of using many models, each carefully tailored to specific purposes. Indeed, once a small gaps model is established, there are probably greater potential gains to be made by adding the sorts of models described here than by moving to a more complex core model.
Behind this argument is the view that no one model will be able to do everything. Any model designed with that intention in mind would be hopelessly complicated and intractable, and therefore would not add any value to the forecast process. Models do not add value by mimicking the data, but by providing insight into the economic phenomena that might lie behind the data. Models should be treated as tools that allow economists to systematically address issues that they might otherwise miss. In order to provide this sort of insight, models should be as simple as possible; the most obvious way of bringing simplicity to models is for them to be carefully designed with specific criteria in mind.19
This approach, when taken to heart, means that the model builder may happily ignore some features of the data for a given model. This is in contrast to traditional approaches to econometric model building, which emphasize finding the ‘true’ Data Generating Process above all else.20 Indeed, when designing models to support an inflation-targeting framework one of the easiest traps to fall into is to model features of the data that are in fact purely spurious. This usually guarantees many unproductive hours spent re-estimating the model in a futile quest to uncover the ‘true DGP’.21
With that in mind, we can think of using a collection of models that allow us to build up the forecast profile. That is, we want to paint a broad picture that starts with the question “Where is the economy now?” and moves to “Where does it need to go?” In doing that, the Forecast Team will need to answer a variety of questions, such as how much inertia there is in current trends and how close to equilibrium various sectors and markets are. This process helps the MPC answer the most important question of all: “What does monetary policy need to do to accomplish its inflation and stabilization objectives?”
We can usefully arrange this in the form of a timeline, running from the short term to the medium term (see Figure 1). Monitoring and near-term forecasting models are designed to capture short-run trends. The basic premise behind this idea is that there are important real and nominal rigidities in the economy that imply that the data have a certain amount of momentum from their current short-run trends. These rigidities help us to forecast the near term. The degree of inertia will vary from sector to sector; asset markets are famously volatile and able to reverse trends quickly, while labor and product markets tend to adjust more slowly. The models used for capturing these aspects of the data will typically be small, time-series models—often univariate—that are cheap to estimate and maintain. They provide an important insight into the likely path of the economy in the near term, but, lacking any structure, provide no insight at all about the reaction of the economy to economic imbalances or to how the economy might respond to economic policies.
Given a picture of likely short-run trends, we need to form a view on what this implies for the state of the economy. Here we use signal extraction models to provide insight as to how the current cycle compares with a typical business cycle and to ask what part of the short-run trend represents a permanent trend (which we might associate with supply) and what represents a mean-reverting component (which we could think of as a cyclical demand effect). Because an important objective should be to impose consistency in the projection scenarios, the methodology for measuring the output gap should involve developing efficient and robust measures of these unobservable variables that are consistent with the underlying structure of the core Quarterly Projection Model.22
Now that the staff have a picture of the likely short-run scenario and the key pressures in the economy (such as the state of excess demand in the goods market, as measured by an output gap), they are in a position to use the core Quarterly Projection Model (QPM) to create a medium-term scenario. This model must be designed to help the staff draw a link between the current pressures in the economy and the implications for monetary policy. In particular, in the scenario the model will generate a policy response consistent with the IFT regime, one that respects both the judgmental starting conditions and the model’s logic concerning medium-term equilibration forces.
Of course, if there are issues in the forecast that relate to permanent real shocks (such as implied by changes in fiscal policy) or if there are important sectoral issues arising from relative prices, such as the terms-of-trade, the staff might want to have a more elaborate model. As noted, added complication brings with it the risk that issues become more difficult rather than easier to address clearly. One way of ensuring that the core Quarterly Projection Model does not lose consistency and that any new developments add economic insight (rather than simply more detail) is to build and maintain an optimizing dynamic multi-sector model. This sort of model can provide a theoretically tight framework that can be used as a check on the consistency of the core QPM. If the core model were the sort mentioned above, then the optimizing model would provide a check on intertemporal constraints (wealth issues), the effects of real side shocks (such as changes in fiscal policy), and any other issues relating to structural adjustment and long-run equilibriums. Insights from such models can be incorporated into a medium-term projection from the simpler model using judgment.
This sort of model can also act as a working prototype for developments on the core model. However, a theoretically tighter optimizing model will be best at dealing with permanent effects of changes; it may produce quite unrealistic dynamics because modeling rigidities in adjustment is the weakest part of theory and attempting to do so may make those models unnecessarily complicated. Dynamics from optimizing models can often have unfortunate implications. For example, a model with purely model-consistent expectations where all variables are ‘jump’ variables may imply that there is a negligible short-run trade-off between output and inflation, or further that there are no costs associated with disinflation or volatile policy. The challenge when expanding the QPM is to retain the appropriate short-run nominal and real rigidities when incorporating the features of the optimizing models. This can be an ambitious undertaking, hence it may be more practical to work with a simpler model as the core QPM in the interim and use the optimizing model as a check on theoretical consistency of the results, particularly in the medium term. Nonetheless, the optimizing model provides a way forward for expansion of the QPM, adding a distinction between tradable and non-tradable goods, incorporating intermediate goods into the production process, modeling a proportion of households as income-constrained, dealing with stock and asset equilibrium issues, and paying careful attention to the nature and scope of government in the economy.
None of this implies that the core QPM should be disaggregated for the sake of it. Instead, disaggregation into national-accounting components can be accomplished more easily and cheaply by satellite models. These are small subsystems that would use simple rules of thumb, based on observed historical patterns, to decompose the output from the core mode into various subcomponents. Hence, a household consumption satellite might take the path for model consumption and break that down into paths for consumption of services, nondurables, durables, and ‘super-durables’ such as expenditure on housing stock. This breakdown can be accomplished based on knowledge of long-run ratios and estimated dynamic patterns for each of the subcomponents.23 The satellite models will therefore provide output that may be more accessible and easily understood, and can be checked by sector analysts for consistency with previous experience. In some cases, there may be idiosyncratic events for subcomponents known in advance, and this could mean that the staff may wish to revise the aggregate path.24
These models therefore form part of a process for the forecast. The benefit of this proposed structure is that the aggregate QPM forms a general overview of the economy, and can therefore be kept very simple and transparent. This is important for the forecast process. It also has the benefit that the core QPM can be easily used for policy analysis. As for the forecast process, it makes no sense to expect that one model will always be suitable for all policy questions. However, it can be very useful for internal communications with the MPC to be able to ask general hypothetical questions of the same model that they see used for the forecast round. By keeping the core model as simple and transparent as possible, we can pose questions directly to it and get fast, easily interpreted answers.
The forecast process, however, is made possible by supplementing this core model with the other models we have discussed. The process looks something like the picture in Figure 3: This shows how the sector specialists would start by interpreting the data, using the short-run indicator models, and then use that—in conjunction with a set of starting assumptions about exogenous paths for fiscal and external variables—to build up a near-term profile. From this near-term profile, we will typically need to extract key unobserved variables such as the output gap, for that and the rest of the near-term profile to be fed into the core model. As noted, the core model helps the staff arrive at an overview. Part of this process involves decomposing the aggregate picture via the satellite models, and this can lead to the staff wanting to revise the aggregate picture. The staff has two sorts of control over the aggregate picture. The first is to revise the starting assumptions. The second is to use ‘add-factors’ directly on the core model (such as when a large investment expenditure is known in advance) in order to alter a particular variable’s path. This procedure will typically require iteration. As a further check, the use of a theoretically tight optimizing model may also imply alterations to the model’s projection paths (for example, if accumulating debt implied a more suppressed path for consumption than would have been projected in a model without such constraints.)
The final output, of course, is a projection profile, which provides a staring point for advice to the MPC. By this stage, the staff should be in a good position to explain what they think the pressures facing the economy are, and what that implies for monetary policy. Using this process and the models involved, the staff should also be in a position to explain their choices for starting assumptions, their interpretations of recent data outcomes, and their handling of idiosyncratic events. The MPC may wish to have some options explored further, so that more iterations are added to the process. However, the structure here is capable of dealing with policy issues in a controlled and systematic manner.
The core model comes into its own in adding to the above analysis information on risks to the scenario, both through alternatives that assess the importance of particular risks in the assumptions and through more general risk analysis involving confidence intervals and so on.
VIII. On the Evolution and Development of the FPAS
This section concerns itself with strategies for ensuring that the FPAS can be taken from its first beginnings to a successful and established system. These strategies can be related to four specific areas:
Information technology and systems (IT) development;
staff development; and
the creation of a forecast and model culture.
When the essential elements of the system are in place, management can start thinking about their development. It is probably easiest to plot a course for model development, with the staff aiming to move over time to more and more sophisticated structural models. We emphasize again, however, that the job of the core model is to provide the link between developments in the economy and the implications for monetary policymakers.
Operationally, the value of the model will come from the ability to perform ‘counterfactuals’ and experiments that provide insight into recent data outturns. This implies that management should be very careful that any proposed developments do not make the model more difficult to understand, and hence to use and communicate to the MPC. The relationship with the MPC, which is so important for the functioning of the system, can break down when the staff are unable to explain their projections. It is helpful, therefore, to aim for a model that is well within the ‘comfort zone’ of both.
Information technology is easily overlooked when thinking about developing the system. It is, however, very important. Automating tasks has the obvious benefit that there is less room for human error, and it is a prerequisite for the effective use of the forecasting model. Effective IT also has the benefit that it allows the staff more time to think about economics, rather than the details of data-processing tasks. Several specific steps can be taken:
There are obvious economies of scale and scope in deploying resources for Information and Computer Technology, but it is important to avoid the pure ‘service bureau’ approach.
Make sure that the staff has direct access to programmers and IT experts who can work to improve productivity and support the needs of the staff.
Avoid the attitude that ‘one size can fit all’! Within limits, staff should be able to specify what they need to help them with their work, rather than only having generic solutions imposed upon them.
If a member of the MPC has some expertise in this area, they should be given the responsibility for ensuring that the Bank’s available resources are being used effectively to support the FPAS, as well as the other needs of the central bank. It should also be recognized that this is not a part-time job and that major productivity gains for the institution can be realized if available resources can be deployed effectively.
The most challenging development issue for management will probably concern human capital. The maintenance and improvement of the FPAS over time will depend on how well the central bank can protect and develop its internal human capital. It is important to recognize that many of the skills necessary to operate an FPAS are highly specialized. This raises a number of issues:
The importance of training for these specialized tasks. In the early days of the system, it may be possible and useful to use external consultants from other institutions who have experience in the issues involved. This can accelerate the development of expertise of the staff. However, management should be very careful that expertise is in fact developed as part of this process. If possible, contracts with external experts that are designed to help set up the FPAS should have duration of at least one year. After a certain point, this training will often have to be done in-house, as so much human capital will depend on the accumulation and transmission of experience with the FPAS.
The importance of safeguarding against the departure of staff. Once some expertise is gained, it will be important for more experienced staff to mentor new staff and ensure that someone is ready to take over when a staff member leaves. It is important for management to recognize that this can be time-consuming and to create incentives so that staff do not guard or restrict their own human capital. Recording procedures in the system will be very important. Much, however, comes down to the attitude of management—the entire system will function more efficiently if everyone involved in the process realizes that there can be important synergies from openness and transparency.
The importance of external contacts. As mentioned, there are obvious benefits if the institution can benefit from procedures and systems developed elsewhere. Management should also work to develop contacts with educational institutions in order to ensure that people in these institutions are aware of the types of human resources and skills that are required for the FPAS.
Staff contact with the MPC. It is beneficial for staff motivation if they feel that they are able to make an impact and to receive credit for initiative and good work. This can be achieved by making sure that staff members are invited to some MPC meetings. This approach has the added benefit that the staff members hear directly the concerns and thinking of the MPC, which will help them anticipate what kind of material the MPC finds useful.
Staff motivation and incentives. As discussed, rotation should be facilitated and planned. In addition to spreading a general knowledge of how the system works, this is one way by which management can ensure that staff members do not become stale or burnout in a particularly stressful position. In general, staff should be closely supervised, but also given the freedom to be autonomous and undertake new initiatives.
Facilitation of a cooperative team atmosphere. The synergies that arise from a widespread understanding of the system should be evident. Small initiatives can be very helpful here. For example, if model builders and forecasters sit near each other, that regular informal contact can greatly aid communication and understanding. Similarly, informal in-house seminars, such as ‘brown paper bag’ lunchtime seminars, that give an opportunity for staff to discuss problems and issues, can be very useful.
Many of these points touch on the general issue of creating a forecasting and model culture. This is hard to define precisely, but the important realization for management is that it is not possible to create a properly functioning FPAS by simply importing models, IT, or temporary external consultants. Staff will need to feel connected to the process, involved in the final outcome, and credited for successful innovation and work. Management should aim to build up a group culture of using models and understanding their limitations. This can be important when questions from the MPC need to be qualified or reinterpreted in order to produce sensible results in a formal model. In particular, the notion of a culture speaks to the ability to undertake model development work in-house; the best racing car is not much good without skilful drivers. This can be aided by investing in an active model development and research program—a separate division of economists engaged in longer-term but well-focused research projects aimed at improving the human capital of the staff and the quality of the FPAS over time. The culture can also be facilitated by a regular review process immediately after each quarterly projection exercise, where the staff can ask themselves what changes could be introduced to make the process more efficient or what model developments would make sense.