Revisions Policy for Official Statistics: A Matter of Governance

This paper proposes a set of good practices for the revision of macroeconomic data. The authors argue that revisions are a routine part of disseminating quality data. Revisions are made not just to correct errors but also to incorporate better source data, update base periods, and make other improvements. It is argued, using country examples and views from policymakers and other users, that national statistical agencies should have explicit revisions policies.

Abstract

This paper proposes a set of good practices for the revision of macroeconomic data. The authors argue that revisions are a routine part of disseminating quality data. Revisions are made not just to correct errors but also to incorporate better source data, update base periods, and make other improvements. It is argued, using country examples and views from policymakers and other users, that national statistical agencies should have explicit revisions policies.

I. Introduction

“Revisions.” The word elicits a wide range of images in the world of official statistics, not all of them pleasant. In statistical offices, the image is often of extra work, to develop new series while continuing to prepare and disseminate the to-be-revised series and to carry time series back. To data users, it also means extra work, to update databases and reanalyze time series to see if history has been rewritten. More traumatic than the images of extra work is the image of a mistake. Especially in the past but still to some extent today, revisions are associated mainly with mistakes having been uncovered.

In this paper we will argue that the time has come to bring revisions more fully out into the open and to draw on statistical experience from around the world to work toward identifying a set of good practices. These good practices make up what we will call “revisions policy.” Revisions policy should be recognized as an important aspect of good governance in statistics. Good governance in statistics, in turn, is part of public sector transparency and accountability more broadly.

At least four developments have put the spotlight on revisions in the last several years. First, the need for improvements in official statistics has received substantial attention. For example, during the financial crises in the 1990s, the lack of relevant data figured prominently in delaying diagnosis. Subsequently, the international financial community called for the International Monetary Fund (IMF) to establish standards for the dissemination of data to the public. The IMF responded by developing the Special Data Dissemination Standard (SDDS) and the General Data Dissemination System (GDDS).2 More recently, the PARtnership In Statistics for Development in the 21st Century (Paris21) consortium called for a shared international strategy to seek adequate support for national statistical systems to build evidence-based policymaking. With determination, cooperation, and goodwill, the need for more and better statistics is being translated into a number of improvements. Some improvements will be additions to sets of statistics, but many will be changes in existing time series. As numerous improvements come onstream, there will be revisions.

Second, the international statistical community in the last decade has put major efforts into preparing and promoting methodological manuals for macroeconomic statistics.3 When countries adopt new standards, such as the 1993 System of National Accounts or the Classification of the Functions of Government, it means major revisions.

Third, a growing share of the world’s population lives within regional organizations. These include, for example, the European Union, regional central banks such as the Eastern Caribbean Central Bank, the West Africa Economic and Monetary Union, and the Andean Community. Many of these organizations prepare statistical aggregates from their members’ reports; however, the members often have varying revision cycles. The result is that the aggregates, once compiled, are either subject to continuing change or, at the other extreme, are not consistent with the members’ data as disseminated. Studies conducted in the euro area concluded that a more harmonized revisions policy is highly desirable to facilitate the analysis of regional economic trends and causes of revisions. Eurostat and the European Central Bank have recently published a note proposing common revision practices for reporting balance of payments and international investment position (IIP) statistics by member states. 4

Fourth, member countries of regional or international organizations typically have obligations to report data to those organizations. These data provide the basis for decisions on, for example, common policy action, or lending, debt relief, and other assistance. In this setting, it is important to be able to distinguish between bona fide revisions and suspect—perhaps politically motivated—revisions in the data provided. In the IMF lexicon, reporting inaccurate information (as well as failure to report information) is referred to as “misreporting.” Although the cases of misreporting have been few, they give rise to difficult situations.5

It is not surprising, then, that pressure is building at the international level for work on revision practices. For example, the IMF Executive Board, in discussing countries’ obligation to provide data to the IMF, encouraged national authorities to articulate their policies on data revisions.6 And from the perspective of national authorities, participants of the Consultative Seminar on Governance of National Statistical Systems recommended that statistical agencies promptly report revisions and provide information on revisions policy. The participants also urged international organizations to promote the use of good revision practices.7 The IMF Committee on Balance of Payments Statistics, in making plans for an updating of the fifth edition of the Balance of Payments Manual, has put revisions policy on its agenda.8

II. Typology and Terminology

Revisions are defined broadly as any change in a value of a statistic released to the public by an official national statistical agency.9 The statistic may be a level, such as the value of a flow (for example, GDP) or of a stock (for example, of financial assets), or a change in level, such as the rate of price increase. As foreshadowed by these examples, this paper will focus on revisions in macroeconomic statistics.10

Revisions can be classified in at least two ways—by reason for the revision, or by timing of the revision. It is especially useful to catalogue these in order to establish a common language.

A. Revisions Classified by Reason

Revisions take place for a number of distinct reasons. In reality, some of the distinctions are blurred because two or more kinds of revisions may be made at the same time. Aside from corrections of mistakes (the last item in the following list), the reasons tend to break into three groups. The first group is the incorporation of more complete or otherwise better source data, encompassing three reasons. The second is routine recalculation, encompassing two reasons, and the third is improvements in methodology, encompassing two reasons.11

There are several reasons for revisions:

  1. to incorporate better source data

    • incorporation of source data with more complete or otherwise better reporting

    • incorporation of source data that more closely match the concepts

    • replacement with source data of judgment or of values derived largely by statistical techniques

  2. to capture routine recalculation

    • incorporation of updated seasonal factors

    • updating of the base period

  3. to reflect improved methodology

    • changes in statistical methods

    • changes in concepts, definitions, and classifications

  4. to correct errors

    • correction of errors in source data and computations

The first reason, incorporation of source data with more complete reporting, causes revisions across a wide spectrum of macroeconomic statistics. At one end of the spectrum, a first report on credit aggregates may be based on the largest financial institutions and then the aggregate is revised when reports from all institutions, including the slower ones that have less sophisticated reporting or are from outside the major cities, become available. At the other end of the spectrum, data from monthly samples may be replaced in national accounts components with data from more comprehensive annual samples. For example, in the quarterly national accounts of several countries, monthly data from a retail sales survey are used until they can be replaced with data from more comprehensive sources. Two other reasons for revisions are related. Updating weights, as for price indices, brings in information from more recent surveys. Incorporating audited results, as for budgetary figures and data from financial reports, in effect brings in “better” data to replace early results.

The second reason, the incorporation of source data that more closely match the concept, is most likely to occur in datasets that piece together many data sources in a mosaic, representing a comprehensive picture of some aspect of the economy. The national accounts and balance of payments are prime examples of such datasets. For example, if production is to be measured, source data that represent sales (plus some adjustments) may provide a first estimate and then the estimate is subject to revision as data more closely matching production become available.

The third reason for revision occurs in some situations when no current data may be available, and a first estimate is based on judgment or statistical techniques. A revision may then result when data become available. Such situations may arise for quarterly national accounts. The United States uses judgmental extrapolation for the first quarterly estimate for several components, including domestic services and improvements on owner-occupied housing. Subsequently, data become available that can be incorporated.

These first three reasons often appear together, for example, in national accounts and balance of payments. In monetary and government finance statistics, the reasons often boil down to completing institutional coverage and incorporating the outcomes of audited reports.

In the second group of reasons, the incorporation of updated seasonal factors relates closely to the first group’s incorporation of additional source data. In fact, some lists of reasons for revisions do not identify the two separately. Seasonal factors, such as those derived from a moving average of experience or from the most recent year (concurrent seasonal factors), can change as the new experience comes into the calculation and the older experience drops out. Some countries only rarely revise the consumer price index to bring in new or additional price observations, but they do revise the index once a year to incorporate updated seasonal factors. For example, the U.S. Bureau of Labor Statistics, with the release of the January index, each year recalculates the seasonal adjustment factors to reflect price movements in the just-completed year. This routine annual recalculation may result in revisions to seasonally adjusted indices for the previous five years.12

In the second group as well, the updating of the base year of an index—that is, the year set equal to 100—is often a routine reason for revision. This step may be carried out separately but usually is done when new data underlying the weights for the index are introduced.

In the third group, the incorporation of changes in statistical methods is sometimes not listed separately because such changes often go hand in hand with changes in source data. However, they can also occur independently. For example, revision studies may reveal that a particular method can be improved or replaced by another to achieve greater accuracy or timeliness. In the last few years, this source of revision has become more prominent as countries have moved from fixed-weighted volume and price measures to chain-weighted measures.

Yet another reason for revision is changes in concepts, definitions, and classifications, often stimulated by adoption of new international guidelines. For example, when countries began following the fifth edition of the Balance of Payments Manual, in place of the fourth, the definition of the current account changed to exclude capital transfers and acquisitions/disposals of nonproduced assets. In another example, the 1993 SNA embodied a broader concept of investment; thus, as countries moved toward that standard and added software, for example, as investment, they introduced a new concept. And another example involves the international statistical community’s major effort in recent years to reach agreement on classifications. The Classification of the Functions of Government (COFOG) and the Classification of Individual Consumption by Purpose (COICOP) are cases in point. The introduction of new classifications is often done on the occasion of the introduction of new concepts and definitions but sometimes is done on its own.

In addition, changes in presentation of statistics should be mentioned. They do not, strictly speaking, fit the definition of revision as a change in a value of a statistic. However, they often take place at the same time as revisions, especially revisions caused by changes in concepts, definitions, and classifications. Changes in presentation are also often implemented to respond to the analytical needs of users. For example, Appendix II describes how Australia began reporting financial derivative asset and liability positions on a gross basis rather than net.

Finally, revisions occur as errors are corrected. Errors may occur in source data or in processing. For example, reporting units may discover after submitting the data that some components are missing or compiling institutions may discover that outdated seasonal adjustments have been inadvertently applied.

B. Revisions Classified by Timing

As to timing, some revisions are made in the weeks or months shortly after a first release of data. These “current revisions” affect the current weekly, monthly, or quarterly data. “Annual revisions” are made after data for all the months or quarters of a year become available. Audits are usually done for a calendar or fiscal year’s data, although the results may not be available for some time after the close of the year.

Both current and annual revisions usually stem from the first four reasons: incorporating source data with more complete reporting, incorporating source data that more closely match concepts, replacing judgment and statistical techniques, and incorporating updated seasonal factors. Moreover, annual revisions often affect several years of data—perhaps three or four years—so an annual estimate may be subject to revision more than once. For example, in the U.S. national accounts, there are three such revisions, because important additional annual source data arrive in each of three years.

Less frequent revisions, often four or more years apart, may be called “comprehensive,” “major,” “historical,” or “benchmark” revisions. Typically they are occasions for major changes in statistical methods and changes in concepts, definitions, and classifications. Often these revisions are carried back, or backcast, for a number of years. Revisions that correct error, of course, have no predictable timing.

III. Context of Revisions

This section will describe the context in which revisions occur and the parameters that policy makers must consider when designing policies to manage the revisions process. The context of revisions can be analyzed from three main points of view: user needs, resource issues, and maintenance of credibility.

A. User Needs

Growing evidence suggests that users are more aware of the role of revisions and want more information about them. For example, Governor Dodge from the Bank of Canada in an address at an international meeting of statisticians concluded, “This is my advice: If you have imperfect data, don’t sit on them. Put them out, together with your professional assessment of their quality and vulnerability. Remember, as policymakers, we are used to taking decisions under uncertainty, in less than perfect conditions.”13 Surveys and meetings with users from a wide range of countries confirm their concern about revisions and revisions practices. These concerns are documented, for example, in the data modules of the IMF Reports on the Observance of Standards and Codes (ROSCs).14

User needs with respect to revisions fall into the following four categories:

  • The timeliness of first release of data and timing of subsequent revisions

  • The accuracy of first release of data and subsequent revisions

  • The consistency of data over time

  • The documentation for the revisions that is provided to users

Timeliness

Some users—such as policymakers, investors, international organizations, and the media— strongly emphasize the timeliness of statistics. A key aspect of timeliness is the early release of economic data, which affects the subsequent timing of revisions. For example, for a central bank to conduct monetary policy effectively, it will need to analyze data on inflation and growth of monetary aggregates that are as up-to-date as possible.

In another aspect of timeliness, statistical agencies need to assure users that the timing of the first release of data and subsequent revisions is predictable and relatively stable from year to year. In addition, the agencies may need to coordinate the timing of the release with preparing important official policy documents, such as government budgets.

There are limits, of course, to how far statistical agencies can go in providing frequent and timely data because of resource constraints, the demands of good statistical practice, and the trade-off with accuracy.

Accuracy

While policymakers place a high premium on timely data, they also need a degree of accuracy. Inaccurate data may cause policymakers and investors to make wrong decisions. Although they want timely data on which to base their decisions, they do not want to take a decision based on data that are likely to change substantially in the next month or next quarter. Among users, researchers and the academic community place perhaps the highest priority on accuracy. To them timely data are less important than an accurate and comprehensive time series of data.

The importance placed by users on accuracy clearly requires that they be able to judge the accuracy of preliminary data and subsequently revised data. To make informed judgments, they need revised data to be clearly identified and documentation provided (see Section IV below).

Consistency

Many users, particularly those engaged in research and forecasting, require consistency of data over time. While they realize that revisions will yield more accurate data, they are concerned that frequent or large revisions may disrupt their databases and cause inconsistencies unless the revisions are backcast over a sufficient number of years. Furthermore, users working with several datasets will be concerned that agencies should coordinate revisions to avoid lengthy periods when one dataset is revised and others are still on the old basis.

Documentation

To lessen the trauma caused by revisions, users want clear documentation. Basic documentation in statistical publications should identify the data that are preliminary (or provisional or estimated) and revised, explain the sources of revisions, and explain breaks in series when consistent series cannot be constructed.

Documentation is particularly important when changes in concepts and definitions are involved because such changes can seriously affect the interpretation of various statistical applications (for example, forecasts) and empirical tests of the validity of economic theory.15 For example, in Canada, the “rollout” of the Fisher chain volume index formula as the official measure of real GDP took place over several years. Statistics Canada placed documentation on the website months in advance, held seminars in several cities for key government and business users, and sent e-mail messages to a large number of known users in the final weeks before the release of the new estimates. (Canada ROSC, Detailed Assessment under 1.2.4., page 19; also on the Internet at http://www.imf.org/ (IMF at Work and then ROSCs).

B. Resource Issues

Resources affect countries’ revisions policies in several ways. On the one hand, specific issues arise about cost effectiveness, such as, is the increased accuracy gained from a revision worth the cost? On the other hand, questions may arise about the basic design of the statistical compilation system itself, which has fundamental implications for the costs of revisions.

On the one hand, statistical agencies, operating within limited budgets, must make efforts to ensure the cost effectiveness of their programs, including revisions. Again it is a matter of balancing—balancing not only timeliness against the accuracy needs of users but also the timeliness and accuracy needs against the marginal costs of achieving improvements in both areas. Not only do statistical agencies incur costs, but so do the respondents, who must take the time and effort to complete the questionnaires and data submissions necessary to comply with data release and revisions policies. Agencies must thus perform a kind of “cost benefit analysis” to make realistic and sustainable decisions with respect to the timeliness and frequency of data releases and revisions.

On the other hand, as described in Section II, revisions are driven primarily by the arrival of source data. Typically a core set of source data are available for the first estimates that are released to satisfy the timeliness needs of users. Then, as more detailed and comprehensive source data arrive, the first estimates are revised to improve the accuracy of the statistics. In designing the statistical compilation system and defining the surveys and administrative data to be used as source data, managers should bear in mind the cost implications of alternative designs and definitions.

C. Maintenance of Credibility

U.K. Prime Minister Tony Blair, in his introduction to “Building Trust in Statistics—The White Paper on Statistics,” stated “I believe that having access to official statistics which we can all trust is essential in any healthy society. Statistics encourage debate, inform decision making both inside and outside government, and allow people to judge whether the Government is delivering on its promises. For official statistics to play that key role effectively in democracy, we need to have confidence in the figures themselves.”16

Confidence in the figures effectively must be built on confidence in the statistical agency disseminating them. Fundamental to achieving trust in, or credibility of, statistical agencies is integrity. Integrity is a central element in the IMF’s Data Quality Assessment Framework (DQAF). For more about the DQAF, see the IMF’s Data Quality Reference Site (DQRS) at http://dsbb.imf.org/. Integrity is also prominent in the U.N. Fundamental Principles of Official Statistics. At least six of the ten Principles relate to various aspects of integrity of official statistics.

Providing assurances of integrity involves, at the broadest level, enacting effective statistical legislation and ensuring the professional autonomy of statistical agencies. It also involves an element key to gaining the trust of users—establishing a sound revisions policy. It is not unusual for a user’s distrust of government (or the political party in power) to be translated into distrust of official statistics, or at least a healthy degree of skepticism. Revisions can be particularly sensitive if statistical agencies handle them in an unprofessional manner.

What are the needs of users with respect to revisions and the credibility of official statistics? With respect to the release of first estimates, users need to be able to make informed judgments about the quality. How accurate are the estimates? What is the likelihood of further revision, and by how much and in what direction? When will the data be “final?” For the revisions, users need to be informed about the causes, as well as have access to complete documentation on methodology and procedures.

Users will also be reassured if they see that revisions take place within the framework of an overall policy and according to a predetermined schedule. If the policy, procedures, and schedule are published, it will be evident that revisions are not ad hoc and for political interests, and that adequate safeguards exist to prevent abuses in this area. Finally, when mistakes are discovered, it is critical that the statistical agency report them to the public as soon as possible and provide satisfactory explanations to reassure users and enable them to distinguish honest mistakes from cases of “misreporting.”

Effective communication with the press and with the public is an important aspect of credibility. Dealing with the press and communicating effectively with various user groups poses special challenges that require experience and public relations skills. In some exceptional cases, proactive steps may be taken to build credibility. For example, a large national statistical agency lost one week of production owing to a power outage. The agency saw that data would be of lower quality and warned users in advance of the scheduled release—a measure that was well received by the public.

Another aspect of communication is presenting the information about revisions in a standard format that is easy to recognize. The U.K. Office for National Statistics publishes detailed information about revisions, including the format, numbering of tables, electronic links to revisions of specific data, notes explaining the revisions, and identification of data that are not subject to regular revision.17 In addition, these standards apply to all U.K. agencies producing national statistics.

IV. Good Practices for Revisions Policies

In this p g other things, to good governance of official statistics. Although revisions policy has not yet been well-articulated in many countries, in recent years, such policy is receiving more emphasis. For example, the Quarterly National Accounts Manual, Chapter XI,18 discusses revisions policy. The Ecofin Council of the European Union, in February 2003, included a section on revisions in its “Code of Best Practices on the Compilation and Reporting of Data in the Context of the Excessive Deficit Procedure.” And the IMF’s Data Quality Assessment Framework includes numerous good revision practices.19

To build on these recent efforts to define good revisions policy, we want to work toward outlining a more comprehensive and internationally accepted set of good practices that would together constitute a sound revisions policy generally applicable. We arrived at the good practices outlined below by combining general considerations identified in the discussion of user needs, resource issues, and maintenance of credibility in Section III, with specific practices drawn from various countries.20

The eight main revisions practices outlined are consistent with the general principles of good governance in statistics, such as they appear in the Fundamental Principles of Official Statistics and in the Handbook on the Operation and Organization of a Statistical Agency. In fact, the revision practices identified explicitly apply these principles of, for example, integrity, responsiveness to users’ needs, and professionalism to the context of revisions.

1. Periodic consultations with users elicit views about revisions practices. Ongoing dialogue with users, including the media, communicate the revisions policy.

Preliminary to elaborating a country’s revisions policy, it is important to consult the main users of official statistics to identify needs and priorities specific to the individual countries. Agencies could seek their views, for example, about their particular needs for timeliness of data, problems they experience because of revisions, and their priorities about balancing timeliness with accuracy and consistency.

2. A clear, short summary statement of when to expect revisions and why is readily accessible to users.

Most revisions fall under a “revisions cycle.” Cycles typically incorporate current (for example, quarterly) and annual revisions as defined in Section II and less frequent comprehensive or benchmark revisions that usually relate more to the two “improvements” reasons listed in Section II. A noteworthy example of a clear, short summary of revisions policy is the description for national accounts in the United States in Box 1 of Appendix I.

3. The current revision cycle is relatively stable from year to year.

Current and annual revisions are done broadly to incorporate more complete or otherwise better source data. The following practices relate to the timing of current and annual revisions:

3.1 The revisions are timed to incorporate new source data.

3.2 The revision schedule takes into account the timing for preparing important official economic policy documents.

3.3 The revision schedule takes into account the timing of revisions in other datasets.

Stability of the revision cycle from year to year is at the heart of good revisions policy. It is one of the few practices generally followed by countries. Fortunately, for countries that decide to establish a revisions policy, it is not difficult to ensure that its timing is stable over time. Indeed, it is a logical outcome and one that promotes efficient implementation. The most common basis for stability is the timing of arrival of source data, which then triggers their incorporation into revised data.21 Occasionally, a balance must be struck between maintaining the stability of the cycle and making unpredictable but important revisions outside the cycle. A stable revision schedule can also take into account the coordinating of timing with important official economic policy events. For example, Italy times the release of national accounts to coincide with the annual presentations to their parliaments on the economic situation. It is also important to coordinate with other macroeconomic sectors to ensure consistency (see example of Australia in Appendix II coordinating revisions of balance of payments statistics with national accounts).

4. Major conceptual and methodological revisions are usually introduced every four to six years, balancing the need for change and users’ concerns.

Major conceptual and methodological revisions relate mainly to the two “improvements” reasons for revisions outlined in Section II—to incorporate new statistical methods and new concepts, definitions, and classifications—all superimposed on changes in the structure of the economy. These revisions are typically more far-reaching and complex than current and annual revisions and can be disruptive and problematic for users if they occur too often or take place in a confusing or unpredictable manner. A reasonable guideline for regular timing would be every four to six years. Timing such as this balances the need to avoid unnecessary disruptions to time series with the need to maintain the quality of statistics in line with international best practices and the changing institutions and structure of the economy. For example, see the description in Appendix I of the U.S. five-year cycle for major conceptual and methodological revisions for GDP; four-or-more-year cycles are used, for example, in Italy, Norway, and Turkey for national accounts revisions.

Although individual countries do not control the timing of major changes in international statistical methodologies (for example, the appearance of the 1993 SNA and BPM5), a four-to-six-year cycle can generally accommodate these changes without undue delays and disruptions. Incidentally, it is also possible and can be helpful to users to coordinate the timing of methodological improvements with the current cycle of revisions timed for the arrival of better source data (see the U.S. example in Appendix I).

Countries do have control, however, over the timing of methodological and classification changes that they undertake to reflect institutional and structural changes in their own economies. These kinds of changes can be accumulated, studied, and prepared for during the four-to-six-year intervals before they are finally published. The example of the United States in Appendix I is illustrative; the comprehensive revision of GDP in 1999 introduced improvements in definitions and classifications. The improvements included the recognition of business and government expenditures for software as fixed investment, the treatment of government employee retirement plans in the same way as private pension plans, and others reflecting institutional and structural changes in the economy.

Mongolia recently provided an example of a comprehensive revision to reflect a change in methodology to come into line with international standards and to make corrections for previous years. The Chairman of the National Statistical Office and the Minister of Finance and Economy, in a Joint Resolution in November 2002, explained to the public clearly and transparently a revision in GDP methodology. The previous methodology had not accounted for exceptional animal losses and resulted in significant misstatements of GDP, particularly in years of severe weather. An accompanying technical paper explained the reasons for changing the methodology and how the revision affected estimates of GDP in previous years.

5. Revisions are carried back several years to give consistent time series.

To maintain their serviceability following major revisions, data should be revised back as far as is reasonable based on a balancing of user needs, costs, and availability of source data. The revised time series should be released simultaneously with the revised current data or soon thereafter, preferably in easily accessible electronic format. The revised series should be of sufficient detail and not so aggregated that users are not able to detect the sources of the changes.

Clearly, some revisions are more difficult than others to revise backwards. Among these are data from surveys that have changed, data affected by legal constraints, and data constrained by accounting principles (for example, government finance statistics). Lack of resources also constrains the extent of backward revisions, especially for poor countries. Nonetheless, various second-best approaches are possible, such as the U.S. practice described in Appendix I where GDP series are revised back to the last benchmark (usually five years) and further back for selected series that are particularly important.

6. Documentation on revisions is readily available to users.

6.1 Preliminary (or provisional or estimated) data and revised data are identified as such.

While this practice may seem obvious, it is not uncommon to find in many countries that preliminary and revised data are not clearly identified. This is especially likely in countries where revisions are not made according to a consistent or clearly stated revisions policy. It also occurs more often for government finance statistics and monetary statistics, where statistical principles may not be as much at the forefront as in national statistical offices. Serious confusion and misunderstandings by users could easily arise from neglect to identify changes in data.

6.2 Advance notice is given of major changes in concepts, definitions, and classification and in statistical methods.

Users should be alerted in advance of major conceptual and methodological revisions to help them prepare for and understand better the reasons for and nature of the changes. For example, Appendix II provides an account of Australia’s efforts to prepare users for revised balance of payments statistics according to BPM5. The statistical agency provided a description of the new standard and its benefits in advance, including illustrations of sample draft data tables to begin to acquaint users with the changes. Consultations with key users dealt with the implementation of the new standard, and a number of changes were made in the implementation strategy and schedule as a result. Various reports and discussion papers published in advance of the revision analyzed and described the effects on Australia’s statistics. Another example is the preparations by the United States described in Appendix I to alert users to the next benchmark GDP revision.

6.3 The sources of revision are explained when the revised series are released.

6.4 Breaks in series are documented when consistent series cannot be constructed.

Complete and transparent documentation of revisions allows users to understand the sources of revisions and, if needed, adjust their analysis of the data. Perhaps even more importantly, complete documentation serves to promote trust in the credibility and integrity of the data and the institutions responsible for compilation and dissemination. Key parts of the documentation are about the sources of the revisions, including the main flows of source data from the preliminary estimates to the revised data. It is also important that statistical agencies clearly identify breaks in the series when consistent time series cannot be constructed. Box 2 in Appendix I provides an example of documentation for sources of revisions for the U.S. GDP, and Box 5 in Appendix II an example of explanation of revisions for Australia’s balance of payments statistics.

7. Users are reminded of the size of the likely revisions based on past history.

It is particularly important for users such as policymakers and investors, who make decisions on the basis of preliminary estimates, to be able to make an informed judgment about the reliability and accuracy of the preliminary, provisional, or estimated data. How much confidence should they have in the first estimates?22 Accordingly, it is good practice for statistical agencies to conduct periodic analyses of revisions (or “revision studies”) and to make them available to users. Today’s information technology environment makes such studies less demanding than in the past. The following two good practices for revision studies have been identified:

7.1 Periodic analyses of revisions investigate the sources of revision from earlier estimates and statistical measures of the revisions.

7.2 The analyses are published for major aggregates to facilitate assessment of the reliability of the preliminary estimates.

Among several measures of revisions, the simplest are those that capture the extent to which preliminary and second releases of data indicate the direction of change—acceleration or deceleration—or the extent to which they are near trend.23 Most revision studies feature measures of the bias and dispersion of revisions.

With respect to measures of the bias of revisions, if a study shows a systematic bias in the revisions, users can adjust appropriately their interpretation of the preliminary estimates. Alternatively, the discovery of bias by a study may lead to changes in procedures, and these can be announced with the study results. See the description in Appendix II of Australia’s discovery of negative bias in the balance of payments current account first estimates and their changes in procedures in collecting source data to correct this bias. Revision studies can also be used to fine-tune the timing revisions within the cycle.

Measures of dispersion of the revisions provide users with an indication of the accuracy of the preliminary estimates and enable them to assess the likely size of future revisions. Box 3 in Appendix I provides an informative explanation and table for users on the historic size of revisions of GDP in the United States. This statistical analysis provided a range within which future revisions of GDP could be expected (that is, “the fourth-quarter change in real GDP, now estimated at 0.7 percent at an annual rate, is not likely to be revised below 0.1 percent or above 1.6 percent in the next two releases”).

It is important to report to users not only the statistical analysis carried out in the revision studies but also the basic data flows from the first estimates through all the revisions. Providing the basic data to users allows them to conduct their own studies of revisions if they wish. For example, Runkle (1998), in a study conducted four years earlier than the study mentioned above, found that some bias did exist in revisions of GDP in the United States.

8. When a mistake in reporting or processing is made, the revision is made in a transparent and timely manner.

As the saying goes, “to err is human,” and contrary to some jokes, statisticians are human. Many different types of mistakes occur in official statistics, from simple mathematical and recording errors to misclassifications and mistakes in coverage. The mistakes may be by the statistical agency or by the reporters of source data. It is critical for the integrity of a country’s statistical system that statistical agencies not only report any errors to users as soon as possible, but also explain errors in a way that gives assurance that the mistakes were not politically motivated. Explanations for mistakes are much easier when users are already well informed by complete metadata and related documentation on the compilation procedures and sources and flows of data used by the statistical agency. In such a transparent environment, it is just as likely that users will detect errors as the statistical agency or will at least quickly understand the source of the error.

An example of reporting errors is provided in Box 5 in Appendix II. The Australian statisticians explain several errors in balance of payments statistics that they identified through improved data collection (expanded individual security reporting leading to detection of mistakes in classification) and analysis of data.

An example that received wide publicity was the announcement by the Philippine government that its balance of payments current account surplus had been significantly overstated for the past several years owing to an understatement of imports. An interagency task force identified these errors. The National Statistics Office and the Bangko Sentral Ng Pilipinas issued public statements during 2003 explaining the errors as well as the implications for the economic outlook.24 The clear and transparent explanations avoided an erosion in confidence and trust in the government that might have occurred if the errors had come to light in a less orderly and effective manner.

Proposed Best Practice Elements of a Revisions Policy for Macroeconomic Statistics

1. Periodic consultations with users elicit views about revisions practices. Ongoing dialogue with users, including the media, communicate the revisions policy.

2. A summary statement of when to expect revisions and why is readily accessible to users.

3. The current revision cycle is relatively stable from year to year.

3.1 The revisions are timed to incorporate new source data.

3.2 The revision schedule takes into account the timing for preparing important official economic policy documents.

3.3 The revision schedule takes into account the timing of revisions in other datasets.

4. Major methodological revisions are usually introduced every four to six years

5. Revisions are carried back several years to give consistent time series.

6. Documentation on revisions is readily available to users.

6.1 Preliminary (or provisional or estimated) data and revised data are identified as such.

6.2 Advance notice is given of major changes in concepts, definitions, and classification and in statistical methods.

6.3 The sources of revision are explained when the revised series are released.

6.4 Breaks in series are documented when consistent series cannot be constructed.

7. Users are reminded of the size of the likely revisions based on past history.

7.1 Periodic analyses of revisions investigate the sources of revision from earlier estimates and statistical measures of the revisions (for example, bias and dispersion).

7.2 The analyses are published for major aggregates to facilitate assessment of the reliability of the preliminary estimates.

8. When a mistake in reporting or processing is made, the revision is made in a transparent and timely manner.

V. Next Steps

The pressures are building from several directions, as noted in the introduction, to elaborate an internationally accepted set of good practices for revisions to official statistics and to recognize the importance of a revisions policy. We are trying to push the process further by suggesting a clear typology and terminology to facilitate discussion, by laying out the landscape of needs and constraints to be addressed, and by proposing a set of good practices.

The international statistical community is being invited to discuss the proposed practices with a view to agreeing on a set of good practices for revisions of official statistics. Such a set could serve as a useful guide for countries designing revisions policies to fit their own particular circumstances. These practices could be adapted for presentation in international methodological manuals and in quality frameworks, such as the IMF’s Data Quality Assessment Framework.

Statistical agencies may begin to take some actions anticipating an internationally agreed set of good practices for revisions policy. Conducting consultations and meetings with users and surveying their needs and priorities must be the basis for any well-considered revisions policy. Communication with users will provide key information concerning the difficult task of balancing timeliness, on the one hand, and accuracy and consistency on the other hand—information needed for a satisfactory schedule to be set for the release of preliminary and revised data. Statistical agencies may also begin to implement some of the less debatable and complicated proposed best practices, such as the relatively straightforward practice of identifying preliminary and revised data in statistical publications.

APPENDIX I

The Policy and Practice of Revising GDP Estimates in the United States

The policy and practice of revising GDP estimates of the U.S. Bureau of Economic Analysis (BEA) 25 has three noteworthy features:

  • A clear, short summary statement of when and why to expect revisions is provided to users.

  • Major conceptual and methodological revisions are introduced only every five years or so.

    • – The data are revised back several years to give consistent time series.

    • – The revisions are explained in advance.

  • The users are reminded of the size of the likely revisions based on past history.

(1) A clear, short summary statement of when and why to expect revisions is provided to users, normally in the BEA news releases and on the BEA’s website (see Box 1, which is a typical summary statement of the revision cycle of the GDP estimates provided by BEA). The revisions typically involve a cycle of five years that includes three estimates for each quarter, annual revisions of the estimates for the three most recent years, and quinquennial benchmark revisions. The cycle reflects the “time-dependent nature of the quantity and quality of the source data.”26 The information about the sources of the revision is widely disseminated, for example, through news releases, publication in one of the current quarterly releases, publication in the Survey of Current Business, and posting on the BEA’s website. The revisions in the estimates incorporate the following main types of improvements (see Box 2, which reproduces extracts from a BEA news release concerning the sources of revision in the GDP estimate for the first quarter 2002):27 (i) in source data—as new data become available, including new benchmark input-output accounts, judgmental estimates/source data of earlier vintage are replaced; (ii) in methodologies, such as changes in the measures of real growth and inflation,28 and changes in definitions and classifications that better reflect the current features of the economy, such as the recognition of computer software as investment; and (iii) in presentation of GDP and other tables to make them more informative. The estimates may also be revised to update seasonal adjustment factors and to correct errors in source data or computations.29

The BEA Summary Statement of the Revision Cycle

Quarterly estimates of GDP are released on the following schedule: “Advance” estimates, based on source data that are incomplete or subject to further revision by the source agency, are released near the end of the first month after the end of the quarter; as more detailed and more comprehensive data become available, “preliminary” and “final” estimates are released near the end of the second and third months, respectively.

Annual revisions are usually carried out each summer and cover the quarters of the most recent calendar year and of the 2 preceding years. Comprehensive (or benchmark) revisions are carried out at about 5-year intervals and incorporate definitional and classificational changes that update the accounts to portray more accurately the evolving U.S. economy and statistical changes that update the accounts to reflect the introduction of new and improved methodologies and the incorporation of newly available and revised source data.

Source: BEA News Release, January 30, 2003 (http://www.bea.doc.gov/bea/newsrel/gdpnewsrelease.htm).

Quarterly revisions: The quarterly “advance” estimates of GDP are extrapolations derived from a combination of preliminary results from surveys, such as the surveys of retail sales and manufacturers’ shipments, and extrapolations for such components as international trade, private inventories, and a large share of consumer spending on domestic services. The advance GDP estimates are released near the end of the first month following the reference quarter, and subsequently revised. At the time of preparation of “preliminary” estimates, among other improvements, the extrapolations used in the advance estimates are replaced by survey data on private inventories and customs data on international trade in goods—two volatile GDP components. The preliminary estimates are released at the end of the second month following the reference quarter. The “final” estimates are released at the end of the third month. In addition, the quarterly GDP estimates are revised three times in the course of as many years (except in the years when benchmark revisions are done) as the first, second, and third annual revision estimates.

Sources of Revision

Quarterly revision

The GDP estimates (GDP: First Quarter 2002 FINAL) released today are based on more complete source data than were available for the preliminary estimates issued last month. In the preliminary estimates, the increase in real GDP was 5.6 percent. The final estimate of the first-quarter increase in real GDP is 0.5 percentage point, or $A12.3 billion, higher than the preliminary estimate issued last month. The upward revision to the percentage change in real GDP reflected a downward revision to imports of goods and services and an upward revision to equipment and software that were partly offset by a downward revision to exports of goods and services.

article image

Annual revision

The annual revision of the national income and product accounts, covering the first quarter of 1999 through the first quarter of 2002, will be released along with the advance estimate of GDP for the second quarter of 2002 on July 31. Features of this revision include the incorporation of a new price index for brokerage services and the adoption of a new revision schedule for wages and salaries that permits the incorporation of more comprehensive quarterly source data on a more timely basis. An article describing the revision will appear in the August 2002 issue of the Survey of Current Business.

Source: BEA News Release, June 27, 2002 http://www.bea.doc.gov/bea/newsrel/gdp102f.htm).

Annual revisions: The first annual GDP estimates are derived as the sum of the quarterly estimates of the reference year, and revised each summer (in July). The estimates of the most recent calendar year and the two preceding years are subsequently revised. The revisions are timed to include major annual source data that become available at this time, and new quarterly data. For example, the preliminary Internal Revenue Service (IRS) tabulations of data from corporate tax returns that are used to compile estimates of corporate profits become available about two years after the reference year, and the final tabulations are available with a three-year lag.30

BEA also makes improvements when it does the routine work of bringing in better source data. For example, the latest annual revision of the estimates (1999–2001) included such improvements as (i) a new methodology and revision schedule for the quarterly estimates of wages and salaries and related income-side components; (ii) new price indices to improve the real estimates of personal consumption expenditures, foreign transactions, and Federal Government spending; and (iii) the compilation of personal consumption expenditures on a commodity basis, etc.31 The previous annual revision of the estimates (1998–2000) incorporated, among other things, the North American Industrial Classification System (NAICS), which effected the detailed estimates of private inventories by industry.32 (2) Major conceptual and methodological revisions are introduced only every five years or so. Typically, such revisions are introduced in the comprehensive revisions of GDP estimates. For example, the 1999 comprehensive revision—the eleventh such revision— included improvements in definitions and classifications,33 such as recognition of business– and government expenditures for software as fixed investment; treatment of government employee retirement plans in the same way as private pension plans; redefinition of dividend payments by regulated investment companies to exclude distributions that reflect capital gains income, etc.34

All previous period estimates are subject to revision in comprehensive revisions starting from the last benchmark input-output table. Selected series are revised for earlier periods to give consistent long-term time series. Currently, the revised annual National Income and Product Account (NIPA) estimates are available from 1929. The quarterly current dollar series start from 1946, and the series on quantity and price measures from 1947. The monthly series start from 1959.

Improvements in the comprehensive revisions are publicized in advance.35 For example, the BEA has stated that in 2003 it intends to start the twelfth benchmark revision of the GDP estimates and other NIPAs. Among other things, the revision will incorporate the 1997 benchmark I-O accounts,36 the reclassification of industry estimates using NAICS, new producer price indices for services for deflation purposes, and several presentational improvements, including updating the reference year for price and quantity measures to 2000.37

(3) The BEA reminds users of the size of the likely revisions based on past history while conducting periodic analyses of the reliability38 of the revised estimates. (See Box 3, which is an extract from the BEA news release concerning the expected changes to the “preliminary” and “final” GDP estimates of fourth quarter 2002. The box also shows historic comparisons of the quarterly revisions.)

Historic Size of Revisions

The table below shows comparisons of the revisions between quarterly percent changes of GDP for the different vintages of the estimates. These comparisons can be used to assess the likely size of future revisions. For example, two-thirds of the revisions between the quarterly change in the advance estimate of real GDP and that in the final estimate were within a range of -0.6 to +0.9 percentage point. Thus, based on past history, the fourth-quarter change in real GDP, now estimated at 0.7 percent at an annual rate, is not likely to be revised below 0.1 percent or above 1.6 percent in the next two releases.

Revisions Between Quarterly Percent Changes of GDP: Vintage Comparisons

(Annual rates)

article image

NOTE. These comparisons are based on the period from 1978 through 2001 for thefirst three comparisons in each group, and on the period from 1978 through 1999 for the last three comparisons in each group.

Source: BEA News Release, January 30, 2003 (http://www.bea.doc.gov/bea/newsrel/gdpnewsrelease.htm)

A recent analysis of the reliability of the revised estimates—the fourteenth of its kind— concludes that revisions39 have no “momentum,” that is, they do not have a bias that could predict future revisions; and they are explained largely by the use of new information/definitions.40 For example, the classification of computer software as investment and similar improvements in the 1999 comprehensive revision raised the GDP growth rates, on average, by 0.4 percent in the latter half of the 1990s. The mean of the revisions has a positive sign reflecting improvements in coverage.

Further, while the aggregate effect of the revisions has diminished over time (on average, from a little over 1 percent, in absolute terms, difference in quarterly GDP growth since the early 1980s, to 0.7 percent in recent years), the revisions in the GDP components may be significant. For example, in the 1996 comprehensive revision, the reclassification of the purchases of the Commodity Credit Corporation as belonging to the business sector instead of the government sector were significant but they did not effect GDP growth because the impacts were offsetting. On the other hand, the definitional improvements in the 1999 comprehensive revision (such as the reclassification of computer software as investment mentioned earlier) were augmentative and resulted in increasing the GDP growth rate by 0.4 percent in the latter half of the 1990s.

APPENDIX II

The Policy and Practice of Revising the Balance of Payments Statistics in Australia

The policy and practice of revising Balance of Payments (BOP) statistics of the Australian Bureau of Statistics (ABS) has three noteworthy features:

  • A regular and transparent schedule is followed.

    • – Revisions are timed to make the BOP estimates consistent with national accounts estimates.

    • – Revisions are documented and explained.

  • Major methodological revisions are explained in advance.

    • – Users are consulted.

  • The results of analyses of revisions are taken into account in revising the data for the subsequent periods.

(1) Revisions follow a regular and transparent schedule. The BOP estimates for the current financial year (which ends in June) are revised at quarterly intervals. The revised data are published in the quarterly publication Balance of Payments and International Investment Position and in the July, October, January and April issues of the monthly publication International Trade in Goods and Services (see Box 4, which is the schedule of quarterly revisions). The quarterly revision of the BOP estimates reflects the periodicity of several BOP data sources. Estimates of trade in goods are based on timely and reliable customs data. These data are updated daily, so revisions feed through very quickly. In contrast, services estimates are based predominantly on data collected by the quarterly Survey of International Trade in Services, whose results are available three months after the end of the reference period. Services estimates for the latest periods are therefore extrapolated and are replaced with the survey-based estimates when they become available.

Quarterly Revisions

article image

Source: ABS (2002)

In general, more accurate information is incorporated into the estimates as soon as possible. However, the monthly data are normally not revised in the first month of a quarter so as to minimize disruption to the historical series, and to keep the monthly and quarterly series consistent. Revisions to the BOP estimates relating to investment income and capital account for the periods prior to the current financial year are only made twice a year. Other BOP data prior to the previous financial year are also revised twice a year. The revised data are published in the July issue of the monthly publication International Trade in Goods and Services and in the June issue of the quarterly publication Balance of Payments and International Investment Position. Exceptions to this rule may be made in case of significant revisions that are important enough to require immediate publication.41

To ensure consistency with the national accounts, the timing of the BOP revisions closely follows the national accounts revisions and benchmarking policy. Specifically, if revisions are being considered to the BOP and IIP data outside the regular revision schedule and prior to three years from the reference year, the national accounts staff is consulted on the implications of such revisions for the consistency of the BOP and national accounts, including when major revisions are incorporated.

Revisions are documented and explained. The quarterly Balance of Payments and International Investment Position publication includes a table summarizing the revisions that have been made since the previous issue. The publication also has notes explaining why the revisions were done. More lengthy listings of revisions are given in the annual publication (see Box 5, which reproduces text pertaining to revisions from the quarterly BOP and IIP statement and the annual publication).

Explanation of Revisions

Quarterly statement:

Seasonally adjusted and trend estimates of the current account have been revised as a result of the annual seasonal reanalysis which takes account of information that has become available since the previous analysis. Revised historical and new forward seasonal factors to September 2003 were released on 20 November 2002.

Incorporation of the latest available survey and administrative data has resulted in revisions to the current account back to March quarter 2000, reducing the 2001-02 current account deficit by $A405m. The financial account and international investment position have been revised back to September quarter 2001, decreasing Australia’s net IIP liability as at 30 June 2002 by $A5b.

Annual publication:

Revisions have decreased the deficit on current account by $A205 million in 1998–99. Chain volume measures and associated price measures incorporate a new base year (1998-99), which has resulted in revisions to levels for all periods. There have been substantial revisions to the financial account and international investment position (IIP) back to the September quarter 1988. The revisions are the result of methodological changes, improved reporting and the identification and correction of errors. These are detailed below:

Currency and residual maturity of foreign debt

(a) A method has been applied for allocating a residual maturity to Commonwealth Government and State and Territory Central Borrowing Authority securities issued in Australia and held by nominees on behalf of non-residents. These $A debt securities were previously classified as unallocated. The new method, applied from 1999–2000, uses the identifying information for each line of stock reported by nominees as held on behalf of non-residents to apply the appropriate residual maturity.

(b) Financial derivative assets and liabilities have been allocated to currency and residual maturity categories from 1999-2000.

Valuation of unlisted equity assets

The reported valuations for foreign investment in unlisted equities issued in Australia and Australia’s direct investment abroad in unlisted equities have been reviewed. While a range of valuation bases are used by investors to report their equity holdings, these are not always a good practical approximation to the market price valuation required in international investment position statistics. Where the reporting basis used is historic acquisition cost, this can diverge significantly from market valuation.

(a) Foreign investment in Australia (FIA)

Analysis of company reports and other sources, and contact with the more significant direct investment enterprises in Australia, have resulted in market price valuations now being applied. Coverage problems have also been identified and rectified.

(b) Australian investment abroad (AIA)

For a number of unlisted investments abroad the ABS has estimated market valuations based on a variety of indicators obtained from published company accounts and other public sources as well as on information from reporting businesses. The level of direct equity investment abroad and the changes in investment position due to market price changes have been revised from 1993–94.

The ABS will closely monitor reported values to avoid any future wide divergence from market prices.

These valuation changes do not affect BOP transactions or any foreign debt measures.

Improved use of expanded individual security reporting has led to the identification and correction of errors in the sector classification of the Australian issuers of both debt and listed equity securities. Significant errors in the market price valuation of these securities and coverage deficiencies have been rectified. The analysis also identified some non-resident issues in Australia being reported as Australian liabilities, which overstated Australia’s external debt.

Ongoing analysis of reported IIP information in the context of the financial accounts of the Australian national accounts has identified reporting errors which have been rectified.

Financial derivative asset and liability positions previously reported on a net basis are now reported on a gross basis.

Source: ABS (2002i) (2001)

(2) Major methodological revisions are explained in advance. The compilation and presentation of the BOP data in accordance with the IMF’s Balance of Payments Manual (BPM5) that started in December 1997 was announced by ABS in September of the same year. The ABS is committed to implementing in full the revised BPM5. A description of the new standard was provided by ABS to the users, including illustrations of sample draft data tables.42 The strategy to implement the BPM5 recommendations was outlined, and the benefits accruing to Australia from adopting the standards were explained, including compilation of consistent rest-of-the world accounts, balance of payments, and other national accounts components. Prior to that, in December 1994, ABS published a paper, which discussed the effects of implementing BPM5 on Australia’s statistics.43

Consultations with key users to assess timetable and priorities for implementing BPM5 started in November 1994. Following these consultations, an article and discussion paper on the issue were published. The implementation proposal was modified and, in 1995, a wider range of users was approached. Users were invited to provide feedback on the revised implementation proposal and to participate in user forums to discuss the proposal. Data providers were also consulted to determine the feasibility of collecting adequate source data, as well to brief them on the implications for collecting data to support the new standard.

(3) The results of analyses of revisions are taken into account in revising the data for the subsequent periods. The analyses typically focus on the direction of the revisions (or bias in the initial BOP estimates) and magnitude of revisions (or dispersion of the latest estimates from the initial estimates). One such analysis of the estimates relating to the balance of current accounts covered the period 1986 to 1994.44 As a consequence of this study, to remove the bias in subsequent estimations of the current account balance, major revisions to the debit items are postponed until the updated data for the credit items become available.

References

General

  • Australian Bureau of Statistics, 2002, “Revisions in Australia’s Balance of Payments (BOP) Statistics,” paper presented at the Fifteenth Meeting of the IMF Committee on Balance of Payments Statistics, Canberra, October.

    • Search Google Scholar
    • Export Citation
  • , Olav, Jon Ivar Røstadsand, and Espen Tørum, 2003, “The Reliability are of Today’s Financial Macroindicators,” Norges Bank Economic Bulletin, Volume 74:3. Also available at http://www.norges-bank.no.

    • Search Google Scholar
    • Export Citation
  • Bloem, Adriaan M., Robert J. Dippelsman, and Nils O. Maehle, 2001, Quarterly National Accounts Manual, (Washington: International Monetary Fund).

    • Search Google Scholar
    • Export Citation
  • Brown, Robert, Bruce T. Grimm, and Marian B. Sacks 2003, “The Reliability of the State Personal Income Estimates,” Survey of Current Business, Vol. 83 (December). Also available on the Web site of the Bureau of Economic Analysis at http://www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation
  • Carson, Carol S., and Lucie Laliberté, 2002, “Assessing Accuracy and Reliability: A Note Based on Approaches Used in National Accounts and Balance of Payments Statistics,” IMF Working Paper 02/24, (Washington: International Monetary Fund). Also at http://www.imf.org/ (Publications).

    • Search Google Scholar
    • Export Citation
  • Carson, Carol S., and Lucie Laliberté, 2001, “Manuals on Macroeconomic Statistics: A Stocktaking to Guide Future Work,” IMF Working Paper 01/183 (Washington: International Monetary Fund). Also at http://www.imf.org/ (Publications).

    • Search Google Scholar
    • Export Citation
  • Consultative Seminar on Governance of National Statistical Systems, 2002, Summary of Proceedings, hosted by International Monetary Fund, United Nations Statistics Division, and the Singapore Department of Statistics, May 2830 http://www.singstat.gov.sg.

    • Search Google Scholar
    • Export Citation
  • Eisner, Robert, 1989, “Divergences of Measurement and Theory and Some Implications for Economic Policy, American Economic Review, Vol. 79 (1), pp. 113.

    • Search Google Scholar
    • Export Citation
  • Eurostat and European Central Bank, 2003 “Harmonization of Revisions Practices for B.O.P./I.I.P. Statistics,” May 20.

  • Fixler, Dennis J., Bruce T. Grimm, and Anne E. Lee, 2003, “The Effects of Revisions to Seasonal Factors on Revisions to Seasonally Adjusted Estimates,” Survey of Current Business, Vol. 83 (December). Also available on the Web site of the Bureau of Economic Analysis www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation
  • International Monetary Fund, Statistics Department, 2002, “Revision Policy and Practice: A First Overview of Country Practices,” paper presented at the Fifteenth Meeting of the IMF Committee on Balance of Payments Statistics, Canberra, October.

    • Search Google Scholar
    • Export Citation
  • Lal, Kishori, 1998, “National Accounts Revision Practice: Canada,” paper presented at the Annual OECD Meeting of National Accounts Experts, Paris, September.

    • Search Google Scholar
    • Export Citation
  • Nesbit, Shirley, 2002, “Revisions in the New Zealand Balance of Payments,” paper presented at the Fifteenth Meeting of the IMF Committee on Balance of Payments Statistics, Canberra, October.

    • Search Google Scholar
    • Export Citation
  • Penneck, Stephen, 1998, vNational Accounts Revision Policy,” paper presented at the Annual OECD Meeting of National Accounts Experts, Paris, September.

    • Search Google Scholar
    • Export Citation
  • Penneck, Stephen, 1998, “The UK Approach to Educating Users,” paper presented at the Annual OECD Meeting of National Accounts Experts, Paris, September.

    • Search Google Scholar
    • Export Citation
  • Runkle, E. David, 1998, “Revisionist History: How Data Revisions Distort Economic Policy Research,” Federal Reserve Bank of Minneapolis Quarterly Review, Vol. 22 (4), pp. 3-12.

    • Search Google Scholar
    • Export Citation
  • Statistics Norway, 1998, “National Accounts Revision Policy in Norway,” paper presented at the Annual OECD Meeting of National Accounts Experts, Paris, September.

    • Search Google Scholar
    • Export Citation
  • United Nations Statistics Division, 2003, Handbook of Statistical Organization: on the Operation and Organization of a Statistical Agency, 3rd ed. (New York).

    • Search Google Scholar
    • Export Citation

Appendix I

  • Bureau of Economic Analysis, U.S. Department of Commerce, 1998, “U.S. National Income and Product Accounts: Release Schedule and Revision Practice,” paper presented at the Annual OECD Meeting of National Accounts Experts, Paris, September.

    • Search Google Scholar
    • Export Citation
  • Bureau of Economic Analysis, U.S. Department of Commerce, 2002, “Note on the Upcoming Comprehensive Revision of the National Income and Product Accounts,” Survey of Current Business, (November). Also available at www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation
  • Fixler, Dennis J., and Bruce T. Grimm, 2002, “Reliability of GDP and Related NIPA Estimates,” Survey of Current Business, (January). Also available at website of www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation
  • Graham, Jenkinson and Nigel Stuttard, 2004Revisions Information in ONS First Releases,” Office of National Statistics Economic Trends, No. 604 (March). Also available at http://www.ons.gov.uk/.

    • Search Google Scholar
    • Export Citation
  • Lawson, Ann M., Kurt S. Bersani, Mahnaz Fahim-Nader, and Jiemin Guo, 2002, “Benchmark Input-Output Accounts of the United States,” Survey of Current Business, (December). Also available at www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation
  • McCulla, Stephanie H., and Carol E. Moylan, 2003, “Preview of Revised NIPA Estimates for 1997: Effects of Incorporating the 1997 Benchmark I-O Accounts, Proposed Definitional and Statistical Changes,” Survey of Current Business, Vol. 83 (January). Also available at www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation
  • Moulton, Brent R., 2000, “Improved Estimates of the NIPAs for 1929-99,” Survey of Current Business (April). Also available on the website at www.bea.doc.gov.

    • Search Google Scholar
    • Export Citation
  • Moulton, Brent R., Eugene P. Seskin and David F. Sullivan, 2001, “Annual Revision of the National Income and Product Accounts,” Survey of Current Business (August). Also available at www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation
  • Moulton, Brent R., Robert P. Parker, and Eugene P. Seskin, 1999, “A Preview of the 1999 Comprehensive Revision of the National Income and Product Accounts: Definitional and Classificational Changes,” Survey of Current Business (August). Also available at www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation
  • Moulton, Brent R., and David F. Sullivan, 1999A Preview of the 1999 Comprehensive Revision of the National Income and Product Accounts: New and Redesigned Tables,” Survey of Current Business (September). Also available at www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation
  • Seskin, Eugene P., and Stephanie H. McCulla, 2002, “Annual Revision of the National Income and Product Accounts,” Survey of Current Business, (August). Also available at www.bea.doc.gov/.

    • Search Google Scholar
    • Export Citation

Appendix II

  • Australian Bureau of Statistics 2002, “Revisions in Australia’s Balance of Payments Statistics,” paper presented at the Fifteenth Meeting of the IMF Committee on Balance of Payments Statistics, Canberra, October.

    • Search Google Scholar
    • Export Citation
  • Australian Bureau of Statistics, 2002i, Balance of Payments and International Investment Position, Australia (Cat. No. 5302) September Quarter. Also available at www.abs.gov.au.

    • Search Google Scholar
    • Export Citation
  • Australian Bureau of Statistics, 2001, Balance of Payments and International Investment Position, 1999-2000 (Cat. No. 5363.0). Also available at www.abs.gov.au.

    • Search Google Scholar
    • Export Citation
  • Australian Bureau of Statistics, 1998, “Balance of Payments and International Investment Position, Australia, Concepts, Sources and Methods” (Cat. No. 5331). Also available at www.abs.gov.au.

    • Search Google Scholar
    • Export Citation
  • Australian Bureau of Statistics, 1997, “Information Paper: Implementing New International Statistical Standards in ABS International Accounts Statistics,” (Cat. No. 5364). Also available at www.abs.gov.au.

    • Search Google Scholar
    • Export Citation
  • Australian Bureau of Statistics, 1996, “Information Paper: Quality of Australian Balance of Payments Statistics,” (Cat. No. 5342). Also available at www.abs.gov.au.

    • Search Google Scholar
    • Export Citation
  • Australian Bureau of Statistics, 1994, “Introduction of Revised International Statistical Standards in ABS Macroeconomic Statistics,” (Cat. No. 5245). Also available at www.abs.gov.au.

    • Search Google Scholar
    • Export Citation
1

This paper further develops research presented at the 54th Session of the International Statistical Institute, Berlin, Germany, http://dsbb.imf.org/ (go to DQRS and Work in Progress). The authors thank Claudia Dziobek for helpful suggestions and gratefully acknowledge comments received during the IMF Balance of Payments Statistics Committee meeting in October 2003 held in Washington.

2

The SDDS and GDDS were established in 1996 and 1997, respectively, to guide countries in the provision of data to the public. For more information, see the IMF’s website a http://dsbb.imf.org/.

3

For a summary, see Carson and Laliberté (2001) Available at http://www.imf.org/ (Publications)

5

For example, responding to one case of revisions and misreporting of fiscal data, the IMF Executive Directors “expressed serious concern that the erroneous data had misled IMF staff and the Executive Board about economic performance; prevented the formulation and implementation of timely corrective measures; and resulted in the design of an adjustment program that was partly based on inaccurate information.” The country authorities committed to remedial actions. See IMF News Brief No. 00/23 at http://www.imf.org/external/np/sec/nb/2000/nb0023.htm

6

“IMF Executive Board Reviews Data Provision for Surveillance,” Public Information Notice No. 02/133 (November 18, 2002) at http://www.imf.org/external/np/sec/pn/2002/pn02133.htm

7

See proceedings of the Consultative Seminar on Governance of National Statistical Systems, Singapore, May 28-30, 2002, hosted jointly by the United Nations Statistics Division, the Statistics Department of the IMF, and the Singapore Department of Statistics. Available at http://www.singstat.gov.sg.

8

See the papers for the Fifteenth Meeting of the IMF Committee on Balance of Payments Statistics (http://www.imf.org/external/bopage/stindex.htm) under “Data Quality,” see especially the paper “Revision Policy and Practice: A First Overview of Country Practices.”

9

The term “national statistical agency” will be used to cover national statistical offices, central banks, and ministries in the capacity of making statistical information available to the public.

10

Much of this paper is equally relevant for social and poverty indicators, for instance those of the Millennium Development Goals and Indicators. However, more work is needed to establish this point.

11

It is assumed in this paper that statistical agencies publish revisions only of significant size and that the agencies may have internal definitions of “significance.” For instance one major national statistical agency corrects its price index only if the correction amounts to 0.2 percent or more.

12

For a discussion of revisions to seasonal factors for seasonally adjusted estimates (e.g., revising the size of the adjustment for year-end Christmas sales in seasonally adjusted GDP figures), see Fixler et al (2003).

13

Remarks by David Dodge, Governor of the Bank of Canada at the Conference of European Statisticians in Geneva, Switzerland (via videoconferencing), June 11, 2003. Available at http://www.unece.org.

15

See Eisner (1989, pp. 1-13)

16

“Building Trust in Statistics—White Paper,” is available on the website of the U.K. Office of National Statistics at www.statistics.gov.uk.

17

Jenkinson and Stuttard, (2004). Document based on National Statistics Code of Practice Protocol on Revisions (Version 1.0).

19

The Data Quality Assessment Framework (DQAF July 2003) identifies good practices with respect to Prerequisites of Quality and five dimensions of quality: Assurances of Integrity, Methodological Soundness, Accuracy and Reliability, Serviceability, and Accessibility. Revision practices are identified in three of these:

  • Dimension 1. Assurances of integrity: 1.2.4 Advance notice is given of major changes in methodology, source data, and statistical techniques.

  • Dimension 3. Accuracy and reliability: 3.5.1 Studies and analyses of revisions are carried out routinely and used internally to inform statistical processes.

  • Dimension 4. Serviceability: 4.3.1 Revisions follow a regular and transparent schedule; 4.3.2 Preliminary data and/or revised data are clearly identified; 4.3.3 Studies and analyses of revisions are made public.

For more about the DQAF, see the IMF’s Data Quality Reference Site (DQRS) at http://dsbb.imf.org/.

20

Revisions practices from published Reports on the Observance of Standards and Codes (ROSCs) for eight countries from different regions of the world are available in Appendix III of the original research paper at http://dsbb.imf.org/ (go to DQRS and to Work in Progress).

21

For government finance statistics and monetary statistics, a common basis for revisions is the official audit of the data, which is based more on accounting principles than statistical methodology. In fact, it is not uncommon to find that the only revisions of government finance and monetary statistics are from official audits. Data are usually considered “final” after the audits, which tends to make further revisions unlikely.

22

For example, in a recent article about financial macroeconomic indicators, et al (2003) from the Norges Bank note that monetary policy, in terms of the interest rate, is set on the basis of information available at the time of the decision. A necessary condition for setting the “right” interest rate is that the quality of initially published data is good. Therefore, an analysis of the properties of the initial estimates is of particular interest for monetary policy.

23

An example of this is shown in a study by Brown et al (2003) of revisions for quarterly estimates of State Personal Income in the U.S. covering a ten-year period.

24

See press releases of April 2003 by the National Statistics Office of the Republic of the Philippines, at http://www.census.gov.ph/data/pressrelease/pr0338tx.html and the Bangko Sentral ng Pilipinas Press Release, April 2003, at http://www.bsp.gov.ph.

25

The BEA, an agency of the Department of Commerce, collects data from other (mainly Federal) statistical agencies and firms, conducts research and analysis, develops and implements estimation methodologies, and disseminates the statistics.

28

Until late in 1991, real growth was measured using the GNP adjusted for inflation, and inflation was measured using the implicit GNP price deflator. From the fourth quarter of 1991 to late 1995, growth was measured using the GDP adjusted for inflation, and inflation was measured using the implicit GDP price deflator, assuming that the market basket of goods and services was constant over time. From the fourth quarter 1995, chain-weighted implicit price deflators are used.

29

Since seasonal adjustment factors typically depend on future data, the seasonal adjustments are revised when these data become available.

30

The tax-based data cover all incorporated businesses and all industries, while financial-accounting measures are less comprehensive. However, since the latter are available on a more timely, quarterly basis, they are used to extrapolate the tax-return-based estimates to current periods (Seskin and McCulla, 2002).

33

Definitional changes accounted for an upward revision in the GDP of about $74.5 billion (McCulla and Moylan, 2003).

34

Moulton, Parker, and Seskin (1999). The revision also incorporated the 1992 benchmark I-O accounts and improvements in presentation, such as redesigned National Income and Product Accounts (NIPA) tables reflecting definitional and classification changes, new data series on computers and their contribution to GDP growth, and chain-type quantity and price indices with reference year updated from 1992 to 1996 (Moulton and Sullivan, 1999).

35

Adequate advance information is provided, as well, for other revisions. (For example, for annual revisions, see Box 3.)

36

Major changes in the latest 1997 benchmark I-O accounts include the incorporation of NAICS, which provides a new treatment of the service activities of central administrative offices and other types of auxiliaries and a more detailed presentation of the service industries. For details, see Lawson et al 2002.

37

BEA (2002).

38

The ability of the “successive vintages of GDP estimates to present a consistent, general picture of the economy” (Fixler and Grimm, 2002).

39

Fixler and Grimm (2002) used two measures of reliability: the mean revision, defined as the average of the difference between the percentage changes in the earlier and later quarterly estimates, and mean absolute revision, defined as the average of the absolute differences in the two estimates.

40

There is also evidence to the contrary. For example, Runkle (1998) finds that initial estimates are not accurate and are biased in terms of predicting the final estimates.

41

ABS (1998).

42

ABS (1997).

43

ABS (1994).

44

Among other things the study (ABS, 1996) found that the initial estimates of several items were understated (negatively biased), and “the median initial estimate of the current account balance is close to the median final estimate but moves away with the first quarterly revisions and continues to worsen the overstatement of the deficit through to the fifth revision point before improving (in subsequent quarters).” The reason for the bias was that the debit items were revised earlier than the credit items, because the data on the credit items became available later.

Revisions Policy for Official Statistics: A Matter of Governance
Author: Mrs. Carol S Carson, Mr. Sarmad Khawaja, and Mr. Thomas K. Morrison