XX. The Balance of Payments Statistical Process
Author:
International Monetary Fund
Search for other papers by International Monetary Fund in
Current site
Google Scholar
Close

Abstract

1033. This chapter is devoted to design and management of the BOP statistical process. The statistical process consists of extraction of data from data sources, estimation of certain data, preparation of a BOP worksheet, verification of data in the worksheet, and publication of BOP statistics. The BOP database comprises groups of time series required to compile and publish BOP statistics and any data (meta data) that describe these series and the relationship between them.

Overview

1033. This chapter is devoted to design and management of the BOP statistical process. The statistical process consists of extraction of data from data sources, estimation of certain data, preparation of a BOP worksheet, verification of data in the worksheet, and publication of BOP statistics. The BOP database comprises groups of time series required to compile and publish BOP statistics and any data (meta data) that describe these series and the relationship between them.

1034. The BOP statistical process may be most effectively analyzed as a series of modules. Through the modular approach, each component may be examined in isolation before all components in the process are combined. The modular approach can be used at various levels—that is, a higher level module may consist of sub-modules. The modular approach also facilitates application of computer software because the first step in computerization of a processing system is to identify the processes involved.

1035. Illustration 20.1 on page 216 shows the primary modules in a typical BOP statistical process. A central feature of this process is the worksheet, which usually takes the form of a computer database. (See paragraphs 406-408 of chapter 10 for an introduction to the worksheet.) Data in the worksheet are obtained directly from data sources or from estimation modules. The analysis and verification process provides the compiler with a method of obtaining feedback on data; therefore, information for collection, estimation, and worksheet modules can be gathered during this stage.

Illustration 20.1
Illustration 20.1

Balance of Payments Statistical Process

1036. When the analysis and verification process is complete and any consequent changes have been made in other modules, data can be produced for publication. Most data to be prepared for publication will be obtained from the worksheet, but some data, such as information for inclusion in an analytical commentary, may be obtained during the analysis and verification process. Preparation of data for publication may, in turn, reveal other issues that should be considered. The final part of the statistical process is user analysis, which is extremely important. User analysis may provide feedback to the analysis and verification process which may, in turn, affect all other processes in future cycles. Occasionally, users may play a role in the current production cycle, especially when the compiler is uncertain of possible user reactions or requirements in response to new developments. In these circumstances, consultation with users may provide input to the analysis and verification process in the current cycle.

1037. The final product of the BOP process should be statistics that are of good quality, produced on a regular and timely basis, and subject to a minimum level of revision.

1038. The remainder of this chapter describes in detail most of the topics associated with the BOP statistical process. Primary topics not covered in this chapter are collection design, which is described in chapter 18; form design, which is described in chapter 19; and publication of BOP statistics, which is described in chapter 21.

BOP Databases

1039. A BOP database contains BOP series, each of which is uniquely identified and described. Stored for each series is descriptive information, such as title, unit of measurement (for example, money value, quantity, or ratio), currency of denomination (in the case of money value), magnitude (for example, thousands or millions of units), and time period. Rules or formulae governing the ways that series may be manipulated or linked should also form part of the database. Limitations should be established for access to, or modification of, series in the database. Tabulation or similar facilities should be established to permit extraction and publication of single series or groups of series. Appropriate maintenance functions should ensure that data in the database are properly stored, secure, and accurate. The database may be maintained manually or on a personal, mini, or mainframe computer.

1040. Series should be identified in meaningful ways. For example, the compiler may wish to identify separately source series (S), which are series extracted directly from data sources such as ITS, an ITRS, ES, etc.; estimation series (E), which are series resulting from the process of estimation; worksheet series (W), which are series that comprise the BOP statement; and publication series (P), which are series made available for publication in print, computer media, or some other format. There may be other series, which the compiler may wish to maintain separately, such as projection series, worksheet back-up series, and previously published series. There may be groups of series to allow for the compilation and publication of BOP series classified by partner country. The labels associated with each group could be assigned as prefixes to series codes. For example, W2100 could refer to exports of goods f.o.b. stored in the worksheet, P2100 to the published version of that series, and C2100 to the previously published version of the series.

1041. It is important that this type of functionality be maintained in a database (or across related databases) because the same title may apply to a number of series or versions of a series. For example, the recorded trade series used to compile the goods item may have several different values in the database. The observation from the source for a particular period may be revised. This observation is entered in the S series; however, until the observation is checked and verified or the compiler wishes to update the worksheet series, the W version of that series may be different and, in turn, the W version may not agree with the observation that was last published (P series).

Designing the Data Extraction Process

1042. The design of the interface between data sources and the BOP database is particularly important. Several procedures and safeguards are required; these are shown in illustration 20.2.

Illustration 20.2
Illustration 20.2

Extraction of Data from Source

1043. The interface between a data source and the BOP compilation system should be designed in consultation with the data provider. Even if the source consists of published data, the compiler of the published data should be informed about BOP requirements to ensure that he or she and the BOP compiler both understand the relationship of the data to BOP requirements.

1044. Source data should be approved by a legitimate authority; if the source is published data, this condition is considered to be fulfilled. Tangible evidence of data provided (in the form of a written return or perhaps a computer disk) is required and should be retained by the compiler as a reference for a reasonable period. If data are provided over the telephone, the person receiving the data should maintain a record (or log) of the data and other relevant details such as the date and the name of the person providing the data.

1045. The BOP compiler should extract the data from the source and load it into the BOP system. Data should be entered as received. That is, there should be a one-to-one relationship between the data source series and the series recorded in the BOP system. This practice limits errors to mistakes in transcription and facilitates the checking of data entries.

1046. Data should be analyzed by the BOP compiler to ascertain whether the data supplied seem consistent with past data and other information available to the BOP compiler. The compiler may discover errors in source data or find that more information is required. If the compiler wishes to query a data provider about data supplied, the specific reason for the question should be explained to the data provider. The compiler should not simply ask that a figure be confirmed as an error in source data could be repeated. It is helpful if the BOP compiler is able to develop a rapport with the data provider and make the provider aware of the compiler’s requirements. Revised data supplied by the source compiler should be released by someone who is authorized to approve such data, and suitable documentation should be maintained by the compiler.

1047. When a data provider obtains information from other sources, he or she may be limited by confidentiality restrictions in disclosing information to the BOP compiler. Should confidentiality provisions hamper the checking of BOP data, the data provider and the BOP compiler should jointly develop suitable procedures that are consistent with the letter and spirit of the legislation. Eventually, it may be necessary to seek legislative amendments that facilitate access to confidential information while preventing disclosure of such confidential information to unauthorized persons.

1048. Once the compiler is satisfied with data from a particular source, he or she should sign off and clear the data for release to the BOP worksheet. The “sign-off” procedure may seem bureaucratic; however, BOP compilation is a complex process, and errors may occur at any stage of processing. Sign-off procedures minimize the risk of errors, permit identification of sources of errors, and facilitate the modification of procedures to avoid future errors.

1049. Error detection should be handled in a positive way. When an error occurs, the compiler should not attempt to condemn the guilty party. Rather, the facts of the case should be openly and honestly established, and procedures should be improved. Errors, especially if highlighted in the press, may be highly embarrassing. Therefore, compilers should attempt to maintain effective procedures for detecting and avoiding errors.

The Estimation Process

1050. Four types of estimation procedures are described in chapter 10. These are: simple estimation, sample expansion, estimation via a data model, and extrapolations or interpolations.

1051. Illustration 20.3 on page 218 shows that different estimation procedures may interact with one another. As the diagram indicates, informed analysis and judgment play important roles in the estimation process and, in turn, may be influenced by the projection process, which is also described in chapter 10. The estimation process provides important input for the projection process as many similar methods may be used in both processes.

Illustration 20.3
Illustration 20.3

Balance of Payments Estimation Process

1052. The compiler should clearly distinguish among the three elements (shown in illustration 20.4 on page 218) that make up the estimation process.

Illustration 20.4
Illustration 20.4

Phases of Estimation

1053. The source series, the estimation component (the factors or estimation series), and the resultant series may be recorded separately. For example, if a compiler estimates freight on imports as a percentage of the value of imports, it is possible to store this relationship in the BOP system in one of two ways. The first approach is to record in one series the value of imports and, in a second series, to record the value of freight on imports calculated as a percentage of imports. In other words, two series are stored. The second approach is to record three series: the source series, the factor series, and the resultant series (source series multiplied by factor series). The factor series would show explicitly the factor or assumption used to estimate freight, and the compiler would be able to vary the factor over time. Use of the second approach can greatly enhance the compiler’s ability to analyze and explain results.

Designing the Worksheet

1054. The design of a worksheet is a matter of judgment. Suggestions made in this Guide represent some alternatives and illustrate general considerations for designing a worksheet.

1055. The core of the worksheet contains the series required to compile the BOP statement. As paragraphs 1040 and 1041 of this chapter point out, the compiler should identify these worksheet (W) series separately from other series, such as source (S) series and estimation (E) series, in the database.

1056. The W series must be organized in a logical manner so that important data structures and classifications are readily identifiable. The reference number system explained in paragraphs 422-429 of chapter 10 demonstrates one way to identify series.

1057. A decision must be made as to the number of periods to be included in the worksheet. This decision may depend on a number of factors, such as paper size (if a manual system is being used) or screen size and computer memory (if a personal computer is being used). A key goal should be to keep the system manageable. If the compiler is using an off-the-shelf spreadsheet or database, considerations such as time required to load the application and the amount of computer memory required may be important factors. These considerations might influence the compiler to maintain only a few periods in a worksheet file and to maintain data for earlier periods in different worksheet files. As a general rule, it may be desirable to show only four quarters, and one year (or eight quarters and two years) in the worksheet.

1058. The compiler must also decide whether to use a bottom-up or a top-down approach to compilation. In a bottom-up approach, series are entered at the lowest level, and all higher level aggregates are calculated from the lowest level series. In a top-down approach, lower and higher level series are entered, and lower and higher level aggregates are reconciled to ensure that there are no errors. In a top-down approach, various reconciliation series should be included in the worksheet. The top-down approach is very useful when the data provider supplies aggregate data before supplying the component series.

1059. The worksheet should also employ a modular approach. There should be a summary section that brings together the main aggregates. The design of the worksheet could follow the pattern shown in illustration 20.5. Separate sheets might be designed for the current account, the capital and financial account, the IIP, and the reconciliation between stocks and transactions. The level of detail in the worksheet is a matter of judgment, although it is usually considered desirable to maintain a one-to-one relationship between series in the worksheet and those in source modules.

Illustration 20.5

1060. As an example, part of a worksheet showing compilation of the goods item is presented in illustration 20.6 on pages 220-221.

Illustration 20.6
Illustration 20.6

1061. In accordance with the goal of keeping the worksheet to a manageable size, only four quarters and a year are recorded in the example.

1062. Labels or codes identifying the worksheet series follow the numbering system developed in the Guide. The first letter, W, indicates those that are worksheet series. The next four digits, which are taken from standard codes outlined in chapter 10, indicate the BOP item number. The tag code—information shown after the BOP item number—indicates the components of the BOP item. (The reader may recall that the structure of the tag code is optional, and examples shown in illustration 20.6 are only illustrative.)

1063. References to source series are included in the worksheet. Documentation of this type, which shows links to sources, can be a valuable tool for the compiler. Documentation should be retained in working files rather than being left in archives or the compiler’s memory. Coding of source series should follow a logical pattern. In this example, source series were obtained from ITS (the SI series), from ES (the SS series), and from estimates (the SE series). The coding pattern for source series shown in the example is similar to that adopted for the W series. Alternatively, coding of source series could follow reference systems (if any) of sources.

1064. In this example, source series are generally ITS. In the case of exports, the ITS series on wheat exports is replaced by an enterprise survey series on wheat exports. Also, information (apparently not collected in ITS) on repairs to ships is added. For imports, the only adjustment made is in respect of converting imports valued at c.i.f. to an f.o.b. basis. These adjustments are estimated.

1065. Worksheet series for which no source series are shown—for example, W2100 and W2100. A—are derived from other worksheet series. Formulae for deriving these series form part of the meta data of the worksheet.

1066. The sample worksheet follows a top-down approach, as is indicated by the fact that data for total recorded trade are recorded as well as the components. Any discrepancy is recorded as a residual (series W2100.K and W3100.K). This top-down approach facilitates the use of preliminary ITS if, for example, aggregate data but not commodity data are initially available. The residual item can also point to any discrepancies that exist between the total and the components. Discrepancies indicate that some errors exist or that aggregate data may be more recent than components. (For example, aggregate data but not component data may have been revised.)

Analysis and Verification of Results

1067. The purpose of the analysis and verification process is threefold. First, via the process, compilers may detect and correct any errors in data. Second, through the process, compilers obtain information essential for understanding BOP results and for explaining results to users. Third, by means of the process, compilers may identify weaknesses in existing data sources, methods, and procedures and subsequently modify them.

1068. For analysis and verification to be performed effectively by compilers, steps in the process should be carefully designed. Analysis and verification take place at a number of stages in the preparation of BOP statistics, and these activities may be performed by different people. Analysis and verification should be performed: (a) when data are received from data providers and entered in the database; (b) after worksheets are prepared; and (c) when statistics are prepared for publication.

1069. The first and most basic step in the analysis and verification process is to ensure that data have been transcribed and/or transferred correctly from the source document to the database and from series to series within the system.

1070. The second step is to establish that data provided by survey respondents are accurate. This step is described in paragraphs 926-927 of chapter 18.

1071. The third step is to examine logical and arithmetical relationships between aggregate series to ensure that series are logically consistent. Such an examination would include, for example, checking that components add to totals and that series on stocks of external assets and changes in those stocks are arithmetically consistent.

1072. The fourth step is to compare BOP data with related data from other sources, such as data published in ITS and data published on external positions of banks.

1073. The fifth step is to examine the behavior of series—in source, worksheet, or publication modules—over time. This examination may consist of a formal analysis of movements or a visual review of the series to detect significant changes between adjacent periods. The appearance of an unusual development in a series during a period may indicate an error in data, some unusual feature or development that should be drawn to the attention of users as an aid to understanding BOP developments, or some form of “statistical noise.” The latter indicates that the result for a period may be correct or incorrect but insufficient information exists to make a definitive diagnosis. Obviously, collection and compilation procedures that produce statistical noise should be isolated and modified so that changes are fully explained and the incidence of statistical noise is minimized.

1074. The sixth step is to examine empirical relationships between BOP series over time. Many of these relationships are discussed in chapters 11 through 16. Some of the better-known relationships are freight on imports as a percentage of imports, investment income as a percentage of stocks of external financial assets and liabilities, and financial fees related to the level of financing activity. Sometimes, what appear to be unusual movements in series may be adequately explained when the series are compared to related series. BOP ratios shown in table P10 in appendix 3 and explained in paragraphs 1184-1186 of chapter 21 should also be examined during this step of the process.

1075. The seventh step is to examine residuals, such as the net errors and omissions item, and any residual changes in stocks that are attributable to measurement problems (for example, the closing stock for one period does not equal the opening stock for the next). An examination of residuals may assist in isolating causes of error. If residuals are small, the compiler may be satisfied that data are reasonably accurate, although this may not be necessarily so as there could be offsetting errors. In the case of large residuals, the compiler should look at the causes of possible error and resolve them. If large residuals are linked to reporting by a few units, a more intensive querying of those units should be undertaken to pinpoint the particular cause of discrepancies.

1076. The eighth step is to identify units with transactions making large contributions to certain BOP items and to ascertain the nature of these transactions. Identification is relatively straightforward when the compiler has direct access to unit record data. If this is not the case, the BOP compiler could approach the data provider, who could, in turn, query the nature of the transactions involved. Monitoring BOP transactions at the unit record level is helpful for understanding BOP developments and is an excellent method for ensuring quality by facilitating linkage of data on micro and macro activity. A related approach would be to match enterprise transactions from one data source with offsetting or associated transactions in other BOP sources.

1077. The ninth step is to assess BOP, IIP, and reconciliation series in the context of observed economic events. For example, imports could be expected to rise in periods when national income is rising. Depreciation in a country’s exchange rate should lead to increases in quantities of exports and reduced quantities of imports. Changes in foreign interest rates may be expected to have an impact on interest payable on floating-rate external financial assets and liabilities denominated in foreign currencies. Increases in the profitability of domestic companies might affect dividends payable to nonresident shareholders. Appreciation in a country’s exchange rate should reduce the value (in national currency) of a country’s external financial assets and liabilities to the extent that these are denominated in foreign currencies. Improvement in a country’s stock market should increase the value of portfolio investment equity liabilities. A change in a country’s foreign investment policy may lead to changes in direct investment transactions over time.

1078. Obviously, the possibilities for such comparisons are extensive. By looking at these relationships, the compiler may be better able to understand the meaning of changes in BOP and IIP statistics and to explain these developments to users. Also, by relating BOP and IIP items to general economic events, the compiler may identify errors or shortcomings in the compilation process and rectify them.

1079. The tenth step is to anticipate reactions of users and the media to published data. If, for example, the compiler discovers some unusual phenomenon (for example, significantly different statistics produced by alternative treatments) likely to create interest or concern among users, it may be advisable to discuss compilation issues and results with certain users in advance. Also, the compiler may be able to gain additional insight and avoid considerable discomfort by discussing results with users who have knowledge of developments in the external sector. Obviously, the compiler should approach users carefully as there may be restrictions on the pre-release of BOP results.

1080. Without compromising objectivity and integrity, it is important that the compiler anticipate potential user and media reaction to release of BOP statistics. Advance consideration of user and media perspectives should be helpful for determining the best way to present data—particularly in terms of explaining unusual results or changes to concepts or methods. Careful handling of published information can enhance the image of the compiler, who needs public support and respect if he or she is to perform effectively. Inadequate preparation for, and explanation of, data that convey unexpected results may suggest that the compiler does not understand the numbers being released and may undermine user confidence in the official statistics.

1081. For the analysis and validation process to be effective, data must be well organized. Basic data (source, work sheet, and publication data) should be assembled in a logical, comprehensive manner and be properly documented. The compiler will then be able to concentrate on the content of data to be analyzed. (Time spent developing an understanding of the idiosyncracies of the processing system can greatly detract from effective analysis.) The analytical process will be facilitated if certain predetermined tables, which bring together key aggregates, ratios and comparisons of BOP data with data from non-BOP sources, are produced as part of basic procedures. The compiler should also assemble any supplementary information that may provide useful explanations of events. This supplementary information may include analyses, which may help satisfy the compiler as to the correctness of particular series, of major contributors to certain items.

1082. The process of analysis and verification is a reiterative one. That is, the compiler analyses data, identifies any sources of error, makes amendments, and repeats the process until he or she judges data to be ready for publication.

1083. The final part of the analysis and verification process is production of a report. The report should cover: (1) unusual features, trends, new developments, and anomalies in data; (2) conclusions from an analysis of the reconciliation with related data; (3) the impact of alternative treatments—if this is an issue; (4) an analysis of the effect of major contributors; (5) an assessment of likely user and press reaction; and (6) comments on matters pertaining to the quality of data sources, methods, and procedures. The objectives of the report are: (1) to inform higher level officers in the institution responsible for compilation of BOP statistics; (2) to provide information for preparation of a statement (to be included in a publication or distributed to users) on analysis of results; and (3) to identify factors that may require future modifications in data sources and procedures.

Preparing Data for Publication

1084. Steps in preparing data from the worksheet for publication are set out in illustration 20.7. As the diagram shows, each step may create feedback relevant to a previous step. For example, in preparing tables or commentary for inclusion in a publication, the compiler may detect errors or recognize anomalies that require re-initiation of an earlier step. Actual publication of BOP data is discussed in chapter 21.

Illustration 20.7
Illustration 20.7

Preparing Data for Publication

Developing Computer Processing Systems

1085. The processes described in this chapter and in chapter 18 will be facilitated by the use of computer processing. These processes have been presented in a modular form that underlies the design of an effective computer processing system. Some of the processes can be developed by using a personal computer, although the assistance of a programmer may be required. For example, the collection register, population frame, and source data may be maintained, by using an off-the-shelf database software package, on a personal computer. A spreadsheet system could be used for worksheet tables, publication tables, and other tables prepared for analysis. Even collection forms could be processed with such technology. However, for larger or more complex systems (including those based on an ITRS), a mini or mainframe computer system (which would require the assistance of experienced programmers) may be necessary.

1086. Computer processing systems are designed in successive phases. After the completion of each phase, a report should be made to ensure that the design project meets the objectives of management in the relevant institution.

1087. The first phase consists of overall assessment of requirements. Options, related development and maintenance costs, and tentative timetables are outlined. An attempt is made to quantify probable processing volumes and staffing options.190 The first phase concludes with selection of the most appropriate option.

1088. During the second phase, the importance of which should not be underestimated, a logical design is established for the development of a new system or the modification of an existing system. Existing systems should be thoroughly analyzed; many are modified to meet implicit objectives that are not clearly articulated. In the absence of existing systems, it may be appropriate to observe and evaluate processing systems (even those located abroad) used for similar purposes. Discovery of current problems requires careful examination of systems in actual operation rather than theoretical perceptions or projections of system operations.

1089. In the third phase, a physical design for the new system is articulated. The physical design encompasses logical requirements and available hardware, software, and human programming resources.

1090. The fourth phase is devoted to actual coding of the system, which should be undertaken by using a modular approach and by testing at all points.

1091. In phase five, the system is tested by users, and any necessary modifications are made. Rigorous testing is of paramount importance; inadequate testing can lead to disastrous results.

1092. Phase six consists of system implementation. Shortcomings in earlier phases usually become apparent at this time. Even if activities were properly performed in earlier phases, unanticipated problems may require considerable effort to correct.

1093. The seventh phase consists of system evaluation. The fundamental question is: does the system live up to its original specifications and, if not, why not? Valuable lessons may be learned for the future if this phase is completed effectively. It may be desirable to undertake two evaluation phases—one immediately after completion of the implementation phase and another at a future time (such as one year later) when the system has been modified to overcome initial operating problems.

1094. In designing and implementing a computer processing system, the compiler and programmer should work closely together; multi-disciplinary teams may be appropriate. It is useful for the compiler to obtain some degree of computer literacy, although he or she need not become a programming expert.

1095. Development of a basic, effective system is the primary goal. Unfortunately, much effort is often expended to design sophisticated processes that do not work well. Limiting initial system development to a manageable level should lead to fewer maintenance requirements. More sophisticated enhancements may be added in the future as the necessity for them becomes apparent.

1096. System maintenance requirements are often greatly underestimated, and it is important that sufficient resources are available for maintenance. The assumption that a new system will perform all desired functions is usually not realistic. It is only after the compiler has used the system that he or she develops a full understanding of his or her requirements. Therefore, the life of a computer processing system should be regarded as relatively short. The desirability of enhancements, changing requirements and priorities, and the emergence of newer technologies all contribute to the necessity for continually redeveloping computer processing systems.

1097. For the beginner, it is desirable to keep the functions of computer systems simple. It is better to have a computer system that performs basic tasks well than a system that fails in the performance of a wide range of tasks. As compilers gain experience with computers, they may take advantage of computer technology to develop computerized collection forms and to undertake more complex tasks.

Timetables

1098. The object of the BOP process is to collect, compile, and publish—on a timely basis—detailed and accurate BOP statistics that satisfy the requirements of a broad range of users.191 There may be trade-offs between timeliness and quality and detail; user requests may be met by frequent, timely release of preliminary and less detailed data and subsequent publication of more detailed and more accurate data at greater intervals. The most appropriate publication strategy for each country will be determined by national circumstances.

1099. Timetables are an important part of the BOP compilation process and should be developed in consultation with users and with any support areas that are part of the compilation process. The capacity of data providers to supply timely information should also be considered so that timetables may be established on a realistic basis. Once timetables are set, every effort should be made to meet deadlines. Every person involved in the processing of BOP statistics, including those in support areas such as computer processing, should be aware of the timetable and their responsibilities for meeting it.

1100. There should be regular reviews and continuous monitoring of timetables. Unmet timetables are very frustrating to BOP users, and failure to adhere to the schedule causes the compiler to appear unprofessional. Potential areas for slippage should be identified, and rectifying action should be undertaken before problems arise.

1101. In order to improve the timeliness of BOP statistics, the compiler may assess the possibility of encouraging reporters to provide data more quickly. BOP and ITS compilers could explore ways of improving timeliness. The ITS compiler could, for example, be encouraged to produce broad aggregate results on a preliminary basis, that is, without stringent checks, such as price/quantity or unit value checks that more final figures would receive. These trade-offs should be discussed with users.

1102. In the case of an ITRS, the compiler could attempt to speed up the receipt of information by arranging for data to be transmitted electronically. In the case of ES, the compiler could ask more important respondents to transmit data by facsimile rather than by mail. Alternatively, respondents could be asked to provide preliminary estimates of the most significant items and to follow up with more complete data at a later stage.

1103. The compiler could consider making greater use of estimates. (Methods of estimation are discussed in chapters 11 through 16.) If the compiler is unfamiliar with the estimation process, users may provide some assistance if they have developed extrapolation and projection methodologies. The compiler could also consider alternative collection strategies. Replacing a full-enumeration survey with a sample survey could, with only a minimal impact on quality, improve the timeliness of results. When a certain BOP data item can be compiled from more than one source, the more timely source could be used to compile preliminary estimates.

1104. The compiler should also examine processing tasks in some detail. Poor procedures can cause delays. A proper analysis of procedures may identify ways to improve them.

1105. When establishing new procedures, the compiler should allow for several periods of implementation and testing. For example, if new procedures (such as the use of preliminary source data and a greater amount of estimation) are being developed, the compiler may, without publishing more timely data, simulate the new procedure during several test periods. Revised publication timetables could be introduced when any initial problems are overcome.

Resources

1106. Achieving a satisfactory balance between user demands and available resources is often difficult. An imbalance frequently produces major problems in delivering statistics that are timely and of good quality. Resolution of an imbalance requires careful cost assessment of each step in the statistical process. The compiler could prepare a comparative analysis of resource costs and user benefits. If more resources are required, the compiler should prepare a good case for obtaining additional resources and submit it, along with user statements of their requirements, to relevant authorities. Alternatives should be clearly shown in the presentation. For example, the compiler could draw up a matrix showing various combinations of frequency, detail, quality, and timeliness, and the resource costs associated with each. Resource cost should include professional and support staff, computing resources, office furnishings, and rents.

1107. In staffing a BOP project, it is important to have appropriately qualified personnel. Ideally, overall staff skills should include expertise in international economics and finance, accounting, collection design and management, statistical theory, computing skills, systems design, clerical and office skills, and different languages. Few individuals possess all these skills. Therefore, the compiler should recruit a balanced team. To work effectively, staff should be encouraged to interact and to share or exchange knowledge and skills.

1108. Another important qualification for BOP staff is the ability to learn quickly. Because of the complexity of the BOP conceptual framework, collection methodologies, and compilation processes, training must be assigned a very high priority. Training can take a number of forms, such as on-the-job training, internal training courses in the compiling organization, or external training courses at academic institutions or international organizations.

Assessing the Accuracy of Estimates

1109. Accuracy refers to the closeness of a measure to the true value of what it is attempting to measure. The true value of an activity is a notional value as the conceptual framework of the BOP is itself an abstraction. Individuals who carry out and/or record activities that the BOP framework is attempting to measure may not think of these activities in the same way as the BOP compiler.

1110. There are two requirements for accurate data. The first is that the conceptual framework and its rules, conventions, and definitions must be logically consistent and meaningful. The second requirement is that measurement of the activity must conform to the conceptual framework in terms of coverage of the activity, valuation, timing, classification, and treatment conventions and rules set out in the framework.

1111. The BOP compiler directs much effort towards ensuring accuracy by fully investigating the activity to be measured, further elaborating the conceptual framework if it does not adequately cover the activity being measured, designing and testing collection forms and processing systems, estimating or adjusting data to bridge the gap between the conceptual requirement and the measure used, regularly conducting quality control procedures, periodically evaluating the total process in terms of meeting its objectives, and—if need be—redirecting the statistical process.

1112. Most BOP processes are designed so that inaccuracies in data are detected and corrected. However, the compiler may identify errors, or the potential for errors, but not be able to correct them. In such cases, the compiler could develop subjective or objective assessments of the accuracy of BOP data; these assessments could then be made available to users or used as a guide for further enhancement of the compilation process.

1113. The compiler may know from discussions with data suppliers, or from errors discovered through data checks, that certain errors are likely to exist. The size of the errors may not be of sufficient concern, or available resources may not permit investigation and correction. In these circumstances, the compiler may form a subjective view of the level of inaccuracy.

1114. Properly designed sample investigations may provide a measure of certain errors. Such investigations attempt to compare what reporters should report with what they actually report. For example, the ITS compiler may investigate the incidence of exporters reporting ex-factory values rather than true f.o.b. values. These types of investigations may identify certain biases in reporting and, as a result, some adjustments may be made to data. Alternatively, such investigations may reveal that certain errors are hard to quantify or are small. As a result, the compiler may make no attempt to adjust the data. Therefore, sample investigations may provide objective data on accuracy or, alternatively, enable the compiler to form a subjective view about accuracy.

1115. When a sample survey is used to measure certain data, an analysis of sample error can provide a mathematical measure of accuracy.

1116. BOP estimates may be compared with data from other sources. For example, data reported by banks (in money and banking statistics) on the stock of external financial assets could be compared with data obtained in an ITRS or ES. Of course, the data should, if possible, be fully reconciled. However, there may be insufficient evidence to determine which source is more accurate. In these cases, the compiler can merely note that there are differences requiring further investigation. A discrepancy between two sources may provide an indication of the size of a possible error.

1117. Existing data sources may be used to derive alternative estimates. Certain assumptions may be made in the compilation process, and these may contain margins of error. It may be useful to investigate the impact of alternative assumptions. For example, the compiler could determine the impact of adopting different, but not necessarily invalid, assumptions for conversion of data from one currency to another. Different assumptions may also be made about undercoverage or non-response. Alternative estimates based on different assumptions may be compiled and compared with original estimates. The size of any differences would provide a measure of inaccuracy.

1118. BOP items compiled from certain data sources could be compared with data compiled from alternative sources and methods. For example, if travel services are measured via an ITRS or ES, resulting estimates could be compared with estimates derived by using other methods, such as numbers of arrivals and departures multiplied by estimates of per capita expenditure. From such a comparison, some judgments may be formed as to the accuracy of existing sources.

1119. Comparison of BOP estimates with those for partner countries often reveals differences. The differences may be due to many factors, including the use of different conceptual frameworks. However, these comparisons may provide some insights on accuracy.

1120. An examination of the net errors and omissions item may also be useful. Different patterns in this item may provide insights into possible causes of errors in BOP statistics. A persistently large but stable positive (credit) or negative (debit) net errors and omissions item may suggest that coverage of certain credit or debit items is inadequate. A fluctuating but offsetting (from period-to-period) item may be evidence of timing differences on volatile items—such as financial account items or large, “lumpy” current account transactions. Large net errors and omissions that arise in periods of exchange rate fluctuation may suggest problems with methods of currency conversion used to compile accounts. Net errors and omissions that appear to change when the behavior of some items changes may be evidence of relationships that indicate inadequate coverage of certain types of transactions. For example, a positive net errors and omissions item coinciding with an increase in imports may suggest undercoverage of trade credit liabilities.

1121. Similarly, changes in economic circumstances or policies accompanied by changes in the net errors and omissions item may suggest some relationship. For example, there may be a large negative net errors and omissions item that could be attributable to unmeasured capital flight occurring after the introduction of a law requiring surrender of foreign currency receipts.

1122. As factors underlying fluctuations in the net errors and omission item may be very complex, the compiler should obviously exercise care in interpreting this item. Also, the presence of a small net errors and omissions item does not necessarily mean that there are no major problems with the accuracy of BOP statistics. There may be offsetting errors, or certain transactions may not be measured at all.

1123. After performing the type of analysis described, the compiler should be able to publish some information on the quality of BOP estimates. In such a statement, the compiler could outline the issues involved, examine the strengths and weaknesses of data sources used, and evaluate the impact, upon quality, of broad data items in the accounts. It may be useful to present a table that shows the size of each broad item and a rating on the related degree of perceived accuracy. For example, the degree of accuracy could be (a) within 1 percent, (b) within 5 percent, (c) within 10 percent, (d) within 20 percent, and (e) larger. Another way of presenting information on quality would be to show, for each broad item, the perceived range of values applicable to the item. Subjective analyses of this type would provide users with an indication of the relative quality of data and serve as a useful tool in BOP analysis.

Issues Associated with Revisions

1124. Revisions to published BOP estimates are a common feature of many compilation systems and may indicate the quality of initial estimates. Initial estimates may be preliminary and subject to revision—that is, preliminary estimates are less accurate than subsequent estimates, which are considered nearer to the true value, for the same reference period. Initial revisions to some estimates are often substantial; later revisions are generally less significant. Data from some systems tend to be less subject to revision. For example, ITRS and ITS data tend to stabilize fairly quickly, whereas enterprise survey results tend to stabilize somewhat more slowly.

1125. Revisions may be required for a number of reasons. Data from any source may be preliminary. To provide an early indication of the BOP result, reporters may supply data based on less than complete response or less than complete checking.

1126. Revisions may be necessary if reporters are tardy in providing data. Actual data may differ from non-response estimates, if any, made by the compiler.

1127. Data may be revised if later generations of estimates for a particular period are based on more complete coverage. For example, quarterly ES may be restricted to larger enterprises; only annual surveys or periodic benchmark surveys may cover all enterprises.

1128. Revisions may result from detailed examination of data. A good example is the reconciliation of stock position and flows data that many countries undertake.

1129. Survey reporters may discover errors in previously reported data and supply corrected information.

1130. Periodic evaluations of data quality may result in adjustments or revisions to data published or recorded previously.

1131. Estimation methods may be revised because of changes in methodology or the availability of benchmark survey results showing that previous methods of extrapolation or interpolation require modification.

1132. Changes in the conceptual framework, such as the treatment of an activity or its classifications, may require adjustments for previous periods.

1133. It is desirable for the compiler to publish information on the impact of revisions to the accounts. In the publication, the compiler could compare initial estimates with later generations of estimates for the same reference periods. The average absolute and actual size of revisions could be published.

1134. From database management and dissemination perspectives, there should be a policy on revisions. This policy could cover which issues of publications should include revisions, whether revisions to less than recent reference periods should be incorporated as they occur or less frequently, and whether small revisions to past data should be made at all.

1135. Frequent and large revisions are irritating to, and create work for, users. Methodologies should be developed to reduce the frequency of revisions. In other words, more attention should be given to getting the right answer on the first or second attempt. To achieve this, the compiler should examine causes of revisions and whether they could be overcome by, for example, increasing the frequency of collections, collecting the most important classifications more frequently, placing less reliance on infrequent benchmark surveys, speeding up quality control procedures, and improving estimation procedures for non-response and partial coverage. To introduce such improvements, greater resources may be required. A cost-benefit assessment may be in order, and user support could be obtained for improvements that would reduce revisions.

1136. However, the fact that revisions are bothersome to users is no excuse for failing to revise estimates. The BOP compiler’s objective is to publish the most accurate estimates possible and revised estimates, to the extent that they are more accurate, should be published. A BOP compiler who does not revise estimates when he or she learns that published estimates are inaccurate may contribute to development of economic policies based on misleading information.

  • Collapse
  • Expand