Developing a Flexible Data System for U.S. International Economic Activities
The major function of federal statistics is to inform public policy making. Thus, data collection agencies must understand the basic purposes of the data and build appropriate concepts and definitions into the design and development of the statistical framework. To ensure that the data collected are suitable for the purposes intended, data collection agencies must also analyze and interpret them.
In addition, since users—both public and private—understand the need for the data and are knowledgeable about the policies and programs for which the data are collected, statistical agencies need to work closely with users in developing the statistical framework and designing data collection and analysis. Such interface would also make users aware of the availability and the limitations of data and make maximum use of the data for their purposes. Until now, the absence of a strong nexus between data collection and data analysis has resulted in the production of data of diminishing analytic usefulness or data that are not fully utilized.
Because the international trade environment will continue to evolve, continual interaction between data collection agencies and public and private users is essential to ensure that relevant data are collected and irrelevant data are not. Such interface will promote flexibility in the data collection system. Moving toward a responsive system of data collection is critical to enhance cost-effectiveness in compiling needed international trade and financial data.
In the early 1980s, when the merchandise trade statistics showed a surge in U.S. imports, there were alarming reports on the impending deindustrialization of the U.S. economy and the displacement of tens of thousands of U.S. workers. Later, when the U.S. merchandise trade deficit reached unprecedented heights, analysts predicted a “hard landing” for the U.S. dollar. At the same time, data on merchandise trade of semiconductors led to great concerns about foreigners' dumping computer chips in the United States to the detriment of the U.S. semiconductor industry and urgent calls for a U.S.-Japan semiconductor agreement to safeguard the U.S. industry. And, on a monthly basis, the U.S. stock market rose and fell with the release of data on the U.S. merchandise trade balance. The huge drop in the stock market in October 1987 was attributed partly to the increasing deficit reported during that period.
Also, in the late 1980s, data on foreign investment in the United States—including foreigners' production of automobiles, electronic products, and other manufacturing, wholesaling, and retailing activities in the domestic economy, as well as purchases of prime real estate—raised fears of foreigners' buying up the United States and of this country's losing control of its production and real estate. More recently, data showing slackening of inflows of foreign capital led analysts to predict a sharp rise in U.S. interest rates at the beginning of the 1990s.
The history of the past decade shows that most of these analyses were off the mark. The United States has not deindustrialized. The number of U.S. jobs has risen since the early 1980s (manufacturing jobs have declined, but service jobs have multiplied). The value of the dollar did decline in the mid-1980s, but it did not have a “hard landing.” At the same time, the U.S.-Japanese semiconductor agreement has not achieved many of its proponents' objectives. The stock market has recovered. Foreign direct investment in the United States has not taken over management of the U.S. corporate sector nor have foreign purchases of U.S. real estate come to represent a major proportion of U.S. assets. Moreover, U.S. interest rates have remained lower than those of most industrialized countries in recent years.
This recent history shows that data on U.S. international transactions have at times been wrong, misunderstood, or misused. The data have thus misinformed policy debates.
For merchandise trade (discussed in Chapter 4), the monthly
data are subject to large errors, especially on the export side. U.S. exports have been underreported, which may have overstated the nation's trade deficit. The merchandise trade balance can also fluctuate widely from month to month without any change in the underlying trend. Yet users have often interpreted those fluctuations as shifts in the underlying trend, which, in turn, has sent inappropriate signals to financial and exchange markets. There are also errors in the detailed merchandise trade data. Moreover, the monthly merchandise trade statistics lack a long-term orientation, limiting their usefulness for policy analysis and research.
For services transactions (discussed in Chapter 5), until recently substantial volumes of services trade were not included in the published data, making the U.S. current account balances inaccurate. There are still major international services that are inadequately covered, such as financial services. The development of data to cover new services has been hampered by limited analytic capabilities in the Bureau of Economic Analysis (BEA). Such constraints have kept BEA from improving concepts and methodologies (especially of determining how different types of services should be measured) and from refining survey questionnaires. BEA does not analyze extensively the data it collects and present analyses and interpretations to users, and users' access to data on U.S. international services transactions is limited. In addition, data compiled in large sector aggregates mask major changes in particular components; the lack of detailed data on U.S. international services limits analytic uses.
For capital flows (discussed in Chapter 6), data are marred by gross inadequacy in coverage—especially on new modes of capital transactions and new types of financial instruments—as well as by outdated assumptions on asset valuations and estimation methods. These inadequacies have adversely affected the accuracy of the data. In addition, data on capital flows are not well synthesized and organized in usable formats, further limiting their usefulness. BEA and the Treasury Department have yet to exploit fully the analytic potential of the data they do collect and to disseminate them broadly to users. Existing data are not regularly reviewed by BEA and Treasury to determine if additions, deletions, or other modifications are necessary.
More important, as discussed in Chapter I, the traditional balance-of-payments framework is inadequate for analyzing issues arising from the changing global environment. Closing data gaps, improving data adequacy, and anticipating new data needs call not only for increased research, analysis, and evaluation on the
part of data collection agencies, but also for enhanced communications between statistical agencies and data users.
ROLE OF STATISTICAL AGENCIES
STRENGTHENING RESEARCH AND ANALYTIC CAPABILITIES
In analyzing ways to improve the federal statistical systems over the past decade, various experts have all separately come to the same conclusion: to produce relevant economic statistics of high quality, statistical agencies must strengthen their analytic capabilities. The importance of analysis and research in the production of federal statistics is obvious. Schultze (1988) states that since federal statistics generated should primarily serve policy making, the specification of data needs for economic policy must originate from economic and social research agenda. Triplett (1990) expands on this theme, noting that today's research needs often drive tomorrow's analytic efforts. Similarly, the Juster (1988) report on quality of economic statistics, prepared under the auspices of the American Economic Association, concludes that since research points to emerging needs for economic data, the role of analysis and research is the single most important contribution toward making statistical agencies more responsive to emerging needs. Lipsey (1990) observes that BEA's recent improvements in data on international services transactions represent a response to continued academic complaints about deficiencies of the data. In addition, Congress and the administrative agencies became aware of the data shortcomings when services trade was put on the agenda for the Uruguay Round of the General Agreement on Tariffs and Trade negotiations. Similarly, BEA's recent introduction of current-cost and market-value methods in revaluating U.S. net international investment position is a response to researchers' discontent with book-value data of U.S. direct investment abroad and U.S. gold reserves, given inflation over time and significant changes that have occurred in the values of gold and other U.S. assets abroad.
Experts have pointed out that the main reason that statistical agencies respond slowly to users' data needs is that the agencies consider as their main task to produce the monthly or quarterly data on time. There are often few professional analysts in the agencies, and when agencies have such professionals, they are often not closely involved in the agencies' program planning and development. To enhance the quality and relevance of federal
statistics calls not only for skilled personnel to staff the research and development function, but also for close linkage between the research output and the ongoing programs of the statistical agencies (Cole, 1990). One area for which such efforts are needed is a thorough review of the scope, concept, and methodology currently used to compile international trade and finance data, as well as to upgrade domestic economic data and the national accounts to reflect the transformation of the U.S. economy.
DETERMINING DATA NEEDS
The panel solicited the views of public and private users on the adequacy of the existing data on international transactions and heard from persons representing more than 100 organizations, covering a wide spectrum of government, business, academic, and other activities (see Appendix B). Although we asked users to identify both currently unmet needs and anticipated future needs for foreign trade data, most of the responses concentrated on the present. A few of the unmet needs described by users referred to data that are currently available, indicating that some users lack complete knowledge of existing data sources. Publication of a broad catalogue of available foreign trade data from all sources, including other countries, international bodies, and private organizations, might lead to better use of existing data.
Most of the comments on unmet needs had to do with new data that could be provided by modifying or expanding the Census Bureau's processing and publication of data based on official trade documents. Areas touched on most often were the level of commodity detail, data on low-value shipments, and data regrouped by states. Manufacturers and their industry associations frequently wanted more commodity detail than is presently provided by the Harmonized System because they would like to be able to monitor their market shares in specific commodities they produce. Several types of users saw a need for greater compatibility between merchandise trade and domestic production commodity data. Interest in low-value shipments, which are presently excluded from the detailed merchandise trade statistics, came from representatives of the transportation industry, especially the airlines. Requests for more detailed data on exports compiled by U.S. state of origin and imports by state of final destination were voiced by the National Governors' Association and state officials charged with promoting exports. Requests for other kinds of data were numerous and varied. Examples include data on affiliated trade, data on
trade and domestic production by nationality of ownership, expanded coverage of the Bureau of Labor Statistics foreign trade price indexes to cover more commodity detail and bilateral trade, and information on tariffs and nontariff trade barriers. Most of these suggestions came from the economic research community.
The panel members, many of whom use foreign trade statistics, can understand the desire of users to have numerous new kinds of trade data. Nonetheless, meeting all such requests would be prohibitively expensive. High-cost data initiatives, such as expansion of state data and processing low-value shipments, would require especially strong justification.
The panel also asked users to comment on the costs of obtaining foreign trade data and on how they process the data before using them. The nature of the responses varied greatly, depending on the type of organization and its needs, resources, and awareness of the sources and nature of foreign trade data. Those who use foreign trade information in a more than casual way seldom acquire it in a form suitable for their purposes. The raw data must be processed to convert them into forms appropriate for making decisions, monitoring macroeconomic trends, understanding the determinants of trade flows, and other purposes. Such processing occurs in several ways: by extracting some data cells from a large data set; by calculating derived statistics, such as unit costs; by developing time series and performing seasonal adjustment; by converting from one classification system to another; and by converting the data from hard copy (or microfiche) to electronic format, or the reverse.
Users have a range of options as to how much of the processing they do themselves. At one end of the spectrum, they can purchase raw cross-sectional data from the Census Bureau (and its counterparts in other countries, if needed) and do all of the processing necessary to convert the data to meet their needs. At the other end of the spectrum, a private-sector organization with minimal facilities for processing data might pay a consulting firm to prepare a market analysis of international or bilateral trade in specific commodities of interest, receiving only the finished product. Some type of cost-benefit analysis is implicit in the strategies that users adopt to decide what information they need and how best to acquire it. Since it is in the users' interest to minimize their costs, it is not surprising that many would like the primary and secondary producers in governments and international organizations to add more value to raw data, preferably with no increase in user charges.
To determine the data needs of the private sector and how much information the federal government should provide, there is a need for new mechanisms to guide federal statistical policy. Historically, the division of labor has had extraordinary effects on productivity, partly because a market system is an effective method for the coordination of the specialized activities of producers and consumers. But in the absence of a market or other means of coordination, the productivity gains from specialization can be more than offset by coordination errors. In our judgment, the division of labor between statistical agencies and data users in the private sector has gone far beyond the point at which the gains from specialization exceed the costs of coordination failures. Indeed, there is no organized system by which users communicate their needs to data collection agencies. Loud user complaints do seem to stimulate responses, but volume and value are not necessarily highly correlated. Substantial changes in institutions are needed to coordinate the demand for and supply of data.
Data cannot be supplied by a competitive market system because of the cost advantages of centralization and because use of data by one party does not exclude use by another party. In the language of economics, information is a public good. But when the benefits of a data set accrue primarily to clearly definable specialized groups, it is appropriate that these groups pay for the data. In some cases, these groups should look to private vendors. For example, shippers need detailed information on merchandise shipments between cities of the United States and elsewhere in the world. We do not think there is a compelling argument that the federal government should respond to that need.
When a data set serves the public good and when the federal government has ongoing activities that make it the low-cost provider, then it is the provider of choice. Groups that gain substantial private benefit from the data should contribute to the costs of collection. Moreover, willingness to pay is a clear signal of the value of data. Generally, the benefits will accrue not only to specialized groups, but also to federal decision makers and to the public. In that event, cost sharing should be the rule. For example, data on international transactions of enterprises within individual states serve the needs of state governments, and they also are important for studying the regional consequences of federal commercial and migrant policy: the cost of collecting these data should therefore be borne both by state and federal governments.
The costs of collecting much economic data cannot be passed
on to users because of the public nature of the data. At the same time, willingness to pay (i.e., markets) cannot necessarily direct the federal government to collect data that are valuable. It is not easy to create institutions other than markets that collect data. If willingness to pay is not a sufficient signal, there needs to be substantial, organized, and ongoing communications between those who collect the data and those who use them. One possibility would be an annual conference that brings users together with those who are responsible for collecting the data. Another possibility would be to establish an advisory body to guide the development of data concepts and frameworks and help set priorities for data collection with the highest payoffs. Such a body could also review programs of the statistical agencies and monitor their progress toward accomplishing overall program goals. The function of the advisory body would complement rather than substitute for in-house research and analysis. Whatever the mechanism, it must foster an ongoing interaction between data collection agencies and data users.
COMMUNICATING DATA QUALITY TO USERS
An important symptom of the communications failure between data users and data collectors is the absence of clear measures of uncertainty accompanying the data on international economic activities provided by the federal government. In our judgment, those who collect the data and those who use the data need to be cognizant that they are not perfect measures, but only estimates. An estimate without a standard error or some other measures of uncertainty is not fully satisfactory. Collectors of data currently publish statistics as if they were perfect measures, and users generally rely on them as though they were perfect. There is strong demand for more data, but little demand for more accurate data. Yet major misinterpretations can easily occur because of lack of knowledge about data quality.
Ideally, statistical data should be accompanied by standard errors or other measures of uncertainty in a profile delineating their limitations. The reporting of such limitations might well stimulate evaluations that in turn could substantially improve the allocation of scarce resources for data collection. When the reported standard errors or other measures of uncertainty of particular data are so large that users find the numbers virtually valueless, for example, users would be likely to demand more accurate data. And a perfectly legitimate response of federal data collectors when
an acceptable level of accuracy cannot be achieved at reasonable costs would be to cease collecting the data. More generally, data users and data collectors should assess how resources could be properly allocated to reduce uncertainty in the data. Changes in the operating procedures that are costly and likely to have little effect on improving data quality would not be undertaken.
Attaching standard errors or other measures of uncertainty to international economic data is not easy, in part because much of the uncertainty in the data is not due to sampling uncertainty, with which statisticians are used to dealing. The larger source of error is response bias of various forms, the size of which often has to be guessed rather than formally estimated. We believe that those who collect the data do have an understanding of those probable biases. They know that the quality of data on exports of travel services is much more uncertain than that on exports of textiles, for example: they should provide users with the benefit of this wisdom. The fact that these standard errors or other measures of uncertainty are only guesses will be disconcerting to those who think of the data currently reported as perfect measures. But those who understand how the data are compiled will not be unduly disturbed by the guesswork involved in selecting standard errors and other measures of uncertainty.
There are other ways to communicate limitations of data to users. If the data are transmitted in machine readable form, it is possible to attach files that contain the associated standard errors or other measures of uncertainty. The print media offer some more creative ways to communicate the inaccuracy of the data. One possibility is illustrated in Table 3-1, which contains part of the 1989 balance-of-payments data of the United States. The numbers in the tables in which the statistical agency has confidence are reported in boldface; others appear in italics. This print type dramatically makes the point that imports are more accurately measured than exports, and merchandise trade data are more reliable than those for services trade.
Some convention has to be adopted to translate a standard error into a type of print: For example, an integer could be regarded to be accurate if it would not be changed if the overall figure were changed by one standard deviation. If merchandise exports were measured to be $360.465 billion with a standard error of 23, a change of one standard deviation would not affect the first integer, but it would affect all the others. Thus the printed number is $360.465. We do not endorse any particular solution to the problem of communicating the inaccuracy of the data. There are many
TABLE 3-1 Current Account Balance, U.S. Balance of Payments, 1989 (in billions of dollars)
other schemes that might be adopted that could as well or better communicate the accuracy of the data. In documents printed in color, the colors red, yellow, and green could indicate increasing accuracy. Our point is that it can be done in informative ways, and that it should be done.
There are three ways in which data users can find out more about the behavior of the economy: gather more data; gather more accurate data; or determine the accuracy of the data. In part because the federal data producing agencies do not emphasize the reporting of data limitations, only the first of these ways generally has been considered. But if standard errors or other measures of uncertainty accompanied the data, there would also be a demand for more accurate data. Once users became familiar with the inaccuracy of the data, they would realize that there is a third route to a better understanding of the economy: more accurate knowledge of the inaccuracies in the data.
Recommendation 3-1 The Bureau of Economic Analysis, the Treasury Department, and, especially, the Census Bureau should strengthen their research and analytic capabilities so that
they can develop proper concepts and methods to measure the U.S. international economic activities. The Census Bureau and the Bureau of Economic Analysis should work jointly to develop concepts, definitions, measures, and strategies for capturing in more detail the rapidly growing intracompany trade and trade in intermediate inputs. The Treasury Department and the Federal Reserve should jointly explore similar improvements to cover portfolio transactions and the flow-of-funds accounts. In addition, efforts should also be made to measure more accurately prices of international transactions and other dimensions of competitiveness, including taxes, costs, and rates of return, as well as to improve constant-price measures of exports and imports. These agencies should seek outside professional advice from analytic users and other experts in these efforts.
Recommendation 3-2 Sufficient information in the form of data quality profiles should be provided to users to help them to evaluate the quality of the data.
Recommendation 3-3 An advisory body should be established to guide long-term developments as the international trade environment continues to evolve and transactions become increasingly complex. This advisory body should be composed of experts from industry, academia, and government; it should include research and analytic data users, data filers, and respondents to government surveys, as well as user agency officials. Priority should be given to the development of timely, accurate, relevant, and cost-effective data for public policy making.
Recommendation 3-4 For data that primarily benefit specialized groups that can be clearly defined, market mechanisms should be developed to coordinate the supply of and demand for the data. When the benefits of the data accrue largely to specialized groups but also to public policy makers, cost sharing should be the rule.
Recommendation 3-5 Nongovernment users should be given greater access to trade and other international economic data compiled by the federal government.