Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.

Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter.
Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 95

Macroeconomic Modeling
arid Forecasdng
LAWRENCE R. KLEIN
ORIGINS OF THE SUBJECT
Historical research may uncover some obscure roots or primitive example
of macroeconomic models, estimated from numerical records and meant to
be realistic, but I doubt that anything clearly in the spirit of the present
effort can be found prior to the investigations of J. Tinbergen, first for his
own country, the Netherlands, in the early 1930s, and later for the League
of Nations' analysis of the economy of the United States, intended for a
better understanding of the Great Depression (Tinbergen, 19391.
Tinbergen's contribution was to show that the essence of the macro
economy could be expressed compactly in an equation system, that these
systems could be fit to real-world data, and that revealing properties of the
economy could be analyzed from such systems. To a large extent, Tinbergen
was interested in the cyclical properties of the system. That was his main
reference point in studying the American economy in the context of the
stock market boom of the 1920s, followed by the Great Crash and recovery
during the Great Depression of the 1930s. He was plainly impressed and
inspired by the implications of the Keynesian Revolution, but his greatest
work on econometric models, that of the United States for the League of
Nations, was never put to practical use in the national scene. His model
estimation for the Netherlands formed the basis for Dutch postwar economic
policy implementation at the Central Planning Bureau, which he directed
after World War II.
There was a hiatus, naturally, caused by the war, but during the closing
months of 1944, J. Marschak assembled a research team at the Cowles
95

OCR for page 95

96
LAWRENCE R. KLEIN
Commission of the University of Chicago. The Cowles Commission was
founded and supported by Alfred Cowles, originally for the study of security
prices and eventually for the investigation of mathematical and statistical
methods in economics. I was recruited for the Cowles team for the express
purpose of taking up the thread of work initiated by Tinbergen. Several
lines of thought converged at the Cowles Commission during the middle
and late 1940s. These were:
· The concept of a mathematical model of the macro economy
· An emerging theory of econometric method
· A growing body of statistical data on the national economy
The macro model concept built on the intense intellectual discussions
among economists about the interpretation of the theories of J.M. Keynes
in The General Theory of Interest Employment and Money. The mathe-
matical formulations of that theory by J.R. Hicks (1937) and O. Lange
(1938) formed the basis for a whole new way of thinking about the aggre-
gative economy. F. Modigliani had written a provocative article extending
the Keynesian analysis (Modigliani, 1944), and I had just completed the
dissertation draft of The Keynesian Revolution in 1944. The mathematical
models in these writings lent themselves well to manipulation to study the
movement of principal magnitudes of the economy and were formulated in
terms of measurable concepts. It was essentially a case of "actors in search
of a play." The Keynesian theory was simply crying for econometric im-
plementation.
In a miniature, condensed, and aggregative sense, the Keynesian theory
was a simultaneous equation version of the Walrasian system for the econ-
omy as a whole. In the nineteenth century, L. Walras, professor at Lau-
sanne, formulated a view of the economy as a result of the solution of a
set of simultaneous equations. His set referred to the detailed micro economy
and, conceptually, to an enormous system of n equations in n unknowns,
where n is as large as the goods and services of an economy are numerous,
i.e., in the millions, or billions, or even larger. At a macroeconomic level
the Keynesian economic theory recognized the simultaneity clearly. For
example, it was noted that aggregate income affected aggregate spending
in the economy, while at the same time aggregate spending affected the
generation of aggregate income. But statistical practice did not take this
simultaneity properly into account. This idea was exploited by T. Haavelmo,
much inspired by A. Wald, who contributed a great deal of the statistical
thinking as far as probability and the laws of inference are concerned-
and also the dynamics. Two important papers were produced that shaped
the statistical approach of the Cowles Commission team (Haavelmo, 1943;
Mann and Wald, 19431. This approach has not flourished as much as the

OCR for page 95

MACROECONOMIC MODELING AND FORECASTING
97
approach of building macroeconometric models for practical and theoretical
application, but it was instrumental in providing a deep understanding of
econometric method. The statistical approach was a moving force in the
formative days, even though it is not preeminent at the present time.
The actors and the play came together in the actual statistical calculation
of models. They began to emerge as early as 1946 and were first used by
the Committee for Economic Development in assessing economic prospects
for the postwar world. The emphasis was different from Tinbergen's. The
principal goal was to build models in the image of the national income and
product accounts for the purpose of making forecasts and for guiding eco-
nomic policy. The kinds of policy formulations implicit in Keynes's General
Theory were plainly operative in these systems.
A PERIOD OF EXPANSION
The history of the development of models during this period and in the
subsequent two decades or more is traced by Martin Greenberger et al.
(1976) and Ronald Bodkin et al. (19801. It consists of tracing the models
from the Cowles Commission at Chicago, to the University of Michigan
(Ann Arbor), Canada, the United Kingdom, and elsewhere up to the present
day, when there are literally hundreds in daily operation all over the world.
The major actors in this history were, in addition to the author, Colin Clark,
Daniel Suits, Arthur Goldberger, R.J. Ball, Otto Eckstein, J. Duesenberry,
and Gary Fromm (Clark, 1949; Klein and Goldberger, 1955; Suits, 1962;
Klein et al., 1961; Duesenberry et al., 19601.
During the period of enthusiastic development at the Cowles Commission
it was thought that applications of the most sophisticated and powerful
methods of statistical analysis would provide a breakthrough in practical
accomplishments, but complexity of computation remained a bottleneck.
Only demonstration models could be given a full statistical treatment, and
that was very laborious.
The use of cross-section data (surveys of individual economic units-
households and establishments), and finer units of observation in time series
(quarterly and monthly data), were looked to as other routes for a break-
through. Cross-section data were "noisy" although revealing. Monthly and
quarterly data were highly serially correlated but helpful in cyclical analysis.
Trend data, in the form of decade averages over long periods of time, were
also examined for possible leads, but they were too smooth to make an
enormous difference.
In the early 1960s a breakthrough did occur, in the form of the electronic
computer, which was harnessed to the needs of econometrics. In the 1950s
there were some early attempts at massive computer use, which worked

OCR for page 95

98
LAWRENCE R. KLEIN
well for selected aspects of econometric computation; but it was only Trough
the use of the computer in successive stages that very significant achieve-
ments were realized. It is noteworthy that at a 1972 conference in honor
of John van Neumann's contributions to the development of the electronic
computer, held at the Institute for Advanced Study in Princeton, most of
the reports claimed that the computer was of only moderate significance in
advancing scholarly subjects in the sciences. Some of the contributions
bordered on the cynical side. But economics was an exception. In my own
paper, I claimed that the computer absolutely transformed research in quan-
titative economics and in econometric model building in particular.
The use of cross-section and sample survey observations in general has
a long history in econometrics, both theoretical and applied. These obser-
vations were used more for the study of building blocks at the micro level
of analysis, but as the computer became more available for econometric
research, it became more plausible to build models that evolved into com-
plete systems from the micro units. These are micro simulation models,
introduced at an early stage by Guy Orcutt (Orcutt et al., 19611. Large
segments of the total economy, if not the entire macro economy, can be
modeled in this way. Such systems are not as widespread in use as are
aggregative models of the economy as a whole, but progress in provision
of data, techniques of data management, and system computation will enable
model building to go in the direction of micro simulation systems in the
future. At the very least, they will be revealing for the study of particular
aspects of human behavior patterns because they get right down to the basic
agents of behavior.
By 1964 it was possible to compute full-information maximum likelihood
estimates of equation systems with 20 or more equations. The usual ex-
amples had dealt with systems of 3, 4, or 5 equations using hand methods.
The computational problem, as far as statistical method in econometrics
was concerned, was fully resolved during the 1960s. Programs for esti-
mation in nonlinear systems, autoregressive correction, estimation of dis-
~ibuted lags, ridge regression, generalized regression, and many other
estimation problems were made available for routine use. All the bottlenecks
that had appeared earlier were suddenly broken. Econometric scholars were
able to handle data much better and explore data much more extensively
in the search for good estimates, but there was no seeming increase in
accuracy, efficiency, or applicability of econometric models from this par-
ticular line of research. The next real breakthrough came in connection with
some research on the Brookings model for the macro economy of the United
States. This was a quarterly model put together by a team of specialists
working under the auspices of the Committee on Economic Stability of the
Social Science Research Council. The work started in 1960, but a model

OCR for page 95

MACROECONOMIC MODELING AND FORECASTING
99
was not available until 1963 and was then transferred to the Brookings
Institution.
The principal problem was the computation of a solution to a system of
some 300 equations, which seemed very large at the time. After much
detailed experimentation, a method of solution was found in 1965. It was
a form of a well-known Gauss-Seidel algorithm, which proceeds by iterative
substitution of partial solutions from one equation to the next in a long
succession through a whole system. This method was tedious but inexpen-
sive and fast on an electronic computer. Although it involved a very large
number of calculations, it was accurate, efficient, and workable. It is now
a routine method used worldwide for solving systems of simultaneous equa-
tions in economics. Once the method had been streamlined for either non-
linear or linear econometric models, the technique of simulation was
extensively developed for the analysis of numerical models.
Use of this instrument was a breakthrough in the following senses:
1. Systems of any manageable size could be handled. A system of 100
equations was considered modest (for the first time), and systems of thou-
sands of equations are frequently used. Manageability is governed by the
capability of human operators to collect data, have it ready for use, and
understand the working of the system.
2. Economic concepts such as multipliers became generalized into al-
ternative policy simulations, scenarios, dynamic or static solutions, or sto-
chastic simulations. All of these enabled workers in the field to do much
more with models, to understand their dynamics and sensitivities. For policy
application in both private and public sectors, extensive simulation analysis
is essential.
3. Frequency response characteristics of dynamic systems could be stud-
ied. Eigenvalues in linear or linearized systems could be studied; methods
of optimal control could be applied.
4. The presentation of econometric results for the noneconometrician or
even the noneconomist became possible. Solutions of abstract mathematical
systems could be presented in the form of instructive accounting tables,
graphical displays, time patterns, and condensed reductions of large, com-
plicated systems.
5. Error statistics could be computed. With stochastic simulation meth-
ods probability limits on forecast error bands or regions could be evaluated.
Extensive recordkeeping for model performance became possible.
The forms of calculation and analysis just listed were always possible. In
linear systems they could be expressed in closed form relationships for most
cases, but they could never have been done on a significant scale and would
have attracted only very few patient individuals to the field of applied

OCR for page 95

100
LAWRENCE R. KLEIN
econometrics. Those of us who toiled for 2 or 3 days to make one alternative
analysis of a 20-equation system still marvel at the hourly accomplishments
of the computer with a model. We could never have dreamed, in 1945,
that we would be using models so intensively, so extensively, with such
an audience, or with as much precision as is realized today. The computer
is the key.
At the present time econometric software takes one from beginning to
end ("cradle to graved. Data come in machine-readable form from the
primary, usually official, source. The data are managed by software de-
signed to take out seasonal variation, arrange in order for computation,
correct for inflation, and form transformations (ratios, sums, logarithms,
etc.~. Estimation routines are put to work directly from updated data files.
The estimated equations are screened and selected for use in models. Sim-
ulation routines arrange the equations for dynamic solution in various forms
(stochastic, deterministic, steady state, short run). All these things are done
internally within the computer. Finally, tables, charts, and graphs are
generated as reports. This is the patterned sequence for the present computer
age.
Every breakthrough has its drawbacks. Today's econometrician can make
horrendous mistakes because the computer stands between the original
material and the final results. Modern investigators do not look at each
sample value with the same care that older econometricians used when the
data were personally handled. A certain number of initial mistakes or non-
sensical results have to be tolerated as payment for the enormous amount
of good material that is generated. Only the "old hands" know where to
look for troubles and avoid pitfalls, because the typical modern investigator
does not want to search into or behind the computer processing unit.
CONTRIBUTION TO THOUGHT
The computer is a facilitator but it does not guarantee good analysis or
usage, enabling us to produce econometric findings on a large scale, pre-
sentable in convenient form. It is now time to consider the impact of this
effort, not particularly from the viewpoint of the immediate users, but from
the viewpoint of scholarly thought.
Econometric methods are often used to test economic theory. The models
are themselves often, and preferably, based on received economic theory.
Whether or not they fit the facts of economic life should tell us something
about the validity of the underlying theory.
Some direct tests of economic theory have been decisive, but we often
come up against the fact that the data of economics, which form the sampling
basis for econometric inference, are not sharp enough or abundant enough

OCR for page 95

MACROECONOMIC MODELING AND FORECASTING
101
to come to decisive conclusions in many cases. More than one hypothesis
is often consistent with a given body of data.
We have, however, been able to reject the crudest and most simplistic
of theories. The data, and tests based on these data, have been conclusive
in rejecting crude acceleration principles or the crude quantity theory of
money:
where
It = act + et
Mt = k [GNP ($~]t + ut
I~ = net real investment in period t
Cat = rate of change of real consumption during period t
Me = money supply at time t
[GNP ($~' = nominal GNP during period t
en, us = random errors
a,k = parameters
If we hypothesize that It is proportional to C' apart from an additive
random error that is obtained from a fixed distribution with finite variance
and no serial correlation, we would find that the data do not support this
model.
It is, of course, a simple model, and if it is generalized by replacing
C' by total real output and introducing other variables such as capital cost
and capital stock we get the generalized accelerator, which does appear to
fit the facts. That does not mean that we have proved that the generalized
accelerator is the only investment function that fits the facts; indeed, there
are others that are consistent with observed data, but we have been able to
reject the crude, and original, version of the accelerator hypothesis.
The same is true of the crude quantity theory. Either Milton Friedman's
generalization, by introducing distributed lags in the price and real output
factors (components) of [GNP ($~], or some extended liquidity preference
version of the Keynesian theory both fit the facts. We cannot discriminate
definitively between the monetarist hypothesis of Friedman or the portfolio
hypothesis of Keynes, yet we have been able to reject the crude form of
the theory.
There are many examples like these, but it can legitimately be argued
that such testing does not take us very far because it leaves an embarrassingly
large number of competitive hypotheses still standing as possible expla-
nations of reality. When we go beyond the simple single relationship,
however, to fully determined models of the economy as a whole, we find
that it is not easy to put together an entire set of relationships that explains,

OCR for page 95

102
LAWRENCE R. KLEIN
to an acceptable degree, the evolution of the macro economy. It is, ad-
mittedly, difficult to settle on a single set of composite relationships for
the economy as a whole, yet it is possible to find more than one that passes
the conventional statistical tests.
The econometric approach should be used, therefore, painstakingly and
slowly, to sift through this multiplicity of systems in replicated applications
and to test them by the strength of their predictive powers. It is usually the
case that particular models will work well on one occasion or another, but
it is not at all easy to find a system that will stand up to predictive testing
period after period. If an estimated model does predict well in repeated
applications, then we get some evidence about the validity of the hypothesis
on which it is based.
Some macroeconometric models that were thought to rest firmly on
accepted hypotheses, such as the St. Louis model, the Fair model, or various
supply-side models, did so poorly in predictive testing that they were deemed
failures (McNees, 1973; Fair, 1976; Andersen and Carlson, 19761.~ Mon-
etarist economists were enthusiastic about their hypotheses in the late 1960s,
but when the St. Louis model, which was based on those hypotheses, came
up against the oil embargo, OPEC pricing, food price inflation, and the
automobile strike of 1969, its operators declared that it was not meant for
short-run prediction of the economy. This statement contradicted their orig-
inal hypotheses. Eventually the monetarists contended that the model could
be used for long-term but not for short-term analysis. In this case, it appeared
to fit the data of the economy for awhile, but with repeated use, observations
emerged that were not consistent with its underlying theory. The same is
true of the original Fair model.
As for the supply-side models that were introduced in the late 1970s,
they were never tested against the facts, but when they were estimated and
confronted with the data of 1981-1982 they failed to stand up as maintained
hypotheses. More conventional models correctly predicted that this would
be the outcome.
Given only limited success for macroeconometric model analysis in test-
ing economic theory, how has it otherwise contributed to economic thought?
The main contributions have been to decisionmakers, the users of econo-
metric information in the public and private sectors. They are usually ad-
ministrators and executives in large organizations and enterprises. Legislators
also fall into this group. In a sense, the utility of these models is indicated
by the degree of their use. There are systematic records of their forecast
The use of supply-side models is quite new; fully documented citations will not be ready for
a few more years.

OCR for page 95

MACROECONOMIC MODELING AND FORECASTING
103
history but no systematic records of their performance in decisionmaking.
Since only one outcome is actually observed, it is impossible to judge their
accuracy with respect to unobserved alternatives that tend to be considered
by the decisionmaker in reaching a choice.
Decisionmakers say that they make better choices than they would oth-
erwise be making without the use of such models, and that econometric
models are the only tool available in a large variety of situations. This is
why econometric model-building activity is expanding so rapidly all over
the world.
To a large extent, the fast models were national in scope and fitted
together with the emerging body of data on national income and product.
But many subnational and supranational macroeconometric models, for
industries, markets, or regions of a nation, are now either in use or being
built. Many of these are designed in integrated feedback mode with com-
prehensive macro models, while many are also built in satellite mode,
without feedback.
At the international level we now have many models connecting different
countries and regions in a macro world system. One of the first of such
systems was designed as an international trade model by J.J. Polak (19531.2
In 1969 Project LINK was initiated, with the objective of consistently
tying together models of the main trading nations to analyze the international
transmission mechanism (Ball, 1973; Klein et al., 19821. The project now
consists of the interrelated network of models for 25 industrial countries-
the Organisation for Economic Co-operation and Development-(OECD),
8 centrally planned economies, and 4 regions of aggregative developing
country models. Work is under way to add more than 25 models of indi-
vidual developing countries. Project LINK is approaching, in the true sense
of the word, the status of a world model.
After the implementation of LINK in analyzing the world economy for
such things as oil price shocks and predictions of various global totals,
many other supranational systems have been designed: INTERLINK, by
the OECD; the TSUKUBA-FAIS Model by Tsukuba University, Japan; the
Multicountry Model of the Federal Reserve Board; the World Economic
Model of the Economic Planning Agency, Japan; the FUGI Model of Soka
University, Japan; and the World Model of Wharton Econometrics. These
systems vary according to size and focus. Some concentrate on exchange
rates and balance of payments flows; others are principally concerned with
trade. Some emphasize the long term; others the short term. Nevertheless,
they all have a common interest in the global outcome of economic activity
20ther early models were COMET and DESMOS (see Barten, 1981; Dramais, 1981).

OCR for page 95

104 LAWRENCE R. KLEIN
and will be used with increasing frequency in an economic world that is
becoming increasingly interdependent. Most of these systems were devel-
oped during the past 10 years, and it is evident that many more will be
developed in the period ahead.
Specifically, the systems are used to study:
· the exchange rate system
· trade liberalization or protection
· world business cycle analysis
· worldwide disturbances (oil, food, raw materials)
· international debt problems
· policy coordination among countries
· transfers of capital
· international migration
As new, equally pressing, issues arise the models will be adapted to the*
analysis.
SOME NEW LINES OF DEVELOPMENT
Econometnc model building in the computer age has moved in the di-
rection of the large-scale (1,000-or-more-equation) system with many sec-
tors, rich dynamics, nonlinearities, and explicit stochastic structure. It has
never been viewed as an issue of "bigger is better"; it is mainly an issue
of detail. In large, detailed systems of this sort a main interest has been
the development of scenario analysis. This procedure generalizes the entire
concept of the multiplier, which is meant to show the relationship between
any particular endogenous variable (Yi') and any corresponding exogenous
variable Axe:
Alit
exit
other x's unchanged
This general expression includes the original Keynesian multiplier
dGNP
dI 1 - mpc
mpc = marginal propensity to consume
GNP = real gross national product
I = real investment (exogenous)
This simple multiplier expression is designed to show the GNP that would
be generated by an increment in fixed investment. For most countries the
GNP gain would outstrip incremental investment, making the multiplier

OCR for page 95

MACROECONOMIC MODEr rNG AND FORECASTING
105
greater than 1.0. Conventional wisdom, derived from large-scale econo-
metric models of the United States, indicates that the multiplier, after taking
a much more elaborate system structure into account, is about 2.0 after a
period of about two years' sustained stimulus to investment.
The scenario, as distinct from the multiplier, simply imposes changes
on a model. These changes can be at any place and any time; they may
alter any element or group of elements. Computer simulation enables the
investigator to compare a baseline or reference solution to a model system
with a scenario solution. Scenarios can be quite creative the scenario of
disarmament, of harvest [allure, of embargo, of stimulative policy, of tech-
nical progress. Investigation is virtually unlimited. It is important to have
a credible and fully understood baseline solution; the scenario then produces
a discrepancy that reflects the investigator's creative inputs. This can be
quite revealing and is of the greatest importance for policymakers, deci-
sionmakers, or interested economists. There is unusual interest in a model's
forecasts, but there is as much interest in scenario analysis. Scenario analysis
permits rapid response to changing situations, such as abrupt shifts in policy,
embargoes, strikes, and natural disasters. It is also the natural tool for
planning.
It is evident that the rapid, frequent, and flexible implementation of
scenarios would not have been possible without the use of the computer.
In addition, the report-presentation capabilities of the computer enable the
model operator to communicate final results with a high degree of expo-
sitional clarity. These applications on a large scale have been available for
only about 10 or 15 years. They are undergoing further development,
particularly for use with microprocessors. From the point of view of de-
velopment of thought, this aspect is likely to be mainly of pedagogical
· · ~
slgmilcance .
But the use of scenario analysis in the formulation of economic polio
is leading in another direction that does have some methodological and
theoretical significance for econometrics. The formal theory of economic
policy was introduced some 30 to 40 years ago by J. Tinbergen and others.
Tinbergen drew a distinction between targets and instruments of policy.
The former are the subgroup of endogenous variables that policymakers
want to have at specified values some time in their planning horizon, while
the latter are the items of control among the exogenous variables that
policymakers can influence. For example, bank reserves are instruments
affected by the Federal Reserve system's open market operations that are
fixed at certain values in order to achieve policy targets such as various
money supply aggregates. The formal theory establishes the choice of in-
struments in relation to target goals in the framework of a loss or gain
function that the policymakers attempt to optimize, subject to the constraints

OCR for page 95

106 LAWRENCE R. KLEIN
of the working of the economy, represented by a macroeconometric model.
We are not, by a long shot, near the point at which policy can be routinized
through this optimization process. We are not able to do what scientists
and engineers accomplish with applications of optimal control theory to the
operation of physical systems such as a boiler. We have borrowed many
useful ideas from the control theory literature, and the development of this
stream of econometric analysis is providing a deeper understanding of the
workings of models, especially inputs for exogenous variables over distant
horizons. These inputs are those that keep the system close to some a priori
targets.
Control theory calculations may become difficult if used in large-scale
systems. This has been the case, particularly, in large international systems
with multimodal components, as in Project LINK. For this reason simplified
and small systems are frequently used to facilitate the control theory ap-
plications to econometrics. This use is not meant to bring control theory
to the needs of actual policy application. Eventually, the cumulation of
knowledge and even further computational advances will make control
theory methods more suitable for use in large, state-of-the-art models.
Despite the advances in econometric model building and the growth in
its use, there are skeptics among professional economists. Some argue that
models are not sufficiently accurate although users do seem to appreciate
their accuracy to the extent that they find them important elements in their
own tool kits. They will undoubtedly continue to use macroeconometric
models unless and until something better, cheaper, and more convenient is
made available.
For some time, analysts who work with different systems have made
claims of either superiority, equality, or cheapness, but they have not
produced convincing evidence to support their claims. In some respects,
time-series models, which are based purely on sample data with no (or
minimal) underlying economic theory, claim to be an alternative. At present,
it may be said that time-series models, in single equations or systems of
equations, produce forecasts that are at least as good as macroeconometric
models for short time horizons, say up to six month-e or less. Time series
models do not provide a flexible vehicle for scenario analysis, but they do
provide forecasts.
The contest between time series and macroeconometric models will con-
tinue on its present course, and it is plausible to believe that each has
something to learn from the other, but there is also another possible route
in which they may be developed together for construction of a better system.
It is first necessary to digress for the explanation of a controversial issue
in connection with application of large-scale models. After a model is
estimated from a typical sample of economic data, it is invariably found

OCR for page 95

MACROECONOMIC MODEl;lNG AND FORECASTING
107
that extrapolations outside the sample lose accuracy quickly. For one year or
two, a freshly estimated model may stay close to economic reality, but it has
never proved possible to estimate a model from a time-series sample and use
that model in extrapolation in an automatic way with zero (mean) values
assigned to the stochastic error terms. All such attempts have failed to pass
forecasting tests outside the sample. The reasons for this failure are:
1. data revisions between the estimation period, within the sample, and
the extrapolation period;
2. legal or institutional changes in the functioning of the economy;
3. the occurrence of an unusual disturbance (war, strike, embargo, nat-
ural disaster);
4. temporary behavior drifts.
One way of dealing with some of these problems in small systems is to
reescalate the model every period (as often as every quarter) before extrap-
olation. This is entirely possible in small systems but not in the large models
presently in use. Instead, common practice is to estimate nonzero values for
the stochastic component to bring the model solution close to reality for the
initial period before extrapolation. In general, this period will be one outside
of the sample. We line up the model so that it starts an extrapolation period
at the observed values. Its reaction properties are left unchanged unless there
has been a statutory or other known structural change.
The adjustments are made to equations on a nonmechanical basis, and
the criticism of model builders' practices is that they are changing their
systems prior to application by a method that is not purely objective. (It
may be better to say, "by a method that is not purely automatic," because
objective criteria are used in choosing the nonzero values for the stochastic
terms.)
A suggestion by Clive Granger, who is an exponent of time-series meth-
ods, may indicate a fruitful alternative that is automatic. Granger suggests
that forecasts be made from linear or other combinations of models in order
to spread risk in an uncertain exercise. A different combination is the
following: Time-series equations can be estimated for each endogenous
variable of a model. These can be automatically recalculated on an up-to-
date basis as new observations are made available. Error values in an
extrapolation period can be assigned to each equation of a model so that
the time-series estimate of the normalized endogenous variable for each
equation is obtained.3 In other words, the model is automatically and ob-
jectively adjusted to "hit" the time-series estimate of the value of each
3A normalized variable is the variable that carries a unit coefficient in each stochastic equation.

OCR for page 95

108
LAWRENCE R. KLEIN
endogenous variable in the initial solution period. This adjustment is then
retained for the balance of the extrapolation horizon.
Variations on the pure time-series model are provided by the so-called
"current quarter model," a system in which pure time-series estimates are
supplemented by estimates that make use of early reports on the macro
economy by using daily, weekly, or monthly advance estimates of key
magnitudes. A strong current-quarter model may provide better estimates
of initial values of endogenous variables than can a pure time-series model.
This area of research for improving the adjustment procedure is one that
is presently receiving attention and appears to be promising.
Other lines of development are in variable parameter models and in
generalized models that transcend the narrow scope of purely economic
relationships.
Parameters may vary over time, systematically or randomly. Methods
for dealing with these models have been proposed from time to time. There
is no immediate breakthrough in sight, but it is an area that merits much
attention.
As for enlarging the scope of models, we have attempted to bring in
more policy variables and introduce political reactions. This is particularly
the case for intervention in the foreign exchange markets in this era of
floating rates.
Economists tend to draw a line and delegate responsibility to demogra-
phers for birth, death, morbidity, and other population statistics; to cri-
minologists for crime statistics; to psychologists for attitudinal variables;
to political scientists for voting magnitudes, and so on. Usually, these
external magnitudes are classified as exogenous variables, but many interact
with the economic system in feedback relationships. Over the years the
scope of endogeneity has expanded. Many models generate age-sex-race
distributions of labor force, employment, and unemployment. The under-
ground economy, theft losses, and statistical discrepancies have not yet
been integrated with criminology theory, for example, but many possibilities
exist for going much further in the endogenous treatment of important
variables. Much more of He public sector is now endogenous than in the
earliest Keynesian models. The social science, legal, and engineering as-
pects of models need fuller integration and are likely to be handled that
way in the future.
REFERENCES
Andersen, L.C., and Carlson, K.M.
1976 St. Louis model revisited. Pp. 46-69 in L.R. Klein and E. Burrneister, eds., Econo-
rnetric Model Performance. Philadelphia: University of Pennsylvania Press.

OCR for page 95

MACROECONOMIC MODELING AND FORECASTING
109
Ball, R.J., ed.
1973 International Linkage of National Economic Models. Amsterdam: North Holland.
Barten, A.P.
1981 COMET in a nutshell. Pp. 211-219 in R. Courbis, ea., Commerce international et
Modeles Multinationaux. Pans: Economical
Bodkin, R.G., Klein, L.R., and Marwah, K.
1980 Macroeconometric Modelling: A Schematic History and a View of Its Possible Future.
Paper presented to the World Congress of the Econometric Society, Aix-en-Provence,
France.
Clark, C.
1949 A system of equations explaining the United States trade cycle, 1921 to 1941. Econ
ometrica 17 (Apnl):93-124.
Dramais, A.
1981 Le modele DESMOS. Pp. 221-234 in R. Courbis, ea., Commerce International et
Modeles Multinationaux. Pans: Economical
Duesenberry, J., Eckstein, O., and Fromm, G.
1960 A simulation of the United States economy in recession. Econometrica 28 (Octo
ber):749-809.
Fair, R.
1976 An evaluation of a short-run forecasting model. Pp. 27-45 in L.R. Klein and E.
Burmeister, eds., Econometric Model Performance. Philadelphia: University of Penn
sylvania Press.
Greenberger, M., Crenson, M., and Cnssey, B.L.
1976 Models in the Policy Process. New York: Russell Sage.
Haavelmo, T.
1943 The statistical implications of a system of simultaneous equations. Econometrica 11
(January): 1-12.
Hicks, J.R.
1937 Mr. Keynes and ~e "classics": a suggested interpretation. Econometrica 5 (April):147-
159.
Klein, L.R., and Goldberger, A.S.
1955 An Econometric Model of the United States, 1929-1952. Amsterdam: North Holland.
Klein, L.R., Ball, R.J., Hazlewood, A., and Vandome, P.
1961 An Econometric Model pf the United Kingdom. Oxford: Basil Blackwell.
Klein, L.R., Pauly, P., and Voisin, P.
1982 The world economy a global model. Perspectives in Computing 2 (May):4-17.
Lange, O~
1938
The rate of interest and the optimum propensity to consume. Econometrica 5 (Feb
ruary): 12-32.
Mann, H.B., and Wald, A.
1943 On the statistical treatment of linear stochastic difference equations. Econometrica 11
(July-October): 173-220.
McNees, S.K.
1973 A comparison of the GNP forecasting accuracy of the Fair and St. Louis econometric
models. New England Economic Review (September-October):29-34.
Modigliani? F.
1944 Liquidity preference and the theory of interest and money. Econometrica 12 (Janu-
ary):45-88.
Orcutt, G.H., Greenberger, M., Korbel, J., and Rivlin, A.M.
1961 Microanalysis of Socioeconomic Systems: A Simulation Study. New York: Harper.

OCR for page 95

110
LAWRENCE R. KLEIN
Polak, J.J.
1953 An International Economic System. Chicago: University of Chicago Press.
Suits, D.B.
1962 Forecasting win an econometric model. American Economic Review 52 (March): 1W-
132.
Tinbergen, J.
1939 Statistical Testing of Business-Cycle Theories, II, Business cycles in one un`'eu o`~`
of America. Geneva: League of Nations.