Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.

Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter.
Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 60

8
Theoretical Computer Science
Computer science is a young discipline, and its theoretical base
is immature. Potential applications of computers even to known
problems will be limited until that base of theory is more fully
developed. Theoretical computer scientists are concerned with the
intrinsic limits to solving problems.
Since computer science is an artificial science (Simon 1981), the-
oretical computer science plays a very different role within computer
science than, say, theoretical physics plays within physics. Theoreti-
cal physicists seek to understand the physical universe, which exists
independently. Theoretical computer scientists seek to understand
aD possible architectures or algorithms, which computer scientists
create themselves.
The concepts and results of theoretical computer science have in-
fluenced many disciplines. In particular, scientific computing, which
has been referred to as the third leg of science (together with ex-
perimental and theoretical science), uses algorithms developed by
computer scientists. New algorithms have been as important as
advances in technology in our progress In solving may important
problems during the last four decades.
60

OCR for page 60

61
COMPUTATIONAL COMPLEXITY
Some of the most influential theoretical work in computer sci-
ence has involved the notion of computational complexity, i.e., the
intrinsic difficulty of solving a given problem, independent of the par-
ticular algorithm employed to solve it. For example, some problems
are known to require computational resources that grow linearly or
quadratically with input size, while the resources required by other
problems grow exponentially or in unknown ways. The theory devel-
oped to date enables computer scientists to demonstrate that various
problems are easily solvable or provably intractable and further en-
ables them to determine that numerous problems have essentially
the same complexity. Computational complexity thus serves much
the same function in computer science that the laws of thermody-
nam~cs play in the physical sciences (Packet and Traub 1987~; it
strives to determine the limits of the possible. An important open
problem in theoretical computer science is to decide whether an im-
portant class of computational problems (the so-called NP-complete
problems) really are intractable or simply resistant to the approaches
used to date.
An example of an NP-complete problem is the traveling sales
representative problem, which is formulated as follows: one L given
a set of n cities together with the distances between all pairs of cities.
A trip must be made that visits each city exactly once and returns
to the starting point. The problem is to find a tour, that is, an
ordering of the n cities, that minimizes the total distance traveled.
This problem turns out to be an abstraction of many scheduling and
layout problems in the real world.
A characteristic of the traveling sales representative problem
and of many other weR-known problems, such as linear program-
ming, is that the available information is complete and exact. In
principle, these problems can be solved exactly. Information-based
complexity studies the intrinsic difficulty of problems for which the
information is partial, contaminated by error or approx~nation, and
provided at some sort of cost. These are characteristics of many
problems in the physical sciences, the social sciences, and engineer-
ing. Such problems cannot be solved exactly, and one must live with
uncertainty. Questions considered by information-based complex-
ity include a problem's intrinsic uncertainty, the minimal computa-
tional resources required to achieve any chosen level of uncertainty,
and whether the information can be decomposed for parallel or dis-
tributed computation.

OCR for page 60

62
ALGORITHMS AND THEIR ANALYSIS
An area in which theory has had significant practical unpact in
the past and ~ likely to have more in the future involves the search
for new procedures to solve difficult problems.
While some computer scientists point to the dramatic increase
in computational power as the development that has contributed the
most to extending the computer's problem-solving utility, others ar-
gue that theoretical advances in the design and analysis of algorithms
have had and will continue to have an even greater impact. Better
understanding of how best to design and measure the efficiency of
computational procedures, in terms of both speed and the amount of
memory required, has resulted in numerous algorithms of practical
utility.
The development of the fast Fourier transform, for example,
produced an algorithm that ~ ubiquitous in the areas of signal and
image processing and ~ used extensively in speech research, radar,
sonar, oil exploration, and seismography. Indeed, the algorithm
mught be said to have made such proce - sing possible; without it, the
problems would be too computationally intensive to solve, hardware
advances notwithetancling.
The development of a highly efficient sorting algorithm has also
led to a broad range of applications, and a more recently devel-
oped linear programming algorithm has offered a solution to many
common scheduling, distribution, and resource allocation problems.
Current theoretical work in algorithms addresses a large set of simi-
lar practical problems and has moved increasingly into the realm of
parallel and distributed systems.
_ , _
·~ ~ .
SEMANTICS AND LANGUAGES
Theoretical work on computer languages has focused on syntax—
how to generate and describe a language's Frammar and on seman-
tice how to determine the meaning of a program through the mean-
ing of the operations it causes to be executed. Fundamental advances
in these areas have drawn on mathematical logic to produce a con-
ceptual framework that facilitates computer language translation on
conventional architectures. Twenty-five years ago, for example, the
generation of a Fortran compiler was considered a legitimate research
topic, involving 20 to 50 person-years of work; today, compilers for
advanced languages on conventional uniprocessors are routinely con-
structed by students over the course of a single academic semester.

OCR for page 60

63
Formal language theory developed in the 1960s and 1970s enabled the
development of utility programs that can now automatically generate
lexical analyzers and parsers from the grammar of a programming
language. Theoretical developments on formal semantics in the 1970s
and early 1980s are now enabling the automation of other aspects of
compiler construction. Results from graph theory have improved the
ability of compilers to optimize object code.
This dramatic improvement Is attributable to a better under-
standing of computer languages and other computer science issues
and suggests the potential practical results that can flow from fu-
ture theoretical advances. At the same tune, new and evolving ar-
chitectures, especially multiprocessors, are presenting designers of
compilers with new challenges and long development times, thereby
attesting to the field's continued growth.
CRYPTOLOGY
Although code-making is an ancient craft, dating back to the
Roman Empire, computer science theory has played an essential role
in adapting encryption to meet the special demands of complex mod-
ern society. It is one thing to maintain privacy and security against a
small number of potential eavesdroppers when one Is communicating
a limited number of messages to a limited number of recipients; it is
quite another to preserve secure communications in an environment
of vast, interconnected communications networks with thousands or
even millions of users. Computer scientists have developed a tech-
nique by which the potential recipient of messages can broadcast a
(disguised) password publicly (while retaining it privately) and be
assured of being the only one able to decipher the encrypted messages
received. The same approach can be used to distinguish the signatory
of an electronic memo from an impostor. Such advanced encryption
techniques have a narrow but important range of applications in both
the commercial and national security areas.