Individuality and the Brain
Gerald M. Edelman
We may soon see a headline, ''Machine Beats World Champion Chess Player.'' Indeed, it is likely that we will be able to design a computer to carry off this feat. Does such a machine have a mind of its own? Is our brain just a machine in a machine-like world?
Some scientists have suggested, in fact, that the brain is like a computer. Others have concluded that computers can think.
If all this makes some humans uneasy, there is comforting news. Brain science and modern physics both point to the conclusion that neither the universe nor the human brain are deterministic machines. Although the computer is the most significant invention of the 20th century, it is not a thinking object comparable to our own brains.
Instead, computers are powerful logic engines that operate under precise instructions. They do not have bodies, are not conscious, and cannot function without an explicit program.
In contrast, brains are not primarily logic engines. They are structures that have evolved to deal with novel events. Given how many things your brain must deal with while you drive home from work, it is not surprising that the brain is the most complicated object in the universe. If you counted at a rate of one per second all of the connections in
the part of your brain responsible for consciousness, you would finish counting 31 million years from now.
The finest wiring of the brain is individual; no two brains are alike. Brains allow their animal owners to categorize and act on events that cannot be foreseen. Computers with this much individual variation could not even run their programs. A brain memory, unlike the memory in a computer, is creative, variable and dependent on context.
Computers, being non-conscious, cannot develop a language with changeable meanings that refer to things or events. Programmers assign the meanings before and after they run their programs. Meanings developed in and by brains, by contrast, have poetic capabilities. They depend on the bodies of which they are part and on the circumstances these bodies encounter. It is unlikely that a computer-waitress would say to another computer-waitress, "The ham sandwich just left without paying."
A rich brain works like a cross between a jungle and a map. The variations in a brain's structure and function help determine which neural circuits are most fit. Just as individual animals are favored in Darwin's theory of natural selection, so individual groups of neural cells in a brain are enhanced in their connectivity if their activity results in rewarding behavior. Maps between these selected cell groups lead to rich responses, transforming past signals in novel ways. Solid evidence supporting these ideas is now beginning to accumulate.
These observations have deep implications for human concerns, particularly if each brain is necessarily individual and if each of its responses is part of a historical reaction to novelty. If these notions turn out to be true, then every act of perception is in part an act of creation. Every act of memory is in part an act of imagination. These are not acts to which computers can aspire.
Although neuroscientists like myself depend heavily on the power of computers to help us model the brain, the computers themselves are poor models for what is going on in our heads and bodies. Used well, they can help us find out about human nature and its bases. That is a sufficiently valuable role without exaggerating their capabilities. To-
gether with the avalanche of discoveries in neuroscience, the proper use of computers will provide an exciting view of what it truly means to be human.
The Enlightenment of the 17th and 18th centuries had a hard time reconciling human freedom with a machine-like model of the world. Now we are recognizing that, far from operating like machines, individuals have identities that are profoundly unique. By revealing this, brain science is laying the ground for a new Enlightenment. President Bush has declared the 1990s to be the Decade of the Brain. Properly supported, it may be expected to yield discoveries that justify a century of freedom.
May 3, 1992
Gerald M. Edelman, a Nobel laureate, is director of the Neurosciences Institute and chair of the department of neurobiology at Scripps Research Institute in La Jolla, Calif. This article is adapted from his book Bright Air, Brilliant Fire (Basic Books).
* * *
Mapping the Human Brain
Joseph B. Martin
The fastest supercomputer is no match for the human brain. Simply reading this article will involve billions of the reader's brain cells, communicating with each other across vast, intricate, yet precisely coordinated connections. Despite great strides in research, the human brain remains mysterious.
Our failure to comprehend many brain functions is taking a staggering toll. Millions of Americans suffer from debilitating diseases such as epilepsy, multiple sclerosis, strokes, and Alzheimer's and Parkinson's diseases. More than 60
million suffer from depression and other mental illnesses; another 20 million abuse alcohol or drugs. Still others battle the effects of head and spinal cord injuries.
These and other problems occur, at least in part, because something goes wrong inside the brain. Perhaps an imbalance of the brain's chemical transmitters leads to depression. Cells may misfire in the brain's cortex, causing epilepsy. Or a small group of cells in the brain may die, producing Parkinson's disease.
When compared with what we know about treating infections or heart disease, our understanding of the brain remains meager. As a committee that I chaired for the Institute of Medicine concluded in a recent report, this ignorance about brain function retards advances toward effective treatments for many medical problems.
President Bush and Congress have declared the 1990s to be the Decade of the Brain, making this a propitious time to take stock of what we know — and what we need to know in the future. To focus this effort, a new initiative is needed to map the brain.
Limited maps of the brain already exist. But new imaging devices, computer graphics, and other technological advances make it possible to develop three-dimensional, computerized maps of extraordinary sophistication. For researchers, this would be like trading in a one-sheet map of a city for an atlas showing the internal dimensions of individual houses. A researcher interested in an area of the brain involved in memory, for example, could obtain a computerized cross-section of the area and zoom in to view local connections and synapses.
The utility of such computerized displays of information can be illustrated by four areas of brain research. The first is vision. Blindness and disturbances of vision often are caused by stroke and other brain injuries. However, since vision involves more than 300 pathways within the brain, these problems now are difficult to understand. Intricate computer simulations, anatomical and chemical maps, and other tools might help unravel them.
Substance abuse is another critical problem. Researchers now know, for example, that cocaine exerts its activity at
specific brain receptors that normally mediate cell-to-cell communications. If synthetic substances could be designed to satisfy an addict's craving for drugs by safely binding to these receptors, millions of addicts might be helped.
A third area needing more research is pain. Normal pain serves as a useful warning of injury, but many people have pain that is pathological or abnormal. Since pain involves virtually every region of the brain, as well as the spinal cord and peripheral nerves, a better map could help researchers figure out why it causes so much misery.
The last example is schizophrenia, which now afflicts 2 million Americans. Scientists studying this tragic mental illness are focusing on biochemical and other abnormalities in the brain, as well as on potential genetic or environmental causes. Whether their interest is the distribution of the brain chemical dopamine or the location of specific genes, a detailed map would be helpful.
A brain mapping initiative that helped incorporate important computer technologies into brain research could begin with an annual budget of $10 million — less than 1 percent of the U.S. neuroscience research budget. An advisory panel of neuroscientists and computer experts could guide the effort and ensure that findings are shared widely via computer networks.
A century ago, Earth mappers never dreamed of the satellite photos available today. So it is now with our new ways to map the brain, and the need to proceed is compelling. More people are hospitalized in the United States with neurological and mental disorders than with any other major disease group, including cancer. We must chart the vast biological frontier that is the human brain.
July 7, 1991
Joseph B. Martin is dean of the School of Medicine at the University of California, San Francisco.
* * *
Gene Therapy: No Longer Just a Concept
Richard B. Johnston
This is the fourth time that my patient, two-year-old Christopher, has been hospitalized with a life-threatening infection. His parents, Jan and Steven, have watched helplessly as their son suffered from recurrent pneumonia, massively enlarged lymph glands, and now an abscess around his liver. I feel frustrated because, as his pediatrician, I can treat Christopher's acute infection and reduce the likelihood of further infections, but I know there is no cure for his disease.
Christopher has a rare, inherited immune deficiency called chronic granulomatous disease (CGD). In this disorder the white blood cells that normally eat and kill invading bacteria and fungi are abnormal. Affected children develop severe infections and sometimes die.
When I gave Christopher's diagnosis to his parents, their confusion and vulnerability were painful to see. What does it mean to have a gene defect like this? Would they gain or lose by knowing whether one or both of them carried the defective gene? How would this knowledge help their son? What about their future children? Unfortunately, as a specialist in immune disorders, I have faced these questions many times from couples like Jan and Steven.
Recently, however, the answers have changed. The defective genes that cause CGD and many other inherited birth defects have been identified. Knowing the specific gene defect allows us to predict possible future problems in the affected child and what the odds are of having another baby with the same disorder.
This new knowledge can be applied not only in diagnoses but also in treatment. In 1990, a four-year-old girl with another rare, fatal, inherited immune disease called adenosine deaminase deficiency made history as the first person to receive an experimental treatment known as gene therapy.
Since then, a second girl with the same disorder has been treated, and the condition of both girls is greatly improved.
Gene therapy involves inserting copies of a normal gene into specific cells from the patient's body. Various tricks are used to accomplish this — for example, using "tamed" viruses to penetrate cells and implant the healthy gene. The
patient can then receive a transfusion of his or her own cells, complete with healthy copies of the missing or faulty gene.
As the government's Human Genome Project proceeds to identify all human genes, it will become possible to develop gene therapy for a growing number of the approximately 3,000 known genetic disorders.
However, opponents of gene therapy have tried to block the National Institutes of Health from conducting experimental trials. These critics charge that manipulation of human genes is a "slippery slope" that will inevitably lead to highly questionable practices, such as the creation of humans with extraordinary strength or intelligence or a predominance of males.
Our new ability to manipulate genetic material is raising a number of other ethical and social dilemmas as well: Who will receive gene therapy and who will not? How will the uses of gene therapy be regulated? How will the information obtained from genetic tests be used?
This is not the first time the pace of technology has exceeded society's ability to manage the moral implications of a discovery. This time, however, it will primarily be individual people, not governments, who must make the critical decisions. These decisions should not be made in fear and ignorance.
A recent poll by the March of Dimes Birth Defects Foundation found that 68 percent of Americans know almost nothing about genetic testing, and 87 percent know nothing about gene therapy.
It is crucial that the American public become better educated about this powerful new science. Public debate is needed, and the teaching of genetics must be strengthened at every level of the American educational system, beginning in primary school.
The potential to help Christopher and children like him is at hand. If basic research and clinical trials progress as expected, it will not take more than a decade or two until we find incredible new cures and treatments for people with genetic defects. These advances will present society with complex social and ethical issues. The more informed people
become about gene therapy, the better they can speed its potential toward reality.
November 8, 1992
Richard B. Johnston is senior vice president for programs and medical director of the March of Dimes Birth Defects Foundation and adjunct professor of pediatrics at the Yale University School of Medicine.
* * *
Driving to a Safer Future
A. Ray Chamberlain
If driving makes you nervous, especially during winter months when conditions can be treacherous, just consider some trends that could make motor vehicle travel even more dangerous in the future:
Roads will be more congested, making driving increasingly difficult and motorists more irritable.
Trucks will be bigger.
Roads, bridges and the rest of the infrastructure will be older.
Aging of the population will create new safety problems.
Cars equipped with futuristic equipment like "talking maps" may distract a driver's attention unless implemented safely.
These and other factors could push the number of deaths from motor vehicle crashes well above the current total of 46,000 annually. In the past, transportation experts have held down fatalities by developing innovations such as safety belts, breakaway highway signs and, more recently, airbags
and anti-lock brakes. Aggressive efforts to reduce drinking and driving also helped. Although motor vehicle traffic more than doubled since the mid-1960s, the number of deaths per mile traveled plummeted by nearly 60 percent.
But this remarkable progress now appears unsustainable. I led a committee of the National Research Council that recently examined highway safety research, and we found far too little being done to develop the safety innovations of the future. The federal government and the states slashed spending on highway safety research during the 1980s to just $70 million annually, or 30 cents per American. Automobile manufacturers and others also perform safety research, but the overall effort is insufficient.
American motorists deserve better. Fiscal austerity prevents public officials from spending the sums this major public health problem justifies. Yet a great deal could be accomplished by a modest increase in funding focused on a few topics that promise significant results. Our committee
identified six in particular that warrant immediate attention:
Crash avoidance. Why do people make mistakes and have crashes? That is a question of growing importance as the population ages and the skills of drivers change. Driver error, sometimes caused by substance abuse, is the major factor contributing to motor vehicle crashes. We need more ''human factors'' research to learn how to create friendlier vehicles and highways for people with serious — but often predictable — limitations.
Occupant protection. As cars get smaller and trucks bigger, it becomes more important to understand exactly what happens to vehicles and people in a crash. "Biomechanics" research can provide this information.
Safer highways. The curve of a road, the width of a lane and other factors all affect safety. But which highway designs and traffic engineering improvements provide the biggest benefit? We need to learn more. Over the next several decades, major portions of the highway system will be rehabilitated or reconstructed, providing a unique opportunity for making roads safer.
Post-crash acute care and rehabilitation. Emergency medical care is likely to become more difficult as congestion worsens and the number of elderly crash victims increases. New ideas are needed to provide rapid access to crash sites and to improve trauma centers.
Management of highway safety. More older drivers and increased highway congestion also challenge us to develop better licensing programs and law enforcement efforts. Improved methods for screening drivers, for example, might enable states to tailor driver licenses to people's individual capabilities — for example, allowing daytime driving only. Technologies such as automatic speed recording devices could automate some traditional enforcement methods.
Driver information and vehicle control technologies. Automobiles soon may be equipped with systems that warn drivers of unsafe conditions, proximity to other vehicles or even the driver's own impairment. Changeable message signs on the road, meanwhile, will provide real-time information
about highway conditions. These information systems could reduce the risk of crashes. But, if poorly designed, they could overload some drivers. It is essential, while these devices are still on the drawing board, that their safety impact be examined.
A research program on these specific topics would require additional federal funding of $30 million to $40 million annually — a tiny amount when compared to the $70 billion in medical expenses, lost wages and property damage caused each year by motor vehicle crashes. Anyone unpersuaded by that comparison might consider that 4 million people will be injured in vehicle crashes this year. We owe it to ourselves to reduce this dreadful toll.
January 20, 1991
A. Ray Chamberlain is executive director of the Colorado Department of Transportation.
* * *
New Priorities in the Heavens
John A. Dutton
Once the source of national pride and astounding achievement, the space program has been an invisible issue in this year's presidential campaign. The program was in the news recently because of the change in leadership at the National Aeronautics and Space Administration (NASA). But the candidates have said little about the fabulous opportunities and painful choices awaiting America in space.
The nation needs to re-examine the rationale for its civilian space program. Today's realities are markedly different from those of the past. With the Cold War ended, economic
and technological competition among nations is increasing and will determine world leadership. American students are performing poorly in science and mathematics; too few choose careers in these fields that shape our future. With the national budget in deficit, we lack the funds to do all that we could in space, including a host of valuable scientific missions, a space station, a crewed mission to Mars or an aerospace plane to replace the space shuttle. We must choose where in the heavens to set our sights.
We should set them on science, on information and new understanding. Two years ago, a presidential advisory group urged that scientific research be moved front and center in the civilian space program. More recently, a committee that I chaired for the National Research Council agreed that a focus on acquiring information and developing understanding about our Earth and the universe could reinvigorate the space program while supporting other national needs, including the development of powerful capabilities for handling information.
So far, our discoveries in space have revealed unexpected complexity and sharpened our understanding of the universe. We have found new moons and rings around giant planets, seen evidence of black holes, observed the earliest stages of star formation and studied cosmic radiation lingering from the Big Bang. We have searched for life on other worlds and learned much about our own planet, including weather patterns, the detailed mosaics of the surface and the atmospheric ozone hole.
Scientists are planning missions that promise even greater knowledge in the future. Some missions will examine global warming and other environmental problems on Earth. Some will reveal new aspects of the planets, the dynamics of our Sun and the evolution of the universe. We will learn how physical processes react to the microgravity environment of space. To undertake long journeys such as a trip to Mars, we still must determine the effect on humans of living in a weightless environment for several years. The heavens beckon with the promise of undreamed rewards.
To reap them, we must decide what to do. Our committee did not take a position on the controversy between manned
and unmanned missions, nor on any specific missions or other NASA programs. Rather, we concluded that focusing on the return of information and creation of enhanced understanding would strengthen all activities in space and enhance the benefits of our national investments. We must set clearer priorities and then follow through with an effective and well-managed program.
NASA cannot afford to complete all of the projects already begun or approved. The scientific community must develop mechanisms to help officials choose among valuable but competing scientific opportunities.
After 30 years of experience in space, certain principles for managing an effective program are clear. Costly missions must be balanced with ongoing support for the scientists and students across the country who quietly do much of the work of space science, converting data into understanding, finding the pathways to be explored tomorrow. Diverse means for reaching space are essential, and both launch vehicles and the role of humans should be matched to mission requirements. We need to find ways to reduce the costs of space research. Once we have started valuable projects, we should finish them.
Our success in space research has brought us many benefits: stimulating young citizens, enhancing national prestige and fostering public pride in national accomplishment. By carefully focusing our efforts, we can increase our gains from space and bequeath still more secrets of the universe to future generations. Although we have an urge to explore new domains, an even more fundamental human imperative is to expand knowledge and know the unknown. Ultimately, that is why we loft our spacecraft into the heavens.
March 29, 1992
John A. Dutton is dean of the college of earth and mineral science at Penn State University.
* * *
Reaching for the Answers in the Stars
John N. Bahcall
Imagine looking for a firefly 100 miles away that is glowing next to a brilliant searchlight. That is how hard it is to detect the reflected light from a large planet orbiting a star other than our own Sun.
Astronomers such as myself have other ways of spotting planets. We can study a star to see if it is wobbling slightly, which suggests that nearby planets are exerting a gravitational tug. New telescopes can detect such wobbles and reveal the possible existence of planets 500 light-years away — an astonishing distance.
Discoveries like these may change how we humans think about ourselves, our Earth and our place in the universe. Most Americans know more about the stars in Hollywood than about those in the sky. Practically the only thing they have heard recently about astronomy is the initial troubles with the Hubble Space Telescope. Yet a team of more than 300 scientists that I led for the National Research Council reported this past week that astronomers are on the verge of findings that could transform our collective self-image.
In the coming decade, astronomers will address fundamental questions of nature. How and when did galaxies form? What is the fate of the universe? Are we alone? We humans have pondered questions such as these for millenia.
Anyone who enjoyed the recent "Civil War" television series might consider how much we have learned about the heavens just since Lincoln's time. We now know that stars form out of gas clouds and die either in quiet solitude or in spectacular explosions. Billions of stars are grouped into galaxies that stretch as far as the largest telescope can see. The universe itself apparently was born in a violent explosion some 15 billion years ago.
During the 1980s, we learned that radiation from most of the matter of the universe has gone undetected. We still do not know where — or what — it is. Giant black holes
apparently exist in the centers of some galaxies and quasars. Huge neutron stars emit radio pulses that are more regular than the best clocks made by humans.
Still greater advances lie ahead. The country's largest telescope will open soon in Hawaii. New detectors at this and other telescopes will capture light from the heavens with exquisite sensitivity. Combined readings from widely scattered telescopes will yield images much sharper than those from any single site.
Even people uninterested in science have reason to look forward to what we may discover as a result. Astronomy provides benefits that extend far beyond the observatory. Imaging techniques developed for radio astronomy, for example, are now used in medical CAT scanners. X-ray detectors in telescopes led to baggage scanners in airports. Studies of the heavens taught us about environmental problems on Earth, such as ozone depletion and global warming.
Choosing which opportunities to pursue next is not easy. Budgets are so tight that our committee had to set stringent priorities, recommending only about one in ten promising initiatives for funding. The biggest need for "Earth-based" research, we felt, is to strengthen the day-to-day infrastructure. Many observatories have suffered a decade of neglect, and young university researchers are struggling for support. Both need help. Up in space, less costly but more frequent missions are needed to respond to new scientific ideas and technological advances, and to train more young scientists.
We also urged support of four exciting new major telescopes. One of these, a space-based telescope operating at infrared wavelengths, uses camera technology pioneered by the U.S. military for detecting incoming missiles, such as Scuds. It will be more than a thousand times more sensitive than comparable telescopes on earth. A radio telescope in the Southwest and an infrared sensitive telescope in Hawaii will bring distant galaxies into much clearer view. The United States might even begin placing telescopes on the moon if it builds a lunar base.
Other opportunities abound. Up in the night sky, the answers to questions that have mystified mankind for millenia
are finally within our grasp. Now that we have the chance, we need only reach a bit farther and grab them.
March 24, 1991
John N. Bahcall is president of the American Astronomical Society and professor of natural sciences at the Institute of Advanced Studies in Princeton, N.J.
* * *
Abolishing Long-Range Nuclear Missiles
Sidney D. Drell
Although the United States and the former Soviet Union have made remarkable progress on arms control, the scariest weapons of the Cold War remain in thousands of missile silos and submarines. It's time to start thinking seriously about eliminating the long-range ballistic missiles that can hit their intercontinental targets in less than 30 minutes.
Extraordinary progress between the two nuclear superpowers makes this once-Utopian idea plausible. The growing threat of long-range ballistic missiles proliferating among other nations makes it compelling.
By early next century, several new countries may become capable of delivering weapons of mass destruction over long distances. A rogue state could fire — suddenly and with little warning — a long-range nuclear missile at one of our cities. We need a world order free from any nation's having this capability.
Politically, we will enhance our ability to dissuade others from deploying such weapons by agreeing to forego them ourselves. The recent experience with Iraq illustrates the
difficulty of detecting covert efforts to attain nuclear capability. Yet long-range ballistic missiles are so large that it is impossible to hide their testing and deployment from current detection technology.
Banning strategic missiles would not require giving up our entire nuclear capability. Rather, it would put a greater burden on our strategic bomber force. This has many advantages. Slow-flying planes require hours rather than minutes to deliver their nuclear devastation. They are far less threatening or useful for first strikes and can be recalled in the event of a misunderstanding or accident.
To maintain its security, the United States would have to ensure that its bomber force could survive an attack. There is a strong basis for believing this could be done. Without long-range missiles, no enemy would have a chance of destroying our dispersed bomber fleet. Equally important, our bombers are equipped with Stealth technology and the ability to fire nuclear-tipped cruise missiles from 1,500 miles away. Experience in the Gulf War has added to our confidence in their effectiveness.
With the disappearance of long-range ballistic missiles, there no longer would be any rationale for a costly, futuristic, Star Wars type of missile defense system. However, cooperation in space exploration would be required to ensure that no country uses acceptable space activities as a cover for military programs that circumvent a missile ban.
Difficult questions remain, most notably: How do we get there from today's force structure? Doing so won't be easy; it probably will take a decade or more. Abolishing strategic missiles would cause tremendous disruptions in our force structure, far greater than those occurring in the current scaling down of the military. It would mean the end of our nuclear triad of submarines, bombers, and land-based missiles, which was developed to meet the former Cold War threat.
Once we have satisfied ourselves on technical grounds that a strategic bomber force could survive and that cruise missiles could penetrate to assigned targets, there would be no strategic reason for continuing to deploy long-range missiles. Politics would be the only remaining major obstacle
to their elimination. Not the least of the difficulties would be getting the other powers with long-range nuclear weapons to go along.
In his State of the Union address in January, President Bush proposed cuts that, although large, still would retain a nuclear advantage for the United States. In particular, he called for eliminating all land-based missiles with multiple warheads, the category in which Russia is strongest. Simultaneously, he proposed only a 30 percent cut in the area where the United States is supreme — submarine-launched missiles with accurate, multiple warheads. The problems of balancing these disparate missile forces would end if we got rid of all strategic missiles.
The once-visionary idea of abandoning these weapons — called "fast flyers" by President Reagan when he first proposed such a ban at the Reykjavik summit in October 1986 — may not be achievable in the near term. But as our attention turns from the former Soviet threat to nuclear proliferation elsewhere, it is an idea that demands consideration. It may be possible once again to have a world free of these frightening missiles.
April 19, 1992
Sidney D. Drell is professor and deputy director, Stanford Linear Accelerator Center, Stanford University. This article is adapted from a longer version in the Spring 1992 edition of Issues in Science and Technology.
* * *
Angling for a New Food Source
Robert B. Fridley
Visit the supermarket and you may wince at the price of shrimp, scallops and certain other seafood. The consumer price index for fish has outpaced that for other foods, largely because consumer demand has grown faster than the supply.
But suppose there were a way to increase the supply significantly. Shoppers might get lower prices. Fishing crews, many of which are hurting financially, would get more business. It's a great idea — except that the world's prime fishing grounds already are at or near their maximum sustainable yields. So where would more fish come from?
We should raise them ourselves. That is what the United States does for other foods, from beef to broccoli, better than any other country. Yet although the average American eats 24 percent more fish than a decade ago, we still produce most fish essentially as we have for centuries — by catching what nature provides.
A National Research Council committee, which I chaired, concluded recently that it is possible to supplement natural yields dramatically by raising finfish, shellfish, crustaceans and seaweeds. Some nations do so with great success. Unless we start doing the same, and soon, we risk losing this lucrative market, just as we have fallen behind in the production of textiles, consumer electronics and other goods. Learning to raise fish also would help us enhance natural fisheries.
According to the Food and Agriculture Organization of the United Nations, world fish production in 1988 was 98 million metric tons. Of the 75 million metric tons consumed directly by people, nearly one of every five fish was provided by aquaculture.
This impressive percentage has been growing abroad. But for all of its scientific know-how, the United States contributes just 2 percent of the world's aquaculture production. China is the leader with nearly half of the total. The vast majority of U.S. marine aquaculture production is devoted to a single shellfish — oysters.
Freshwater farming of catfish, crayfish, trout and other species has expanded rapidly in the United States. In some parts of the country these products are now common. Not so with fish farming in saltwater. Barriers to marine aqua-culture have been the high value of ocean and coastal space; environmental concerns about animal and feedstock wastes and about the transfer of diseases with wild stocks; and objections by some boaters and fishermen to net or cage installations. Other people say the installations are unsightly.
All of these concerns are solvable, however, and there are large benefits in pushing ahead with marine aquaculture where appropriate. Expanding our efforts in this field could create new jobs, provide a reliable source of seafood, augment
threatened fisheries and fish species, and reduce our dependence on imported seafood.
Private companies and entrepreneurs in the United States cannot accomplish this on their own. Our committee concluded that their efforts are being constrained by a regulatory and policy framework that is far too complex and restrictive, and by knowledge and technology that remain too limited.
Suppose you want to begin raising fish offshore. Not only do you need to meet the usual business needs of obtaining financing and the like but you also must comply with federal, state and regional rules that are unclear. Getting a permit can be costly and time consuming. You also must assemble the necessary knowledge, skills and technology. It's no wonder many aspiring entrepreneurs simply give up or go bankrupt.
The permit process must be clarified and streamlined. People who want to undertake marine aquaculture should not face a regulatory maze. Making marine aquaculture a recognized use of the Coastal Zone Management Act, for instance, would stimulate states to include the raising of fish in their coastal management plans.
A modest national research program could provide better methods for raising fish in an environmentally sustainable fashion. Creation of an expanded biological and engineering knowledge base would spur businesses from Maine to Hawaii.
In other words, with a little more scientific and bureaucratic support and a little less red tape, we could ease the pressure on fragile ocean fishing areas while satisfying consumers nationwide. As the demand for fish grows, so must the supply. Mother Nature alone cannot provide all of the fish that consumers want. It's time we woke up and smelled the chowder.
August 16, 1992
Robert B. Fridley is executive associate dean, College of Agricultural and Environmental Sciences, University of California, Davis.