National Academies Press: OpenBook
« Previous: 3 An Overview of Risk Assessment
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

4
Technological Risk and Cultures of Rationality

SHEILA JASANOFF

John F. Kennedy School of Government, Harvard University

The latter half of the twentieth century has brought increased demands for governments of modern societies to expand their regulatory powers beyond economic to social regulation. As new technological hazards multiplied, traditional concerns with rates, routes, and pricing of industrial products and services were supplemented by pressures to safeguard the quality and safety of life. Issues such as public health, worker safety, medical devices, consumer protection, food production, and the environment either arose or gained in importance on the policy agendas of most industrial nations. With this shift, scientific knowledge became an ever more essential prerequisite for credible policy making, and governments vastly expanded their capacities for producing and assessing relevant technical information. The policy system's greatly enlarged dependence on science can be charted through the emergence in recent decades of new areas of research (e.g., environmental health, climate change), new analytic techniques (e.g., cancer risk assessment, biosafety assessment), and new programs of data collection (e.g., indicators for desertification or biodiversity, postmarket surveillance of adverse drug reactions).

As governments came to rely more on science as a basis for regulation, policy analysts initially assumed that cross-national cultural differences would diminish in importance as a factor shaping public action. The universality of science has been an article of faith for modern societies since the Enlightenment. A common base of scientific understanding, it was widely

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

thought, would override the vagaries of national politics and culture in specific issue areas, whether nuclear power, pollution control, pharmaceutical regulation, the management of chemical pesticides, or the environmental release of genetically modified organisms (GMOs).

One of the more interesting findings of comparative policy research in recent years has been the failure of these expectations. Although policy agendas, broadly speaking, have converged on a host of issues worldwide, specific national policies for managing health, safety, and environmental risks continue to diverge, even when they are ostensibly based on the same bodies of scientific information. Intriguingly, evidence deemed persuasive in one national policy context does not necessarily carry the same weight in others. Even when policy outcomes converge, as for example in the informal moratorium on nuclear power across most of Europe and the United States, the underlying technical justifications are not invariably the same.

The literature on comparative policy provides some notable examples of cross-national divergences in the regulation of technological hazards. Thus, a four-country comparison of U.S. and European chemical regulation in the mid-1980s showed that European nations neither gave the same priority to carcinogens as did the United States nor developed comparable programs of testing and risk assessment (Brickman et al., 1985). Parallel differences have been observed even between the arguably more closely coupled policy systems of Canada and the United States (Harrison and Hoberg, 1994). National strategies for regulating air pollution have similarly diverged in priority setting, timing and severity of controls, and the choice of regulatory instruments. European countries, for example, were markedly slower to regulate airborne lead and chlorofluorocarbons than the United States. More recently, Europe has overtaken the United States in cutting sulfur emissions regarded as a precursor of acid precipitation. Biotechnological products created through genetic modification have encountered substantially different entry barriers on the two sides of the Atlantic, with significant cross-national disparities observable in the environmental release of GMOs (Jasanoff, 1995), the public acceptance of genetically modified foods, and patent protection for genetically engineered animals.

Numerous explanations have been offered for these persistent policy divergences, which reflect in turn underlying differences in societal perceptions and tolerance of risk. The simplest causal factor advanced by social scientists is economic interest—most plausibly invoked when the burdens and benefits of regulation fall disparately in different national contexts. For example, the relatively muted character of antinuclear protest in France (Nelkin and Pollak, 1981), as well as that nation's exceptionally low-key response to the Chernobyl disaster, have been attributed to the heavy French reliance on nuclear power as an energy source. Similarly, generators of acid precipitation such as the United States and the United Kingdom have been notably less aggressive in seeking control policies for sulfur oxides than the recipients of pollution, such as Canada and Norway.

Historical explanations seem to carry weight in other cases: Germany's unusual hostility to biotechnology in the 1980s no doubt reflected a distaste for

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

state-sponsored science and a fear of uncontrolled genetic experimentation, both inherited from the Nazi era (Proctor, 1988; Gottweis, 1998). Other factors that have been held responsible for deviancies from allegedly rational policy choices include deficiencies in the public understanding of science (see, for example, Breyer, 1993), mass hysteria, the rise of politically influential social movements, the economic inefficiency of litigation (especially mass torts), and lack of political will or leadership. Cross-national variations in any of these factors could, in principle, lead to substantial divergences in policy outcome across countries.

The chief difficulty that these explanatory strategies encounter is their ad hoc and unsystematic character. Separate reasons are sought for each case of deviance from an idealized and supposedly rational baseline. Economics is invoked in one case, history in another, adversary politics in still a third. No general patterns emerge. Economic arguments, for all their appeal, only carry the day in a limited number of cases; often, national policies seem to favor outcomes that burden industrial production, as in the case of Europe's famous precautionary principle (see below) for the environment.

Moreover, sustained research in the fields of comparative policy and politics points to the durability of certain modes or styles of political action within nations, regardless of the issue in question. There appears to be a systemic quality in national responses to many different perceived threats and crises. The term political culture has been used as a catchall to explain such patterned divergences. It encompasses those features of politics that seem, in the aggregate, to give governmental actions a distinctively national flavor, even in countries sharing generally similar social, political, and economic philosophies. Political culture is difficult to measure in quantitative terms, although various attempts have been made to quantify some aspects of it. Thus, efforts were made in the 1960s to measure the engagement of citizens with their political system in several democratic societies (see, for example, Almond and Verba, 1963, 1989; Putnam, 1979, 1993), and many surveys have been made of public attitudes to particular technological developments (e.g., biotechnology, as measured by the Eurobarometer). For many, political culture is not a highly useful concept because it is simply the residue that remains after other efforts at rational explanation have failed. Yet it has become critically important to understand political culture better—especially with regard to its influence on public policy—as states, multinational corporations, and an increasingly well-informed civil society all confront the challenges of living together on the same bounded planet.

In this paper I attempt to synthesize our current knowledge of political culture as derived, empirically, from diverse studies of national regulatory systems and, theoretically, from recent developments in social theory and science and technology studies. In the following section of the paper I briefly outline the principal dimensions of variance among national approaches to regulating technological risks. In the subsequent section I outline the major ways in which comparative social scientists have tried to systematize the notion of political culture. In the concluding section I draw on this analysis to offer

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

some reflections concerning the possible harmonization of policies for biotechnology across national boundaries.

DIMENSIONS OF CROSS-NATIONAL VARIANCE

Before embarking on a discussion of cross-national policy divergences, it is useful to remind ourselves of the large commonalities that provide humankind in any era with a shared set of experiences and understandings. In late modernity, as our historical moment is sometimes called, governments of advanced industrial societies have been required to deal with many common policy problems at roughly similar points in time. Examples with a significant scientific or technical dimension include, most recently, the global environmental crisis, the instability of global capital, economic restructuring after the Cold War, arms control, new epidemics, and the uneven social vulnerability to human as well as natural disasters. As one looks across the policy spectrum within liberal democracies, one finds numerous striking parallelisms in both governmental action and societal demand. These include similarities in legislative priorities, investments in science and technology, development of policy-relevant expertise, new forms of social mobilization, and increased interaction between state and nonstate actors. Significant shifts in policy ideology, such as economic liberalization or deregulation, are seldom any longer confined to the boundaries of single nations. Social movements, too, seem relatively unconstrained by national politics as they work to raise the visibility of particular policy issues. At the same time, national autonomy in many areas of policy making has been curtailed through increasingly thick networks of international regimes and institutions (for an overview of international developments in the environmental arena, see Haas et al., 1993).

Policy divergences are nested within this broad framework of common human understanding and social development. Their presence and persistence are therefore all the more remarkable. They deny any absolute claims for historical or technological determinism—that is, for universal regularities in human behavior occasioned by the characteristics of a particular period in time or by the material inventions of human ingenuity. Divergent responses to risk, in particular, point to the ability of social norms and formations—in short, of culture—to influence deeply the ways in which people come to grips with the uncertainties and dangers of the natural world. How do these different coping strategies most commonly manifest themselves?

Framing

Risk is often defined as the probability of a harmful consequence. How often will a flood or earthquake or volcanic eruption occur in a given region within a given number of years? What is the likelihood of an exceptionally severe El Niño or that warming of the earth's atmosphere will melt the West Antarctic ice shelf? What are the chances that a prolonged dry spell will give

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

rise to consuming forest fires? Although risks of these types have traditionally been seen as natural, many risks of greatest concern to technological societies involve natural and social factors operating in tandem. How likely is it, for instance, that overfishing will destroy the capacity of fisheries to replenish themselves or that cutting down trees for fuel will lead to deforestation of uplands and consequent downstream flooding? With increasing knowledge of human—nature interactions, we have come to perceive numerous phenomena once seen as wholly natural as also having a human-made component. Anthropogenic climate change is perhaps the most noteworthy example of such a shift in awareness.

It is by now widely acknowledged in the policy analytic literature that our capacity to identify distinct problems from a universe of potentially interconnected causes and effects involves a kind of selective vision referred to as framing (see, for example, Cobb and Elder, 1972; Dryzek, 1990; Schon and Rein, 1994). Frames have been defined as "principles of selection, emphasis, and presentation composed of little tacit theories about what exists, what happens, and what matters" (Gitlin, 1980). By sorting experience into such well-demarcated patterns of significant causes and effects, human agents impose meaning on what might otherwise be no more than a jumble of disconnected events. Framing orders experience, eases confusion, and creates the possibility of control. Problems that have been framed are capable, in principle, of being managed or solved. At the same time, framing, in its nature, is also an instrument of exclusion. To bring some parts of experience within a frame—to render them comprehensible and interpretable—other parts must be left out as irrelevant, incomprehensible, or uncontrollable. This dual aspect of framing, as a device for ordering as well as exclusion, helps to capture some of the observed cross-cultural variation in the identification and management of risk.

Differences in framing are most starkly apparent when the same social problem is attributed to different causes by competing actors in a policy-making environment. Is climate change the result of worldwide emissions of greenhouse gases, as claimed by Western scientists, or is it the result of centuries of unsustainable and inequitable resource exploitation by industrial countries, as claimed by some developing country activists? Is the world as we know it teetering on the brink of environmental disaster because of overpopulation in poor countries or overconsumption in rich ones? Should the AIDS epidemic be seen as a consequence of deviant sexual behavior or is it simply a highly resistant viral disease that foreshadows the threat of new global epidemics? Is persistent poverty attributable to welfare policies that sap individual initiative or to the absence of effective job creation strategies in inner cities? Opinion on such questions may differ radically among actors within a single country as well as between countries. For purposes of this paper, differences of the latter type are of greater interest because they point to the possible influence of culture rather than of more temporary economic or social interests.

National responses to the risks of biotechnology in recent decades provide one striking example of divergent framings of technological risk (see Jasanoff, 1995 for further details). In the United States, initial concerns with the safety of

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

genetic engineering as a new scientific process gave place to a government-wide consensus that regulation should focus largely on the products of biotechnology. The process of genetic manipulation was deemed not to pose any special hazards in and of itself. In Britain, by contrast, policy leaders have continued to worry about genetic engineering as a novel process that is not well enough understood to be granted a clean bill of health. In Germany, the risks of biotechnology were seen from the start as both social and natural in character, because the technique appeared to give the state unregulated power to reshape the meanings of nature and human identity. The uncertain risks of genetic research seemed to undercut the German constitutional system's guarantee of adequate state protection against industrial hazards. Accordingly, Germans felt the need for programmatic legislation in the form of a law specifically addressing genetic engineering to control this technological enterprise in all its aspects.

Styles of Regulation

Although governments of industrial societies frequently converge in deciding which risks require positive state action, resulting regulatory programs are often founded on very different patterns of interaction between the state and other major actors. These systematic differences are sometimes referred to as styles of regulation (Vogel, 1986; Brickman et al., 1985). The components of a nation's regulatory style may include, in brief, the means by which the state solicits input from interested parties, the opportunities afforded for public participation, the relative transparency of regulatory processes, and the strategies employed for resolving or containing conflict.

Comparative research over the past two decades has highlighted the relatively sharp stylistic differences between the United States and other industrial countries. On the whole, U.S. regulatory processes are more formal in soliciting and processing information, more inclusive in securing participation, more comprehensively documented, and more adversarial in handling disputes than those of most other nations. Thus, U.S. administrative law permits private parties to sue regulators for both substantive and procedural deficiencies in their decision making; agencies may therefore be sued for failure to build an adequate technical record or to take account of relevant scientific information. In other countries, litigation against regulators is at best infrequent and even then is limited to instances in which a right has been clearly violated. Lawsuits founded on alleged inadequacies in the government's technical analysis, such as the recent appeals court decision striking down proposed federal standards on airborne particulates (DC Cir., 1999), are virtually unheard of outside the United States. Disputes elsewhere are resolved more often behind closed doors than in the open forum of a courtroom. Correspondingly, the basis for policy decisions is far less readily available to the public at large than in the United States, where

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

the Freedom of Information Act and other laws create a strong presumption in favor of disclosure.1

These generic differences in communications between state and society also intersect with the manner in which social actors express dissent from, or resistance to, official policy. Discontent with state action manifests itself most readily in the United States, where the courts provide a ready avenue for challenging policy decisions. Highly polarized conflicts are more likely to give rise to direct political action in Europe and Japan, as when Greenpeace occupied the Brent Spar oil platform off the British coast or when various environmental groups have blockaded construction sites, torn up fields planted with GMOs, and the like. In the United States, comparably sharp disagreements would far more probably end up in court. Direct actions, such as terrorist attacks by animal rights activists or by the Unabomber, are considered here the exceptions, not the norm. Public referenda, widely used in a number of smaller European states, are atypical in U.S. politics, although the state of California has been a notable outlier in this respect.

Acceptable Evidence

Differences in the framing and style of regulation go hand in hand with substantial differences in the kinds of evidence that governments and the public consider suitable as a basis for public decisions. Standards of proof and persuasion also differ across countries, along with preferences for particular methods of technical analysis.

Contrasts between the United States and major European countries again provide some of the most striking examples. Comparative researchers have noted the consistent U.S. preference for formal and quantitative analytic methods, whether in measuring risk, economic costs and benefits, or even the relatively intangible impacts of regulatory policy on social justice. U.S. environmental policies, for example, gave highest priority throughout the 1970s and 1980s to the risk of chemically induced cancer. During this time, significant energy went into the development of sophisticated analytic techniques designed to produce reliable quantitative estimates of risk and, eventually, the uncertainty surrounding such estimates (NRC, 1994). European countries facing presumably comparable problems avoided the use of formal quantitative techniques in favor of more qualitative appraisals based on the weight of the evidence (Jasanoff, 1986).

These differences in forms and standards of acceptable evidence appear to correlate well with the two terms of greatest legal significance— risk and precaution—that have helped to define preventive environmental policy making

1  

A recent manifestation of the bias toward openness was the inclusion of a directive in the 1999 omnibus spending bill requiring the Office of Management and Budget to amend its rules for extramural research grants so that "all data" collected using federal research funds would be accessible under the Freedom of Information Act.

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

during the past two decades. Risk, as already noted, is the term heavily favored in U.S. legislation and public policy, whereas European nations have tended to attach greater consequence to the precautionary principle. These terms not only reflect subtly different notions about the purpose and scope of environmental protection, but they also entail different approaches to the public justification of environmental policy.

The concept of risk appears at first glance to render environmental problems more tractable precisely because it is probabilistic and measurable. The term was borrowed into the environmental domain from the financial sector, where it refers to a quantifiable probability of one or another adverse outcome. Risk is actuarial in spirit. One can (indeed, one often must) insure oneself against various kinds of risks for which actuarial data are available, such as fires, floods, earthquakes, catastrophic illnesses, or automobile accidents. When used in environmental decision making, risk retains the connotation of something that can be clearly defined and quantified, hence managed. It is a relative concept that risks can always be offset against benefits, and risk-based laws often explicitly prescribe that the benefits of policy action (which are, in their turn, quantified) should outweigh the risks. Importantly as well, risks can be compared against one another (Graham and Wiener, 1995) so that policymakers can be meaningfully instructed to focus on large risks over small ones and to ignore altogether risks that are too tiny to matter.

Critics of risk-based policy have noted that the language of risk implicitly conceptualizes most human-environment interactions as harmless or even positively beneficial (Winner, 1986). Risk is thought to be the exception, not the rule, in human engagements with nature. It is something that one can guard against without upsetting underlying philosophies of development, consumption, or resource use. By comparison, the precautionary principle seems to display greater sensitivity to human ignorance and uncertainty. Historically, the term is a translation of the German Vorsorgeprinzip, one of five fundamental principles recognized in German law as constituting the basis for environmental policy. Migrating into the English language and into European policy, the term has inevitably lost precision, but some of its features are quite generally accepted. The principle states in brief that damage to the environment should be avoided in advance, implementing a duty of care on the part of policymakers. As with risk, the principle emphasizes prevention rather than cure. But precaution, as used in a wide variety of European policy statements, seems to urge something more than mere prevention. It demands heightened caution in the face of uncertainty, to the point of favoring inaction when the consequences of action are too unclear. Unlike risk, which both invites and lends itself to calculation, precaution implies a greater need for (uncalculated) judgment and, where necessary, restraint.

Precaution, to be sure, is never an absolute mandate in any regulatory system. Just as risks are balanced against benefits, so the precautionary principle is offset in practice by other moderating principles, such as the requirement that actions be proportional to the anticipated harm. Nonetheless, the very indeterminacy of the idea of precaution may have kept it from being translated into formal assessment methodologies, such as quantitative risk assessment or risk-benefit analysis. Put differently, it may be easier to work with a concept such as precaution in a cultural

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

environment that does not insist on mathematical demonstrations of the rationality of policy decisions. The preference for relatively formal and quantitative or relatively informal and judgmental techniques of decision making thus resonates with other important values in environmental policy making.

Forms of Expertise

Whether formal or informal, risk analytic frameworks incorporate tacit assumptions about how the world works; the use of analytic techniques, moreover, entails choices about who participates, and how, in processes of environmental decision making. Both the forms of relevant expertise and the rules of participation may differ substantially in national systems for dealing with the same regulatory problems.

One axis of divergence that has proved to be especially significant is whether experts are selected primarily on the basis of their technical qualifications (what they know) or as much on the basis of their institutional affiliations and experience (who they know, and in what context). On the whole, the U.S. policy process stresses experts' technical competence more than their institutional or political background. American regulatory statutes not infrequently specify, for instance, which types of technical expertise must be represented on advisory panels for federal agencies. In many U.S. policy frameworks, expert advisers are actively required to display their political independence and neutrality as a prerequisite for government service. Although most such bodies also have to meet requirements of breadth and inclusiveness, overt application of political criteria is deemed inappropriate in most cases (for examples and further discussion, see Jasanoff, 1990). Objective scientific expertise is generally valued more highly than other grounds for decision making, and attacks on the scientific competence of regulatory agencies is a standard device for undermining their political legitimacy.

By contrast, expert advisory bodies in other industrial nations are often more explicitly representative of particular interest groups and professional organizations. Tripartite arrangements, including government, industry, and labor, are especially commonplace; in newer regulatory frameworks, participation has sometimes been broadened to include representatives of social movements, such as environmentalists and consumers. Outside the United States, an expert body thus is thought to reflect in microcosm the segment of society which will be affected by its policy advice. Expert judgment is expected to be binding because the group as a whole is capable of speaking for the wider community it represents. Technical expertise, experience or tacit knowledge, and social identities are regarded as equivalently important qualifications for offering advice under these presuppositions.

Another important dimension of difference concerns the role of nonexpert opinions in decision-making. Again, the U.S. policy system is most open to the inclusion of such viewpoints in the decision making record. Formally, inclusiveness is assured through a process that offers interested parties, at a minimum, the chance to comment on the government's rationale for proposed

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

decisions. In many areas of social regulation, lay participation is secured through more formal means, such as administrative hearings that give nonexperts a chance both to present their own positions and also to question those of technical experts representing government and industry. Ordinarily, entry into the U.S. policy process occurs at the initiative of parties who see themselves as stakeholders. By contrast, in most European countries, the right to be recognized as a stakeholder is neither automatic nor achieved through self-selection, but must be officially acknowledged through legislation or administrative practice. Entry accordingly tends to be limited to groups or actors who have established longstanding or politically salient working relations with governmental agencies.

Nature of Regulatory Standards

Standards play a crucially important role in any policy system that seeks to protect the public against technological risks. Standards come in many forms. They may be applied to industrial processes, pollutants, facilities, products, equipment or vehicles, or natural media such as air and water. Standards may be used to regulate the quality of an environmental medium; control harmful discharges, emissions, and residues; establish limits for human exposure to toxic substances; specify safe usage conditions for regulated products; or influence environmentally detrimental behaviors. In effectuating these goals, standards may directly address the design of a product or process (e.g., air bags in automobiles) or specify the performance level desired of a particular technology, leaving the means of compliance more flexible (e.g., emissions standards for power plants). They may be required by law (regulatory standards), recommended by guidelines, or voluntarily adopted by industries or private standard-setting organizations (consensus standards). They may be enforced through rigorous governmental monitoring and legal sanctions or through economic incentives or through relatively lax systems of self-regulation.

From this wide range of possible variation, national policy systems often seek out some characteristic approaches to standard setting. U.S. regulation, for example, has shown a preference since the early 1970s for nationally uniform standards that are enforced through the legal process. Penalties can be harsh, sometimes in the form of criminal sanctions for corporate executives. At the other extreme, British environmental standards were at one time locally and flexibly negotiated to suit the economic and technical capabilities of particular industrial concerns. More recently, this national preference has yielded somewhat in the face of European demands for greater uniformity and accountability across member states. With respect to enforcement, the European approach overall is less adversarial and legalistic than the American approach. Compliance tends to be achieved through bargaining and behind-the-scenes negotiation between business and government (other social actors generally play little role in enforcement) rather than through the Draconian processes of the law.

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

VARIETIES OF CULTURAL EXPLANATION

That differences such as those described above persist across similarly situated societies has presented a puzzle to economics and political science. Cross-national variations in risk perception and risk policy appear to contradict widely held assumptions of technical as well as economic rationality, both of which would predict greater convergence when states act upon similar information and need to balance similar trade-offs between the benefits and burdens of regulation. To explain patterned divergences in societal responses to risk, one has to supplement theories of rational choice with approaches that focus more centrally on the public interpretation of experience—in short, to supplement studies of reason and utility with studies of culture and meaning. In particular, one has to examine the role of institutions in stabilizing particular ways of dealing with uncertainty, conflict, expertise, and participation.

Going beyond currently dominant theories that cast states as rational actors, comparative studies of risk have given rise to three main theoretical frameworks for understanding cultural variations. The first is structural. This approach places primary emphasis on the role of political organization. It is presumed that the ways in which power is formally divided in society profoundly influence public perceptions of security and insecurity and also channel governmental action in specified directions. The second framework is functional . This approach regards all societies as encountering recurrent problems in the form of threats to their welfare or existence. Functionalist explanations therefore tend to see cross-cultural policy variations as by-products of differences in the perception, or framing, of social problems among different societies. The third framework is interpretive . This approach places primary emphasis on the need of societies to make sense and meaning of their collective experience, taking into account changes in knowledge and human capacity produced through science and technology. Interpretive social theorists—including specialists in science and technology studies—are particularly interested in the instruments of meaning creation in society, including most importantly various forms of language or discourse. Each framework illuminates some of the causes of cultural variation in risk perception and risk policy, as briefly described below.

The Role of Political Structure

The ways in which governmental power is institutionalized influence a society's handling of risk in more or less obvious directions. At the simplest level, agencies that are responsible for both the promotion and the regulation of technology tend to be more accepting of risk than those whose mandate is limited to regulation. This is why, in 1972, U.S. environmentalists successfully pressed to have pesticide regulation removed from the Department of Agriculture, where agribusiness interests were considered dominant, to the newly formed and politically less committed Environmental Protection Agency. Similarly, the regulation of nuclear power was taken away from the old Atomic

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

Energy Commission and delegated to the more independent Nuclear Regulatory Commission. Failure to separate promotional and regulatory functions in this way arguably leads to laxer regulatory practices. For instance, Britain's Ministry of Agriculture, Fisheries and Food is widely thought to have underestimated the transmissibility of bovine spongiform encephalopathy (BSE or mad cow disease) because its primary goals were to help the beef industry and prevent public panic.

More generally, the institutional organization of power affects the ways in which nongovernmental actors fight for particular policy objectives. In parliamentary democracies, for example, electoral politics provides the primary avenue through which citizens can expect to influence government. The rise of Green parties in Europe illustrates this dynamic. Environmentalists have needed to muster seats in parliament in order to press their agendas, and their success rates have differed from one country to another. In the United States, by contrast, local groups such as NIMBY (''not in my backyard") organizations have largely taken the place of party politics. The readiness of American citizens to form single-issue associations is historically documented. This strategy is facilitated by a "political opportunity structure" in which power over many issues is decentralized and local initiative can express itself through a variety of mechanisms, such as lawsuits, local referenda, and elections. In countries with more hierarchical and closed decision-making processes, activist groups may be slower to form and there may be an appearance of greater trust in government. However, underlying such superficial political acquiescence there may be significant public alienation and distrust which can erupt into mass protest if the opportunity arises (for sociological accounts of such public attitudes in Europe, see Beck, 1992; Irwin and Wynne, 1996).

Structural divisions of power have been plausibly correlated with another aspect of national regulatory styles, namely, the degree to which decisionmakers rely on formal, objective, or quantitative justifications for their actions. In the relatively transparent and competitive U.S. policy environment, decisionmakers are apt to be vulnerable to charges of subjectivity and arbitrariness. Indeed, the federal Administrative Procedure Act authorizes courts to review agency decisions to ensure that they are not arbitrary or capricious. Given these pressures, it is not surprising that United States policymakers have opted over time for more explicit and formal analytic techniques than their counterparts in other advanced industrial states. Examples include quantitative risk assessment of chemical carcinogens, cost-benefit analysis of proposed projects, detailed economic analysis of regulatory impacts, and environmental equity analysis—all of which are more extensively used, and also debated, in the United States than in other liberal democracies.

For all their apparent power, structural explanations have some notable deficiencies. Because they take structures for granted, they are unable in principle to account for modification and change in institutional configurations of power. Some phenomena that have proved important in international risk debates but that elude structural analysis include the rise and transnational spread of social movements and epistemic communities (groups of actors sharing similar beliefs and values about a given issue area), the shifts from one

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

problem framing to another in national and international programs of risk management, and the differences in value commitments with respect to technological risk among similarly situated states and social actors. Other types of cultural explanation have proved more helpful for these purposes.

The Functionalist Approach

Functionalist approaches, as noted above, conceive of societies as having a range of large problems that continually need to be addressed and solved for the society's general well-being. Risk could be seen as one such problem. Unmanaged risk creates situations of extreme uncertainty for citizens and undermines confidence in ruling institutions. Social theorists have argued that the rise of modern regulatory states was in part an answer to the risks of widespread economic and social dislocations surrounding the industrial revolution. In particular, institutions such as the insane asylum and the workhouse and social analytic techniques such as statistics and demography are thought to be instruments developed by states in order to enable and maintain policies for social order (see, for instance, Foucault, 1979; Porter, 1986; Nowotny, 1990).

One of the best known attempts to understand cultural variations in the management of risk arises from a blending of anthropology and political science in work initiated by Douglas and Wildavsky (1982). Cultural theorists have noted that beliefs about nature and society are encountered in some commonly recurring clusters that appear to correlate with forms of social organization. Three dominant belief systems about environmental problems have been described most often in the literature: catastrophist or preventivist (nature is fragile); cornucopian or adaptivist (nature is robust); sustainable developmentalist (nature is robust within limits) (Cotgrove, 1982; Jamison et al., 1990; Rayner, 1991). The image of nature as cornucopian has been further differentiated into the idea that natural bounty is lottery-controlled cornucopian (nature is capricious) or else that it is freely available (nature is resilient) (Thompson et al., 1990). Cultural theory posits that these persistent forms of belief are not accidental but are connected to underlying features of social order.

To explain why human views of nature, and associated views about human nature, fall into certain broad patterns, cultural theory suggests that such beliefs grow out of a need to preserve important ordering elements in social relations. Douglas (1970), in particular, sees two cultural variables as fundamental: hierarchy within a community ("grid") and the firmness of its demarcation from other communities ("group"). For example, bureaucratic organizations (high grid and high group, in Douglas' terms) are most inclined to believe that nature, though not infinitely malleable, can be managed by means of appropriate, technically grounded, and formally legitimated rules. Such beliefs promote this culture's interest in protecting its boundaries against outsiders, as well as in preserving its clear internal hierarchy. In contrast, market or entrepreneurial cultures (low grid and low group) seem more likely to

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

subscribe to a cornucopian view of nature—that is, the capacity of nature to rebound from assaults without active human intervention. This belief is consistent with the culture's willingness to rearrange its membership and operating rules so as to make best use of changes in its environmental resources.

By reducing the complexity of human-nature interactions to a few fixed types, the categories of cultural theory run up against some significant theoretical difficulties. It is unclear, to begin with, whether so parsimonious a notion of culture can be applied in meaningful ways to complex organizations (firms, social movements), let alone to nation states. Moreover, institutions and their members appear in this framework to be inflexibly bound together in hard and fast belief systems. This rigid packaging contradicts the ambivalence and heterogeneity of response reported in the literature on risk perception and public understanding of science and technology.

Cultural theory also resembles structural approaches in its relative insensitivity to historical processes. A functionalist notion of culture tends to take the needs of particular cultural types for granted. A bureaucracy, for instance, is always looking to maintain its hierarchical integrity, just as entrepreneurs are always seeking to maximize their profits through new modes of resource exploitation. Such assumptions are not well suited to account for large-scale social and ideological movements, such as the shift in the Western world from a pollution-centered to a sustainable developmentalist philosophy of environmental management in the 1980s. Shifts within organizations are also puzzling from the standpoint of cultural theory. For example, why was there a "greening" of industry in the late twentieth century, and why did German environmentalists eventually drop their "just say no" stance toward biotechnology? Changes in scientific understanding could provide part of the answer in such cases, but science, technology, and expertise play a relatively passive or subordinate role in the cultural theory framework. Science is seen more as a resource to be controlled by the dominant cultural types than as a source of distinctive knowledge and persuasive power. Nonetheless, cultural theory valuably calls attention to the socially constructed character of beliefs about nature and to possible connections between longstanding social relations and the perception and management of risk.

Interpretive Approaches

Interpretive social theory focuses from the start on the place of ideas in social life. It asks how people make sense of what happens to them, how they distinguish between meaningful and meaningless events, and how they accommodate themselves to new information or experience. It regards culture as the lens through which people understand their condition. This approach is centrally concerned with the origins of and changes in belief systems, including the modern belief system called science, and with the factors that make certain beliefs either unquestionable (ideology) or else massively resistant to modification. Accordingly, interpretive work in the social sciences has focused on the resources with which societies construct their ideas, beliefs, and

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

interpretations of experience. These include aspects of social behavior that have not been widely examined in quantitative social sciences, such as language and visual representation.

An important contribution of this theoretical approach has been to show how formal systems of language and practice incorporate particular, often culturally specific, ways of looking at the world—in other words, how they help to frame both problems and solutions (see, for example, Bjork, 1992; Litfin, 1994). Quantitative risk assessment (QRA) of chemicals provides an especially instructive example for our purposes. As noted above, this analytic technique has been more extensively used in the United States than in other industrial countries. Its use, in turn, implies a number of prior assumptions about the nature of risk and uncertainty.

QRA builds not only on seemingly objective measurements of toxicity and exposure but also, less visibly, on underlying models of causality, agency, and uncertainty. It frames the world, so that users of the technique are systematically alerted to certain features of risk but desensitized to others. Causation for purposes of QRA, for example, is generally taken to be simple, linear, and mechanistic. Asbestos causes cancer and dioxin causes birth defects in animals, but perhaps not in humans. The classical model of cancer risk assessment used by most U.S. federal regulatory agencies conceives of risk as the result of individual or population exposure to single harmful substances. Over the years, this causal picture has grown in complexity. An older single-hit model of carcinogenesis has been replaced by one that views cancer as a multistage process. It is recognized as well that risk is distributed over populations of varying composition and susceptibility, exposed for variable lengths of time and by multiple pathways. Quantitative models have been redesigned to reflect these discoveries.

But a closer look reveals that some of the most up-to-date models of risk assessment still remain quite partial and selective in their treatment of causes. In focusing on particular substances, for example, QRA necessarily ignores others. Despite scientific arguments to the contrary, industrial chemicals are taken to be of greater public health concern than similar substances to which people are exposed by nature. QRA in this way treats causes as if they fall primarily on the artificial, or non-natural, side of human exposure to chemicals in the environment (Ames et al., 1987; Gold et al., 1992).

In other respects, QRA tends to simplify the world so as to dampen the overall estimate of risk. The impact of multiple exposure routes and possible synergistic effects, for example, is rarely captured. Behavioral patterns that may aggravate risk for particular subpopulations (a well-known example is smoking among asbestos workers) are similarly downplayed or disregarded. Socioeconomic factors that tend to concentrate risk from many sources for poor and minority populations were not normally considered in QRA until pressure to do so arose from the environmental justice movement.

QRA also incorporates tacit conceptions of agency. Implicit in this mode of analysis is the notion that risk originates in the inanimate world, even though it is known at some level that social behavior is part of the process that produces risk. By focusing on material agents as the primary sources of risk, QRA tends to

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

diminish the role of human agency and responsibility. An indirect consequence is that governmentally sponsored research on risk has centered around issues of concern to mathematical modelers rather than to social scientists more broadly. Yet organizational sociologists have observed for years the complex ways in which the physical and human elements of technological systems interact to produce risky conditions and disasters (see, for example, Perrow, 1984). Similar insights have emanated as well from the sociology of technology, which calls attention to the continual interplay of functions between animate and inanimate actors (Bijker et al., 1987).

The third set of assumptions embedded in QRA has to do with the nature of uncertainty and our perceptions of it. This method takes for granted that it is possible to encapsulate in objective and understandable forms the zones of uncertainty that regulators should be aware of when attempting to control risk. Comparative and historical work on risk has shown, however, that even when societies use quantitative analysis to further public policy, they differ in how they classify and measure natural phenomena, which techniques they label as objective or reliable, how they characterize uncertainty, and what resources they apply to its reduction (Porter, 1995). Far from being a neutral statement about the unknown, uncertainty about risk thus appears as the product of culturally situated forms of activity. It is a collectively endorsed recognition that there are things about our condition that we do not know; but such an admission is only possible because there are agreed-upon mechanisms for finding out more.

QRA users, and quantitative modelers more generally, will tend to think about causation, agency, and uncertainty in different terms from those who rely on qualitative approaches to risk assessment. In European decision-making environments, for example, the interconnectedness of social and natural causes may be more readily understood, provided that policy advisory bodies include a sufficiently diverse range of expertise. Uncertainty is managed by building trust in particular institutions rather than by expressing it more precisely through formal analytic techniques. Thus, British policy has historically relied on a tested cadre of public servants whose integrity and judgment are considered beyond doubt (Jasanoff, 1997). German policy, by contrast, depends to a large extent on agreements forged in consensual, politically representative expert bodies whose decisions are trusted because they reflect the full spectrum of relevant societal beliefs. Uncertainty within these contexts is most likely to manifest itself as a loss of trust in the experts or expert bodies responsible for making policy.

QRA for its part loses credibility when it openly ignores spheres of human experience that bear crucially on people's perception of risk (for case studies of such loss of trust, see Krimsky and Plough, 1988). These include the strength of family and work relationships, the robustness of communities, the special status of children, and the trustworthiness of major institutions. Failure to take account of such historical and cultural factors in risk determinations can induce alienation, distrust, and heightened risk perception in those who are unable to participate meaningfully in the preserves of objective technical expertise. These observations account for recent high-level recommendations in U.S. policy circles to interweave processes of technical analysis and political deliberation more closely in risk decision making (NRC, 1996; Presidential/Congressional Commission, 1997).

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

CONCLUSIONS

Differences in institutionalized divisions of power, culturally grounded perceptions of need, and formalized systems of analysis profoundly shape the ways in which technological risks are framed for purposes of policy making. Contradicting the expectations of rational choice and policy convergence, these factors produce divergences in the conceptualization and management of risk even among societies that are closely similar in their economic, social, and political aspirations. As risk debates are globalized, engaging vastly more disparate societies, one can only expect such divergences to harden and grow more numerous. Cultural differences are particularly likely to arise when a risk domain touches upon issues that are basic to a society's conceptions of itself, such as constitutional relations between science and the state or religious and philosophical ideas about what is "natural." What then are the implications for the future of a promising new technology, especially one such as biotechnology that impinges upon such a wide range of fundamental conceptual questions?

One source of optimism is the proliferation in recent decades of policy-harmonizing institutions in the international arena. Their existence, and the increasing scope and diversity of their mandates, testify to the desire of modern societies to progress toward a shared future of increased safety, health, material comfort, and psychological well-being. Yet in trying to meet these multiple demands, international harmonizing bodies risk falling victim to the so-called contradictions of postmodernity. Different cultural constructions of the "same" policy problem may make agreement difficult in spite of apparent similarities in national goals and aspirations. Even where consensus is reached, ambiguities may subsequently resurface in the process of implementation. An initial convergence among experts may not always be sufficient to reassure skeptical publics and ensure robust political acceptance.

The 1996 BSE scare in Europe provided a dramatic but typologically by no means isolated example. The European Union's efforts to construct a unified, science-based standard to calm citizens confronting (ostensibly) the "same" risk of disease from the "same" agent were undercut by the discrepant perceptions of farmers, parents, food producers, government scientists, independent scientists, public health officials, agriculture ministers, politicians facing reelection, anti-European Britons, and the Brussels bureaucracy. Quantitative analysis proved inadequate for bridging these far-flung interests, as ministers wrestled week after week to agree on a single magic number—the number of cows that would have to be culled to render the beef supply adequately safe for all uses. Cartoons, black humor, and bizarre role reversals took the place of orderly policy making. Butchers in the markets of Europe appropriated the expert's reassuring role, with official-looking signs to back up their guarantees of "no British beef sold here." Ministers, having vainly turned to science for credibility, were forced to regain trust through personalized expressions of consumer confidence, such as, "Beef will still be served. Myself and my family will

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

continue to eat beef" (U.K. Minister John Gummer, as quoted in the Independent, March 22, 1996, pg. 5; see also Jasanoff, 1997).2

National policy institutions—shored up by history, tradition, established policy discourses, and well-understood standards of fairness and rationality—are able to persuade most of their publics most of the time that they can deliver fair and objective solutions to complex problems. International harmonizing bodies have few if any of these legitimating props at their disposal. As risks such as BSE assume global proportions, harmonizing institutions are likely to find it difficult to pass off as impartial expert judgment the political act of mediating among competing cultural framings of risk. Yet international regulatory institutions remain for the most part less transparent and less accessible to public input than their counterparts within many national governments.

Letting politics back into international policy processes may therefore be more productive in many cases than leaning exclusively on the supports of allegedly rational policy analysis. Mutual education seems the most promising route to eventual cross-national harmonization. If culture permeates the ways in which people cope with risk, then learning to understand each other's framing processes becomes a necessary prelude to collective action in the international arena. Exploring how culture matters in the politics of risk constitutes a modest first step in this direction.

REFERENCES

Almond, G.A., and S. Verba. 1963. The Civic Culture: Political Attitudes and Democracy in Five Nations. Princeton, N.J.: Princeton University Press.

Almond, G.A., and S. Verba, eds. 1989. The Civic Culture Revisited. Newbury Park, Calif.: Sage Publications.

Ames, B.N., R. Magaw, and L.S. Gold. 1987. Ranking Possible Carcinogenic Hazards. Science 236:271–80.


Beck, U. 1992. Risk Society: Towards a New Modernity. London: Sage Publications.

Bijker, W.E., T.P. Hughes, and T. Pinch, eds. 1987. The Social Construction of Technological Systems. Cambridge, Mass.: Massachusetts Institute of Technology Press.

Bjork, R. 1992. The Strategic Defense Initiative: Symbolic Containment of the Nuclear Threat. Albany: State University of New York Press.

Breyer, S. 1993. Breaking the Vicious Circle: Toward Effective Risk Regulation. Cambridge, Mass.: Harvard University Press.

Brickman, R., S. Jasanoff, and T. Ilgen. 1985. Controlling Chemicals: The Politics of Regulation in Europe and the U.S. Ithaca, N.Y.: Cornell University Press.


Cobb, R.W., and C.D. Elder. 1972. Participation in American Politics: The Dynamics of Agenda-Building. Baltimore, Md.: Johns Hopkins University Press.

2  

A new public crisis surrounding genetically modified foods that broke loose in Britain in February and March 1999 echoed many of the same themes. Prime Minister Tony Blair appeared to have learned little from the BSE episode as he tried to reassure Britons by saying that he personally would be happy to consume genetically modified foods.

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

Cotgrove, S. 1982. Catastrophe or Cornucopia: The Environment, Politics and the Future. Chichester, U.K.: Wiley.

DC Cir. 1999. American Trucking Associations, Inc., et al. v. United States Environmental Protection Agency, No. 97–1440.

Douglas, M. 1970. Natural Symbols: Explorations in Cosmology. London: Barrie and Rockliff.

Douglas, M., and A. Wildavsky. 1982. Risk and Culture. Berkeley: University of California Press.

Dryzek, J.S. 1990. Discursive Democracy—Politics, Policy, and Political Science, Cambridge, U.K.: Cambridge University Press.


Foucault, M. 1979. Discipline and Punish. New York: Vintage.


Gitlin, T. 1980. The Whole World Is Watching: Mass Media in the Making and Unmaking of the New Left. Berkeley: University of California Press, p. 6.

Gold, L.S., N.B. Manley, and B.N. Ames. 1992. Extrapolation of Carcinogenicity Between Species: Qualitative and Quantitative Factors. Risk Analysis 12.

Gottweis, H. 1998. Governing Molecules: The Discursive Politics of Genetic Engineering in Europe and the United States. Cambridge, Mass.: Massachusetts Institute of Technology Press.

Graham, J.D., and J.B. Wiener, eds. 1995. Risk versus Risk. Cambridge, Mass.: Harvard University Press.


Haas, P.M., R.O. Keohane, and M.A. Levy. 1993. Institutions for the Earth. Cambridge, Mass.: Massachusetts Institute of Technology Press.

Harrison, K., and G. Hoberg. 1994. Risk, Science, and Politics: Regulating Toxic Substances in Canada and the United States. Montreal: McGill-Queen's University Press.


Irwin, A., and B. Wynne, eds. 1996. Misunderstanding Science? Cambridge, U.K.: Cambridge University Press.


Jamison, A., R. Eyerman, and J. Cramer. 1990. The Making of the New Environmental Consciousness; A Comparative Study of the Environmental Movements in Sweden, Denmark, and the Netherlands. Edinburgh: Edinburgh University Press.

Jasanoff, S. 1986. Risk Management and Political Culture. New York: Russell Sage Foundation.

Jasanoff, S. 1990. The Fifth Branch: Science Advisers as Policymakers. Cambridge, Mass.: Harvard University Press.

Jasanoff, S. 1995. Product, process, or programme: three cultures and the regulation of biotechnology . In M. Bauer, ed., Resistance to New Technology. Cambridge, U.K.: Cambridge University Press.

Jasanoff, S. 1997. Civilization and madness: the great BSE scare of 1996. Public Understanding of Science 6:221–232.


Krimsky, S., and A. Plough. 1988. Environmental Hazards: Communicating Risks as a Social Process. Dover, Mass.: Auburn House.


Litfin, K.T. 1994. Ozone Discourses: Science and Politics in Global Environmental Cooperation. New York: Columbia University Press.


NRC (National Research Council). 1994. Science and Judgment. Washington, D.C.: National Academy Press.

NRC (National Research Council). 1996. Understanding Risk: Informing Decisions in a Democratic Society. Washington, D.C.: National Academy Press.

Nelkin, D., and M. Pollak. 1981. The Atom Besieged. Cambridge, Mass.: Massachusetts Institute of Technology Press.

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×

Nowotny, H. 1990. Knowledge for certainty: poverty, welfare institutions and the institutionalization of social science. In P. Wagner, B. Wittrock, and R. Whitley, eds. Discourses on Society 15:23–41.

Perrow, C. 1984. Normal Accidents. New York: Basic Books.

Porter, T.M. 1986. The Rise of Statistical Thinking 1820-1990. Princeton, N.J.: Princeton University Press.

Porter, T.M. 1995. Trust In Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton, N.J.: Princeton University Press.

Presidential/Congressional Commission on Risk Assessment and Risk Management. 1997. Framework for Environmental Health Risk Management. Washington, D.C.: Presidential/Congressional Commission.

Proctor, R. 1988. Racial Hygiene: Medicine under the Nazis. Cambridge, Mass.: Harvard University Press.

Putnam, R.D. 1979. Studying elite political culture: the case of ideology. American Political Science Review 65:651.

Putnam, R.D. 1993. Making Democracy Work: Civic Traditions in Modern Italy. Princeton, N.J.: Princeton University Press.


Rayner, S. 1991. A cultural perspective on the structure and implementation of global environmental agreements. Evaluation Review 15(1):75–102.


Schon, D.A., and M. Rein. 1994. Frame/Reflection. New York: Basic Books.


Thompson, M., R. Ellis, and A. Wildavsky. 1990. Cultural Theory. Boulder, Colo.: Westview Press.


Vogel, D. 1986. National Styles of Regulation. Ithaca, N.Y.: Cornell University Press.


Winner, L. 1986. On not hitting the tar-baby. Pp. 138–154 in The Whale and the Reactor: A Search for Limits in an Age of High Technology. Chicago, Ill.: University of Chicago Press.

Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 65
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 66
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 67
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 68
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 69
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 70
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 71
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 72
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 73
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 74
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 75
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 76
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 77
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 78
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 79
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 80
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 81
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 82
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 83
Suggested Citation:"4 Technological Risk and Cultures of Rationality." National Research Council. 2000. Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference. Washington, DC: The National Academies Press. doi: 10.17226/9868.
×
Page 84
Next: Part II: Political and Ecological Economy »
Incorporating Science, Economics, and Sociology in Developing Sanitary and Phytosanitary Standards in International Trade: Proceedings of a Conference Get This Book
×
Buy Paperback | $70.00 Buy Ebook | $54.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The rapid expansion of international trade has brought to the fore issues of conflicting national regulations in the area of plant, animal, and human health. These problems include the concern that regulations designed to protect health can also be used for protection of domestic producers against international competition. At a time when progressive tariff reform has opened up markets and facilitated trade, in part responding to consumer demands for access to a wide choice of products and services at reasonable prices, closer scrutiny of regulatory measures has become increasingly important. At the same time, there are clear differences among countries and cultures as to the types of risk citizens are willing to accept. The activities of this conference were based on the premise that risk analyses (i.e., risk assessment, management, and communication) are not exclusively the domain of the biological and natural sciences; the social sciences play a prominent role in describing how people in different contexts perceive and respond to risks. Any effort to manage sanitary and phytosanitary (SPS) issues in international trade must integrate all the sciences to develop practices for risk assessment, management, and communication that recognize international diversity in culture, experience, and institutions.

Uniform international standards can help, but no such norms are likely to be acceptable to all countries. Political and administrative structures also differ, causing differences in approaches and outcomes even when basic aims are compatible. Clearly there is considerable room for confusion and mistrust. The issue is how to balance the individual regulatory needs and approaches of countries with the goal of promoting freer trade. This issue arises not only for SPS standards but also in regard to regulations that affect other areas such as environmental quality, working conditions, and the exercise of intellectual property rights.

This conference focused on these issues in the specific area of SPS measures. This area includes provisions to protect plant and animal health and life and, more generally, the environment, and regulations that protect humans from foodborne risks. The Society for Risk Analysis defines a risk as the potential for realization of unwanted, adverse consequences to human life, health, property, or the environment; estimation of risk is usually based on the expected value of the conditional probability of the event occurring times the consequence of the event given that it has occurred.

The task of this conference and of this report was to elucidate the place of science, culture, politics, and economics in the design and implementation of SPS measures and in their international management. The goal was to explore the critical roles and the limitations of the biological and natural sciences and the social sciences, such as economics, sociology, anthropology, philosophy, and political science in the management of SPS issues and in judging whether particular SPS measures create unacceptable barriers to international trade. The conference's objective also was to consider the elements that would compose a multidisciplinary analytical framework for SPS decision making and needs for future research.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!