B
Abstracts of Background Papers

THE IMPORTANCE OF INTERPRETATION

Mark Bevir


This briefing paper describes a broad consensus in current philosophy of social science and then considers the implications of this consensus for the ways one might think about data, knowledge, and policy making.

Since the late 20th century, philosophy has been dominated by meaning holism. Holists believe that the meaning of a sentence or belief depends on the wider language game or web of beliefs of which it is a part. This holism has given rise to comparative epistemology, constructivist ontology, and contextualizing historical explanations. Current philosophy thus supports a view of the social sciences as an attempt to interpret other people’s interpretations of the world.

Interpretive social science encourages certain views of data and knowledge. First, all kinds of techniques generate valid data, and ethnographic and historical studies are important supplements to other data. Second, models, frameworks, and correlations are reifications, so one should consider if they need to be disaggregated. Third, correlations, models, and frameworks are just more data, not explanations, and—to explain such data—one has to tell stories. An interpretive social science suggests lessons for policy makers. First, practitioners should take an eclectic approach to data and remember that all data are partial and provisional. Second, practitioners should remain aware of the diversity of beliefs and actions as well as the historical and cultural contexts that influence them.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 105
B Abstracts of Background Papers THE IMPORTANCE OF INTERPRETATION Mark Bevir This briefing paper describes a broad consensus in current philosophy of social science and then considers the implications of this consensus for the ways one might think about data, knowledge, and policy making. Since the late 20th century, philosophy has been dominated by mean- ing holism. Holists believe that the meaning of a sentence or belief depends on the wider language game or web of beliefs of which it is a part. This holism has given rise to comparative epistemology, constructivist ontol- ogy, and contextualizing historical explanations. Current philosophy thus supports a view of the social sciences as an attempt to interpret other people’s interpretations of the world. Interpretive social science encourages certain views of data and knowl- edge. First, all kinds of techniques generate valid data, and ethnographic and historical studies are important supplements to other data. Second, models, frameworks, and correlations are reifications, so one should consider if they need to be disaggregated. Third, correlations, models, and frameworks are just more data, not explanations, and—to explain such data—one has to tell stories. An interpretive social science sug- gests lessons for policy makers. First, practitioners should take an eclectic approach to data and remember that all data are partial and provisional. Second, practitioners should remain aware of the diversity of beliefs and actions as well as the historical and cultural contexts that influence them. 105

OCR for page 105
106 SOCIOCULTURAL DATA TO ACCOMPLISH DOD MISSIONS Finally, practitioners should consider multiple stories that reveal new aspects of situations. WHY MODELS DON’T FORECAST Laura A. McNamara The title of this paper, “Why Models Don’t Forecast,” has a decep- tively simple answer: models don’t forecast because people forecast. Yet this statement has significant implications for computational social mod- eling and simulation in national security decision making. Specifically, it points to the need for robust approaches to the problem of how people and organizations develop, deploy, and use computational modeling and simulation technologies. I argue that the challenge of evaluating computational social mod- eling and simulation technologies extends far beyond verification and validation and includes the relationship between a simulation technology and the people and organizations using it. This challenge of evaluation is not just one of usability and usefulness for technologies but extends to the assessment of how new modeling and simulation technologies shape human and organizational judgment. The robust and systematic evaluation of organizational decision-making processes, and the role of computational modeling and simulation technologies therein, are a criti- cal problem for the organizations that promote, fund, develop, and seek to use computational social science tools, methods, and techniques in high-consequence decision making. A PERSPECTIvE ON MODELING, DATA, AND KNOWLEDGE Robert G. Sargent This paper presents and discusses the problem-solving methodol- ogy used in operations research. The advantages presented using this methodology include (1) the development of a problem statement, (2) the construction and use of a causal mathematical model based on system knowledge, and (3) the data requirements determined from the steps of the methodology. Also discussed is how this methodology differs from the method of first collecting significant amounts of data and then attempting to develop models from that data. Two major types of models, causal and empirical, are compared and discussed; this includes the strengths and weaknesses of each type. This paper also discusses why causal models are preferred, the importance of understanding that causal models contain system relationships and empirical models contain data relationships, and the different kinds of graphical and mathematical models for each model type. Different

OCR for page 105
107 APPENDIX B kinds of data and measurement scales for data are also described. Sys- tem knowledge, needed for developing causal models, is discussed and depicted in a table containing different levels of system knowledge and types of system knowledge. The modeling process and obstacles that may arise during this pro- cess are described. The importance of validation of models, model solu- tions, and model theories is stressed. Finally, the use of domain experts in problem solving is discussed, including why it is an important approach for solving social system problems. THE DANGERS OF RUSHING TO DATA: CONSTRAINTS ON DATA TYPES AND TARGETS IN COMPUTATIONAL SOCIAL MODELING AND SIMULATION Jessica Glicken Turnley By the time most modeling projects address data, the project team has made significant decisions in the course of the project that determine the type of data they need and constrain which part of a comprehensive picture they will provide. I argue that it is not possible to create, a priori with data, a comprehensive picture of some area of interest. A model is not all things and all relations in the target domain but a selection from them. That selection is made by the modeling team which constructs the model. By exercising this selection process, the team acts as sort of a prism, controlling which part of the target domain one sees and how one sees it. The model as artifact, once it is constructed, embodies this prism. This gives great power to the people involved in the modeling pro- cess. I have parsed that process into different social roles, each of which contributes differently: the questioner, who poses the question that initi- ates the process and establishes the model’s purpose; the user, who exer- cises the model in a particular sociotechnical environment; a disciplinary or theoretical expert who identifies the elements to include in the model and the relationships among them; the data provider; and the model builder, who captures relevant theory and data in the chosen medium. A model is much more than an artifact or bucket into which data can be dumped. It actually is a process of creating a particular way of looking at the world. It is like Karl Weick’s sense making, a process that “struc- tures the unknown,” using theory to choose elements of the target domain that are relevant to a particular problem. Rushing too quickly to the data question is likely to lead the team to the dangerous and impossible request to collect everything or to collect the wrong things. And finally, by definition, no model will provide a comprehensive picture of anything. In fact, the creative power of models may actually cause people to revise the picture through the very act of constructing the analytic tool.

OCR for page 105