Questions? Call 800-624-6242

| Items in cart [0]

PAPERBACK
price:\$76.25

## Survey Automation: Report and Workshop Proceedings (2003) Committee on National Statistics (CNSTAT)

### Citation Manager

. "Dealing with Complexity: Broadening the Concept of Documentation." Survey Automation: Report and Workshop Proceedings. Washington, DC: The National Academies Press, 2003.

 Page 23

The following HTML text is provided to enhance online readability. Many aspects of typography translate only awkwardly to HTML. Please use the page image as the authoritative form to ensure accuracy.

are elicited. In comparison, McCabe’s presentation is an invitation to view documentation more broadly, applying mathematical concepts of complexity in survey work, in ways that could ultimately become very important tools throughout the survey design process.

A first step in assessing complexity is to visualize computer code as a directed graph (flowgraph), wherein statements, decision points, and other pieces of code are designated as nodes and the links between them are edges (this graph of the functioning of the software is a much more code-dependent and detailed representation of the functioning of a software system than the graphical model that underlies model-based testing, described earlier). Having parsed the code as a graph, these flow-graphs may be plotted, and the resulting pictures may be extremely useful in directly seeing whether the logic underlying the code is clear or convoluted. More importantly, as McCabe has done, tools from mathematical graph theory can be applied to calculate natural metrics that objectively measure the complexity of the code.

At the workshop, McCabe focused on two such metrics. The first, cyclomatic complexity (v), is a measure of the logical complexity of a module (or a system). It can be computed by several methods, as illustrated in the proceedings, and it is defined as the number of basis paths in the graph. The set of basis paths is the minimum set of paths through the graph that—taken in combination with each other—can generate the complete set of individual paths. Accordingly, cyclomatic complexity is a benchmark of inherent complexity of the graph; it indicates the minimum number of separate tests that would be needed to cover all edges of the graph (or all the transitions in the software).

Essential complexity (ev) is a second metric that measures the degree of unstructuredness of software. Roughly speaking, essential complexity simplifies the flowgraph by fusing nodes that represent primitive (and well defined) logical structures; the collection of nodes that define a complete loop that executes while a certain condition is met may be replaced by a single node, for instance. Essential complexity is then calculated as the cyclomatic complexity of the resulting reduced graph. The quantity ev is bounded by 1 (perfectly structured, in that it can cleanly be resolved into well-defined logical modules) and v (perfectly unstructured, when no simplification is possible).

McCabe has developed many other metrics that could not be described at the workshop due to time constraints, among them a measure of the degree to which statements within modules make use of external data from other modules (thus indicating the success of the modularity of software design). Implemented for CAPI instruments, this metric would indicate how often edits within a given module make use of information collected in other modules.

 Page 23
 Front Matter (R1-R14) I Report -- Introduction (1-8) Current Practice in Documentation and Testing (9-11) Shift from Survey Research to Software Engineering (12-13) Changing Survey Management Processes to Suit Software Design (14-21) Dealing with Complexity: Broadening the Concept of Documentation (22-27) Reducing Insularity (28-32) II Proceedings - Opening Remarks (33-35) What Makes the CAI Testing and Documentation Problems So Hard to Solve? (36-62) Software Engineering -- The Way to Be (63-77) Automation and Federal Statistical Surveys (78-82) Understanding the Documentation Problem for Complex Census Bureau Computer Assisted Questionnaires (83-96) The TADEQ Project: Documentation of Electronic Questionnaires (97-115) Computer Science Approaches: Visualization Tools and Software Metrics (116-136) Model-Based Testing in Survey Automation (137-152) Quality Right from the Start: The Methodology of Building Testing into the Product (153-163) Interactive Survey Development: An Integrated View (164-173) Practitioner Needs and Reactions to Computer Science Approaches (174-182) Web-Based Data Collection (183-197) Interface of Survey Methods with Geographic Information (198-210) Prospects for Survey Data Collection Using Pen-Based Computers (211-225) Panel Discussion: How Can Computer Science and Survey Methodology Best Interact in the Future (226-246) Workshop Information (247-250) Bibliography (251-252) Biographical Sketches of Workshop Participants and Staff (253-260)