Index

A

Army Materiel System Analysis Activity (AMSAA), 11-12, 24

Automatic Efficient Test Generator, 50

Availability, 26, 62

B

“Bad actors,”4, 8, 72

Bayesian statistics and analysis, 15, 18, 29, 34, 35-36, 39-41, 46-47, 62-63, 69, 75, 76

Beckett, Allen, 68, 86

Birnbaum-Saunders model, 36, 56-57

Blischke, Wallace, 62-63, 68-69, 86

Booker, Jane, 13, 27-29, 84

C

Camm, Frank, 65-69, 72, 86

Composite materials, 57

Crouch, James, 24, 85

Crow, Larry, 11-12, 13, 30-33, 47, 84

D

Dalal, Siddhartha, 36, 48-53, 55, 86

Data archive, 19-21, 23, 27

Databases

archival data, 19-21, 23, 27

field performance data, 21, 23

Developmental testing, 4-5, 22, 23, 26, 30-33, 37, 43-44, 61

archival data, 20

operational information combined with, 2, 3-5(passim), 7-8, 26, 33-34, 37-41, 45-47, 74-75

Differential equations, 15

Duane model, 11-12

E

Easterling, Robert, 68, 74, 86

Ellner, Paul, 23-24, 84

Estimation techniques, general, 3, 10, 13, 37-40, 46, 47, 71, 73, 78

Bayesian statistics and analysis, 15, 18, 29, 34, 35-36, 39-41, 46-47, 62-63, 69, 75, 76



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 91
Reliability Issues for DoD Systems: Report of a Workshop Index A Army Materiel System Analysis Activity (AMSAA), 11-12, 24 Automatic Efficient Test Generator, 50 Availability, 26, 62 B “Bad actors,”4, 8, 72 Bayesian statistics and analysis, 15, 18, 29, 34, 35-36, 39-41, 46-47, 62-63, 69, 75, 76 Beckett, Allen, 68, 86 Birnbaum-Saunders model, 36, 56-57 Blischke, Wallace, 62-63, 68-69, 86 Booker, Jane, 13, 27-29, 84 C Camm, Frank, 65-69, 72, 86 Composite materials, 57 Crouch, James, 24, 85 Crow, Larry, 11-12, 13, 30-33, 47, 84 D Dalal, Siddhartha, 36, 48-53, 55, 86 Data archive, 19-21, 23, 27 Databases archival data, 19-21, 23, 27 field performance data, 21, 23 Developmental testing, 4-5, 22, 23, 26, 30-33, 37, 43-44, 61 archival data, 20 operational information combined with, 2, 3-5(passim), 7-8, 26, 33-34, 37-41, 45-47, 74-75 Differential equations, 15 Duane model, 11-12 E Easterling, Robert, 68, 74, 86 Ellner, Paul, 23-24, 84 Estimation techniques, general, 3, 10, 13, 37-40, 46, 47, 71, 73, 78 Bayesian statistics and analysis, 15, 18, 29, 34, 35-36, 39-41, 46-47, 62-63, 69, 75, 76

OCR for page 91
Reliability Issues for DoD Systems: Report of a Workshop nonparametric methods, 5, 6, 15, 36, 41-47, 71 stochastic processes, 43, 44, 56-58 Experimental design, 1, 9, 35-36, 38 F Failure modeling, 8-9 see also Fatigue modeling “bad actors,”4, 8 components, 4, 8, 16, 17, 18, 20, 21, 22, 25 Integrated Reliability Growth Strategy, 30-31 PREDICT, 28-29, 65 historical perspectives, 11-12 physics of failure, 4, 8-9, 22, 58, 70, 71-73 reliability growth modeling, 11-34 (passim) reliability modeling, other, 4, 8, 40, 42-46(passim), 57-60 Failure Reporting, Analysis, and Corrective Action System, 31 Fault trees, 29, 31, 62 Fatigue modeling, 2, 4, 5, 8 reliability growth models, 25 reliability modeling, other, 2, 5, 8, 35, 36, 56-60, 71, 72 Feedback loops, 4, 25 Ferguson, Jack, 55, 75, 86 Field performance data, 2, 4, 5, 13, 20-23, 24, 25, 26, 33, 37 data archive, 20 data bases, 21, 23 cost-effectiveness, 20, 21, 22, 26 training exercises, data from, 37, 38-39 Fries, Arthur, 32-33, 84 G Gaussian distribution, 56-57 Gaver, Donald, 12-13, 16-18, 23, 24, 84 General Dynamics Advanced Technology, 30 Goodness of fit, 14, 42 H Handbooks, 3, 6, 9, 10, 71 RAM Primer, 3, 5-6, 8, 9, 75-77 Hollis, Walter, 32, 84 Huller, Jerry, 50 I Integrated Reliability Growth Strategy, 8, 30-32, 72 Internet, RAM Primer, 5 K Kaplan-Meier estimates, 15 L Learning curves, 14 Life-cycle factors, 6, 25, 32, 56 costs, 2, 4, 26, 63-69 warranty costs, 20, 21, 22, 23, 25, 61, 62, 63-64, 65, 67-68 Linear models, 11, 14, 55, 59, 74 Bayesian, 40, 41 Los Alamos National Laboratory, 28, 29 M Maintainability, 4, 26 Maintenance, 12, 25 alternative models, 6-7 Management factors, 4, 9, 75, 78 Markov chain techniques, 48, 53-54 Material fatigue, see Failure modeling; Fatigue modeling Meeker, William, 13, 19-23, 24, 72, 85

OCR for page 91
Reliability Issues for DoD Systems: Report of a Workshop Modeling and simulation, general, 1, 2, 3, 4, 9 see also Reliability; Reliability growth modeling goodness of fit, 14, 42 reliability growth, 2, 3, 5 Multivariate distributions, 73, 74 Myers, Fred, 45-46, 85 N National Missile Defense System, 32 Nicholas, Theodore, 59, 85 Nonparametric methods, 5, 6, 15, 36, 41-47, 71 O Olwell, David, 75 Operational testing and evaluation, 4-5, 6, 27, 77-78 archival data, 20 developmental test information combined with, 2, 3-5 (passim), 7-8, 26, 33-34, 37-41, 45-47, 74-75 P Padgett, Joe, 36, 57-60, 85 Pareto charts, 24-25 Patriot Missile System, 32 Performance and Reliability Evaluation with Diverse Information Combination and Tracking (PREDICT), 28-29, 32-33, 65, 72 Poore, Jesse, 36, 53-55, 86 Physics-of-failure modeling, 4, 8-9, 22, 58, 70, 71-73, 75-76 Integrated Reliability Growth Strategy, 8, 30-32, 72 Poisson processes, 18 Pollock, Steve, 74-75, 86 Power law process model, 14-15 R RAM Primer, 3, 5-6, 75-77 physics-of-failure modeling, 8, 75-76 Regression analysis, 11, 19, 55 Reliability, v, 2-4, 6-7, 8, 35-69 estimation and testing, 3 failure modeling, 4, 8, 40, 42-46 (passim), 57-60 field performance data, 2 operational testing and evaluation, 1, 2, 7-8, 35-36, 37-41, 45-47, 53, 69 software development and testing, 35, 36, 47-55, 73-74 Reliability growth modeling, 2, 3, 5, 7, 9, 10-34, 76-77 cost factors, 16-17, 20-22 (passim), 26, 33 expert judgment, 10, 14, 18, 26, 27, 34 historical perspectives, 10-12, 13, 25-26 operational testing and evaluation, 1, 2, 7-8, 14, 17, 22, 23, 24, 25-26, 27, 33, 37 software development and testing, 5, 18-19, 50, 55, 71, 73 Risk analysis, 45, 61 Bayesian, 15, 18, 29, 34, 35-36, 39-41, 46-47, 62-63, 69, 75, 76 S Samaniego, Francisco, viii, 36, 38, 40, 41-47, 75, 83, 84-85, 86 Saunders, Sam, 36, 56-57, 59-60, 85 Scholz, Fritz, 13, 18-19, 24, 85 Seglie, Ernest, 46-47, 77-78, 85, 86 Sen, Ananda, 12, 13-16, 23, 24, 84

OCR for page 91
Reliability Issues for DoD Systems: Report of a Workshop Sensor technology, field performance data via, 23 Software development and testing, 5, 71 cost factors, 50, 55, 73 Statistics, Testing and Defense Acquisition: New Approaches and Methodical Improvements,v Steffey, Duane, 36, 38-41, 45-47, 84 Step-intensity model, 15 Stochastic processes, 43, 44, 56-58 Streilein, Jim, 47 Surveillance testing, 68-69 Survival analysis, 15 Systems development, 12, 25-26, 27-28, 33-34, 37, 77 T Test, analyze, and fix episodes, 13, 14-15 Test and Evaluation of System Reliability, Availability and Maintainability: A Primer, see RAM Primer Theater High-altitude Air Defense System, 32 Tortorella, Michael, 61, 86 Training exercises, data from, 37, 38-39 U Uncertainty analysis, 5, 6, 28, 41, 62, 64, 70, 78 W Warranty costs, 20, 21, 22, 23, 25, 61, 62, 63-64, 65, 67-68 Weibull model, 12, 42 Willard, Dan, 25 Williams, Marion, 74-75, 86 World Wide Web, see Internet