remove redundancy. Our review indicates that this evolution process needs to continue.
The new design introduced with the web-based collection has increased the amount of data sought, introducing such questions as the identification of new construction in the previous 2 years, with a project worksheet to be completed for each individual project. Some of the concepts are new and possibly vague. For example, the new questions on computer technology and cyber infrastructure introduce new collection challenges, given the wide variety of institutional practices for computer and software procurement and inventory. The panel recommends that NSF continue to conduct a response analysis survey to determine the base quality of these new and difficult items on computer technology and cyber infrastructure, study nonresponse patterns, and make a commitment to a sustained program of research and development on these conceptual matters (Recommendation 6.8).
Even with these burdensome data inquiries, the overall response rate in 2001 was about 90 percent for academic institutions and 88 percent for the biomedical institutions. This is a testament to the institutional contact program and the determination of the data collectors. The differences in response rates between public and private institutions is of concern, with smaller rates for the private institutions perhaps the result of traditions and maybe the cause of larger error in their estimates. Some of these issues may be resolved in the current collection of data for 2003, when NSF will publicize data by institution, with only a few sensitive data items suppressed. The data should become more useful as benchmarks, and this procedure should also drive up institutional participation.
The survey employs web-based procedures, which require that all data items be completed prior to allowing submission of the form. This may force the respondent to enter doubtful data or to impute answers that are not obtainable from organizational records.
Imputation procedures vary for paper-based responses by whether or not the institution previously reported. If the unit previously reported, prior responses were used in the imputation procedure; if not, other methods were employed. The exact procedure used by NSF for imputation is not well documented, but it appears that imputation is used for unit nonresponse—a practice that is highly unusual in surveys. In most surveys, unit nonresponse is handled by weighting, as it was in this survey in 1999. At a minimum, NSF is urged to compare the results of imputation and weighting procedures (Recommendation 6.9).