National Academies Press: OpenBook

Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report (2003)

Chapter: 2. Overview of OMB Guidelines and Agency Concerns

« Previous: 1. Introduction
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

2
Overview of OMB Guidelines and Agency Concerns

The first workshop began with a keynote address by Dr. John D. Graham, administrator of the Office of Information and Regulatory Affairs at OMB, the office responsible for developing the government-wide Data Quality Guidelines.

THE OMB PERSPECTIVE

Dr. Graham said that federal agencies have disseminated information for decades, but usually in the form of paper documents. The Internet has increased the volume of information disseminated, he said, raising the difficulty of ensuring high quality. As discussed in the Introduction, the increase in information was also accompanied by more challenges to rules based on the information and requests to examine the “raw” data on which rules were based. The Data Quality Act of 2001 was an attempt to meet these challenges. While the original bill called for government-wide rules, the OMB insisted on guidelines instead.1

Dr. Graham noted, “There is plenty of evidence that the quality of information advanced for use by government decision makers needs to be improved. In the scholarly literature in the field of what is sometimes called science policy there are entire books of case studies demonstrating

1  

Guidelines are non-binding norms. Rules are developed under the Administrative Procedures Act, and require an agency to provide notice and invite public comment. Ordinarily, rules are binding on both the agency and the public.

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

technical problems with the information collected, used and published and released by the federal agencies.” Dr. Graham cited recent studies by the National Institutes of Health and the Environmental Protection Agency in which results had been fabricated, misread, or poorly analyzed.

Dr. Graham stated that the Bush Administration is “committed to vigorous implementation of the new information quality law” and that the Administration believes it “provides an excellent opportunity to enhance both the competence and accountability of government.” To fulfill this opportunity, the guidelines “imposed three co-responsibilities upon all federal agencies”:

  1. Agencies must commit to a basic standard of quality for the information they disseminate.

  2. Agencies must develop information management procedures to prevent dissemination of poor-quality data, with peer review playing an important role.

  3. Agencies must have an administrative mechanism that allows “affected parties” to request corrections of information. The burden of proof, Dr. Graham noted, is on the requester to demonstrate that the information fails to meet OMB or agency guidelines. If the request is denied, there must be an appeals process.2

Dr. Graham acknowledged that a number of concerns had been raised about the guidelines:

  1. The guidelines subject government information to a higher standard than information generated by industry, academics, and public interest groups. Dr. Graham noted that a closer reading of the guidelines would suggest a more “nuanced” conclusion. “If a government agency wishes to rely upon and cite information from industry to support a decision, that information, because it becomes a dissemination, must meet the same quality standard that information generated by the agency must meet.”

  2. The guidelines are unfunded mandates on agencies. It is true that agencies will need to spend time responding to requests, Dr. Graham said, but the guidelines allow them to reject complaints that are groundless. He also estimated that agencies would probably save money in the long run.

  3. Original data may not be available. Dr. Graham said the OMB was “reluctant” to require that all original data be reproducible, instead they require that analytical results (i.e., those derived from original data) be

2  

Graham cited this responsibility as “perhaps the key provision.”

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

reproduced. “Show me what numbers or assumptions you have used,” he said, “and how they add up to the number you say they add up to.”

  1. Agencies may be reluctant to acknowledge that “affected parties” are truly affected. He said that unless there was an objective appeals process inside agencies, “I predict there will be efforts down the road to make a mechanism that works from the outside.”3 He also said, however, that “the burden of proof is squarely on the affected parties. They must demonstrate that a specific dissemination does not meet the quality standards in the OMB guidelines or the agency-specific guidelines.”

Dr. Graham said that OMB’s focus would be on the design and implementation of agency procedures rather than on mediating disputes. He said he hoped the courts would refrain from intervening, but that it would “probably take some court decisions to know how they will be interpreted…” Acknowledging the challenge the guidelines present to the agencies, Graham concluded his remarks by saying “Our shared objective is an improvement in the quality of the information that the federal government disseminates to the public.”

DISCUSSION

A brief discussion period followed Dr. Graham’s presentation, with Mike MacCracken of the Office of the U.S. Global Change Research Program asking how projections made for many years in the future might comply with data quality guidelines. Dr. Graham said that agencies making projections or risk assessments would be asked to demonstrate their models and make them transparent enough to allow others to repeat the calculations.

Kevin L. Bromberg of the Small Business Administration asked if an agency that relied on “third-party data”—from outside the agency— would have to provide the underlying data. Dr. Graham said that if an agency disseminates information in an official way, “then they do have a responsibility to assure that that information meets relevant quality standards in the agency guidelines and the OMB guidelines.” Whether it was possible to obtain the original data would be decided in the same way as for information disseminated by an agency, noted Dr. Graham.

3  

Some participants noted that “affected parties” are not defined in the guidelines, creating a potential source of confusion. See, for example, question 4 posed by Frederick Anderson under Administrative Correction and Appeals, p. 20.

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

KEY GUIDELINE COMPONENTS AND CONCEPTS

Alan Morrison, a member of the Public Citizen Litigation Group and currently a visiting professor at Stanford University Law School, continued the introductory session by offering a synopsis of the basic components and concepts of the OMB guidelines.

In his opening remarks, Professor Morrison noted “…the message seems to be pretty clear that information is power in itself and it is important in and of itself quite apart from its use in the regulatory context … that government information is powerful because it has been disseminated by the government and people do and do not do things based upon what government information, as information, says…” Morrison also noted that “perhaps the most poignant example of the power of government information is the recent information about the value of mammography. Millions of women in the United States acted based on that [information]…”

What Kinds Of Information And Data Are Covered Under The Guidelines?

Under the guidelines government information is “broadly defined” and under the statute “it is required to be of high quality.” The OMB guidelines cover many types of information, said Professor Morrison, and all kinds of formats and media. The guidelines pay special attention to factual information, specifically “influential” scientific, financial, and statistical information. One significant exclusion is for “opinion information.” However, noted Professor Morrison, “I warn everyone to be careful about this exclusion. It would be in my judgment improper for an agency to say, ‘Well, this is all our opinion, and therefore we don’t have to pay any attention to the statute and guidelines.’” An agency should not assume it can “disseminate opinion after opinion without paying attention to the statute.” He cited the analogy of libel law and its attendant difficulties. Libel law also attempts to draw a distinction between fact and opinion, but such distinctions are often questioned or hard to discern.

Professor Morrison noted a “non-inclusion” in the guidelines for press releases. However, he cautioned agencies to be careful because some press releases may be “chock-full of data” or include fact sheets, leaving the possibility for confusion. Morrison cautioned the agencies that “this is not a matter of semantics. It is not a matter of labels. The statute has a purpose and it requires that you be realistic in your assessment of what is in and what is outside of the guidelines.”

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

What Is Dissemination?

Professor Morrison said that the OMB guidelines apply not to the collection or maintenance of data but to its “dissemination by federal agencies.” He defined dissemination as public distribution or sharing of information by an agency—by printed, electronic, or other means. The OMB guidelines take special note of the Internet, which both “enables agencies to communicate information quickly and easily to a wide audience” and also “increases the potential harm that can result from the dissemination of information that does not meet basic information quality guidelines.”

Many uses of information are not considered dissemination if they are intended for a limited audience or a specific setting. Professor Morrison suggested the following examples of information use that are not acts of dissemination: a response to a FOIA request; a response to a letter; and information provided during adjudication. However, he noted, once an agency posts information on its Web site, it is disseminated— even if its initial intended audience was limited.

He also said that information produced by grantees or employees of an agency does not become “of the agency” unless it is adopted and disseminated by the agency. He suggested that publications or Web postings resulting from research supported by a federal agency should contain a disclaimer to the effect that the views are those of the individual, not the agency.

Professor Morrison clarified that agencies must not wait until the time of dissemination to ensure data quality. High quality should always be a priority because an agency cannot know when certain data may be used in regulation or rule making—even long after the work is done.

He noted that the OMB guidelines take effect on October 1, 2002. Any information disseminated after that date is covered by the guidelines. Information disseminated prior to that time does not have to be reviewed under the guidelines. However, if older information is challenged, the agency may bear the burden of reconsidering that information in light of the guidelines.

What Is “Influential” Information?

The Data Quality Act suggests that there are two types of data: (1) data that are “important” (an OMB term) and/or “influential” (an Act term), and (2) all other data or information. The Act says that a higher standard of quality applies to influential information. As defined by OMB, “influential scientific, financial, or statistical information” is of such high importance “that the agency can reasonably determine that dissemination of the information will have or does have a clear and substantial impact on

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

important public policies or important private sector decisions.” This definition contains several changes from earlier versions; “clear and substantial,” for example, was added “to reduce the need for speculation on the part of agencies.” “Financial” information also has been added as an example of potentially influential information.

Professor Morrison made five key points about influential information:

  1. It is not necessary to put a label on all disseminated information. “You don’t have to say that this is influential or this is not influential.” Agencies may want to make an internal determination about what is influential, but the difference between what is or is not influential is only of importance under the guidelines if someone complains about the information.

  2. It is important to focus the question of whether something is influential on the information itself. “That is, is the information influential, not is the ultimate decision on which the information is based in part going to be influential.”

  3. The “clear message of the guidelines is that most information that agencies disseminate is not influential information.”

  4. The key aspect of data being categorized as influential information “is that it must be reproducible.” This doesn’t mean that the agency must actually reproduce the information in order to disseminate it, but rather it means that the information is “capable of being reproduced.”

  5. The question of whether information is influential may change from the time the agency originally disseminates it until the time the agency actually uses it.

What Is the Complaint Mechanism?

Professor Morrison noted that while under the Administrative Procedures Act, the public has the right to complain about the quality of agency data, the OMB guidelines strengthens that right by requiring that the agencies respond to the complaint. “The agency has to respond, and not only does it have to respond, but at the end of the year it has to send OMB a report explaining what kinds of complaints it received and what kind of responses it gave.”

When receiving a complaint, the agency has to decide whether to have someone with a completely unbiased view of the information, who had no responsibility for the preparation or dissemination of the information evaluate the complaint or whether to have someone intimately familiar with the information assess the complaint. If the agency’s response to the complaint is not satisfactory, the agency must provide the filer with an opportunity to appeal. Professor Morrison noted that “the choice of the

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

appeal process OMB makes quite clear is up to the agency and the agencies may well want to have a multi-person appeal process to be able to bring in both some objectivity and some knowledge and some more general kind of expertise in information dissemination.”

DISCUSSION

Following Professor Morrison’s overview, a number of questions were raised. Harold Halpern of the Department of Energy asked if agency advisory committee reports were covered by the guidelines. To which Professor Morrison responded, “My own view is that the advisory committee…. would be like outside researchers … It is an outside submission. Indeed the whole purpose is to get outside advice and my view is that the agency is not responsible because after all it doesn’t control it.” Professor Morrison, noted however, that if the agency takes the results of the report and “then issues a regulation or approves a product … and disseminates [the report] and says that this is the basis on which we are acting, it has in my judgment adopted it as its own and subject to some obvious practical limitations has got to make reasonable assurances that it is accurate.”

Ray McAllister, CropLife America, asked about information that is released to an individual under FOIA that is posted on an Internet site. Professor Morrison noted that while the information may have influence, “I do not believe that the statute puts the burden on the agency to worry about what somebody else may do with the information.”

DETERMINING THE THRESHOLD-INFLUENTIAL SCIENTIFIC, STATISTICAL, AND FINANCIAL INFORMATION

Richard Merrill, Professor of Law at the University of Virginia, moderated a panel discussion on agency approaches to determining influential information. The panel comprised Nancy Kirkendall, Energy Information Administration of DOE; Steven Galson, FDA; and Fred Siskand, DOL. Prior to the agency representatives discussion, Richard J. Pierce of The George Washington University Law School provided his perspective on how best to categorize influential data.

Mr. Pierce indicated that he found the definition of influential information4 to be unclear, stating that of 100 different types of information an agency disseminates he could probably identify 2 or 3 types

4  

“Scientific, financial, or statistical information the dissemination of which will have or does have a clear substantial impact on important public policies or important private sector decisions,” as stated in the OMB Guidelines.

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

that are definitely influential, 2 or 3 that are definitely not, but would be hard press to know if the other 94-96 types were influential. They probably could be argued both ways. Mr. Pierce stated that this should be viewed as a “very important opportunity.” “The malleability of the definition leaves you a tremendous amount of discretion to figure out what you want to call influential information and I would urge you to think very, very carefully about what you want to call influential…” The reason, he said, is that in designating an action as influential, agencies might “create a legal regime” in which each dissemination of that type of information becomes immediately reviewable as final agency action. Mr. Pierce closed by urging agencies to identify only a very few things as influential.

During the agency presentations, Nancy Kirkendall of the Energy Information Administration of the Department of Energy, said that her agency focuses on “high-quality, policy-relevant information to support public and private decisions.” Most of this work is statistical, she said, and virtually all of it conforms to high standards of transparency and reproducibility. Thus it has, by its nature, already acquired the OMB’s criteria for influential.

Dr. Kirkendall agreed that if data are influential, “you need to have a high degree of transparency.” She said that the statistical information her agency produces is “already transparent” because of procedures worked out over the years.

Dr. Kirkendall noted that for statistical agencies, good practice means that products are transparent and reproducible, “or at least if we follow our own guidelines they are. If you have a question about a number, we can find out exactly what information went into that number.”

Steven K. Galson of the Food and Drug Administration said the FDA proposed defining “influential” information as “economically significant, as defined in Executive Order 12866: any rule-making action that will have an annual effect on the economy of $100 million or more or will adversely affect in a material way the economy, a sector of the economy, productivity, competition, jobs, the environment, public health or safety or state, local and tribal governments.”

Fred Siskind of the Department of Labor said that the DOL’s diversity of products is “almost overwhelming,” as is its total number of documents. His department has posted over 200,000 documents on its Web site. By using the screens suggested by the guidelines, however, he said that a very small percent of disseminated information would fall under the “influential” category. Some examples of DOL influential information include the Consumer Price Index, Producer Price Index, and other national economic indicators. He said that DOL was still working on its definition of influential.

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

DISCUSSION

During the discussion period, Robert Ashby of the Department of Transportation commented that DOT would define “influential” as being “outcome determinative of a key issue.” He also noted that the government is not only a generator of information, but also a “considerable recipient” of information, and that DOT would want to determine which of the incoming information should be classified as influential.

Dr. Carla Steinborn of the National Oceanic and Atmospheric Administration noted that most information released by her agency—such as weather forecasts—was difficult to regard as “influential” when released, but could become so later, triggering a call for reproducibility. In most cases, she said, this could not be done for NOAA research, and she said her agency had not yet resolved this problem.

THE STANDARDS OF TRANSPARENCY/REPRODUCIBILITY/ PEER REVIEW FOR INFLUENTIAL INFORMATION

The OMB guidelines direct agencies to develop procedures for reviewing and substantiating “the quality (including the objectivity, utility, and integrity) of information before it is disseminated.” The guidelines characterize quality as the “encompassing term,” and the others are “constituents,” with the following meanings:

  • Utility refers to the usefulness of the information to the intended users.

  • Objectivity focuses on whether the disseminated information is accurate, reliable and objective, and is presented in an accurate, clear, complete, and objective manner.

  • Integrity refers to the protection of information from unauthorized access or revision.

Transparency

The OMB guidelines state that the concepts of utility and integrity are relatively straightforward and arouse little debate. Achieving objectivity, however, is less clear. The OMB guidelines suggests that to be objective, information should be produced by methods that are “transparent” and should be reproducible by others.

The goal of transparency for data and methods, according to the guidelines, is “to facilitate the reproducibility of such information.” The guidelines add, “Where appropriate, data should have full, accurate, transparent documentation, and error sources affecting data quality should be identified and disclosed to users.”

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

Reproducibility

A generally accepted standard in the world of research is that experimental results should be capable of replication by others. The guidelines originally stated: “If an agency is responsible for disseminating influential scientific, financial, or statistical information, agency guidelines shall include a high degree of transparency about data and methods to facilitate the reproducibility of such information by qualified third parties.”

This statement stimulated much debate because some research may be difficult, expensive, or impossible to replicate. In practice, few experiments are replicated precisely, because of such obstacles as confidentiality, expense, irreproducibility of original data, and the death of persons who took part in the original research. For such reasons, the guidelines were modified to say the work must be “capable of being reproduced.”

Types of Scientific Information.

The OMB guidelines list two types of “information” in the case of scientific studies. One is original and supporting data. The OMB urges caution in the treatment of such data because it may often be “impractical or even impermissible or unethical to apply the reproducibility standard to such data.” As examples, the guidelines state that “it may not be ethical to repeat a ‘negative’ (ineffective) clinical (therapeutic) experiment and it may not be feasible to replicate, for example, the radiation exposures studied after the Chernobyl accident.” Thus the guidelines urge agencies to consider “which categories of original and supporting data should be subject to the reproducibility standard and which should not,” and that they should make this determination with the help of “relevant scientific and technical communities.”

The second category is analytic data. OMB states that “reproducibility is a practical standard to apply to most types of analytic results.” The guidelines add: “With respect to analytic results, ‘capable of being substantially reproduced’ means that independent analysis of the original or supporting data using identical methods would generate similar analytic results, subject to an acceptable degree of imprecision or error.” The primary benefit, according to the guidelines, would be to allow the public to “assess how much an agency’s analytic result hinges on the specific analytic choices made by the agency.” The OMB guidelines also acknowledge that the “objectivity standard does not override other compelling interests such as privacy, trade secrets, intellectual property, and other confidentiality protections.”

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

Peer Review

Journals, granting agencies, and others traditionally seek transparency in research through peer review—qualified experts who review the research concept, methodology, and results. The OMB guidelines praise the mechanism of peer review for its general reliability; they also say that peer review is not fail-safe, and that the “competence or credibility” of the reviewers themselves is occasionally challenged. The guidelines state that peer review “is rebuttable based on a persuasive showing by the petitioner in a particular instance.” The guidelines add that occasional cases of falsification of data have slipped through the process of peer review.

Michael R. Taylor, Resources for the Future, moderated a panel discussion regarding agency approaches to achieving objectivity. He opened the session by noting the “irony” of a law (the Data Quality Act) emphasizing transparency that was passed by Congress without a hearing process. The session began with comments by R. Brooks Hanson, Science, and Robert O’Keefe, HEI, who provided non-agency perspectives on these issues. Agency representatives, Heather G. Miller, NIH; Kevin Teichman, EPA; and John Rodgers, FAA, then followed with presentations on their respective agency approaches.

Dr. R. Brooks Hanson, Deputy Managing Editor for Physical Sciences at Science, offered a detailed description of how a leading scientific journal performs peer review. Reviewers are expected to consider whether the data and analytical methods substantiate the conclusions; whether interpretations are fairly presented; whether other hypotheses or conclusions should be mentioned; the level of statistical and other kinds of uncertainty; whether data are separated from conclusions; and whether the scholarship, referencing, and presentation are appropriate. He noted that in all the reviews he had seen, a referee had requested replication of data in only a very few cases. At the same time, he said, “peer review and publication fosters reproducibility. Any reasonable request for materials and methods must be made available... The goal of peer review is to evaluate or guarantee significance—both of the data and the interpretations and of each separately.”

Dr. Robert O’Keefe of the Health Effects Institute said that HEI takes special pains with its “rigorous peer reviews…[as]…a key step for us and really for all the studies that we undertake.” HEI maintains an independent standing committee of subject-matter experts, which exists solely to review the quality of HEI studies. Studies thought to have “significant regulatory impact” receive additional scrutiny, including quality oversight, detailed peer review, and extensive commentary on the study and its underlying data.

Dr. O’Keefe identified three groups of studies:

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
  1. Contributing studies—the “rank and file” of information;

  2. Studies likely to be relevant to regulation; and

  3. Those few studies with known and significant regulatory impact.

In the last category, he said, were results that would be at the core of a regulatory outcome. These, and some studies from category 2, would certainly be influential. For these, he said, it is “reasonable for higher levels of detail to be expected.” He added that it is not cost effective, however, to provide an extensive level of oversight for all studies.

Dr. O’Keefe said that transparency held high priority at HEI, as reflected in its research reports, which “are perhaps a bit unusual” in that they include all the data generated during the course of a study, the scientific methods used, and the range of approaches employed by investigators. He said that transparency was a founding precept of HEI, which is structured to promote objectivity in decision making.

For Dr. Heather Miller of the National Institutes of Health, the issue of transparency “is at the heart of how science is done and how the NIH does business, including the business of information dissemination.” Dr. Miller said that NIH could move relatively quickly toward compliance with OMB guidelines because “the agency has always controlled the quality of the information it presents, and inherent in the process of assuring quality is peer review. It is a very slight modification of the way we have always done business.”

Dr. Kevin Teichman of the Environmental Protection Agency said that the criteria for defining transparency are found in EPA’s risk characterization handbook, which requires description of the approach one is using, the assumptions made, models used, where data gaps exist, where one extrapolated from the data, what the uncertainties are, where one is using data or relying on defaults, and where one is making scientific conclusions as opposed to policy decisions.

The EPA, according to Dr. Teichman, has had a peer-review policy in place since June 1994. It requires that major scientific and technical work products be peer reviewed, with external peer review used for work products that support important decisions. The EPA peer-review handbook was revised in December 2000, providing the guidance for implementing the peer-review policy.

Dr. Teichman said that the EPA also was contemplating the use of the “economically significant” standard, as well as the case-by-case approach being considered by the Department of Labor. He said that EPA might give the OMB guidelines to those submitting third-party studies that may at some point become influential information.

Dr. Teichman said that an unresolved issue for EPA was how its guidelines should address information generated by third parties that

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

may or may not be reproducible. In the pesticide program, for example, the agency receives hazard information from pesticide suppliers. He raised the possibility of proactive steps, such as requiring third parties to adhere to the guidelines.

Mr. John Rodgers of the Federal Aviation Administration said that the FAA deals with several categories of “influential” information, including those related to the development of airports, making financial decisions regarding airports, and regulating other aspects of transportation.

Mr. Rodgers approved of the use of transparency as a criterion, and said it should allow the reader to know what data, assumptions, analytical methods, and statistical procedures were used. He noted that “in general the information that the FAA uses tends to be transparent and I think we are compliant in spirit.”

With respect to peer review, Mr. Rodgers described the compliance of FAA as “mixed.” “I think for all types of data there is something I could characterize as peer review, although it is not necessarily done with respect to a uniform set of standards or guidelines, and it is not necessarily always documented in the same way.” He also said that using the same external peer reviewers, who know the programs, could lead to conflicts of interest. He said he was curious to see whether the agency would move “peer review outside the scientific community and into the operating context where the FAA operates, how successful we will be in creating entities to do peer reviews.”

Earlier in the workshop, Dr. Steven Galson said that the FDA strives for a high degree of transparency in the high volume of health-related information it disseminates publicly, including risk notices, rule-making documents, product approvals, guidance and regulatory assistance, and reports. FDA is developing new templates for all drug reviews to ensure that they are written in a consistent way across different classes of drugs and by different reviewers.

The Bureau of Labor Statistics, said Dr. Fred Siskind, already has to meet certain OMB requirements in the area of generating statistical information, and these would likely meet the guideline requirements. He said that BLS puts out descriptions of its methodology, making the process transparent and—in theory—reproducible. He said that BLS does have privacy and confidentiality concerns, and does not give out data about individual people or establishments.

DISCUSSION

Mr. MacCracken of the Office of the U.S. Global Change Research Program asked Dr. Hanson how he would handle situations where the criteria for objective peer review were not met by the reviewers. Dr.

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

Hanson said that Science asks all reviewers to describe potential conflicts of interest, both financial and intellectual, and that part of the editor’s job is to know the pool of potential reviewers. Dr. Miller added that NIH has a huge peer review operation to screen applications, and that financial conflict of interest “is one of the points that we explicitly have each reviewer address prior to any round of review... We also vet reviewers for personal conflicts, such as, is this person your best friend or worst enemy.”

Ray S. McAllister of CropLife America cited the concern that occasionally an agency would reach into the open literature for information in making a pesticide decision and use studies that, even though peer reviewed, are of lower quality than data produced by pesticide manufacturers themselves. Dr. Teichman of the EPA responded that “we would certainly hope that all of the data the agency would use would follow the best possible practices, good laboratory practices and others that would comply with the OMB data quality guidelines.”

Dr. William Perry of OSHA said that the “testimony of experts can be really critical” in setting standards, and that OSHA works closely with national consensus standard organizations like ANSI and ASTM. He said that consensus standards are often the starting point in areas where “you won’t find a lot of peer-reviewed science.”

Dr. Galson said that the FDA also relies on data from outside firms in making many decisions, and much of that data are confidential business information closely held by the sponsors. He considered “our review process of this data to be the peer review of the data that is submitted.” However, he said, “for certain drug approvals we do go to scientific advisory committees for recommendations.”

RISK INFORMATION REGARDING HUMAN HEALTH, SAFETY, AND THE ENVIRONMENT

A category of information in which objectivity is viewed as very important concerns risk, which is discussed in the guidelines in the context of health, safety, and environmental information. Agencies making health-and safety-related decisions about risk are directed not only to use the best available data, but also to provide risk information about their decisions. This information should include such features (as specified in the Safe Drinking Water Act) as which populations are most affected by risk, the central risk for specific populations, appropriate upper- and lower-bound estimates of risk, significant known uncertainties in predicting health or safety effects, and studies that would help resolve these uncertainties.

Joe S. Cecil, the Federal Judicial Center, moderator of this session observed that risk assessment is “certainly the most demanding form of dis-

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

closure that is considered by the regulations and it has earned its distinction because of the especially contentious disputes that have taken place over the years.” He noted that agencies must not only gather extensive information on risk, but do so in a way that ensures the timely flow of vital information to medical providers, patients, health agencies and the public—“a pretty tall order.”

Starting off the presentations, Dr. Joseph Rodricks, Environ, reviewed an earlier report issued by the National Academies in 1983, called “Risk Assessment in the Federal Government: Managing the Process.” He noted that many of the points in the OMB guidelines are anticipated in that study, which should serve as a useful guide in addressing current issues. He expressed some skepticism about whether it is possible to demonstrate that a complex risk assessment is capable of being reproduced.

Dr. William Perry agreed with the 1983 Academies report that risk analysis should include hazard identification, dose-response assessment, exposure assessment and risk characterization. “Those four things,” he said, “require different kinds of information,” including peer-reviewed literature and many other sources. At OSHA, he said, a risk assessment typically has two parts: (1) hazard identification and (2) exposure response. This relies heavily on peer-reviewed scientific studies, striving for best estimates of the size of the population at risk, estimates of how effective preventive measures are likely to be, and ways to resolve conflicting information.

At the Department of Labor, said Dr. Perry, OSHA rule making about health issues generally requires the agency to develop a risk assessment. The goal is “to determine the levels of risk for workers exposed to various chemicals” and to “estimate the impact of reducing exposure to those particular chemicals.”

Dr. Perry also noted that the department has in the past used literature searches to establish standards, trying to determine the best studies that are peer reviewed and information on the methodology used to collect the data. “I often say that I used to think economics was a soft science,” he said, “until I saw some of this stuff. It is not the hardest stuff in the world in terms of accuracy; there are lots of uncertainties, and often the underlying data just are not available.”

Dr. Kevin Teichman said that EPA has a risk characterization handbook. Its policy for risk characterization came out in March 1995 and mandates that each risk assessment used in EPA decision making should include a risk characterization that bridges risk assessment and risk management. Risk characterization, he said, “is that integrating, summarizing step at the end of a risk assessment that puts the information in a form that the decision maker can use. It is very important for scientists to realize when they are conveying information about risk and when

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

they may be moving into the arena of providing information on risk management or on policy calls, and we try to very carefully draw the distinction.”

Dr. Steven Galson said that much of the influential information disseminated by FDA is based on analyses of risk to the public of certain actions or exposure. Quantitative risk assessments may include health, safety, ecological, engineering, and physical hazards encountered during the use of medical devices, such as artificial hips, stents for heart arteries, and valves; food chemical residues; and antimicrobial resistance genes and bacteria. “Risk analysis is broadly used in the agency as a tool to enhance the scientific basis for all our regulatory decisions including product approvals,” he said. But, he added, “many of our actions are essentially qualitative.” For example, in the law that governs drug approvals, the standard for new drugs is that they be “safe and effective.” These qualities do not have numerical thresholds. For these, FDA depends frequently on outside expert advice.

FDA proposes to adapt the general principles for risk assessments in the Safe Drinking Water Act (as stated in the OMB guidelines) to fit those situations. It proposes to define risk as the likelihood of injury and/or damage that can be caused by a substance, technology, or exposure. Dr. Galson added, “Although we analyze the economic costs of these regulations and consider alternatives, most of our regulations simply don’t lend themselves to the type of quantitative risk assessments that are contemplated by the Safe Drinking Water Act principles.”

In addition, many FDA actions are based on research and supporting data generated in biotech or drug companies. In these cases, approval actions are based on scientific studies conducted by sponsors that are seeking marketing approval in accordance with our regulations and our guidance documents.

DISCUSSION

Mr. Neil King of Wilmer, Cutler and Pickering asked how agencies chose among studies to support risk assessments when the studies give conflicting results, and “whether these guidelines are going to require any changes in the way you make selections among studies to use for purposes of risk assessment.” Dr. Perry said that OSHA did indeed confront this dilemma. He noted that the guidelines apply only to information dissemination, so that OSHA focuses on how the act of disseminating that information is going to be changed by these guidelines. He noted that by the time OSHA makes a rule, it already has disseminated a great deal of information. He did not see the guidelines affecting how they look at information for setting priorities on major health hazards.

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

Administrative Correction and Appeals

The OMB guidelines oblige agencies to respond to complaints from people who are affected by agency decisions. They describe “affected persons” as those who may benefit from or be harmed by the disseminated information. The guidelines recognize that most agencies already have mechanisms to respond to complaints, but they now require agencies to respond directly to complainants and to itemize their complaint history for OMB at the end of each year. They also require each agency to add an appeals process for the benefit of complainants, with “appropriate time limits in which to resolve such requests for reconsideration.”

A panel discussion, moderated by Frederick R. Anderson of Cadwalader, Wickersham & Taft and comprising Daniel Cohen, DOC; Neil Eisner, DOT; Eileen Stanley, EPA; and James Scalon, DHHS, highlighted key issues their respective agencies were wrestling with.

Mr. Anderson introduced a discussion of the corrections and appeals process with a series of procedural questions for the panelists and audience to ponder.

  1. Time limits: Agencies must specify time periods for correction requests and appeals, but what should they be? What should be their own deadlines for responding to requests and appeals?

  2. Can information be challenged at the moment of dissemination, before it reaches the policy or rule-making stage? Or must a challenge await the rule-making or policy-making step?

  3. The guidelines for agencies apply specifically to information released after October 1, 2002. How should agencies handle requests for corrections of data disseminated before October 1, 2002?

  4. The OMB guidelines stipulate that “affected persons” should be able to bring a challenge. Who is an affected person?

  5. How should the corrections process proceed: Should requests be directed to the chief information officer (as the guidelines suggest) or to people with expertise on the information in question? Will a written response be required?

  6. How should the appeals process proceed: How can it ensure the impartiality and fairness of sound law? Will it include independent third-party review? What kind of record of the initial discussion will provide a basis for appeal? Will the agency limit contact with the petitioner; the office that produced the data; the appellate agency; other petitioners?

  7. For interagency information gathering or rule making, is there a mechanism short of an appeal to consult with other agencies that may be affected?

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
  1. Will there be opportunities to be heard or to cross-examine during informational review process at the appeals level?

  2. Will agencies seek outside expertise to develop appropriate, high-quality answers to appeals?

  3. Forms of relief: Some agency responses will be non-controversial, including correction, retraction, or defense of data. Disclosure of the existence of multiple views might not be a sufficient response. Should a higher standard of quality apply during the appeals process? If the complaint is that the agency has insufficient data, should there be relief that requires extra work by an agency? Who would pay?

  4. Judicial review: Will the courts impose judicial review in some case? This seems likely, because of the courts’ tradition of overseeing the use of information that affects people’s lives. Reviews already occur in the case of statutes overseen by agencies, such as EPA, where the courts have examined guidelines from the point of view of the challenger.

With regard to what is reviewable, Mr. Neil Eisner of the Department of Transportation said that “in my opinion the substance of our decision is reviewable under the Administrative Procedures Act. If we have incorrect data and somebody has pointed it out, the reasonableness of our response to them is subject to APA review.”

Mr. Eisner also described the importance of deciding who receives the complaints. In DOT, he said, the initial response will probably be given by the experts responsible for the data. If there is an appeal, an appeals person or panel will be appointed that has an “appropriate balance between neutrality and enough knowledge to make the decision.” For frivolous and repetitive complaints, he said that agencies appear to have the authority under the guidelines to reject them. The agency will probably ask for specificity in the complaint, why correction is needed, and where the data are incorrect. It will ask the challenger to file a complaint within 180 days of dissemination. The agency will respond to a complaint within 90 days and to an appeal within 45 days. The agency will ask complainants to state how they were harmed and how correction would benefit them.

Mr. Dan Cohen of the Department of Commerce noted an ambiguity. He said that “the statute talks about a review mechanism looking at agency compliance with the OMB guidelines—not actually whether the information itself is correct or incorrect, but whether the agencies complied with a process for developing that information.”

Mr. Cohen noted that there is some confusion over legal standing, with some suggesting that affected and standing are equivalent. “I am not sure that is right. … You could be affected for purposes of this statute but not really have standing to challenge in court, and I think that agencies

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×

should be very careful in deciding who is affected to make sure they don’t blur the distinction.”

Mr. Cohen also discussed relevancy. He indicated that “agencies should have the ability in their administrative mechanism process to be able to decide that correcting a particular item of information doesn’t make any difference. So, why…bother…” Mr. Cohen illustrated this with a complaint that might come in stating that the Weather Service had predicted that yesterday’s weather would be sunny and warm and instead it was raining and cold. Mr. Cohen asked, “… should the Weather Service even bother dealing with that as a request for correction?”

Ms. Elaine Stanley of the Environmental Protection Agency discussed the agency’s web-based integrated error correction system that the agency was considering using as part of the data quality correction process. Under this system, an error is defined as a “deviation from accuracy or correctness and described as the difference between observed and/or approximately determined value and the true value of a quantity.” Ms. Stanley noted that a key principle in managing any correction mechanism is knowing who owns the data. “Knowing who has the responsibility and the authority over the original data or more broadly the information is the No. 1 principle in terms of trying to get it corrected and resolved…”

In terms of the appeals process, Ms. Stanley stated that EPA was considering two options: (1) Affected persons would file the appeal with the assistant administrator or regional administration or (2) Affected persons would file the appeal with the chief information officer.

Mr. James Scanlon, Department of Health and Human Services, indicated that while honoring the existing processes and legal mechanisms for different agencies within the department, such as FDA and NIH, the department will try to establish a common template to be used by affected parties when making requests for correction. Scanlon indicated that DHHS is trying “to make it fairly flexible to request the correction,” but emphasize that the affected person must be quite clear in describing what exactly needs to be corrected. With respect to appeals, Scanlon said that the appeal would go to one level above the originating office and could conceivably be raised to a higher level within the department if needed.

Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 4
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 5
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 6
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 7
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 8
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 9
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 10
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 11
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 12
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 13
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 14
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 15
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 16
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 17
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 18
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 19
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 20
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 21
Suggested Citation:"2. Overview of OMB Guidelines and Agency Concerns." National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/10666.
×
Page 22
Next: 3. Draft Agency-Specific Guidelines »
Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report Get This Book
×
Buy Paperback | $45.00 Buy Ebook | $35.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!