As an example of how digital data can be inappropriately manipulated, consider the case of digital images in cell biology. When the journals published by the Rockefeller University Press, including the Journal of Cell Biology, adopted a completely electronic work flow in 2002, the editors gained the ability to check images for changes in ways that were not possible previously. The Journal of Cell Biology, in consultation with the research community it serves, therefore adopted a policy that specified its expectations and procedures:
No specific feature within an image may be enhanced, obscured, moved, removed, or introduced. The grouping of images from different parts of the same gel, or from dif-
seconds) that retains typically 1 in 30,000 of all events. A second rapid analysis step reduces the rate of permanently recorded data down to about 100 events per second.
Research at the LHC is carried about by international collaborations that construct, operate, and analyze the data from each of the four main detectors. The scale of the research borders on the fantastic: Two of the collaborations each have about 2,000 members from 40 different countries; the volume of the ATLAS detector, for example, is about half that of Notre Dame cathedral, and the mass of iron in its gigantic solenoid magnet is approximately that in the Eiffel Tower.
LHC detectors are complex systems that require meticulous calibration, alignment, and quality control procedures. The data from an LHC detector flow from the arrays of devices that track the particles emitted when the protons collide. The data processing system determines the momentum and energy of each particle radiated from a collision, and identifies how the particles are correlated in space and time. The thousands of detection devices, the magnetic field in which the collisions occur, and the properties of the complex digital data acquisition system must all be known accurately. The complexities of data analysis in LHC experiments are comparable to those of the apparatus itself.
Ensuring the integrity of data from a particle physics experiment presents special challenges because no form of traditional peer review would be sufficient. The experiments are so complicated that a knowledgeable outsider who attempted to evaluate the performance of the detection system would require years for the job. Consequently, the particle physics community has developed a method for reliable internal quality assurance that goes beyond straightforward peer review.
As part of each major collaboration, multiple data-analysis teams work to evaluate the performance of the apparatus and analyze the data independently, withholding their final results until the latest possible moment. In effect, in the particle physics community a major portion of the role that was traditionally played by straightforward peer review has been augmented by a process of critical self-analysis.