New and Changing Scientific Publication Practices Due to Open-Access Publication Initiatives
Linköping University, Sweden
During this symposium there have been a number of examples and references to legal rules that apply to research activities and economic rules and conditions that influence our lives as researchers. There are a number of other rules that are equally important for us in our professional life, but which are internal to the scientific community. These are the rules and practices that govern publication activity, at least in the broad sense of publication.
THE PRINCIPLES OF SCIENTIFIC PUBLICATION
There are several principles that have guided the publication of traditional scientific journal articles, which include the following:
In science, technology, and medicine, quality control by peer review is key. It is highly inappropriate to publish an article describing the results of a colleague. It is understood that one should not, even though such an act would not constitute a copyright violation.
A journal will not republish a previously published article, even if the copyright situation is correct. This was a principle introduced by the New England Journal of Medicine in the early 1950s, and it has been generally embraced.
Correct reference should be made to the first reported results. The priority of results is an important part of the incentive system for researchers.
Reviewers are anonymous in almost all journals.
Reviewers are supposed to be objective and not favor or disfavor a paper simply because it relates in some way to their own interests. Papers are considered confidential while reviewed.
Priority is counted from the day of publication of an article, that is, a paper must be reviewed before priority can be given.
Institutions have differing rules, such as whether advisors should be listed as an author of a paper published by their students.
Interestingly, these principles are not in general the results of legislation and they have relatively little connection to the legal system that constrains researchers. For example, the copyright rule would certainly prevent
a researcher from publishing verbatim an article previously published by someone else, but it does not prevent someone from republishing the previous results in their own words. Economic mechanisms are irrelevant. What we have are social rules within the research community, which are in part policies that have been established by journals, universities, or other organizations within the community. This set of rules is specific to the research community. The legal system oversees the community but not actively. There also is a system of incentives in the research community that seemingly contradict economists’ claims that incentives should be monetary because people are driven by monetary compensation. These economic claims are simply not true.
Thus, rules observed in economics that apply to physical property do not automatically transfer to intellectual property. On the other hand, there are other economic rules and results that are more appropriate for the scientific community.
For example, in the 1930s this economic question arose: if the market is such a wonderful design, why do enterprises of more than one employee exist? How come everyone has not formed their own company? The answer, from a purely economic view, is that the transaction costs would be too large. So an enterprise is organized with internal incentives to keep transaction costs down. Certainly in science today when we see enormous transaction costs, for example, for publication and risk communication, we should consider this economic model.
There are two reasons for economic rules that govern the scientific community: (1) efficient dissemination and preservation of scientific information and (2) provision of efficient and varied incentives. Finally, let us observe that these kinds of rules and practices are highly technology dependent. For example, the current peer-review system requires that articles after submission must be sent to reviewers in a different part of the world for review.
Could such a review system have existed before current technology? It would have been very difficult. When there was no photocopying, and handwritten manuscripts were given directly to typesetters, it was very difficult to obtain several copies of a manuscript. Modern peer review became possible with the advent of typewriters and carbon paper.
Now, of course, we are experiencing another enormous wave of technological changes. In fact, the very possibility of open access is due to the technological revolution in the 1990s. We should ask the question whether it will again be necessary to revise current rules of practice because of technological change.
CHANGING PUBLICATION PRACTICES
The rule about previous publication was established by the New England Journal of Medicine before the advent of the preprint, mimeography, and widespread photocopying. When preprints emerged a few years later, there were two different reactions in research communities. Some people believed that under this rule, papers that were presented as preprints could not be published. This meant that you could not use preprints. Other communities believed that a preprint was not really a publication, which is an astounding interpretation of the word “publication” from the patent lawyer’s point of view or just from ordinary common sense. This strange use of “publication” has persisted even to today.
The moral of the first example is that as technology advances rules regarding publications may need revisions. In another, more recent example one major publisher removed a number of research articles from listservs in response to pressure from some groups who considered the content or the wording inappropriate. If you only carried the electronic subscription, you lost access to the articles. The scientific community has, of course, serious concerns as to whether this is reasonable. This was not technically possible before. A policy must be established to govern electronic publications under these conditions.
The third example occurred in 1997 when I started an electronic journal and experimented with another system implementing a two-stage review process. The first stage consisted of a three-month open discussion period in which papers were posted on the Internet and peers were invited to submit comments. It was not anonymous; rather it was a discussion just like in a conference. After three months authors could revise their paper, which was then sent to confidential pass-or-fail referees. This scheme had many advantages: more safety, fairer treatment of the authors, better rewards for the reviewers, and better political control of reviews.
One concern that we immediately encountered was that during the period of open discussion, someone might steal the results, publish the paper in his or her own name, and get the priority. The only solution was to decide that priority begins with the date of the first appearance of the paper in discussion, that is, publication counts before the reviewing begins. First you publish, then you review, and then the journal, if the paper is accepted, technically republishes the paper. In order to use the more advantageous review process it was necessary to change the mode of thinking and the terminology.
Another example is the placement of new research results directly in a database without any article to document the results. We should have a way to characterize a database contribution as an entity on a par with a published research article, because data that are in the database will be used and later work will rely on earlier work. This imposes technical requirements on those databases. Certainly, when this happens on a large scale the scientific database system must keep track of those references, as publishers keep list of references for an ordinary article. The question is should that reference to other work be presented and if so, when and how, and if it is not in a database, what other mechanism can the scientific community use to give incentives for such work.
We ought to think of scientific publication as a function that is internal to the scientific community. With some exceptions, such as in medicine, papers are published by scientists for scientists. But the scientific community has chosen to outsource this function to mainly commercial actors. By doing so we have established contracts with commercial partners that seem to be enormously favorable to them in economic terms. We certainly should not complain, however, that the commercial publishers earn money, because it is their objective to make a profit.
Instead, we might think once again about the agreement we have negotiated. We should be a bit smarter about the deals we establish when we outsource functions that are important to us. This might also require us to look over the rules and conventions that we are using. Those rules and conventions primarily should serve our interests as researchers and make the research process work well, but there are also pragmatic aspects. For example, consider an editorial board of a journal that is independent and uses a commercial publisher. The board is dissatisfied with the pricing policy of the journal, which they consider too expensive. They decide to solve the problem by establishing a new journal with another publisher. They cannot bring the name of their old journal with them, because the name of the journal is owned by the publisher; so they create a new name. The entire editorial board resigns and goes over to the new publisher to start the new journal. The old publisher, then, recruits a new editorial committee.
An interesting question arises: should the impact factor and the prestige be ascribed to the name of the journal or to the editorial board? There are two reactions. One might naturally think that in this situation the prestige is derived from the name of the journal. The other reaction is to let the research community decide, since impact factors and prestige are used by tenure committees, granting agencies, and other organizations within the scientific community. We could certainly say that in this case the impact factor goes with the editorial board, but it will require action. Currently the impact factor stays with the name, making it difficult to start new journals. This is one of the reasons why it is difficult to do what this editorial board did, and it is one of the reasons why the people on that editorial board now consider what they did a bad idea.
One result of new technology that is evident on a large scale is the increasing do-it-yourself activity and self-publication, either by the individual researcher, a university, or more centralized archives. In the case of university publication this sometimes falls under the term of universal electronic press. As a result two issues that are important to the scientific community arise. The first is that the long-term preservation of the publications must be guaranteed. It is not appropriate for publications to vanish within two years.
Second, it is also very important that no one is able to manipulate the articles after publication. In particular the authors should certainly not be able to go back and improve their articles two years later while keeping the old time stamp. This fact must be absolutely clear to the rest of the world.
Therefore, there must be control schemes. You might imagine several different control schemes, for example, duplication of multiple copies of the paper in different parts of the world or an encryption scheme with public keys and so on. It is important that these schemes require public awareness and public acceptance. The scientific community must consider this issue when establishing rules and procedures.
The last example relates to Robin Cowan’s illustration1 of the book containing a table of integrals that is no longer commercially viable because it is on the Internet. Resources that existed before a new technology and were sold to researchers at a reasonable profit may no longer be viable, just as horse-drawn carriages have become obsolete. If this occurs, the incentives provided within the scientific community may have to be broadened to give better incentives for things we previously did not give incentives to because they were taken for granted.
Today in science we tend to give much more credit for doing original or new work. We should remember, however, the nineteenth-century work of making digests, summaries, surveys, and compilations of earlier work is still very important for the research community. Perhaps more credit should be given to these kinds of ventures.
Publication policies tend to be long lived as illustrated in the example of the New England Journal of Medicine, which established a policy in 1951 that became less relevant in 1960.
When we discuss policies today, especially at this time of rapid technological development, we should try to anticipate the development that is likely to occur in the future and how our policies will relate to it.
I suggest that the direction we are going is research knowledge management. In research there are documents of different kinds—articles, research reports, reviews, discussion items, raw data, and laboratory notes from experiments. This broad collection of information is not organized in a very coherent way. Some organizations have begun to address this issue.
There was another conference in Paris in January 2003 where one major French research institute described its plans for building exactly this kind of system for its own needs.2 This is a very likely development. It will continue on toward integration on the international level within science, integration or interfacing with knowledge management systems in industry, and integration or interfacing with knowledge management systems in government or the public sector. This trend may be quite relevant to the scientific contribution at the World Summit on the Information Society in Geneva. Thus, when we review our own policies and rules for scientific publication and communication, we ought to keep this in mind.
A change in the direction of such knowledge management is only going to be acceptable and useful if the research community is able to revise the rules so that the incentives and values important to us will function well with the new technology.
Is it even possible to change the existing rules and practices? It is not easy. Nobody can dictate it. That change certainly must arise in the same way that those rules and practices arose in the first place, that is, by discussion within the scientific community that leads to new policy decisions by such organizations as journals.
What is needed is a broad discussion by a body such as the International Council for Science (ICSU). These issues must be brought to attention and the discussion ignited, perhaps with provocative proposals.
The Committee on the Dissemination of Scientific Information within ICSU might be a group that could initiate such debate. The mandate of this group is to do studies that lead to quality proposals for ICSU. Publication policy is a natural item for the agenda for this committee.
See Chapter 8 of these Proceedings, “Economic Overview of Open Access and the Public Domain in Digital Scientific and Technical Information,” by Robin Cowan.
For more information, see the International Council of Scientific and Technical Information (ICSTI) Open Access Meeting Web site at http://www.icsti.org/open_access2003/index.html. This meeting was convened by ICSTI, in partnership with the INSERM and the French Ministry of Research, and in association with ICSU and the Committee of Data for Science and Technology.