The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium
In other instances, we have seen rebellion at the grassroots. Editorial boards have protested against the commercial publisher journal prices and have resigned and moved to less expensive publishers in scientific societies. The complexity and shifting from a first-sale approach characteristic of paper to licensing have caused a good deal of experimentation.
There are many other variants. One approach is reminiscent the way that dissertations used to be generated from the university microfilms collection, as an edition of one. That is, to begin to make it acceptable that there may only be one physical copy of a document but have the ability to reproduce that from an online copy, at the user's expense.
Another interesting approach that is emerging involves the open source or open-access strategy. The success of the open source software movement through Linux, the Apache Web server, and similar technology has given rise to a number of related open-access initiatives, such as the Open Knowledge Initiative and the MIT OpenCourseWare project. These initiatives focus on developing new financial models for the open distribution of scholarly materials, perhaps by building charges for dissemination into research grants that generate the information in the first place. This is not only consistent with the traditions and values of academia but also reinforces the definition of the university as a public good, an issue that university leaders are increasingly worrying about these days, when the rest of society tends to look at us more as a market commodity.
In summary, advances in digital technology are producing radical shifts in our ability to reproduce, distribute, control, and publish information. Yet, as these advances become more a part of scientific activity, they tend to run headlong into the existing practices, policies, and laws that govern traditional publishing.
The issues are complex, in part, because the stakeholders are so many, so varied, and with different agendas. People who fund research want to see that the information is advanced and made available to the public. The authors, editors, and reviewers do not charge for their labor. They are motivated to contribute to the public good, but of course they also have other rewards, not the least of which is tenure. Publishers, as intermediaries, although they do not pay for content, do add significant value and provide the work in published form. Libraries, similarly, are intermediaries. They provide access to the users of STM content. They pay the subscription fees, but they usually do not charge for providing access. And, of course, the end users either pay for personal subscriptions or obtain the resources free through libraries.
There are several more general issues that need to be considered. First, is the changing nature of science and technology research. As pointed out in the recent National Science Foundation (NSF) report, Revolutionizing Science and Engineering Through CyberInfrastructure,2 the process of knowledge creation itself—experimentation, analysis, theory development, and forming conclusions—is increasingly occurring entirely in the digital world. That has caused a shift from the sequential process of research, publication, validation, and dissemination to more of a parallel flow model that is interactive, in which the process of publication and distribution actually becomes almost the process of research itself. The key point of the report is that distributed network computing technology is providing a new kind of infrastructure for federating people, information, computational tools and services, and specialized facilities into virtual organizations—so-called collaboratories or grid communities or, as the Europeans call it, e-science, a cyberinfrastructure. The vision put forth by this NSF report is to use this infrastructure to build ubiquitous and comprehensive digital environments. Such environments will become interactive and functionally complete for research communities in terms of the people, data, information tools, and instruments, and that operate at unprecedented levels of computational storage and data transfer capacity. Part of the aim is to trigger the necessary public and private investments to create this cyberinfrastructure. Nevertheless, many elements of it are already in place, and it will significantly change the nature of scholarly activity, including scholarly publication.
The reality today is that electronic publishing is becoming the dominant mechanism for publishing and reading scholarly materials. It opens vast possibilities, of course, but it challenges existing practices and principles, including the way in which we handle intellectual property. A new paradigm for scholarly
National Science Foundation. 2003. Revolutionizing Science and Engineering Through Cyberinfrastructure: Report of the National Science Foundation Blue-Ribbon Advisory Panel on Cyberinfrastructure, Arlington, VA, January.