8.
SYMPOSIUM WRAP-UP

MODERATOR’S OVERVIEW

Mary Waltham, Publishing Consultant

In the context of Dr. Duderstadt’s opening remarks to this symposium, which referred to the “chaos of concerns” that currently surrounds scientific, technical, and medical (STM) publishing, this final session reflects on some key issues raised by the symposium’s participants, including publishers, university faculty, administrators, and librarians.

From a publisher’s perspective, changes occurring to publishing systems driven by electronic journals have been discussed extensively during the symposium, and a range of issues that require close attention were identified and addressed. It is clear that the costs of online-only journals are less than print plus online. However, print seems unlikely to go away completely across all of STM publishing in the near future, so the costs of a dual system persist for most publishers. Archiving was discussed and questions raised about who will be responsible for the archive journal content—publishers, national libraries, or other third parties? What will be archived—all of the journal content or just the research content? Who will curate the journal archive and ensure it migrates to appropriate platforms as technology evolves? Who will pay?

Within STM journal publishing there are economies of scale that tend to favor publishers of large numbers of journals; these economies of scale are not available to small and mid-sized publishers. As a result, some cooperation and grouping of content have arisen within some sectors in order to mimic or replicate these economies. Recent mergers and acquisitions within the STM journal market have further made smaller publishers nervous about their longer term viability as independent entities. Most of the broad range of online business models are quite experimental and seem likely to diversify and hybridize over time as the publishing system develops and new advances emerge.

Speakers also talked about the likely development of publishing as a more disaggregated process with separate pieces of the continuum done by different groups, from content creation through dissemination.

Filtering and quality control of the information are central to the publishing process but, in the future with more open access to all types of information and data, who will provide reliable and consistent filtration and quality control, and who will pay for it?

Increased online access results in increased usage of information. The journal as a package of information is not granular enough and so further unbundling of information is clearly taking place. Customers and users want more granularity in the online environment than a journal issue represents.

There was discussion of who needs copyright as opposed to “wants it” or “uses it.” Copyright is very dependent on both the author and the mission of the publisher with whom they may be working.

The continuous online publishing process means that documents may no longer be static but evolve through time because addition, annotation, and revision are simple.

Interoperability and common standards are essential to bring together and integrate information and provide a dynamic reference tool. Achieving this type of integration is necessary for making the optimal use of scientific information.

A key point for publishers is: Where do they add value to the publishing process, and is that value added where users and customers want it? The role of publishers must continue to change to meet the needs of the research community.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium 8. SYMPOSIUM WRAP-UP MODERATOR’S OVERVIEW Mary Waltham, Publishing Consultant In the context of Dr. Duderstadt’s opening remarks to this symposium, which referred to the “chaos of concerns” that currently surrounds scientific, technical, and medical (STM) publishing, this final session reflects on some key issues raised by the symposium’s participants, including publishers, university faculty, administrators, and librarians. From a publisher’s perspective, changes occurring to publishing systems driven by electronic journals have been discussed extensively during the symposium, and a range of issues that require close attention were identified and addressed. It is clear that the costs of online-only journals are less than print plus online. However, print seems unlikely to go away completely across all of STM publishing in the near future, so the costs of a dual system persist for most publishers. Archiving was discussed and questions raised about who will be responsible for the archive journal content—publishers, national libraries, or other third parties? What will be archived—all of the journal content or just the research content? Who will curate the journal archive and ensure it migrates to appropriate platforms as technology evolves? Who will pay? Within STM journal publishing there are economies of scale that tend to favor publishers of large numbers of journals; these economies of scale are not available to small and mid-sized publishers. As a result, some cooperation and grouping of content have arisen within some sectors in order to mimic or replicate these economies. Recent mergers and acquisitions within the STM journal market have further made smaller publishers nervous about their longer term viability as independent entities. Most of the broad range of online business models are quite experimental and seem likely to diversify and hybridize over time as the publishing system develops and new advances emerge. Speakers also talked about the likely development of publishing as a more disaggregated process with separate pieces of the continuum done by different groups, from content creation through dissemination. Filtering and quality control of the information are central to the publishing process but, in the future with more open access to all types of information and data, who will provide reliable and consistent filtration and quality control, and who will pay for it? Increased online access results in increased usage of information. The journal as a package of information is not granular enough and so further unbundling of information is clearly taking place. Customers and users want more granularity in the online environment than a journal issue represents. There was discussion of who needs copyright as opposed to “wants it” or “uses it.” Copyright is very dependent on both the author and the mission of the publisher with whom they may be working. The continuous online publishing process means that documents may no longer be static but evolve through time because addition, annotation, and revision are simple. Interoperability and common standards are essential to bring together and integrate information and provide a dynamic reference tool. Achieving this type of integration is necessary for making the optimal use of scientific information. A key point for publishers is: Where do they add value to the publishing process, and is that value added where users and customers want it? The role of publishers must continue to change to meet the needs of the research community.

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium COMMENTS BY PANEL PARTICIPANTS Malcolm Beasley, Stanford University Professor Beasley commented on the symposium from the perspective of an interested practicing scientist working in the lab, with a physical science background. He shared an example of the use of print versus online journals in the physics community in Stanford, where the faculty now overwhelmingly use electronic journals. Only the older faculty still use print copies, and it will only be a very short time before online journals are used universally. Professor Beasley’s first impression was that some sort of broad open access to scientific information bases of various kinds, including journals, is inevitable. It is too important not to have them. This is his view as a practicing scientist, and, although they are younger and naive, that is also the view of the students. It will not be simple to get there, and there are a wide variety of vested interests and issues that have to be accounted for. He believes that ultimately scientists will insist on it. Nonetheless, he agrees with Monica Bradford from Science that this will be a tension-filled transition. That tension is not necessarily a bad thing if it is creative. Tension can also be destructive, however, so it will require strong and well-informed leadership to ensure that the tension is creative for science and all who partake in that enterprise. His second impression is that “publishing” will not go away because it provides added value. The breakdown of the present system is forcing an examination of what those fundamental values added are in the existing publishing modes. Equally interesting are the alternatives in terms of publishing broadly defined to achieve things such as certification, formatting, and editing. Most interesting to Professor Beasley are the new modes of value added that will be created. There are some wonderful ideas out there. Maybe three-quarters of them will not work, but it is the quarter that survive and provide the essential value added that will be wonderful to have and very interesting to see evolve. Also, it is clear that access to information almost universally will continue to profoundly improve in speed and efficiency, and therefore have a similar effect on science. That is basically good, but it is important to deal with all that information and get to it with some sense of the quality, the various other types of value added that might be seen. Hyperspeed in itself is not the ultimate good; rather, it is speeding up the process of achieving understanding, as distinct from the acquisition of facts. We ultimately have to achieve understanding in science, and it may mean different things in different fields. During the symposium, Professor Beasley heard ideas that might implement that, but he did not hear a focus on what might be done to improve that aspect of the scientific enterprise. He thought that aspect of the scientific enterprise should have been brought into sharper focus. People are thinking about it, but it is important to stress that we do need to do that. Professor Beasley thought that in addition to understanding the information we will be getting, we must find methods to improve and accelerate our ability to form good judgments on the basis of that information. There will be more information, which presumably will be better and received faster. To give a concrete example, he observed over his 30 to 40 years as a university professor that graduate students cannot read the literature. It is too abstract, has too much jargon, or has concepts they do not understand. Maybe this is something more specific to physical sciences, but simple access to it is not going to be enough. As a university professor, he found some related issues as well. How will the faculty help students deal with the information? They may be more adept in using the tools, but they are not going to be adept in this essential value added. For example, there are personalized searches and virtual journals. There are in Science and Nature perspectives on some of the articles. He is an avid reader of those because they help him gain perspective, not only in his own field, but a bit more broadly. The condensed-matter physics community is considering creating electronic journal clubs. There is a famous journal club in this community where individuals report on a paper selected because of its quality and importance, which is then presented and critiqued. This was a tradition at Bell Labs, which was initiated by Conyers Herring, a truly great physicist, and some of the alumni of Bell Labs are now trying to institutionalize that by putting it online somehow. Professor Beasley does not know whether it will work, but believes it is an excellent thing to try.

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium The knowledge environments and connection maps discussed during this symposium are ideas that illustrate what is possible in this area. As a university professor wearing his teaching hat, Professor Beasley found the implications of electronic publishing and the digital revolution interesting and not topics that have received sufficient attention either in the scientific community or in the universities. James O’Donnell, Georgetown University Dr. O’Donnell is one of the longest serving editors and publishers of an open-access online scholarly journal anywhere. His cofounder, Professor Richard Hamilton, and he began publishing Bryn Mawr Classical Review (BMCR), an online book review journal for classical studies in the fall of 1990. It has always been free; it has always been on the Internet. They tried selling a paper version for a few years, but that did not go anywhere and ended in 1997. An archival CD was published in the mid-1990s and was a complete flop. BMCR has become the leading book review journal in the classics in the United States and one of the top two or three in the world. The editors have all the worries and anxieties of an open-access publisher, including where the next dollar is coming from, but they are confident they are in business to stay. We expect, he believes, of a new system that it will drive costs out of the old system. There is a general sense that we pay a lot for information and if we could possibly pay less, we would like to do so. He shared the dreams and imaginings of the early 1990s, when we thought that electronic publication would be somehow magically frictionless and cost free. He realizes that we know it is not, but at the same time we know there are opportunities to reduce some of the costs. The first two sessions of the symposium were particularly instructive in that regard. O'Donnell treats separately the question of recovering those costs and sharing those costs in an equitable and sustainable manner. The open-access movement, to the extent that it is a movement, tends to shift costs away from the user and toward the producer of information. O’Donnell recognizes the attractiveness of this shift, and has qualms about it. The adventures in the non-open-access domain of journal publishing in the past 15 to 20 years have demonstrated that one of the features of high-priced paper or electronic journals is that they facilitate a form of quality control. When university libraries cut their budgets for serials acquisition and reduce the number of titles they acquire, they are exercising a form of quality control over what they will accept. If the “big deal” for licensing e-journals is fraying around the edges, library resistance will transmit itself back to the publishers as news about which journals are to be sustained and which journals are not. The actual closing of real journals and cutting back of titles by publishers over the past decade has been achieved mainly through the Pacman-like gobbling up of one publisher by another and the recognition that not every title was sustainable in the economic model of the for-profit publisher. But here O'Donnell identifies the focus of an important disagreement. Some of the symposium’s speakers spoke up for the open-access model as delivering superior quality. One can equally argue, he thinks, that the commercial model provides superior quality. As one argues about business models, costs, and recovery of costs, it remains an open question whether a change in the system will improve or degrade the quality of the aggregate system of scholarly communication. As long as that question remains open, he thinks there is no forcing argument to use in favor of one model or the other. A mere economic argument would not be definitive. A good recommendation from this symposium, according to Dr. O’Donnell, would be to continue to assess what the effects on quality of information, timeliness of access, and quality of peer review in the different models would be. O'Donnell observes wryly that if, as an open-access journal publisher, he is part of the solution, as a provost, he is undoubtedly part of the problem for everyone. Commercial publishers will complain that he does not give enough money to the library, and explain that this is why librarians think the prices of the journals are too much. Open-access enthusiasts will complain that he does not underwrite their experiments in new forms of publication, because he is still paying for the old system. He would like to stop paying for the old system before paying for the new system, and not be caught paying for two different systems at once. From the viewpoint of a provost, he is struck by the emerging differentiation of the product for which the scholars, scientists, and publishers want him to pay. He was particularly impressed by many of the numbers of costs raised during the first session, especially Mike Keller's findings on the precipitous drop in commercial value of information over the first 12 months of its life. That seems to suggest at least

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium a tri-partition of the kind of information objects that are under discussion at this symposium. At one end, there is the timeliest of information services, providing the linked news from the front as rapidly as possible, with no consideration for archiving or price, but simply getting the news as fast as possible so that science can proceed in its best way. At the other end of the life expectancy of that information, service has become an artifact, something to be preserved, maintained, and sustained long after its commercial life, or perhaps even its scientific life, has been exhausted. Preserving the science that is done today for the historians of science 100 years from now is an important exercise, but the historians of science 100 years from now do not have their budgets yet and cannot come to the table with dollars to pay for that task. That first information service, O'Donnell believes, tends to be market-based. That last artifact service is not market based at all, but is something done out of noblesse oblige for the greater good of the community. Between there is a borderland area, where the information service itself needs to be mediated to those who have limited access to the market. If a researcher has large research grants or is in a major research university, he or she is probably doing well in acquiring information. However, if one is in a developing nation or a small institution or is otherwise disadvantaged, the situation may be very different. He believes that understanding that differentiation of product, which is an increasing differentiation in an electronic environment, increasing among products and increasingly differentiated among disciplines, will be an important part of understanding what a new system of information dissemination can be like. In the end, O'Donnell thinks himself more tranquil than many other observers. Much of the symposium’s discussion has been focused in his view not on imminent fear of negative events, not on reasonable expectation of longer-range problems, but on still shapeless anxiety. As provost, he cannot pay to fend off every piece of unsubstantiated anxiety. Every other department and office of the university comes to him with the same kind of anxiety. Much progress has been made in ten years, he believes, and the discourse has changed dramatically. In that regard, he is optimistic. His question for the librarians, scientists, and colleagues he works with will be, Can you make sure you define what the problem is before you need a solution? He is still not convinced that he can get a good answer to that question. It is time, he believes, to begin disaggregating the scholarly and scientific publishing crisis, if there is one, into pieces that can be addressed in rational and coherent kinds of ways, as part of the many movements that have begun to emerge in the past ten years, as e-journals have become a reality, as open access has become an astonishing reality, and as the impact on academic culture has been so great. If someone can identify persuasively the problem and say what the characteristics of a successful outcome might be, then they will find him to be a much better provost. Ann Wolpert, Massachusetts Institute of Technology The views expressed during this symposium have been largely those of the authors and publishers in the scholarly communications system. Ms. Wolpert commented from the standpoint of the university library, which is responsible for bringing in, paying for, managing, and maintaining access to the kinds of information resources that are the subject of this symposium. To paraphrase Hal Abelson’s earlier comment, the progress of science requires access to raw material, evaluated judgments, and conclusions, to work that is current and previous. This is true in the university, not just in one's own discipline, but in multiple disciplines, because universities are dealing with students who have not yet settled into a discipline or who are working in the interstices between disciplines. So it is the responsibility of university libraries to think across those kinds of boundaries. It appears that the raw material and the evaluated conclusions coming out of scholarly research are bifurcating into two fairly distinct realms. One realm is a quite tightly controlled, often highly priced peer-review literature, and the other is the minimally controlled, scholar-managed, open-access publication and database regime. Both environments present challenges to universities that are by mission dedicated to providing homes for researchers, educating the students who come to these environments, and managing the information that is created today and yesterday for the benefit of scholars and students, some of whom have not yet been born. One could generalize from the symposium’s discussion that the controlled literature is making every effort to increase that control over the content that they publish, and to expand their reach both in

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium time and in format. We heard that there is considerable interest on the part of publishers in controlling the archive of those publications, so that they would be responsible for the permanent record of a discipline rather than university libraries. The disciplines or the publishers themselves would thus be responsible over time for the archiving of publications. At least some publishers have also expressed an interest in adding data to that record, which would bring all of these resources under the intellectual property control of the publishers themselves. This may be a perfectly rational business model from the standpoint of someone who manages a university press and struggles on a day-to-day basis with the finances of publishing. However, it is not a perfectly rational business model if one happens to be on the university library end of this formula and looking at buying these materials from publishers under the kinds of intellectual property controls that are created largely for the benefit of the entertainment industry rather than education and non-profit research. The university perspective on the value chain of the process by which new information is created is very different from the value chain perspective of authors or publishers. In the value chain from the university perspective, the university builds the infrastructure and provides an opportunity for faculty to develop curricula and conduct research. Universities also attract and admit students, especially the graduate students who conduct some of the research. Universities provide access to information for their communities at a cost that must seem to them to be reasonable, as a percentage of the overall annual operating budget of the university. And they delegate that responsibility to their university libraries, which is where research libraries come in. Not everyone in research libraries believes that the subscription model as a way of acquiring information is fundamentally broken, although the price structure certainly is. Most do believe that the terms and conditions under which information is allowed into our campuses are substantially flawed, because licensing agreements control, in many cases, how subsets of a community can actually use that information. So it is the combination of the price and the constraints on the use of information that represent problems in the system for us. Looking down this value chain of universities to the point where faculty are evaluated for publishing their work, one can see that the publication function has been outsourced to publishers for a number of valid reasons. As long as publishers handled the peer review and the publication of the work, and that work came back into the university for the next generation of education and research at a price that was reasonable as a percentage of the annual operating budget of the university, the system worked to everyone’s advantage. Over the past 10 or 15 years the outsourcing enterprise has developed its own independent business model, which does not think of itself as having a symbiotic relationship with the university as much as it thinks of itself as a stand-alone, independent enterprise, able to create and craft its own business future, separate and distinct from what the university wants. It is in that environment that the costs have gone up and that the terms and conditions of use have changed, because there is now a very clear boundary between universities and the groups that capture the work that comes out of universities, publish it, and evaluate the quality of that work. One of the consequences of this separate and distinct business model, that is now at more than arm's-length from universities, is that some of those businesses doing the publishing and the peer reviewing have come to think of universities as patrons—that is to say, organizations that “owe” the publisher a certain amount of money to support the work. Others think of universities as pigeons—that can be plucked endlessly to support a profit margin they may or may not wish to support if they were asked. A third alternative one hears, is that universities are pirates, stealing information, trying to get for free information that was of high value and added to by the publishers. None of these scenarios makes for a particularly useful set of expectations around which to have conversations, although we keep trying. The conundrum for universities has several dimensions. For many publications, the costs are simply too high for the value received, and the licensing conditions are problematic, in terms of what we can do, particularly with digital information when it is delivered to our campuses. As described in session three, the intellectual property environment is not only incomprehensible to the average faculty member and student, but it may place universities at risk. Any legal regime that varies so considerably by circumstance and format is not a particularly useful legal regime for people to operate under. On the other hand, people are afraid to seek definition through litigation; because they do not know what the outcome is going to be. Now there are also new forms of data and information that clamor for attention for curation and funding on our campuses. Ms. Wolpert shared some concluding observations from the perspective of the university librarian.

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium It is apparent scholarly communication and publication are diverging. At one time, communications with colleagues was through publications; now it is clear that one can communicate outside of the formal publication record. The formal publication record is moving in many cases off to one side, which again affects the question of how much value do we put in the formal record of advances in a discipline, if in fact most of the communication is happening some other way. That is in a sense what MIT’s DSpace is about—to capture the communication, not the formal publication record. We hardly have a clue about what reasonable standards and norms might be for the cost of peer-reviewed publication. Those costs vary tremendously, and we do not know what drives them. We also do not know what drives the cost of print as opposed to the electronic publication. So it is very hard for us to think logically without the kinds of norms and standards that one can get from most other industries. It is clear that intellectual property law that meets the needs of the entertainment industry and the international publishing conglomerates is not particularly conducive to what goes on in the academy. We do not know what new models of peer review and recognition might be developed for open-source publication, which is an area of real attention. Finally, Ms. Wolpert took away from this symposium a new, tongue in cheek business model for university libraries—the offshore library model. She will advise her colleagues in the state of Maine, who are dying under the current cost structure for peer-reviewed literature and have had to cancel subscriptions, that all they need to do is go to one of the 130 developing countries where this information is delivered for free and open a branch library; they can have an offshore library. DISCUSSION OF ISSUES Dr. O’Donnell added that he was struck by the vividness of the presentations. Ann Wolpert's comments are quite relevant to the divergence of views and the building of collaborative enterprises. That said, he thought the speakers addressed fewer of the problems that might emerge as, for example, those huge databases age. In another ten years, 5 or 10 petabytes will be easy to carry around, but it will still take some housekeeping and maintenance, and will begin to look a lot like what publishers and libraries do. His question remains, what is it going to take to do the good science and to make sure the good science gets done? That should probably be the bottom-line question and set of priorities. Universities Role in Data Curation and Management Ann Wolpert asked Malcolm Beasley as a scientist in a research university what role he sees for the university, if any, in the curation and management of data that emerge from science. She asked this in light of Dan Atkins’s comments on the National Science Foundation’s cyberinfrastructure report and the question about where responsibility resides for the long-term storage and curation of data. Professor Beasley thought that question was more appropriate for Mike Keller. He did, however, add that faculty in general do not understand the costs of providing these library services in this modern sense and therefore are not necessarily informed people to make judgments about how these things ought to be paid for and what the real trade-offs are. Ann Wolpert asked the question in part because she was interested in knowing how and where the discussion can go forward on university campuses. Professor Beasley, reflecting on his own experience, said that the faculty do need to understand these things better, because if they do not they will go and beat up on the provost. To some degree, it simply is not fair, and to another degree, it must be exasperating to be ill informed. There are a number of areas where the provost will get beaten up, but the point is that with the revolution in the way science is done that it is too important to leave to the provost alone. Jim O'Donnell thought that the real question was not what the provost wants to do in the abstract, but an empirical question: Where is information best cared for? He did not think we know the answer to that. As a provost, he could understand an argument that would say only the scientists know what good-quality curating looks like and what maintains information in good enough form to be useable, therefore it must be done in-house. He also could accept an argument that the scientists do not have a clue how to

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium do it technically, managerially, with appropriate metadata, or with appropriate access or preservation, and therefore the curating needs to be outsourced. The question then would be, does one outsource it inside the not-for-profit community, the university, or the larger not-for-profit community, or externally to the market? Donald King commented that the scientific literature should be made available to the faculty, not just per title, but rather per article. For some journals the cost per article has actually decreased because the size of the journal has increased much more rapidly than the price. Faculty should also be presented with the cost based on the price by use. The number of individual subscriptions by scientists has dropped from 5.8 to 2.2 over the past 25 years. As a consequence, all of that reading has now gone into the library. Scientists have roughly doubled the reading they do in academic libraries, and in special libraries it has increased four or five times. Faculty need the right indicators in a way that is realistic for them to understand. Mike Keller tried to respond to Ann Wolpert’s initial question about the role of the university in data curation and management. He noted that the underwriters who work for Stanford and who want the university to spend a lot of money on insurance say that the collections that have been amassed there in the libraries, now amounting to about 8 million books and probably 30 or so miles of archives and so forth, are valued at $1.2 billion. It is one of the largest assets the university has, other than its land. In the decade since 1993, when the World Wide Web became massively and widely available, Stanford has spent $73 million on capital projects, the most recent of which was to buy land to construct the first of what might be six or eight modules to store books and other physical artifacts, as well as perhaps to house a very large digital store off campus. In the same period of time, Stanford spent about $5 million, earned almost entirely from the publisher clients at HighWire, putting out about a million articles for them. The cost of information technology and e-publishing today is about 5 percent of the whole budget for publishing. As we work on this problem of nonphysical, virtual artifacts, the solution lies, as it has in the past, in great libraries, which are archives not built by archivists for their own sake. They are built with the connivance and cooperation of the scholars who most directly receive benefit from them. As the scholars in science, technology, and medicine very clearly see the advantages to some of the research possibilities that the big data sets and collections of electronic resources come to offer, then there will be more impetus and more money to provide them with the research materials, which then become artifacts for preservation over a long period of time. Libraries are preparing to do that, but everything that we have is in embryonic phases, without exception. Martin Blume noted that on behalf of publishers from learned societies, the message, if somewhat ungrammatically, to the professors, librarians, provosts, and scientists is, “We are you, and you are us.” The American Physical Society (APS) is run by professors and laboratory scientists, including the president, the vice president, past president, and president-elect. It is operating with scientists, who have been professors. So the values that we have are the same values that are expressed at this symposium. The APS officers understand some of the problems of the publications perhaps a little better, because they have been thrown into it and are therefore forced to think about it. At the same time, they need the understanding of the scientific community and the universities that this is where they are coming from. APS’s publications committee is chaired by an industrial scientist. Its members are university professors and laboratory scientists as well. Paul Ginsparg is a member of that committee, so there is representation from the electronic archives. All of this comes together at this point, so APS cannot be accused of having a very narrow point of view; it is better educated. Interaction between Scientists and Archivists Steve Berry asked Jim O'Donnell whether it is possible to overcome the arrogance of the scientists and the humility of the archivists, so that the natural solution will come from these two groups of people learning to talk to each other, to deal with the problems of these enormous databases, and realize that each brings something, but not a complete solution. Jim O'Donnell thought the critical intellectual capital will be formed in the dialogue between the archivist and the information scientist on the one hand, and the working researcher on the other. He would like the institutions and the not-for-profit sector to retain control of that process in order to hold on to the intellectual capital. Then he would be willing to talk about who should do the dirty work and make

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium the things actually operate. He was brought up short once early in the days of BMCR, when an associate provost, who used to be a librarian, asked Dr. O’Donnell if he could do a good job of marketing to nonclassicists, and O’Donnell replied, “not really” since he knew who all their readers were. The associate provost told him that, “the one thing you know in a library is there are lots of other people who have an interest in the stuff you are doing besides you, your six friends, and the folks you think you are doing this for.” That really adds an order of magnitude of value that O'Donnell thinks we always need. Getting the dirty work done adds another value, but it is a very different kind. Tom Arrison from the National Academies was interested in the idea of the dialogue between the scholars and the librarians. An instructive example is the OpenCourseWare project at MIT, where there was a process that led to this fundable vision, implemented with money from private foundations. Sometimes this dialogue and reflection on a campus can lead to a vision that can bring in the resources to implement it. He asked whether in the panelists’ individual institutions that kind of dialogue is going on and is adequate. Also, more broadly, he was interested in what needs to go on between institutions and among the scholarly communities to help promote that. Malcolm Beasley noted that in terms of dialogue, it depends on definition. At Stanford, there is a faculty committee that interacts with the administration to deal with these questions. The faculty hears reports, and he has no doubt that it is a substantive discussion. But a wider discussion than that through the normal faculty senate committees is not widespread. The question is: Is this an issue that is sufficiently important that one ought to try to have a wider discussion about it or not? He would argue that it is a candidate, because of its importance to the scholarly side of what faculty does. But he does not think it would be fair to say that this discussion is highly developed in any institution. Jim O'Donnell said that the learned societies play a stronger role than the individual campuses do. They can aggregate more resources, and they can bring more intellectual firepower to bear on discipline-specific kinds of questions than happens in a university faculty with a library advisory committee with different disciplinary representatives. Ann Wolpert believed that perhaps one of the points to take away from the symposium was that there need to be vehicles to encourage that discussion, because there are conversations within institutions and groups of institutions, and there certainly are conversations within disciplines, and, to a lesser degree, among and between disciplines. But there is no easy way for the conversations to happen across the boundaries that were just described. Malcolm Beasley followed up on Tom Arrison’s comment about universities obtaining resources from foundations and other private sources. He felt it would be more appropriate to try certain experiments. It is not the intended role of any foundation to pay the continuing costs. The Role of Publication and Communication in the Sciences Steve Berry asked Ann Wolpert if she would accept that the role of publication still remains central to the process of communication in the sciences. Although colleagues in many different places talk more now, when it comes to the substance on which scientists base their inferences and new ideas, they have to go back to the literature. They have to go back to the publication from last month or the last decade. The scientific publication still remains the rock on which they build, and the thing that has changed most is the communication and the way scientists collaborate with each other. Ann Wolpert agreed. However, she added that the challenge that confronts us now is the migration of the official version of the publication of record from print to electronic formats. For 15 years, people have been asking when the library will become entirely digital. The answer is, it depends, like the law. But as a practical matter, we are in a serious transition phase right now. She has no idea how long this is going to last, but we will know that we have successfully tipped into the electronic environment and learned how to archive material that is formatted digitally in reliable and sustainable ways when someone gets a Nobel Prize based entirely on an electronic publishing record. That is her standard. Steve Berry noted that it has sometimes been said that paper is much more permanent than any current electronic form, and there is the dilemma of transferring last year's electronic archive into next year's electronic archive. However, there was a period when Physical Review was printed on very high acid paper; if one opens a mid-1970’s Physical Review now, the pages crack and come apart. APS was able to transfer those fragile paper versions to electronic form and save the records in PROLA.

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium University Libraries and Publishing Deals Robert Bovenschulte asked Ann Wolpert if he could conclude from her remarks that she might roughly divide all library deals into three categories: good deals, acceptable deals, and bad deals. He was curious to understand better the interplay of forces within the university when a library accepts a bad deal. Ann Wolpert noted that there are different flavors of bad deals. Some of them have to do with the cost of the deal, and some of them have to do with the terms and conditions of use that come with the deal. A bad deal for a university is a deal that over time consistently favors the needs and interests of one group within the university over others. Part of the political risk that the scientific community runs right now, and where there is push-back in the university environment, is around the constant percentage increase in demand to support the scientific and technical literature out of a finite amount of money. At the end of the day, someone on the campus gets shortchanged as a consequence of the need to constantly feed a set of growing expectations about the payout from university library budgets in support of scientific and technical literature. From the standpoint of the university, ultimately the groups that are shortchanged will have their time in the spotlight. Disciplines cycle through favor. Around the turn of the last century it was mechanical engineering then civil engineering and then physics; right now it happens to be the biological sciences. But as a practical matter, sometime they will cycle out too, and something else will replace them. Libraries struggle to maintain a balance on their campuses from the expenditure point of view between and among the disciplines. The potential for long-term damage is there, because one cannot buy a book that is not available anymore. The other worry that universities have is about the terms that licenses about who can use material and under what conditions. That sometimes disadvantages parts of the campus. A library can afford to license for a limited number of users or for one subset of its community, which disadvantages those who cannot easily get to use it. Those licenses cannot be networked, or they can only be networked within a particular subset of buildings on campus. So these are the kinds of complications that libraries did not have 10 or 15 years ago, which they now confront on a regular basis. Retrospective Information in an Electronic Environment Gordon Neavill, of Wayne State University, noted that one of the problems in the digital environment is that the economic link between current and retrospective information is broken. In the print environment, almost all the information that libraries bought was acquired because it was current. It was then simply retained and became valuable retrospective information. In the electronic environment, we pay once for current information, then we probably have to pay all over again, at a fairly high cost, to capture the same information for very important, but low-level, retrospective uses. In the case of electronic databases, snapshots are made, but fewer people will use the retrospective information than need the current information. To some extent, these retrospective costs can be shared. He asked if electronic systems can be designed to minimize the additional cost required to retain them for retrospective purposes. Ann Wolpert said that there are two ways to think about that. When only one format existed, it served a variety of needs: the current information dissemination, the near-term research requirement, and the long-term archiving requirement. In the digital environment, people want to use the digital materials for the sake of convenience and productivity, but there are difficulties with that, in that the electronic material does not perfectly mirror what came out in print. We heard previously that a third of the use of the Reed Elsevier titles was for nonarticle material. So there is a lot of information in print publications that is not in electronic collections of journals. The other way to think about the electronic environment is that there are presumably costs associated with buying the retrospective collection on an annual basis. If a library had a choice of buying one year's worth at a time in terms of budget, it could do that and stop. But in the electronic environment, a library has to buy the current year plus the archive in many instances over and over again. Certainly that is the model for reference books. If a library had limited funds, it bought a scientific and technical

OCR for page 95
Electronic Scientific, Technical, and Medical Journal Publishing and its Implications: Proceedings of a Symposium reference book one year and then bought it on an every other year or every third year basis. If this reference book is purchased electronically, the same price is paid year after year. So it affects the economics of how a library thinks about its collections, not just in terms of the current material, but in terms of how it manages the archive. Jim O'Donnell added that the retrospective collection has never been free. It costs a lot of money to keep it dry and cool and to reshelve it. Universities spend immense amounts of money on redundant collections. Where off-site shelving is beginning to be done cooperatively among institutions, it certainly provides an alternative opportunity to think about just how much redundancy is worth paying for in terms of the use they get out of it. Universities have to be rational about how much they spend on the print archives—and it is a lot. His comment reminded Ann Wolpert that that there are different kinds of money in universities. Capital funds are needed to build a building, which is a big one-time effort. Although the cost of the materials that are stored there over the life of the building are amortized, in fact, for the university the economics work differently. Costs in the electronic environment are annual operating costs as opposed to being able to move some of those costs off into a capital budget, and manage them differently. CLOSING REMARKS Ted Shortliffe Ted Shortliffe closed the symposium by thanking the speakers and participants. One of the charges to the steering committee is to take the lessons of the symposium and try to crystallize them, and in particular to ask what are the key potential areas of study that the National Academies might focus on for additional work to help move these issues forward. There are many potential topics for study, and he welcomed advice from symposium participants on which areas the National Academies could make useful contributions in, such as areas that have not been looked at effectively by others perhaps, and where there is a need for further work. On that note, he thanked the National Academies staff for assisting with the symposium and adjourned the meeting.