Here are some comments on the CalTech Proposal:
Scholar's Forum: A New Model For Scholarly Communication
Anne M. Buck, Richard C. Flagan and Betsy Coles California Institute of
Technology, Pasadena, CA, March 23, 1999
http://library.caltech.edu/publications/ScholarsForum
All the objectives of this proposal are right. Most of the pieces are
there. But unfortunately they are put together into an incoherent
picture. Only a few pieces need moving to get a coherent picture, but
that changes substantially the path we need to take in order to reach
the objective we all agree on and share.
First, a quick reminder of what that objective is:
"It is easy to say what would be the ideal online resource for
scholars and scientists: all papers in all fields, systematically
interconnected, effortlessly accessible and rationally navigable
from any researcher's desk worldwide, for free."
That is the optimal outcome, and what proposals like this one are meant
to do is to help get us there.
I believe this one would fail as it stands, but with just a little
rearrangement, it will succeed.
As it stands, this proposal is trying to create an ALTERNATIVE to the
current peer-reviewed journal literature, because that literature is
currently held hostage by access-tolls, despite having been freely
contributed by the authors, i.e., by us.
The alternative is based on the correct step of decoupling (1) the
quality-control component (peer review) from (2) the rest of scholarly
journal publication and attempting to provide (1) in the form of an
alternative service (in place of the toll-based existing journals)
while providing access and archiving for free for all (2).
This is all very commendable, but it has almost no chance of
succeeding, for the simple reason that it is attempting to compete with
the existing journal corpus for authors, and there is no reason
whatsoever for authors to prefer submitting their papers to a new,
untested quality-control "board" when the existing journal labels are
the ones that carry the confidence and prestige. The proposal asks
authors to switch, but there is no good reason for authors to switch:
The refereed journals are doing the job of quality control well. It is
not their quality control function that is amiss. It is the fact that
they must fund themselves by raising toll-based barriers to block those
who wish to access those papers.
The way to change this is not to try to lure authors away from their
trusted journals. That is like starting not one, but countless new
journals, all unknown commodities, with the usual handicaps of new
startup journals that must find their own niches -- except that in this
case they are taking on the entire existing corpus (at least 14,000
refereed journals)!
It is unrealistic in the extreme to imagine that authors can be enticed
away from their known, trusted and effective brand-names in favour of a
generic "board" of some sort. With the endorsement of a Consortium of
university associations and learned societies (if those can indeed be
persuaded to give it), the chances would be a little better, but still
tiny. The authors risk too much in moving en masse to a brand new,
untested, quality-control authority, even if they are assured that as a
reward, they will get a lot more readers for it. And a mere trickle of
authors would quickly make this whole approach fail, with a residual
shadow cast on the whole false start, thereby putting us even further
away from the optimal outcome we are all seeking.
Yet, with just a few parametric changes, it will work.
First, although journals depend for their PAGES on authors, they depend
for their WAGES on readers, through the
Subscription/Site-License/Pay-Per-View (S/L/P) access fees that they
pay, or their institutions pay for them. There is little hope in
competing for the authors, if this means asking them not to keep
submitting their work to the prestigious, high-impact journals they
know and prefer, and instead to submit them to an unknown new entity,
be it ever so heartily endorsed.
What we CAN compete for, however, is the journals' READERS, and we can
count on the authors' support in this, as long as we do not ask them to
give up submitting their papers to the traditional journals of their
choice.
Here is the LOGICAL (and pragmatic) role that can now be played by the
very feature that makes this peculiar literature -- the refereed
learned serial literature -- so anomalous among literatures: ITS
AUTHORS GIVE IT AWAY FOR FREE (and always have done so), to both their
publishers (in the form of their submitted manuscripts) and to their
readers (in the form of preprints and reprints).
[Pause to appreciate this point, perhaps for the first time: Compare
this highly anomalous behavior of refereed journal authors with the
behavior of all other authors: of books, magazines, newspapers. Do any
of them give their texts to their publishers, desiring no royalty of
fee, and, on top of that, do all they can to distribute copies for free
to all who may desire to read them?]
The implication of this for the online era is quite obvious: Let
authors continue to give their papers away to their publishers to sell,
but let them also self-archive them online, for free. That is all it
will take! Readers will vote with their eyes. They will of course
prefer to access the literature for free online -- Los Alamos has
already proven that.
<http://xxx.lanl.gov/cgi-bin/show_monthly_submissions>
Once this happens across enough fields and at a sufficient scale, the
library serials budget crunch will be the ally in the next logical
step: With hard-pressed library serials budgets, and authors all
accessing online for free, S/L/P terminations are absolutely
inevitable. The journal publishers, feeling the pressure from this, will
have to find an alternative, and the only alternative will be to scale
down to online only, with their providing the only remaining service
that is needed of them: quality control (peer review).
The result will be precisely the outcome the Scholar's Forum Proposal
seeks, namely, a decoupling of peer review from archiving and access,
with the publishers continuing to provide the peer review, in the
traditional, prestigious journals, with their known and reliable
editorial boards and referees -- but without the need for Scholar's
Forum ever to try to compete with them using new, unknown, generic
boards.
See: <http://www.princeton.edu/~harnad/nature.html>
There is only one issue, however, that the Scholar's Forum Proposal did
not consider directly, and that is the cost of quality control. It is
true that referees referee for free; and that many editors also devote
their time for free, or for only a modest honorarium. But implementing
peer review is nevertheless not entirely cost-free (nor is the minimal
copy editing that still needs to be done by way of quality control for
the FORM of papers, just as peer review quality-controls their
CONTENT).
These residual costs of quality control (per published "page," say) are
minimal compared to the costs of S/L/P, but they are non-zero: Andrew
Odlyzko has estimated them as being as low as $10 per published page.
Let us be conservative and say they might be, at most, 30% of the cost
per paper page, recovered via S/L/P.
<http://www.research.att.com/~amo/doc/economics.journals.txt>
The obvious way to pay that small residual cost is up-front, so that
everyone can then access the paper for free. The natural source for
this up-front page-cost is of course not authors' own pockets, but just
30% portion of the annual 100% that their institutions save from the
termination of S/L/P.
So now we know both what the optimal solution is, and the natural way
to pay for it. The only thing that remains is to find a way to get
there from here. The Scholar's Forum Proposal as it stands will not get
us there, because it tries to go off in an untested direction which
depends on authors' making risky decisions that they do not really have
the incentive to make, abandoning their known-impact journals for brand
new generic ones of uncertain provenance and destiny. (Besides, it has
not explained how the "Boards" will be financed: if by S/L/P then
that's self-defeating!)
Make the following parametric changes, however, and it will fly: Don't
put an AAU Consortium's weight behind rival generic editorial boards;
put it behind AUTHOR ONLINE SELF-ARCHIVING (in both local institutional
archives and global disciplinary or multidisciplinary ones, like Los
Alamos -- indeed perhaps in direct collaboration with Los Alamos
itself, which is already well established and could, with support,
readily scale up for the full load, with distributed and mirror sites
worldwide). If this step were taken at a sufficient scale, the optimal
outcome would also become the inevitable one, and very soon.
The only other concern is to make sure there is a stable transition
strategy to prevent abrupt chaotic events from occurring as publishers
experience the S/L/P cancellation crunch. So the second thing a
Consortium could do, besides endorsing and encouraging author
self-archiving, is to provide transitional support for publishers who
explicitly commit themselves to scaling down and moving from S/L/P-toll
based cost recovery to up-front page charges. If this is not done,
quality control could break down, as known, experienced publishers pull
out and nothing is in place to take over their function.
Well, that's it; it should be familiar to some of you as my "subversive
proposal" of a few years ago, updated to take into account some of the
further evidence and experience that has accumulated since then.
<http://www.library.yale.edu/~okerson/subversive.html>
I now proceed to quote/comment mode for some of the specifics:
> In the meantime, pressure to enact regressive copyright legislation has
> added another important element. The ease with which electronic files
> may be copied and forwarded has encouraged publishers and other owners
> of copyrighted material to seek means for denying access to anything
> they own in digital form to all but active subscribers or licensees.
<http://www.cogsci.soton.ac.uk/~harnad/science.html>
Precisely (see the URL above). And this is why the main function (as
Steve Koonin correctly perceived) of "endorsing and encouraging
self-archiving" on the part of the Consortium will be to make sure that
authors are not intimidated into signing copyright agreements that
deprive them of the right to self-archive online. That's all they need
to retain. Publishers can continue to have full and exclusive rights to
SELL the papers, in either medium, paper or online. The author need
only retain the right to give them away for free online. THAT is what
needs the weight of an AAU and Learned Society Consortium, NOT an
alternative quality-control board!
<http://www.chronicle.com/free/v45/i04/04a02901.htm>
The American Physical Society has provided a fine model for the
self-archiving policy of publishers who promote rather than oppose what
is in the best interests of both learned researchers and learned
research. This is the model to be adopted by all learned journal
publishers:
<http://www.cogsci.soton.ac.uk/~harnad/Hypermail/Author.Eprint.Archives/>
> II. A NEW MODEL
>
> 1. Support peer review and authentication
> 2. Support new models of presentation incorporating network technology
> 3. Permit "threaded" online discourse
> 4. Adapt to varying criteria among disciplines
> 5. Assure the security of data
> 6. Reduce production time and expense
> 7.Include automated indexing
> 8. Provide multiple search options
This is all unrevolutionary and uncontroversial! I would add only the
importance of CITATION LINKING of the entire refereed journals corpus
(which can be readily done in a global Archive like Los Alamos, as well
as an interoperable integration of the local Archives). Citations are
the seamless pathway that links the entire literature. Publishers are
planning to provide them as an "add-ons" to the online version, in order
to hold it hostage to S/L/P (mainly L/P), with a kind of "click-through
monopoly" uniting their respective proprietary data bases through an
"interoperable" network of toll-booths.
The self-archived literature can provide this for free, without the
firewalls, and this may prove to be a critical incentive to authors
to self-archive.
> III. TRILATERAL PARTNERSHIP
>
> * CONSORTIUM OF UNIVERSITIES
> * PROFESSIONAL SOCIETIES
>
> Within the various disciplines, professional societies,
> committees, and working groups continue to establish journals
> with editorial boards that are commissioned to review and
> validate work submitted by authors for final publication.
> Societies retain the power to publish and sell their journals
> in print or non-networked electronic formats such as CD-ROM
> or DVD-ROM; for the foreseeable future, many readers are
> likely to prefer receiving subscriptions as they do now.
As long as online (networked) access is free of S/L/P, this is fine!
> * AUTHORS
>
> Supported by easy-to-use inputting protocols and standards,
> authors perform their own technical writing, copy editing,
> document formatting, etc., or else contract for these
> services from technical writing consultants (see Section V,
> Document Preparation Services). They may submit preliminary
> findings or preprints to the preprint database, or finished
> work directly to an editorial board for formal review.
There is a fallacy here: Copy-editing occurs AFTER a paper has been
refereed, revised and accepted. Whatever stylistic help an author gets
before that is very important and welcome, but not the real thing.
Quality control for FORM begins after quality control for CONTENT, and
it will continue to be the responsibility of the publisher
(quality-controller); that is part of what the journal "label" attests
to; the author cannot be his own quality-controller.
> IV. DOCUMENT DATABASE
>
> The centerpiece of this proposal is a document database that
> incorporates and builds on important features derived from Paul
> Ginsparg's highly successful physics preprint server. Begun in 1991 and
> today comprising nearly 100,000 records in physics and related
> disciplines, xxx.lanl.gov demonstrates the viability of a large
> electronic archive that supports alerting services, automated hyperlink
> referencing, indexing, searching, and archiving. The proposed model
> also incorporates Ginsparg's recently developed plan to create an
> "intermediate buffer layer" overlaid on the raw preprint database and
> containing papers that have been subjected to a formal peer review.
> Such refereed papers may be aggregated into one or more journals that
> may exist at the buffer level.
The possibility of authenticated journal overlays for a Global Archive
is NOT captured by this unfortunately rather naive and unrealistic last
sentence. Archives can be sectored, and sectors can have
"certification" tags that are officially controlled by journals. But
there is a confusion here between self-archiving and refereed
publication again. An author can self-archive both unrefereed preprints
and refereed reprints, but he cannot CERTIFY that the latter have been
published by Journal X; only Journal X can do that. THAT is what the
journal overlay on the archive can provide.
This notion of "aggregating" archived papers into one or more
"journals" does not make sense in the online medium: We don't need
aggregations. Even online journals will stop aggregating issues and
will instead publish single articles at a time. The rest will be done
by intelligent search and alerting engines (especially guided by
certification tags authenticated by the Journals), as well as by
citation linking and searching. Gather readings together for a course,
if you like, but there's no need for the notion of recombining them
into different "journals." That's just an obsolete and useless
holdover from the Gutenberg Era! This is the PostGutenberg Galaxy!
> This heterodoxical approach opens the
> possibility for authors to establish their reputations simultaneously
> in a variety of related fields.
Not sensible in this new medium again, I'm afraid. The way to establish
reputations in a variety of fields in the online medium is not by doing
"virtual multiple publication" with spurious collation "journals," but
via links, index words, and interdisciplinary contents and mailing
lists. This sort of thinking is still papyrocentric.
The only residual function of journals is the service of quality
control. Referees are a scarce, over-used resource (who work for
free). Multiple submission is already an abusive drain on the system
(rightly outlawed by most journals -- except Law journals, where
student-review rather than peer-review prevails, and student labour
comes cheap). Once a paper has been refereed and accepted once, it need
not appear in further journals. It is already there on the Net! It can
be linked to; it can be reviewed by review journals; it can be
commented on, formally and informally, linked to by citation; but there
is no point whatsoever in having it re-appear in still further
"journals."
<http://AMSCI-FORUM.AMSCI.ORG/scripts/wa.exe?A1=ind99&L=september-forum&F=lf#4>
> Further value is added by shortening
> the reader's path to the certified version of a paper and by using
> links to point the reader back to the database of preprints.
One (suitably backed up, mirrored, distributed and protected) certified
journal version is enough. The rest is just about tagging and linking.
> V. UNIQUE FEATURES
> EDITORIAL BOARDS
>
> Editorial boards obtain permission from the Consortium to create
> and support a journal on Consortium servers. Following the
> tradition of confidentiality, a board determines whether a paper
> merits inclusion; it recommends revisions to authors; it considers
> authors' responses and rebuttals to referees' critiques; and
> ultimately accepts or rejects the work. An editorial board may
> also establish standards for document preparation. Revised
> versions that are placed in the preprint server receive a "version
> stamp". Eventually a "watermark", indicating final acceptance, is
> applied to the certified version that will be retained in all
> permanent archives maintained by the Consortium.
What has been described here is precisely what will be left of the
established refereed journals once they become online-only. It is not a
"new alternative" in any respect -- except inasmuch as it pertains to
journals that are NOT established. That is hardly an advantage in
itself...
> Consortium editorial boards are not granted exclusivity, i.e., any
> paper may be accepted for inclusion in multiple "journals".
Unrealistic again, alas, and extremely naive about what a scarce
resource peers' finite refereeing time is. One (successful) peer-review
per article is enough; the rest is just a matter of citation, linking
and commentary.
> In addition, the editorial boards may not exclude a paper based on
> "prior publication" in the preprint server or elsewhere.
This, in contrast, is an extremely important and substantive point, for
the Consortium must encourage and support authors in every possible way
(and there are many) in self-archiving preprints in defiance of
arbitrary and counterproductive strictures, such as the
"prior-publication" exclusion clause that some journals still try to
invoke to prevent the self-archiving of preprints.
<http://www.cogsci.soton.ac.uk/~harnad/science.html>
These arbitrary and extremely counterproductive strictures are probably
also unenforceable: How many changes do I have to make in a
self-archived preprint before it is no longer the same draft I submit
to a journal that endeavours to exclude papers that have already been
self-archived as preprints? And how are journals to enforce this? By
constantly trawling the Net for lookalikes for every paper submitted?
How look-alike?
The very same slippery-slope logic of course applies to any attempt to
forbid self-archiving of refereed reprints: How many changes turn my
preprint into a reprint, and vice versa? The absurdity and
counterproductiveness of the exercise is also made apparent by this
slippery slope. Authors GIVE this literature away, and there is no
ethical or enforceable way or reason to stop them from doing so in this
new medium. The rules were made for another medium (paper), and another
kind of literature (the vast, non-give-away, trade literature of books,
magazines, etc.).
Copyright laws were not invented in order to prevent authors from
giving away their own intellectual property, i.e., to protect them from
themselves! They always had two purposes:
(1) The first was to protect authors from THEFT OF TEXT-AUTHORSHIP,
i.e., to outlaw plagiarism. This still applies to the online
literature and is not at issue.
(2) The second was to protect authors from THEFT OF TEXT. Publishers
shared an interest in this, for they had royalty agreements with the
authors, and both authors and publishers would have lost revenue if the
text could be stolen.
In a nutshell, for the refereed journal literature (only), the
online-only era means that (2) is no longer justified or necessary.
Authors can self-archive their own texts, free for all, and publishers
need only provide quality control and its certification, a service they
can be paid for up-front, once there is no longer an S/L/P market for
the freely available texts. Until then, a license to sell the texts by
S/L/P is all that publishers need; there is no justification whatsoever
for attempting to prevent self-archiving. Attempts to do so should be
countered by Scholar's Forum (and all of us) head-on, by all legal and
practical means.
<http://trauma-pages.com/harnad96.htm>
> DOCUMENT PREPARATION SERVICES
>
> Authors may require considerable assistance in preparing
> manuscripts that meet editorial boards' submission standards. In
> this model, the Consortium supports a directory of independent
> technical writers and editors with expertise in a variety of
> fields. These consultants may apply for inclusion or be
> recommended by an editorial board. The Consortium may also devise
> a procedure for certifying those who offer to provide document
> preparation services on a contract basis to authors.
Fine, but don't confuse presubmission stylistic help with
post-acceptance editing and copy editing. The former can come from
colleagues and institutional writing assistants under the author's
solicitation and control, but the latter comes from the quality
controller/certifier.
> COPYRIGHT
>
> Authors or universities retain copyright according to institution
> policies. A mechanism at the input level requires authors to
> grant a limited, non-exclusive license to the Consortium. This
> agreement grants the right to provide unlimited access to all work
> in either preprint or archival servers for non-commercial purposes
> for the term of the copyright. Authors may grant limited-use
> licenses for their work to other not-for-profits or commercial
> entities, for which they may receive compensation, as long as such
> agreements do not infringe upon any rights previously assigned to
> the Consortium.
This is critical: Authors must be protected, and feel protected, from
any need to give up self-archiving rights. THAT'S ALL!
<http://www.princeton.edu/~harnad/science.html>
> THREADED DISCOURSE
>
> The model supports threaded discourse based on the work of
> researchers from Rand and Caltech to create a HyperForum.
> Colleagues may participate in dialogue on findings, however,
> anonymous comments will not be accepted.
Open Peer Commentary is my specialty, and the above component is
well-intentioned but again naive. Nothing critical hinges on it,
however, so I will pass over it.
See: http://citd.scar.utoronto.ca/EPub/talks/Harnad_Snider.html
> The preprint server with its threaded discourse permits editorial
> boards not only to follow comments from the field, but also to
> identify important work and invite submission for review leading
> to inclusion in a journal.
One thing to consider is sorting commentary into (1) comments on
unrefereed preprints and comments on refereed reprints and (2) refereed
vs unrefereed comments.
> Of particular value is the opportunity
> for an editorial board to incorporate into their journal work
> usually associated with another field but of special interest to
> theirs.
Again obsolete, if thought of as further collation-journals. All that
is needed is citations and links!
> Concomitantly, this feature overcomes the need to require
> authors to prepare a new version of existing work.
Updates can be archived and linked too, both refereed and unrefereed
ones.
> RESOURCE DISCOVERY
>
> Subjects and names as well as other metadata and full text will be
> searchable using the best available technology, including keyword
> and phrase searching, Boolean operators, proximity, truncation,
> and relevance ranking. It will also be possible to browse the
> archive by subject term, author name, or chronologically.
And one of the best ways of all: via citation links.
> VII. NEXT STEPS
>
> The success of this model depends critically on winning the support of
> "champions" from the research community and attracting participants in
> initial experiments who are likely to come from emerging areas of
> research that have not yet had their journals published either
> commercially or by professional societies. Partnering benefits such
> groups by allowing them to leverage Consortium resources to announce
> their findings economically and to a broad audience.
The only thing that needs championing is self-archiving. Once that is
practised, everything else will follow suit. To champion forfeiting the
established journals and turning to an untested new generic journal is,
in my opinion, Quixotic; nor is it motivated, if the new journals are
still supported by S/L/P.
> Before this is accomplished, research universities must assemble a
> Consortium to support the development and implementation of this model.
There is no model here yet! Why should universities back the
abandonment of the established journals for generic newcomers? And how
are the newcomers to be funded? Through S/L/P again? But that just
defeats the purpose.
But once the irrelevant and doubtful components are dropped, a model
does indeed emerge: Scholar's Forum will be a robust, interoperable
Archive in which authors from all disciplines can self-archive their
unrefereed preprints and their refereed preprints. The Consortium will
not only provide the Archive, but it will also use its collective
influence and resources to facilitate its use, by helping to protect
authors from attempts to prevent self-archiving, and by vigorously
promoting it in their institutions.
> The Consortium must assign lead participants from university IT
> departments, libraries, and faculty; identify and define elements of
> cost and develop a budget; establish a production schedule; develop
> underlying systems, standards, and protocols to enable champions,
> editors to create new journals; and attract funding from within the
> Consortium and from external sources.
This sounds like getting busy planning new online journals. But we don't
need new journals, online or otherwise! We need to free the existing
journal literature. That requires a realistic plan, and a careful
transitional strategy. So far, this "Model" can be misinterpreted as
just a lot of hoopla about establishing new online-only journals. But
that's not the point! Most of the established journals are or will soon
be available online too. What is needed is a way to free them from all
access barriers.
> CONCLUSION
>
> A growing number of researchers and information professionals recognize
> that scholarly communication is at a crossroads; many are seeking
> innovative solutions on their own to the wide variety of technical
> challenges that networked alternatives present. While much visionary
> work has emerged, the absence of any significantly new prototype for
> exchanging and preserving research results beyond xxx.lanl.gov suggests
> the advantages that may accrue from a more broadly-based, collaborative
> approach.
But local and global (xxx.lanl) self-archiving IS the new prototype!
You need only put your own pieces together slightly different to see that.
> A Consortium of universities, committed to developing and maintaining
> an integrated platform supporting all aspects of the scholarly
> communications process, also provides a basis for conducting meaningful
> experiments. Universities have the necessary critical mass of
> participants from varied disciplines. University faculty are already
> well represented on present editorial boards and include many editors;
> strong representation of university faculty on the new editorial boards
> established under the auspices of the Scholar's Forum continues this
> tradition. Universities have close ties to professional societies, have
> expertise in information technology, and have a large pool of creative
> student programmers who can contribute to the infrastructure
> developments that will be needed. Since universities are responsible
> for most of the work that appears in the scholarly literature,
> well-defined, committed administrative support can take advantage of
> major economies of scale to curtail costs as access to the scholarly
> literature is enhanced.
A Consortium will certainly provide the clout, but it won't do any good
until the game-plan is tightened into a coherent one (along the lines
described here, in my opinion)
--------------------------------------------------------------------
Stevan Harnad harnad@cogsci.soton.ac.uk
Professor of Cognitive Science harnad@princeton.edu
Department of Electronics and phone: +44 1703 592-582
Computer Science fax: +44 1703 592-865
University of Southampton http://www.cogsci.soton.ac.uk/~harnad/
Highfield, Southampton http://www.princeton.edu/~harnad/
SO17 1BJ UNITED KINGDOM ftp://ftp.princeton.edu/pub/harnad/
This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:04 GMT