On Mon, 26 Sep 2005, Donald Waters wrote:
> I can see why one might argue that citation of a research
> publication represents a kind of "return on investment" in research.
> However, the logic of Mr. Harnad's argument escapes me when he insists in
> Proposition #2 below that publication itself is not also a kind of return
> on investment.
Yes; number of publications *is* a measure of return on the investment. I
should have said not the best measure. I did say that a piece of research
that is not used may as well not have been done. Hence usage is a better
message of impact than mere publication.
> Mr. Harnad's use of "return on investment" is not at all
> rigorously defined, but if you accept his usage for the common sense
> assertion that it is, not only does the research publication surely
> represent another kind of "return," it is a primary and immediate return
> compared to a subsequent citation to the publication, which is secondary
> and would likely emerge only over time.
To repeat: a piece of research unused is a piece of research that may as
well not have been done, or funded.
Of course usage may come with time (and the citation counts were
estimated over a 10-year baseline). But the point was not that research
that is accessible only to researchers who are at institutions that can
afford the journal in which it happened to be published has no citation
impact at all. The point was that it does not have its full potential
citation impact. It loses 50%-250% of it. And that citation impact is
the more accurate measure of the return on the public's investment in
the research; not the crude count of articles published.
(If you don't like "return on investment," use "getting your money's
worth": research that is not self-archived gives the public 50%-250% less
than what it paid for.)
> If there are "returns on
> investment" in research, then we are faced with a proposition requiring
> careful analysis of multiple factors and their relative weighting over a
> base of investment, not simply an "either/or" proposition as Mr. Harnad so
> dismissively suggests.
Publication counts are a crude first approximation; citation counts are a
better approximation. There are of course even better measures of research
impact (downloads, post-publication peer evaluations, and many of the
scientometric measures of the future that an online full-text corpus will
spawn).
> Second, and more troubling, even if you allow for his black and white view
> of the world, Mr. Harnad fails to provide any sort of rationale for why it
> is reasonable and valid to impute a specific monetary value to a citation,
> much less as a "return on investment" in an accounting system that
> measures country-wide investment in research. He simply assumes that it
> has such value, takes a percentage difference in the quantity of citations
> to published articles that are distributed in different ways, and applies
> the difference to the assumed base value, which is the UK's annual
> investment in research. To identify a value for citations, why not apply
> the percentage difference to the UK's gross national product, the annual
> gross sales of research publications in the UK, the total salaries of
> university researchers in the UK, or their total expenditures on
> groceries? An arbitrary application of percentages to an arbitrary base
> value is hardly a disciplined way to calculate value.
Rather harder to calculate, in comparing self-archived and non-self-archived
articles, but by all means go ahead and do it!
But, no, UK revenue from articles sold is *not* part of a UK
researcher's research impact.
> In his original article, which appears at
> http://openaccess.eprints.org/index.php?/archives/28-guid.html, Mr. Harnad
> does try to be more specific and refers to a 1986 article by A.M. Diamond
> in the Journal of Human Resources entitled "What is a citation worth."
> Mr. Harnad points to a reprint of the Diamond article introduced by Eugene
> Garfield at http://www.garfield.library.upenn.edu/essays/v11p354y1988.pdf,
> but his adoption of the Diamond article is slavish and uncritical despite
> the warnings contained both in Diamond's highly nuanced and qualified
> article and especially in Garfield's introduction.
Apart from the slavish adoption, please note that the Diamond measure
was used for another calculation: what the researcher is losing, per
citation, in salary and grant income. That is not the same measure as
the other measure, of whether the UK public is getting its full money's
worth for its 3.5 bn pound investment.
> Diamond had examined the salary structures of university professors and
> roughly estimated that the marginal value of a citation fell at the time
> within the range of US$50-1,300 of additional salary. For his own
> purposes, Mr. Harnad simply takes this estimate out of context converts it
> to pound sterling, inflates it to current value, and multiplies it by the
> number of citations that authors presumably lose by failing to
> self-archive. However, in the introduction to the Diamond article,
> Garfield cautions emphatically against just such a usage.
Well, for what it's worth, I checked my article with both Diamond and
Garfield. Diamond replied with no objection to the application. Garfield
didn't reply.
> Garfield points out that Diamond "is not saying that every additional
> citation is worth 'X' amount of dollars" and then continues: "Economists
> are interested in the structure of wages and in its components, and they
> present their data to show that structure. Diamond does not claim that
> that there is any simple, automatic connection between citations and
> salaries. There is no real evidence of such a causal connection.
> Rather, as Harriet Zuckerman, Department of Sociology, Columbia University
> points out, from Diamond's findings we can conclude that citations can be
> regarded 'as a kind of "proxy" for certain services for which scientists
> and scholars get paid.'"
A correlation is a correlation. The causality would require a much more
complicated study. But since promotion committees are, if anything, more
explicitly counting citations since 1986, causality is likely. See also
the studies showing the correlation between the rankings (and funding)
made by the UK Research Assessment Exercise (RAE) and citation counts.
(Citations are not counted directly by RAE, but the ranked departments
and institutions do emphasise the citation impact factor -- average
citation ratios -- of the journals their researchers publish in, in
designing their RAE submissions.)
http://www.ariadne.ac.uk/issue35/harnad/
> Mr. Harnad may well be on to an important subject and line of argument in
> suggesting that citations are a kind of return on investment. However,
> close inspection of the concepts and logic of his argument suggests that
> he is quite a bit further from proving his case than he seems to have
> convinced himself that he is.
No proof here: Just conservative estimates.
Stevan Harnad
> Don Waters
>
> -----Original Message-----
> Sent: Thursday, September 22, 2005 7:05 PM
> To: liblicense-l_at_lists.yale.edu; AmSci Forum
> Subject: Re: Open access to research worth £1.5bn a year
>
> On Wed, 21 Sep 2005, Sally Morris (ALPSP) wrote:
>
> Re: http://www.theregister.co.uk/2005/09/16/free_access_research/
>
> > Am I alone in failing completely to understand the basis for Stevan's
> > calculation of the 1.5 bn? It seems to be (hypothetical (and as far as
> > I can follow, unexplained) figure) x (hypothetical figure) x
> > (hypothetical figure). Am I missing something?
> >
> > Perhaps someone could explain it to me nice and slow...
>
> Dear Sally, happy to oblige:
>
> (1) The UK spends £3.5 billion pounds annually on funding UK research:
> http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/4620.html
>
> (2) The return on that investment is not the number of UK articles
> published (130,000 per year)
> http://auth.athensams.net/?ath_dspid=ISI.PHL&ath_returl=http://isiknowledge.com/
>
> (3) The return on that investment is the number of UK articles used,
> built-upon, cited: 761,600 citations per year:
> http://auth.athensams.net/?ath_dspid=ISI.PHL&ath_returl=http://isiknowledge.com/
>
> (4) 15% of articles are self-archived worldwide, 85% are not:
> http://www.crsc.uqam.ca/lab/chawki/graphes/EtudeImpact.htm
>
> (5) Self-archived articles have 50%-250% more citations:
> http://www.crsc.uqam.ca/lab/chawki/graphes/EtudeImpact.htm
>
> (6) Hence, for 85% of its research output (£2.98 billion pounds worth)
>
> (7) the UK is losing 50-250% of the potential return on its investment:
> £1.49 - £7.44 billion pounds worth
>
> (8) To be conservative, I used only the lower end of this estimate of the
> UK's annual loss in potential return on it research investment: £1.5
> billion pounds worth
> http://openaccess.eprints.org/index.php?/archives/29-guid.html
>
> In other words, the fiction is not in the figures I have cited on the RCUK
> investment in research and the empirical evidence for the loss of
> potential return on that investment
> http://openaccess.eprints.org/index.php?/archives/28-guid.html
>
> The fiction is all in Sally's own non-figures and non-evidence on
> publishers' loss of potential revenues as a result of self-archiving:
> http://openaccess.eprints.org/index.php?/archives/20-guid.html
>
> Stevan Harnad
>
>
Received on Wed Sep 28 2005 - 01:42:32 BST