The following is a comment on an article that appeared in today's
Independent about the RAE and Metrics (followed by a response to another
piece in the Independent about Web Metrics).
Re: Hodges, L. (2006) The RAE is dead - long live metrics
The Independent April 13 2006
http://education.independent.co.uk/higher/article357343.ece
Absolutely no one can justify (on the basis of anything but superstition)
holding onto an expensive, time-wasting assessment system such as the RAE,
which produces rankings that are almost perfectly correlated with, hence
almost exactly predictable from, inexpensive objective metrics such as
prior funding, citations and research student counts.
http://www.hm-treasury.gov.uk/media/1E1/5E/bud06_science_332.pdf
Hence the only two points worth discussing are (1) which metrics to use
and (2) how to adapt them to each discipline.
The web has opened up a vast and rich universe of potential
metrics that can be tested for their validity and predictive power:
citations, downloads, co-citations, immediacy, growth-rate, longevity,
interdisciplinarity, user tags/commentaries and much, much more. These
are all measures of research uptake, usage, impact, progress and
influence. They have to be tested and weighted according to the
unique profile of each discipline (or even subdiscipline). Prior
funding is highly predictive, but it also generates a Matthew Effect:
a self-fulfilling, self-perpetuating prophecy.
I would not for a moment believe, however, that any (research) discipline
lacks predictive metrics of research performance altogether. Even less
credible is the superstitious notion that the only way (or the best)
to evaluate research is for RAE panels to re-do, needlessly, locally,
the peer review that has already been done, once, by the journals in which
the research has already been published.
The urgent feeling that this human re-review is necessary has
nothing to do with the RAE or metrics in particular; it is
just a generic human superstition (and irrationality) about
population statistics versus my own unique, singular case:
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Explaining.Mind96/0221.html
http://en.wikipedia.org/wiki/Base_rate_fallacy
-----
Re: Diary (13 April 2006, other article, same issue)
http://education.independent.co.uk/higher/
> 'A new international university ranking has been launched and
> the UK has 25 universities in the world's top 300. The results
> are based on the popularity of the content of their websites on
> other university campuses. The G Factor is the measure of how
> many links exist to each university's website from the sites
> of 299 other research-based universities, as measured by 90,000
> google searches. No British university makes it into the Top 10;
> Cambridge sits glumly just outside at no 11. Oxford languishes at
> n.20. In a shock Southampton University is at no.25 and third in
> Britain. Can anyone explain this? Answers on a postcard. The rest
> of the UK Top 10, is UCL, Kings, Imperial, Sheffield, Edinburgh,
> Bristol and Birmingham.'
The reasons for the University of Southampton's extremely high overall
webmetric rating are four:
(1) U. Southampton's university-wide research performance
(2) U. Southampton's Electronics and Computer Science (ECS)
Department's involvement in many high-profile web projects and
activities (among them the semantic web work of the web's inventor,
ECS Prof. Tim Berners-Lee, the Advanced Knowledge Technologies
(AKT) work of Prof. Nigel Shadbolt, and the pioneering web linking
contributions of Prof. Wendy Hall)
(3) The fact that since 2001 U. Southampton's ECS has had a
mandate requiring that all of its research output be made Open
Access on the web, and that Southampton has a university-wide
self-archiving policy (soon to become a mandate) too
(4) The fact that maximising access to research (by self-archiving
it free for all on the web) maximises research usage and impact
(and hence web impact)
This all makes for an extremely strong Southampton web
presence, as reflected in such metrics as the "G factor".
http://www.universitymetrics.com/tiki-index.php?page=G-Factor
which places Southampton 3rd in the UK and 25th among the world's
top 300 universities or
http://www.webometrics.info/ which places
Southampton 6th in UK, 9th in Europe, and 80th among the top 3000
universities it indexes.
Of course, these are extremely crude metrics, but Southampton itself
is developing more powerful and diverse metrics for all Universities
in preparation for the newly announced metrics-only Research Assessment
Exercise.
http://openaccess.eprints.org/index.php?/archives/75-guid.html
Stevan Harnad
American Scientist Open Access Forum
http://amsci-forum.amsci.org/archives/American-Scientist-Open-Access-Forum.html
-------------------------
Some references:
Harnad, S. (2001) Why I think that research access, impact and
assessment are linked. Times Higher Education Supplement 1487:
p. 16.
http://cogprints.org/1683/
Hitchcock, S., Brody, T., Gutteridge, C., Carr, L., Hall, W.,
Harnad, S., Bergmark, D. and Lagoze, C. (2002) Open Citation
Linking: The Way Forward. D-Lib Magazine 8(10).
http://eprints.ecs.soton.ac.uk/7717/
Harnad, S. (2003) Why I believe that all UK research output should
be online. Times Higher Education Supplement. Friday, June 6 2003.
http://eprints.ecs.soton.ac.uk/7728/
Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated
online RAE CVs Linked to University Eprint Archives: Improving
the UK Research Assessment Exercise whilst making it cheaper
and easier. Ariadne 35.
http://www.ariadne.ac.uk/issue35/harnad/
Berners-Lee, T., De Roure, D., Harnad, S. and Shadbolt, N. (2005)
Journal publishing and author self-archiving: Peaceful Co-Existence
and Fruitful Collaboration.
http://eprints.ecs.soton.ac.uk/11160/
Brody, T., Harnad, S. and Carr, L. (2006) Earlier Web Usage
Statistics as Predictors of Later Citation Impact. Journal of
the American Association for Information Science and Technology
(JASIST).
http://eprints.ecs.soton.ac.uk/10713/
Shadbolt, N., Brody, T., Carr, L. & Harnad, S. (2006) The Open
Research Web: A Preview of the Optimal and the Inevitable. In:
Jacobs, N., Eds. Open Access: Key Strategic, Technical and Economic
Aspects Chandos.
http://www.ecs.soton.ac.uk/~harnad/Temp/shad-bch.doc
Citebase impact ranking engine
http://citebase.eprints.org/
Beans and Bean Counters
http://www.thes.co.uk/search/story.aspx?story_id=2023828
Bibliography of Findings on the Open Access Impact Advantage
http://opcit.eprints.org/oacitation-biblio.html
Stevan Harnad
AMERICAN SCIENTIST OPEN ACCESS FORUM:
A complete Hypermail archive of the ongoing discussion of providing
open access to the peer-reviewed research literature online (1998-2005)
is available at:
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/
To join or leave the Forum or change your subscription address:
http://amsci-forum.amsci.org/archives/American-Scientist-Open-Access-Forum.html
Post discussion to:
american-scientist-open-access-forum_at_amsci.org
UNIVERSITIES: If you have adopted or plan to adopt an institutional
policy of providing Open Access to your own research article output,
please describe your policy at:
http://www.eprints.org/signup/sign.php
UNIFIED DUAL OPEN-ACCESS-PROVISION POLICY:
BOAI-1 ("green"): Publish your article in a suitable toll-access journal
http://romeo.eprints.org/
OR
BOAI-2 ("gold"): Publish your article in a open-access journal if/when
a suitable one exists.
http://www.doaj.org/
AND
in BOTH cases self-archive a supplementary version of your article
in your institutional repository.
http://www.eprints.org/self-faq/
http://archives.eprints.org/
http://openaccess.eprints.org/
Received on Fri Apr 14 2006 - 02:01:07 BST