The UK Research Assessment Exercise's (RAE's) sensible and overdue
transition from time-consuming, cost-ineffective panel review to low-cost
metrics is moving forward:
http://www.hefce.ac.uk/news/hefce/2006/rae.htm
However, there is still a top-heavy emphasis, in the RAE's provisional
metric equation, on the Prior-Funding metric: "How much research funding
has the candidate department received in the past?"
"The outcome announced today is a new process that uses for all
subjects a set of indicators based on research income, postgraduate
numbers, and a quality indicator."
Although prior funding should be *part* of the equation, it should
definitely not be the most heavily weighted component a-priori, in any
field. Otherwise, it will merely generate a Matthew-Effect/Self-Fulfilling
Prophecy (the rich get richer, etc.) and it will also collapse the
UK Dual Funding System ((1) competitive proposal-based funding *plus*
(2) RAE performance-based, top-sliced funding) into just a scaled up
version of (1) alone.
Having made the right decision -- to rely far more on low-cost metrics
than on costly panels -- the RAE should now commission rigorous,
systematic studies of metrics, testing metric equations discipline by
discipline. There are not just three but many potentially powerful
and predictive metrics that could be used in these equations (e.g.,
citations, recursively weighted citations, co-citations, hub/authority
indices, latency scores, longevity scores, downloads, download/citation
correlations, endogamy/exogamy scores, and many more rich and promising
indicators). Unlike panel review, metrics are automatic and cheap to
generate, and in the 2008 parallel panel/metric exercise they can be
tested and cross-validated against the panel rankings, field by field.
In all metric fields -- biometrics, psychometrics, sociometrics -- the
choice and weight of metric predictors is based on careful, systematic
prior testing and validation, not on the basis of a hasty a-priori
choice. Biassed predictors are also avoided: The idea is to maximise
the depth, breadth, flexibility and validity of the predictive power
by choosing and weighting the right metrics. More metrics is better than
fewer, because they serve as cross-checks on one another; this triangulation
also highlights anomalies, if any.
Let us hope that good sense will not stop with the decision to convert
to metrics, but will continue to prevail in making a sensible, informed
choice among the rich spectrum of metrics available in the online age.
Excerpts from
"Response to consultation on successor to research assessment exercise"
http://www.hefce.ac.uk/news/hefce/2006/rae.htm
"In the Science and Innovation Investment Framework 2004-2014
(published in 2004), the Government expressed an interest in using
metrics collected as part of the 2008 RAE to provide a benchmark on
the value of metrics as compared to peer review, with a view to making
more use of metrics in assessment and reducing the administrative
burden of peer review. The 10-Year Science and Innovation
Investment Framework: Next Steps published with the 2006 Budget
http://www.hm-treasury.gov.uk/media/1E1/5E/bud06_science_332.pdf
moved these plans forward by proposing a consultation on moving to
a metrics-based research assessment system after the 2008 RAE. A
working Group chaired by Sir Alan Wilson (then DfES Director General
of Higher Education) and Professor David Eastwood produced proposals
which were issued for consultation on 13 June 2006. The Government
announcement today is the outcome of that consultation."
"The RAE panels already make some use of research metrics in reaching
their judgements about research quality. Research metrics are
statistics that provide indicators of the success of a researcher
or department. Examples of metrics include the amount of income a
department attracts from funders for its research, the number of
postgraduate students, or the number of times a published piece
of research is cited by other researchers. Metrics that relate to
publications are usually known as bibliometrics.
"The outcome announced today is a new process that uses for all
subjects a set of indicators based on research income, postgraduate
numbers, and a quality indicator. For subjects in science,
engineering, technology and medicine (SET) the quality indicator will
be a bibliometric statistic relating to research publications or
citations. For other subjects, the quality indicator will continue
to involve a lighter touch expert review of research outputs, with
a substantial reduction in the administrative burden. Experts will
also be involved in advising on the weighting of the indicators for
all subjects."
------------
Some Prior References:
Harnad, S. (2001) Why I think that research access, impact and
assessment are linked. Times Higher Education Supplement 1487:
p. 16.
http://cogprints.org/1683/
Hitchcock, S., Brody, T., Gutteridge, C., Carr, L., Hall, W.,
Harnad, S., Bergmark, D. and Lagoze, C. (2002) Open Citation
Linking: The Way Forward. D-Lib Magazine 8(10).
http://eprints.ecs.soton.ac.uk/7717/
Harnad, S. (2003) Why I believe that all UK research output should
be online. Times Higher Education Supplement. Friday, June 6 2003.
http://eprints.ecs.soton.ac.uk/7728/
Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated
online RAE CVs Linked to University Eprint Archives: Improving
the UK Research Assessment Exercise whilst making it cheaper
and easier. Ariadne 35.
"Metrics" are Plural, Not Singular: Valid Objections From UUK About RAE"
http://openaccess.eprints.org/index.php?/archives/137-guid.html
Pertinent Prior American Scientist Open Access Forum Topic Threads:
Pertinent Prior AmSci Topic Threads:
UK "RAE" Evaluations (began Nov 2000)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1018
Digitometrics (May 2001)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/1300.html
Scientometric OAI Search Engines (began Aug 2002)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2238
UK Research Assessment Exercise (RAE) review (Oct 2002)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2326
Australia stirs on metrics (June 2006)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5417.html
Big Brother and Digitometrics (began May 2001)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1298
UK Research Assessment Exercise (RAE) review (began Oct 2002)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2326
Need for systematic scientometric analyses of open-access
data (began Dec 2002)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2522
Potential Metric Abuses (and their Potential Metric Antidotes)
(began Jan 2003)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2643
Future UK RAEs to be Metrics-Based (began Mar 2006)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#5251
Australia stirs on metrics (Jun 2006)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5417.html
Let 1000 RAE Metric Flowers Bloom: Avoid Matthew Effect as
Self-Fulfilling Prophecy (Jun 2006)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5418.html
Australia's RQF (Nov 2006)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/5806.html
Received on Thu Dec 07 2006 - 13:13:34 GMT