Sharing the results of publicly funded research

Research Impact Measurement – Timeline

June 10, 2015 in PASTEUR4OA

I’ve been working on a series of timelines for the PASTEUR4OA Project – these will form part of a collection of advocacy papers. So far we’ve had one on Article Processing Charges (APCs) and on Open Access to Research Data. I now have a final timeline to share with you on Research Impact Measurement and Peer review. So bibliometrics, altmetrics, research evaluation and other related areas.

Image from Pixabay, CC0

Image from Pixabay, CC0

This timeline has used What is Open Peer Review as its foundation. Once again any suggestions would be much appreciated.

1665

20th century – Peer review became common for science funding allocations.

1948

  • Launch of Project RAND, an organization formed immediately after World War II to connect military planning with research and development decisions. The project evolved into the RAND Corporation is a nonprofit institution that helps improve policy and decision making through research and analysis. [http://www.rand.org/]

1955

1961

1969

1976

1986

  • First exercise of assessing of research in Higher Education in the UK took place conducted by the University Grants Committee, a predecessor of the present Higher Education Funding Councils. Went on to be carried out in 1992, 1996, 2001 and 2008.

1989

1996

  • Michael Power writes The Audit Explosion – an anti auditing and measuring paper. [http://www.demos.co.uk/files/theauditexplosion.pdf]
  • PageRank was developed at Stanford University by Larry Page and Sergey Brin
  • CiteSeer goes public – first such algorithm for automated citation extraction and indexing

1998

  • PageRank is introduced to Google search engine

1999

2000

2001

  • Atmospheric Chemistry and Physics introduces a system where manuscripts are placed online as a “discussion paper”, which is archived with all comments and reviews, even before approved and peer-reviewed articles appear in the journal.

2002

2004

  • The official U.S. launch of Scopus was held at the New York Academy of Sciences. [http://www.scopus.com/]
  • BMJ published the number of views for its articles, which was found to be somewhat correlated to citations
  • Google Scholar index launched

2005

2006

2007

2008

  • European Commission, DG Research set up the Expert Group on Assessment of University-Based Research to identify the framework for a new and more coherent methodology to assess the research produced by European universities.
  • Last Research Assessment Exercise (RAE) run in UK
  • The MRC launch a new online approach to gather feedback from researchers about the output from their work, first called the “Outputs Data Gathering Tool”, it is revised and renamed “MRC eVal” in 2009 and then re-developed as “Researchfish” in 2012. [http://www.mrc.ac.uk/research/achievements/evaluation-programme/?nav=sidebar]
  • The MRC, Wellcome Trust and Academy of Medical Sciences publish the first “Medical Research: What’s it worth?” analysis the result of two years of discussion under the auspices of the UK Evaluation Forum, and ground-breaking analysis by the Health Economics Research Group at Brunel, RAND Europe and the Office of Health Economics. The findings provide a new UK estimate of the return on investment from medical research. [http://www.mrc.ac.uk/news-events/publications/medical-research-whats-it-worth/]

2009

  • Public Library of Science introduced article-level metrics for all articles.
  • UK research councils introduce “pathways to impact” as a major new section in all RCUK applications for funding. Applicants are asked to set out measures taken to maximise impact. [http://www.rcuk.ac.uk/innovation/impacts/]

2010

2011

2012

  • Google scholar adds the possibility for individual scholars to create personal “Scholar Citations profiles”
  • Several journals launch with an open peer review model:
    • GigaScience – publishes pre-publication history with articles and names reviewers (opt-out system)
    • PeerJ – Peer review reports published with author approval, reviewer names published with reviewer permission. (Info)
    • eLife Decision letter published with author approval. Reviewers anonymous.
    • F1000Research – All peer review reports and reviewer names are public, and appear after article is published online.
  • MRC launches a new funding initiative for studies aimed at better understanding the link between research and impact over the next two years 7 awards are made totaling £1M. [http://www.mrc.ac.uk/funding/how-we-fund-research/highlight-notices/economic-impact-highlight-notice/]
  • Subset of higher education institutions in Australia ran a small-scale pilot exercise to assess impact and understand the potential challenges of the process: the Excellence in innovation for Australia impact assessment trial (EIA). [https://go8.edu.au/programs-and-fellowships/excellence-innovation-australia-eia-trial]
  • ORCID launches its registry and begins minting identifiers

2013

2014

2015

  • 100 research funding organisations are using Researchfish in the UK, tracking the progress and productivity of more than £4.5billion of funding for new grants each year.

2 responses to “Research Impact Measurement – Timeline”

  1. Ian Viney says:

    2006 Peter Warry’s report challenges the UK research councils to go further in demonstrating their economic impact http://www.rcuk.ac.uk/Publications/archive/TheWarryReport/

    2008 The MRC launch a new online approach to gather feedback from researchers about the output from their work, first called the “Outputs Data Gathering Tool”, it is revised and renamed “MRC eVal” in 2009 and then re-developed as “Researchfish” in 2012. http://www.mrc.ac.uk/research/achievements/evaluation-programme/?nav=sidebar

    2008 The MRC, Wellcome Trust and Academy of Medical Sciences publish the first “Medical Research: What’s it worth?” analysis the result of two years of discussion under the auspices of the UK Evaluation Forum, and ground-breaking analysis by the Health Economics Research Group at Brunel, RAND Europe and the Office of Health Economics. The findings provide a new UK estimate of the return on investment from medical research http://www.mrc.ac.uk/news-events/publications/medical-research-whats-it-worth/

    2009 UK research councils introduce “pathways to impact” as a major new section in all RCUK applications for funding. Applicants are asked to set out measures taken to maximise impact. http://www.rcuk.ac.uk/innovation/impacts/

    2012 MRC launches a new funding initiative for studies aimed at better understanding the link between research and impact over the next two years 7 awards are made totalling £1M. http://www.mrc.ac.uk/funding/how-we-fund-research/highlight-notices/economic-impact-highlight-notice/

    2012 ORCID launches its registry and begins minting identifiers

    2014 RCUK extends the Researchfish approach to all disciplines and implements the process across all research council funding. 18,000 principal investigators complete the process, providing 800,000 reports of outputs linked to over £16 billion of RCUK funded awards. http://www.rcuk.ac.uk/research/researchoutcomes/

    2015 100 research funding organisations are using Researchfish in the UK, tracking the progress and productivity of more than £4.5billion of funding for new grants each year.

  2. Iara Vidal says:

    I have two suggestions:
    2012: A group of editors and publishers of scholarly journals met in December 2012 during the ASCB Annual Meeting in San Francisco and launched the San Francisco Declaration on Research Assessment (DORA), with reccomendations to improve the ways in which the outputs of scientific research are evaluated (http://www.ascb.org/dora/).
    2015: After discussions during the 2014 International conference on Science and Technology Indicators, Diana Hicks, Paul Wouters, Ludo Waltman, Sarah de Rijcke and Ismael Rafols publish The Leiden Manifesto for Research Metrics, with ten principles to guide research evaluation (http://www.leidenmanifesto.org/).

Leave a Reply

Your email address will not be published. Required fields are marked *