Research Impact Measurement – Timeline
I’ve been working on a series of timelines for the PASTEUR4OA Project – these will form part of a collection of advocacy papers. So far we’ve had one on Article Processing Charges (APCs) and on Open Access to Research Data. I now have a final timeline to share with you on Research Impact Measurement and Peer review. So bibliometrics, altmetrics, research evaluation and other related areas.
This timeline has used What is Open Peer Review as its foundation. Once again any suggestions would be much appreciated.
1665
- First recorded editorial pre-publication peer-review process at the Royal Society of London in 1665 by the founding editor of Philosophical Transactions of the Royal Society. [http://en.wikipedia.org/wiki/Philosophical_Transactions_of_the_Royal_Society]
20th century – Peer review became common for science funding allocations.
1948
- Launch of Project RAND, an organization formed immediately after World War II to connect military planning with research and development decisions. The project evolved into the RAND Corporation is a nonprofit institution that helps improve policy and decision making through research and analysis. [http://www.rand.org/]
1955
- Dr. Garfield’s article on citation indexing appeared in Science – led to the Genetics Citation Index. [http://garfield.library.upenn.edu/essays/v7p515y1984.pdf]
1961
- Irving H. Sher and Eugene Garfield created the journal impact factor to help select journals for the Science Citation Index (SCI) [http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.80.6316&rep=rep1&type=pdf]
1969
- Term bibliometrics was coined by Alan Pritchard in a paper published in 1969, titled Statistical Bibliography or Bibliometrics? [https://www.academia.edu/598618/Statistical_bibliography_or_bibliometrics]
1976
- A recursive impact factor that gives citations from journals with high impact greater weight than citations from low-impact journals was proposed. [http://www.sciencedirect.com/science/article/pii/0306457376900480]
1986
- First exercise of assessing of research in Higher Education in the UK took place conducted by the University Grants Committee, a predecessor of the present Higher Education Funding Councils. Went on to be carried out in 1992, 1996, 2001 and 2008.
1989
- First peer review congress meeting. [http://www.peerreviewcongress.org/]
1996
- Michael Power writes The Audit Explosion – an anti auditing and measuring paper. [http://www.demos.co.uk/files/theauditexplosion.pdf]
- PageRank was developed at Stanford University by Larry Page and Sergey Brin
- CiteSeer goes public – first such algorithm for automated citation extraction and indexing
1998
- PageRank is introduced to Google search engine
1999
- BMJ starts revealing reviewer names to authors. [http://jama.jamanetwork.com/article.aspx?articleid=194992]
2000
- BioMed Central launches, and soon after that starts including reviewer names and pre-publication history for published articles in all medical journals in their BMC series of publications. [http://jama.jamanetwork.com/article.aspx?articleid=194992]
2001
- Atmospheric Chemistry and Physics introduces a system where manuscripts are placed online as a “discussion paper”, which is archived with all comments and reviews, even before approved and peer-reviewed articles appear in the journal.
2002
- Thompson Reuters Web of Knowledge launches. [http://wokinfo.com/]
2004
- The official U.S. launch of Scopus was held at the New York Academy of Sciences. [http://www.scopus.com/]
- BMJ published the number of views for its articles, which was found to be somewhat correlated to citations
- Google Scholar index launched
2005
- European Commission report Enhancing Europe’s Research Base, DG Research, Brussels. Report by the Forum on University‐based Research. [http://ec.europa.eu/research/conferences/2004/univ/pdf/enhancing_europeresearchbase_en.pdf]
2006
- Johan Bollen, Marko A. Rodriguez, and Herbert Van de Sompel propose replacing impact factors with the PageRank algorithm. From their paper (based on 2003 data)
- Twitter launched
- Commission Communication on the modernisation of universities report asks “How to create a new and more coherent methodology to assess the research produced by European universities?” [http://bookshop.europa.eu/en/assessing-europe-s-university-based-research-pbKINA24187/]
- Launch of Biology Direct, which includes reviewer comments and names with published articles.
- Peter Warry’s report challenges the UK research councils to go further in demonstrating their economic impact. [http://www.rcuk.ac.uk/Publications/archive/TheWarryReport/]
- EC Communication report ‘Delivering on the modernisation agenda for universities: Education, research and innovation’. [http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2006:0208:FIN:en:PDF]
2007
- EC on resolution on ‘Modernising Universities for Europe‘s Competitiveness in a Global Knowledge Economy’ [http://www.consilium.europa.eu/uedocs/cms_Data/docs/pressdata/en/intm/97237.pdf]
- EC expert group launched by the Scientific and Technical Research Committee (CREST)- Mutual Learning on Approaches to Improve the Excellence of Research in Universities
- Frontiers launches, and includes reviewer names with articles.
- Higher Education Funding Council for England (HEFCE) issued a circular letter announcing that a new framework for assessing research quality in UK universities would replace the Research Assessment Exercise (RAE), following the 2008 RAE
2008
- European Commission, DG Research set up the Expert Group on Assessment of University-Based Research to identify the framework for a new and more coherent methodology to assess the research produced by European universities.
- Last Research Assessment Exercise (RAE) run in UK
- The MRC launch a new online approach to gather feedback from researchers about the output from their work, first called the “Outputs Data Gathering Tool”, it is revised and renamed “MRC eVal” in 2009 and then re-developed as “Researchfish” in 2012. [http://www.mrc.ac.uk/research/achievements/evaluation-programme/?nav=sidebar]
- The MRC, Wellcome Trust and Academy of Medical Sciences publish the first “Medical Research: What’s it worth?” analysis the result of two years of discussion under the auspices of the UK Evaluation Forum, and ground-breaking analysis by the Health Economics Research Group at Brunel, RAND Europe and the Office of Health Economics. The findings provide a new UK estimate of the return on investment from medical research. [http://www.mrc.ac.uk/news-events/publications/medical-research-whats-it-worth/]
2009
- Public Library of Science introduced article-level metrics for all articles.
- UK research councils introduce “pathways to impact” as a major new section in all RCUK applications for funding. Applicants are asked to set out measures taken to maximise impact. [http://www.rcuk.ac.uk/innovation/impacts/]
2010
- Multirank launched – new multi-dimensional, user-driven approach to international ranking of higher education institutions. [http://www.umultirank.org/]
- EMBO journal starts publishing review process file with articles. Editors are named, but referees remain anonymous. [http://emboj.embopress.org/about#Review_Process]
- Altmetrics manifesto is released. [http://altmetrics.org/manifesto/]
- EU report on Assessing Europe’s University-Based Research Expert Group on Assessment of University-Based Research released. [http://ec.europa.eu/research/science-society/document_library/pdf_06/assessing-europe-university-based-research_en.pdf]
2011
- BMJ Open launches, and includes all reviewer names and review reports with published articles. [http://bmjopen.bmj.com/site/about/]
2012
- Google scholar adds the possibility for individual scholars to create personal “Scholar Citations profiles”
- Several journals launch with an open peer review model:
- GigaScience – publishes pre-publication history with articles and names reviewers (opt-out system)
- PeerJ – Peer review reports published with author approval, reviewer names published with reviewer permission. (Info)
- eLife – Decision letter published with author approval. Reviewers anonymous.
- F1000Research – All peer review reports and reviewer names are public, and appear after article is published online.
- MRC launches a new funding initiative for studies aimed at better understanding the link between research and impact over the next two years 7 awards are made totaling £1M. [http://www.mrc.ac.uk/funding/how-we-fund-research/highlight-notices/economic-impact-highlight-notice/]
- Subset of higher education institutions in Australia ran a small-scale pilot exercise to assess impact and understand the potential challenges of the process: the Excellence in innovation for Australia impact assessment trial (EIA). [https://go8.edu.au/programs-and-fellowships/excellence-innovation-australia-eia-trial]
- ORCID launches its registry and begins minting identifiers
2013
- EU Innovation Output Indicator launched. [http://ec.europa.eu/research/innovation-union/index_en.cfm?pg=output]
- RAND ImpactFinder tool released. [http://www.rand.org/randeurope/research/projects/impactfinder.html]
- Article on Deep impact: unintended consequences of journal rank by Björn Brembs, Katherine Button and Marcus Munafò. Article reviews the data suggesting that we have journal rank upside down: high-IF journals publish the worst science. [http://journal.frontiersin.org/article/10.3389/fnhum.2013.00291/full]
2014
- First Research Excellence Framework held in UK
- RCUK extends the Researchfish approach to all disciplines and implements the process across all research council funding. 18,000 principal investigators complete the process, providing 800,000 reports of outputs linked to over £16 billion of RCUK funded awards.
- European standard for social impact measurement announced. [http://ec.europa.eu/internal_market/social_business/docs/expert-group/social_impact/140605-sub-group-report_en.pdf]
2015
- 100 research funding organisations are using Researchfish in the UK, tracking the progress and productivity of more than £4.5billion of funding for new grants each year.
2006 Peter Warry’s report challenges the UK research councils to go further in demonstrating their economic impact http://www.rcuk.ac.uk/Publications/archive/TheWarryReport/
2008 The MRC launch a new online approach to gather feedback from researchers about the output from their work, first called the “Outputs Data Gathering Tool”, it is revised and renamed “MRC eVal” in 2009 and then re-developed as “Researchfish” in 2012. http://www.mrc.ac.uk/research/achievements/evaluation-programme/?nav=sidebar
2008 The MRC, Wellcome Trust and Academy of Medical Sciences publish the first “Medical Research: What’s it worth?” analysis the result of two years of discussion under the auspices of the UK Evaluation Forum, and ground-breaking analysis by the Health Economics Research Group at Brunel, RAND Europe and the Office of Health Economics. The findings provide a new UK estimate of the return on investment from medical research http://www.mrc.ac.uk/news-events/publications/medical-research-whats-it-worth/
2009 UK research councils introduce “pathways to impact” as a major new section in all RCUK applications for funding. Applicants are asked to set out measures taken to maximise impact. http://www.rcuk.ac.uk/innovation/impacts/
2012 MRC launches a new funding initiative for studies aimed at better understanding the link between research and impact over the next two years 7 awards are made totalling £1M. http://www.mrc.ac.uk/funding/how-we-fund-research/highlight-notices/economic-impact-highlight-notice/
2012 ORCID launches its registry and begins minting identifiers
2014 RCUK extends the Researchfish approach to all disciplines and implements the process across all research council funding. 18,000 principal investigators complete the process, providing 800,000 reports of outputs linked to over £16 billion of RCUK funded awards. http://www.rcuk.ac.uk/research/researchoutcomes/
2015 100 research funding organisations are using Researchfish in the UK, tracking the progress and productivity of more than £4.5billion of funding for new grants each year.
I have two suggestions:
2012: A group of editors and publishers of scholarly journals met in December 2012 during the ASCB Annual Meeting in San Francisco and launched the San Francisco Declaration on Research Assessment (DORA), with reccomendations to improve the ways in which the outputs of scientific research are evaluated (http://www.ascb.org/dora/).
2015: After discussions during the 2014 International conference on Science and Technology Indicators, Diana Hicks, Paul Wouters, Ludo Waltman, Sarah de Rijcke and Ismael Rafols publish The Leiden Manifesto for Research Metrics, with ten principles to guide research evaluation (http://www.leidenmanifesto.org/).