Leiden Manifesto

From HandWiki
Revision as of 16:27, 8 February 2024 by Corlink (talk | contribs) (link)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: List of "ten principles to guide research evaluation"


The Leiden Manifesto for research metrics (LM) is a list of "ten principles to guide research evaluation",[1] published as a comment in Volume 520, Issue 7548 of Nature, on 22 April 2015. It was formulated by public policy professor Diana Hicks, scientometrics professor Paul Wouters, and their colleagues at the 19th International Conference on Science and Technology Indicators, held between 3–5 September 2014 in Leiden, The Netherlands.[2]

The LM was proposed as a guide to combat misuse of bibliometrics when evaluating scientific research literature. Examples of commonly used bibliometrics for science, or scientometrics, are the h-index, impact factor, and websites displaying indicators such as Altmetrics. According to Hicks et al., these metrics often pervasively misguide evaluations of scientific material.[3]

The Leiden Manifesto for research metrics
Illustration by David Parkins
AuthorsDiana Hicks, Paul Wouters, Ludo Waltman, Sarah de Rijcke, Ismael Rafols
PublisherNature
Publication date
22 April 2015
Websiteleidenmanifesto.org

Motivation

Motivations for codification of the Leiden Manifesto arose from a growing worry that "impact-factor obsession"[1] was leading to inadequate judgement of scientific material that should be worthy of fair evaluation. The lead author of the LM, Diana Hicks, hoped that publishing in Nature would expand the ideas already commonplace in the scientometrics sphere to the broader scientific community.[4] Although the principles of the LM are not new to scientometricians, Hicks et al. desired to create a unification of ideas to guide future editors and reviewers of scientific literature.

DORA and other predecessors

Interest in the reform of how research assessment is conducted is an ongoing debate in the scientific community. A well known declaration intended to quell the use of the impact factor, the San Francisco Declaration on Research Assessment was published around two years prior to the manifesto. The LM broadened the ideas presented in the DORA, which has now since been signed by over 2000 organizations and over 15000 individuals.[5]

One of the main concerns about overuse of citation-based performance indicators came from the observation that smaller research organizations and institutions may be negatively affected by their metric indices. In one public debate at the Centre for Science and Technology Studies at Leiden University, it was acknowledged that indicators which measure citations may give "more weight to publications from fields with a high expected number of citations than to publications from fields with a low expected number of citations".[6]

Although the main focus of the LM is based on the use of scientometrics for research evaluation, in its background, Hicks et al. also explain why overuse of metrics can adversely affect the wider scholarly community, such as the position of universities in global rankings.[7][8] According to Hicks et al., scientific metrics such as citation rate are used far too much for ranking the quality of universities (and thus the quality of their research output).

Journal impact factor

The background for the LM describes why misusing metrics is becoming a larger problem in the scientific community. The journal impact factor, originally created by Eugene Garfield as a method for librarians to collect data to facilitate selecting journals to purchase, is now mainly used as a method of judging journal quality.[9] This is seen by Hicks et al. as an abuse of data in order to examine research too hastily. For example, an impact factor, while a good metric to measure the size and experience of a journal, may or may not be sufficient enough to accurately describe the quality of its papers, and even less so for a single paper.

Content

Consisting of ten concise principles, along with a description for each, the Leiden Manifesto aims to reconstruct the way that research evaluations by academic publishers and scientific institutions are done. Its emphasis lies in detailed and close evaluation of research, rather than the excessive use of quantitative data in evaluations. It aims to promote academic excellence and fairness with thorough scrutiny, as well as remove possible perverse incentives of using scientometrics, such as judgement of academic capability and university quality.[10]

Ten principles

The ten principles of the Leiden Manifesto are as follows:[1]

  1. Quantitative evaluation should support qualitative, expert assessment.
  2. Measure performance against the research missions of the institution, group, or researcher.
  3. Protect excellence in locally relevant research.
    • That is, allow research that takes place in a certain area or field to be published in corresponding local research publication, instead of prioritizing high-impact journals. Many high-impact journals are in English, which may decrease needed specificity when publishing a paper meant to study locational characteristics. As an example, in high-impact Spanish-language papers, "topics such as local labor laws" and other features designated for sociologists may be lost.[11]
  4. Keep data collection and analytical processes open, transparent, and simple.
  5. Allow those evaluated to verify data and analysis.
  6. Account for variation by field in publication and citation practices.
    • Peer-review and citation rate can vary wildly across differing disciplines, for example, "top-ranked journals in mathematics have impact factors of around 3; top-ranked journals in cell biology have impact factors of about 30".[12]
  7. Base assessment of individual researchers on a qualitative judgement of their portfolio.
  8. Avoid misplaced concreteness and false precision.
    • According to Hicks et al., use of scientific indicators may precede strong assumptions that are not necessarily correct. For example, when looking at a specific scientist, a low citation rate may lead the investigator to assume low research quality, which is implying causation from correlation. Providing clarification, as well as multiple, robust indicators, may reduce inappropriate concreteness. False precision is possible when indicator producers, such as Clarivate (which publishes the annual Journal Citation Reports) attempt to create an exact journal impact factor (i.e. three decimal places). Hicks, et al. argue that conceptual ambiguity and random variability of citation counts make it unnecessary to distinguish indices such as journal impact factors to such a precise extent, because it can foster excessive comparison and competition between publishers.[1]
  9. Recognize the systemic effects of assessment and indicators.
  10. Scrutinize indicators regularly and update them.

Reception

2016 John Ziman Award

In 2016, the Leiden Manifesto was presented with the John Ziman Award by the European Association for the Study of Science and Technology for its effort to widen scientometrics knowledge to the scientific community as a whole. EASST president Fred Steward stated that the LM "emphasizes situatedness, in terms of different cognitive domains and research missions as well as the wider socioeconomic, national and regional context".[13] This award contributed to the solidification of the LM as a continuing public debate for the scholarly community. The award was received well by lead author Diana Hicks.[4]

Leiden Manifesto, The Metric Tide, and DORA

The Leiden Manifesto gained popularity following its publication, mainly by the scholarly publishing community which looked to reform their practices. The LM is often cited alongside other similar publications; those being the DORA and UK based academic review The Metric Tide.[4][14] In a public statement by the University of Leeds, support for committing to using the principles of the LM along with DORA and The Metric Tide was issued in order to further research excellence.[15]

Public endorsement

LIBER, a collaboration of European research libraries issued a substantial review on the LM in 2017. It concluded as a viewpoint that the LM was a "solid foundation" on which academic libraries could base their assessment of metrics.[16]

Elsevier, a global leader in research publishing and information analytics, announced on 14 July 2020 that it was to endorse the LM to guide its development of improved research evaluation. Elsevier stated that the principles of the manifesto were already close in nature to their 2019 CiteScore metrics, which was in summary "improved calculation methodology" for "a more robust, fair and faster indicator of research impact".[17] This alignment further popularized the LM, and also illustrated the shift in practices of evaluation by prominent research publications.

The Loughborough University LIS-Bibliometrics committee chose to base their principles on those of the LM instead of the DORA, because according to their policy manager Elizabeth Gadd, the LM takes a "broader approach to the responsible use of all bibliometrics across a range of disciplines and settings".[18] Stephen Curry, the chair of the DORA steering committee, commented on this statement by emphasizing that DORA was aiming to extend its "disciplinary and geographical reach".[19] However, he still made it clear that a university should have the right to choose to follow either the DORA or the LM, or neither, as long as reasonable justification was provided.

Further applications

David Moher, et al. referenced the LM in a perspective for Issues in Science and Technology that the "right questions"(i.e. research planning, timeframe, reproducibility, and results) for assessing scientists were not being asked by academic institutions. Moher et al. criticize the obsession of journal impact factors and "gaming" by investigators of scientometrics.[20] Instead, Moher et al. advocate for usage of DORA and the LM when assessing individual scientists and research.

T. Kanchan and K. Krishan describe by a letter in Science and Engineering Ethics why the LM is "one of the best criteria" for assessing scientific research, especially considering the "rat race" for publications in the scholarly community. Kanchan and Krishan emphasize that use of the LM will lead to "progress of science and society at large".[21]

See also

References

  1. 1.0 1.1 1.2 1.3 Hicks, Diana; Wouters, Paul; Waltman, Ludo; de Rijcke, Sarah; Rafols, Ismael (2015-04-22). "Bibliometrics: The Leiden Manifesto for research metrics". Nature 520 (7548): 429–431. doi:10.1038/520429a. ISSN 0028-0836. PMID 25903611. Bibcode2015Natur.520..429H. 
  2. "STI 2014 Leiden - 19th International Conference on Science and Technology Indicators" (in en). http://sti2014.cwts.nl/Home/. 
  3. "Leiden manifesto for research Metrics" (in en). http://www.leidenmanifesto.org/. 
  4. 4.0 4.1 4.2 "Best Practices in Research Metrics: A Conversation with the lead author of the Leiden Manifesto - YouTube". https://www.youtube.com/watch?v=UY7vYmAd9d0&feature=youtu.be. 
  5. "DORA – San Francisco Declaration on Research Assessment (DORA)". https://sfdora.org/. 
  6. van Raan, Anthony F.J.; van Leeuwen, Thed N.; Visser, Martijn S.; van Eck, Nees Jan; Waltman, Ludo (July 2010). "Rivals for the crown: Reply to Opthof and Leydesdorff" (in en). Journal of Informetrics 4 (3): 431–435. doi:10.1016/j.joi.2010.03.008. https://linkinghub.elsevier.com/retrieve/pii/S1751157710000301. 
  7. "World University Rankings" (in en). 2018-09-19. https://www.timeshighereducation.com/world-university-rankings. 
  8. "ARWU World University Rankings 2020 | Academic Ranking of World Universities". http://www.shanghairanking.com/. 
  9. Larivière, Vincent; Sugimoto, Cassidy R. (2019), Glänzel, Wolfgang; Moed, Henk F.; Schmoch, Ulrich et al., eds., "The Journal Impact Factor: A Brief History, Critique, and Discussion of Adverse Effects" (in en), Springer Handbook of Science and Technology Indicators, Springer Handbooks (Cham: Springer International Publishing): pp. 3–24, doi:10.1007/978-3-030-02511-3_1, ISBN 978-3-030-02511-3, https://doi.org/10.1007/978-3-030-02511-3_1, retrieved 2020-12-04 
  10. (in en) The Leiden Manifesto for Research Metrics, https://vimeo.com/133683418, retrieved 2020-12-05 
  11. López Piñeiro, Carla; Hicks, Diana (2015-01-01). "Reception of Spanish sociology by domestic and foreign audiences differs and has consequences for evaluation" (in en). Research Evaluation 24 (1): 78–89. doi:10.1093/reseval/rvu030. ISSN 0958-2029. https://academic.oup.com/rev/article/24/1/78/1544815. 
  12. Waltman, Ludo; Calero‐Medina, Clara; Kosten, Joost; Noyons, Ed C. M.; Tijssen, Robert J. W.; Eck, Nees Jan van; Leeuwen, Thed N. van; Raan, Anthony F. J. van et al. (2012). "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation" (in en). Journal of the American Society for Information Science and Technology 63 (12): 2419–2432. doi:10.1002/asi.22708. ISSN 1532-2890. https://onlinelibrary.wiley.com/doi/abs/10.1002/asi.22708. 
  13. "2016 John Ziman Award for Leiden Manifesto" (in en). https://www.cwts.nl:443/news?article=n-q2x294. 
  14. "Metric Tide - Research England". https://re.ukri.org/sector-guidance/publications/metric-tide/. 
  15. "Responsible Metrics in the Assessment of Research" (in en-US). http://ris.leeds.ac.uk/research-excellence/responsible-metrics-in-the-assessment-of-research/. 
  16. Coombs, Sarah K.; Peters, Isabella (2017-11-13). "The Leiden Manifesto under review: what libraries can learn from it". Digital Library Perspectives 33 (4): 324–338. doi:10.1108/dlp-01-2017-0004. ISSN 2059-5816. http://dx.doi.org/10.1108/dlp-01-2017-0004. 
  17. "Elsevier endorses Leiden Manifesto to guide its development of improved research evaluation". https://www.elsevier.com/about/press-releases/corporate/elsevier-endorses-leiden-manifesto-to-guide-its-development-of-improved-research-evaluation. 
  18. Gadd, Lizzie (2018-07-09). "DORA, the Leiden Manifesto & a university's right to choose" (in en). https://thebibliomagician.wordpress.com/2018/07/09/dora-the-leiden-manifesto-a-universitys-right-to-choose/. 
  19. Stephen. "DORA, the Leiden Manifesto & a university's right to choose: a comment" (in en-US). http://occamstypewriter.org/scurry/2018/07/18/dora-leiden-manifesto-right-to-choose/. 
  20. Moher, David; Naudet, Florian; Cristea, Ioana A.; Miedema, Frank; Ioannidis, John P. A.; Goodman, Steven N. (2018-03-29). "Assessing scientists for hiring, promotion, and tenure". PLOS Biology 16 (3): e2004089. doi:10.1371/journal.pbio.2004089. ISSN 1545-7885. PMID 29596415. 
  21. Kanchan, Tanuj; Krishan, Kewal (April 2019). "The Leiden Manifesto and Research Assessment" (in en). Science and Engineering Ethics 25 (2): 643–644. doi:10.1007/s11948-017-0012-2. ISSN 1353-3452. PMID 29264830. http://link.springer.com/10.1007/s11948-017-0012-2.