One of the most basic citation metrics is how often an article was cited in other articles, books, or other sources (such as theses). Citation rates are heavily dependent on the discipline and the number of people working in that area. For instance, many more scientists work in neuroscience than in mathematics, and neuroscientists publish more papers than mathematicians, hence neuroscience papers are much more often cited than papers in mathematics. Similarly, review papers are more often cited than regular research papers because they summarize results from many papers. This may also be the reason why papers with shorter titles get more citations, given that they are usually covering a broader area.
The simplest journal-level metric is the journal impact factor (JIF), the average number of citations that articles published by a journal in the previous two years have received in the current year, as calculated by Clarivate. Other companies report similar metrics, such as the CiteScore (CS), based on Scopus.
However, very high JIF or CS are often based on a small number of very highly cited papers. For instance, most papers in Nature (impact factor 38.1, 2016) were only cited 10 or 20 times during the reference year (see figure). Journals with a lower impact (e.g. PLOS ONE, impact factor 3.1) publish many papers that are cited 0 to 5 times but few highly cited articles.
Journal-level metrics are often misinterpreted as a measure for journal quality or article quality. They are not an article-level metric, hence its use to determine the impact of a single article is statistically invalid. Citation distribution is skewed for journals because a very small number of articles is driving the vast majority of citations; therefore, some journals have stopped publicizing their impact factor, e.g. the journals of the American Society for Microbiology.
Total citations, or average citation count per article, can be reported for an individual author or researcher. Many other measures have been proposed, beyond simple citation counts, to better quantify an individual scholar's citation impact. The best-known measures include the h-index and the g-index. Each measure has advantages and disadvantages, spanning from bias to discipline-dependence and limitations of the citation data source. Counting the number of citations per paper is also employed to identify the authors of citation classics.
Citations are distributed highly unequally among researchers. In a study based on the Web of Science database across 118 scientific disciplines, the top 1% most-cited authors accounted for 21% of all citations. Between 2000 and 2015, the proportion of citations that went to this elite group grew from 14% to 21%. The highest concentrations of ‘citation elite’ researchers were in the Netherlands, the United Kingdom, Switzerland and Belgium. Note that 70% of the authors in the Web of Science database have fewer than 5 publications, so that the most-cited authors among the 4 million included in this study constitute a tiny fraction.
As early as 2004, the BMJ published the number of views for its articles, which was found to be somewhat correlated to citations. In 2008 the Journal of Medical Internet Research began publishing views and Tweets. These "tweetations" proved to be a good indicator of highly cited articles, leading the author to propose a "Twimpact factor", which is the number of Tweets it receives in the first seven days of publication, as well as a Twindex, which is the rank percentile of an article's Twimpact factor.
Open access (OA) publications are accessible without cost to readers, hence they would be expected to be cited more frequently. Some experimental and observational studies have found that articles published in OA journals do not receive more citations, on average, than those published in subscription journals; other studies have found that they do.
The evidence that author-self-archived ("green") OA articles are cited more than non-OA articles is somewhat stronger than the evidence that ("gold") OA journals are cited more than non-OA journals. Two reasons for this are that many of the top-cited journals today are still only hybrid OA (author has the option to pay for gold) and many pure author-pays OA journals today are either of low quality or downright fraudulent "predatory journals," preying on authors' eagerness to publish-or-perish, thereby lowering the average citation counts of OA journals.
An important recent development in research on citation impact is the discovery of universality, or citation impact patterns that hold across different disciplines in the sciences, social sciences, and humanities. For example, it has been shown that the number of citations received by a publication, once properly rescaled by its average across articles published in the same discipline and in the same year, follows a universal log-normal distribution that is the same in every discipline. This finding has suggested a universal citation impact measure that extends the h-index by properly rescaling citation counts and resorting publications, however the computation of such a universal measure requires the collection of extensive citation data and statistics for every discipline and year. Social crowdsourcing tools such as Scholarometer have been proposed to address this need. Kaur et al. proposed a statistical method to evaluate the universality of citation impact metrics, i.e., their capability to compare impact fairly across fields. Their analysis identifies universal impact metrics, such as the field-normalized h-index.
Research suggests the impact of an article can be, partly, explained by superficial factors and not only by the scientific merits of an article. Field-dependent factors are usually listed as an issue to be tackled not only when comparison across disciplines are made, but also when different fields of research of one discipline are being compared. For instance in Medicine among other factors the number of authors, the number of references, the article length, and the presence of a colon in the title influence the impact. Whilst in Sociology the number of references, the article length, and title length are among the factors. Also it is found that scholars engage in ethically questionable behavior in order to inflate the number of citations articles receive.
Automated citation indexing has changed the nature of citation analysis research, allowing millions of citations to be analyzed for large scale patterns and knowledge discovery. The first example of automated citation indexing was CiteSeer, later to be followed by Google Scholar. More recently, advanced models for a dynamic analysis of citation aging have been proposed. The latter model is even used as a predictive tool for determining the citations that might be obtained at any time of the lifetime of a corpus of publications.
According to Mario Biagioli: "All metrics of scientific evaluation are bound to be abused. Goodhart's law [...] states that when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it."
Garfield, E. (1955). "Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas". Science. 122 (3159): 108–111. Bibcode:1955Sci...122..108G. doi:10.1126/science.122.3159.108. PMID14385826.
Garfield, E. (1973). "Citation Frequency as a Measure of Research Activity and Performance" (PDF). Essays of an Information Scientist. 1: 406–408.
Garfield, E. (1988). "Can Researchers Bank on Citation Analysis?" (PDF). Essays of an Information Scientist. 11: 354.
Garfield, E. (1998). "The use of journal impact factors and citation analysis in the evaluation of science". 41st Annual Meeting of the Council of Biology Editors.
Moed, Henk F. (2005). Citation Analysis in Research Evaluation. Springer. ISBN 978-1-4020-3713-9.
^Leydesdorff, L., & Milojević, S. (2012). Scientometrics. arXiv preprint arXiv:1208.4566.
^Harnad, S. (2009). Open access scientometrics and the UK Research Assessment Exercise. Scientometrics, 79(1), 147-156.
Larsen, P. O.; von Ins, M. (2010). "The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index". Scientometrics. 84 (3): 575–603. doi:10.1007/s11192-010-0202-z. PMC2909426. PMID20700371.
Deng, B. (26 August 2015). "Papers with shorter titles get more citations". Nature News. doi:10.1038/nature.2015.18246. S2CID 186805536.
Casadevall, A.; Bertuzzi, S.; Buchmeier, M. J.; Davis, R. J.; Drake, H.; Fang, F. C.; Gilbert, J.; Goldman, B. M.; Imperiale, M. J. (2016). "ASM Journals Eliminate Impact Factor Information from Journal Websites". mSphere. 1 (4): e00184–16. doi:10.1128/mSphere.00184-16. PMC4941020. PMID27408939.
Belikov, A. V.; Belikov, V. V. (2015). "A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts". F1000Research. 4: 884. doi:10.12688/f1000research.7070.1. PMC4654436.
^Gálvez RH (March 2017). "Assessing author self-citation as a mechanism of relevant knowledge diffusion". Scientometrics. 111 (3): 1801–1812. doi:10.1007/s11192-017-2330-1. S2CID 6863843.
^Couto, F. M.; Pesquita, C.; Grego, T.; Veríssimo, P. (2009). "Handling self-citations using Google Scholar". Cybermetrics. 13 (1): 2. Archived from the original on 2010-06-24. Retrieved 2009-05-27.
Serenko, A.; Dumay, J. (2015). "Citation classics published in knowledge management journals. Part I: Articles and their characteristics" (PDF). Journal of Knowledge Management. 19 (2): 401–431. doi:10.1108/JKM-06-2014-0220.
Perneger, T. V. (2004). "Relation between online "hit counts" and subsequent citations: Prospective study of research papers in the BMJ". BMJ. 329 (7465): 546–7. doi:10.1136/bmj.329.7465.546. PMC516105. PMID15345629.
^Hajjem, C.; Harnad, S.; Gingras, Y. (2005). "Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How It Increases Research Citation Impact" (PDF). IEEE Data Engineering Bulletin. 28 (4): 39–47. arXiv:cs/0606079. Bibcode:2006cs........6079H.
^Lawrence, S. (2001). "Free online availability substantially increases a paper's impact". Nature. 411 (6837): 521. Bibcode:2001Natur.411..521L. doi:10.1038/35079151. PMID11385534. S2CID 4422192.
^MacCallum, C. J.; Parthasarathy, H. (2006). "Open Access Increases Citation Rate". PLOS Biology. 4 (5): e176. doi:10.1371/journal.pbio.0040176. PMC1459260. PMID16683866.
^Gargouri, Y.; Hajjem, C.; Lariviere, V.; Gingras, Y.; Brody, T.; Carr, L.; Harnad, S. (2010). "Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research". PLOS ONE. 5 (10): e13636. arXiv:1001.0361. Bibcode:2010PLoSO...513636G. doi:10.1371/journal.pone.0013636. PMC2956678. PMID20976155.
^Davis, P. M.; Lewenstein, B. V.; Simon, D. H.; Booth, J. G.; Connolly, M. J. L. (2008). "Open access publishing, article downloads, and citations: randomised controlled trial". BMJ. 337: a568. doi:10.1136/bmj.a568. PMC2492576. PMID18669565.
^Davis, P. M. (2011). "Open access, readership, citations: a randomized controlled trial of scientific journal publishing". The FASEB Journal. 25 (7): 2129–2134. doi:10.1096/fj.11-183988. PMID21450907. S2CID 205367842.
^Chua, SK; Qureshi, Ahmad M; Krishnan, Vijay; Pai, Dinker R; Kamal, Laila B; Gunasegaran, Sharmilla; Afzal, MZ; Ambawatta, Lahiru; Gan, JY (2017-03-02). "The impact factor of an open access journal does not contribute to an article's citations". F1000Research. 6: 208. doi:10.12688/f1000research.10892.1. PMC5464220. PMID28649365.
^Tang, M., Bever, J. D., & Yu, F. H. (2017). Open access increases citations of papers in ecology. Ecosphere, 8(7), e01887.
^Niyazov, Y., Vogel, C., Price, R., Lund, B., Judd, D., Akil, A., ... & Shron, M. (2016). Open access meets discoverability: Citations to articles posted to Academia. edu. PLOS ONE, 11(2), e0148257.
^Young, J. S., & Brandes, P. M. (2020). Green and gold open access citation and interdisciplinary advantage: A bibliometric study of two science journals. The Journal of Academic Librarianship, 46(2), 102105.
^Torres-Salinas, D., Robinson-Garcia, N., & Moed, H. F. (2019). Disentangling Gold Open Access. In Springer Handbook of Science and Technology Indicators (pp. 129-144). Springer, Cham.
^Björk, B. C., Kanto-Karvonen, S., & Harviainen, J. T. (2020). How frequently are articles in predatory open access journals cited. Publications, 8(2), 17.
Radicchi, F.; Fortunato, S.; Castellano, C. (2008). "Universality of citation distributions: Toward an objective measure of scientific impact". PNAS. 105 (45): 17268–17272. arXiv:0806.0974. Bibcode:2008PNAS..10517268R. doi:10.1073/pnas.0806977105. PMC2582263. PMID18978030.
^Hoang, D.; Kaur, J.; Menczer, F. (2010). "Crowdsourcing Scholarly Data" (PDF). Proceedings of the WebSci10: Extending the Frontiers of Society On-Line. Archived from the original (PDF) on 2016-03-16. Retrieved 2017-02-20.
Kaur, J.; Hoang, D.; Sun, X.; Possamai, L.; JafariAsbagh, M.; Patil, S.; Menczer, F. (2012). "Scholarometer: A Social Framework for Analyzing Impact across Disciplines". PLOS ONE. 7 (9): e43235. Bibcode:2012PLoSO...743235K. doi:10.1371/journal.pone.0043235. PMC3440403. PMID22984414.
van Wesel, M. (2016). "Evaluation by Citation: Trends in Publication Behavior, Evaluation Criteria, and the Strive for High Impact Publications". Science and Engineering Ethics. 22 (1): 199–225. doi:10.1007/s11948-015-9638-0. PMC4750571. PMID25742806.
Giles, C. L.; Bollacker, K.; Lawrence, S. (1998). "CiteSeer: An Automatic Citation Indexing System". DL'98 Digital Libraries, 3rd ACM Conference on Digital Libraries. pp. 89–98. doi:10.1145/276675.276685.
Yu, G.; Li, Y.-J. (2010). "Identification of referencing and citation processes of scientific journals based on the citation distribution model". Scientometrics. 82 (2): 249–261. doi:10.1007/s11192-009-0085-z. S2CID 38693917.
Bouabid, H. (2011). "Revisiting citation aging: A model for citation distribution and life-cycle prediction". Scientometrics. 88 (1): 199–211. doi:10.1007/s11192-011-0370-5. S2CID 30345334.
Chanson, Hubert (2007). "Research Quality, Publications and Impact in Civil Engineering into the 21st Century. Publish or Perish, Commercial versus Open Access, Internet versus Libraries ?". Canadian Journal of Civil Engineering. 34 (8): 946–951. doi:10.1139/l07-027.
Panaretos, J.; Malesios, C. (2009). "Assessing Scientific Research Performance and Impact with Single Indices". Scientometrics. 81 (3): 635–670. arXiv:0812.4542. doi:10.1007/s11192-008-2174-9. S2CID 1957865.
Media related to Citation impact at Wikimedia Commons