University of Illinois Chicago

University library, search uic library collections.

Find items in UIC Library collections, including books, articles, databases and more.

Advanced Search

Search UIC Library Website

Find items on the UIC Library website, including research guides, help articles, events and website pages.

  • Search Collections
  • Search Website

Measuring Your Impact: Impact Factor, Citation Analysis, and other Metrics: Citation Analysis

  • Measuring Your Impact

Citation Analysis

Find your h-index.

  • Other Metrics/ Altmetrics
  • Journal Impact Factor (IF)
  • Selecting Publication Venues

About Citation Analysis

What is Citation Analysis?

The process whereby the impact or "quality" of an article is assessed by counting the number of times other authors mention it in their work.

Citation analysis invovles counting the number of times an article is cited by other works to measure the impact of a publicaton or author.  The caviat however, there is no single citation analysis tools that collects all publications and their cited references.  For a thorough analysis of the impact of an author or a publication, one needs to look in multiple databases to find all possible cited references. A number of resources are available at UIC  that identify cited works including: Web of Science, Scopus, Google Scholar, and other databases with limited citation data.

Citation Analysis - Why use it?

To find out how much impact a particular article or author has had, by showing which other authors cited the work within their own papers.  The H-Index is one specific method utilizing citation analysis to determine an individuals impact.

Web of Science

Web of Science provides citation counts for articles indexed within it.  It i ndexes over 10,000 journals in the arts, humanities,  sciences, and social sciences.

  • Enter the name of the author in the top search box (e.g. Smith JT).  
  • Select Author from the drop-down menu on the right.
  • To ensure accuracy for popular names, enter Univ Illinois in the middle search box, then select “Address” from the field drop down menu on the right.  (You might have to add the second search box by clicking "add another field" before you enter the address)
  • Click on Search
  • a list of publications by that author name will appear.   To the right of each citation, the number of times the article has been cited will appear.   Click the number next to "times cited" to view the articles that have cited your article

Scopus provide citation counts for articles indexed within it (limited to article written in 1996 and after).   It indexes o ver 15,000 journals from over 4,000 international publishers across the disciplines.

  • Once in Scopus, click on the Author search tab.
  • Enter the name of the author in the search box.  If you are using initials for the first and/or middle name, be sure to enter periods after the initials (e.g. Smith J.T.). 
  • To ensure accuracy if it is a popular name, you may enter University of Illinois in the affiliation field.  
  • If more than one profile appears, click on your profile (or the profile of the person you are examining). 
  • Once you click on the author's profile, a list of the publications will appear and to the right of each ctation, the number of times the article has been cited will appear.  
  • Click the number to view the articles that have cited your article
  • Google Scholar

Google Scholar provides citation counts for articles found within Google Scholar.  Depending on the discipline and cited article, it may find more cited references than Web of Science or Scopus because overall, Google Scholar is indexing more journals and more publication types than other databases. Google Scholar is not specific about what is included in its tool but information is available on how Google obtains its content .   Limiting searches to only publications by a specific author name is complicated in Google Scholar.  Using Google Scholar Citations and creating your own profile will make it easy for you to create a list of publications included in Google Scholar.   Using your Google Scholar Citations account, you can see the citation counts for your publications and have GS calculate your h-index.  (You can also search Google Scholar by author name and the title of an article to retrieve citation information for a specific article.)

  • Using your google (gmail) account, create a profile of all your articles captured in Google Scholar.  Follow the prompt on the scrren to set up your profile.   Once complete, this will show all the times the articles have been cited by other documents in Google Scholar and your h-index will be provided.  Its your choice whether you make your profile public or private but if you make it public, you can link to it from your own webpages.

Try Harzing's Publish or Perish Tool in order to more selectively examine published works by a specific author.

Databases containing limited citation counts:

  • PubMed Central
  • Science Direct
  • SciFinder Scholar

About the H-index

The h-index is an index to quantify an individual’s scientific research output ( J.E. Hirsch )   The h-index is an index that attempts to measure both the scientific productivity and the apparent scientific impact of a scientist. The index is based on the set of the researcher's most cited papers and the number of citations that they have received in other people's publications ( Wikipedia )  A scientist has index h if h of [his/her] Np papers have at least h citations each, and the other (Np − h) papers have at most h citations each.

Find your h-index at:

Below are instructions for obtaining your h-index from Web of Science, Scopus, and Google Scholar.

Web of Science provides citation counts for articles indexed within it.  It indexes over 12,000 journals in the arts, humanities,  sciences, and social sciences.  To find an author's h-index in WOS:

  • To ensure accuracy for popular names, add an additional search box and enter "Univ Illinois" and then select “Address” from the field drop down menu on the right.
  • Click on Citation Report on the right hand corner of the results page.  The H-index is on the right of the screen.
  • If more than one profile appears, click on your profile (or the profile of the person you are examining).  Under the Research section, you will see the h-index listed.
  • If you have worked at more than one place, your name may appear twice with 2 separate h-index ratings.  Select the check box next to each relevent profile, and click show documents.

  Google Scholar

  • Using your google (gmail) account, create a profile of all your articles captured in Google Scholar.  Follow the prompt on the screen to set up your profile.   Once complete, this will show all the times the articles have been cited by other documents in Google Scholar and your h-index will be provided.  Its your choice whether you make your profile public or private but if you make it public, you can link to it from your own webpages.
  • See  Albert Einstein's
  • Harzing’s Publish or Perish (POP) 
  • Publish or Perish Searches Google Scholar.  After searching by your name, deselect from the list of articles retrieved those that you did not author.  Your h-index will appear at the top of the tool.  Note:This tool must be downloaded to use
  • << Previous: Measuring Your Impact
  • Next: Find Your H-Index >>
  • Last Updated: Dec 12, 2023 3:51 PM
  • URL: https://researchguides.uic.edu/if

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • Clin Transl Sci
  • v.14(5); 2021 Sep

Practical publication metrics for academics

Bethany a. myers.

1 Louise M. Darling Biomedical Library, University of California, Los Angeles California, USA

Katherine L. Kahn

2 Division of General Internal Medicine and Health Services Research, David Geffen School of Medicine, University of California, Los Angeles California, USA

Research organizations are becoming more reliant on quantitative approaches to determine how to recruit and promote researchers, allocate funding, and evaluate the impact of prior allocations. Many of these quantitative metrics are based on research publications. Publication metrics are not only important for individual careers, but also affect the progress of science as a whole via their role in the funding award process. Understanding the origin and intended use of popular publication metrics can inform an evaluative strategy that balances the usefulness of publication metrics with the limitations of what they can convey about the productivity and quality of an author, a publication, or a journal. This paper serves as a brief introduction to citation networks like Google Scholar, Web of Science Core Collection, Scopus, Microsoft Academic, and Dimensions. It also explains two of the most popular publication metrics: the h‐index and the journal impact factor. The purpose of this paper is to provide practical information on using citation networks to generate publication metrics, and to discuss ideas for contextualizing and juxtaposing metrics, in order to help researchers in translational science and other disciplines document their impact in as favorable a light as may be justified.

INTRODUCTION

As the scale of global research continues to increase, research organizations are becoming more reliant on quantitative approaches to determine how to recruit and promote researchers, allocate funding, and evaluate the impact of prior allocations. It has been common practice for funders; appointment, tenure, and promotion committees; academic administrations; publishers; and others to apply a variety of quantitative metrics to rank researchers, papers, journals, and even institutions and countries. 1 , 2 , 3 Many of these quantitative metrics are based on research publications. The total number of publications can be used to infer scientific output or productivity, whereas the number of citations to those publications may be used to infer the impact of the research. In aggregate, these “publication metrics” have potential to serve both researchers and those evaluating the work of researchers. 4 For the researcher, metrics can highlight the scope and strengths of one’s work, forming a useful starting point to answer the question, “what have I done?” Researchers can use metrics to structure their review of their past work, design efficient summaries of their prior research trajectory, and inform future decision making. For an evaluator (and many researchers eventually find themselves in the position of evaluating the scientific achievements of others), understanding a researcher’s publication and citation record provides context for judging their achievements and future potential.

Publication metrics are not only important for individual careers, but also affect the progress of science as a whole via their role in the funding award process. Funders that receive many grant applications may see quantitative publication metrics as a shortcut to assess research quality and impact. Researchers competing for grants may in turn strive to achieve a perceived threshold for certain metrics. An understanding of the origin and intended use of popular publication metrics can inform an evaluative strategy that balances the usefulness of metrics with the limitations of what they can convey about the productivity and quality of an author, a paper, or a journal.

This is especially important in translational science, a discipline created to improve patient and population outcomes. Translational science researchers are iteratively called upon by peers, funders, and their institutions to use their publication records to document their progress toward these outcomes, and translational science evaluators use publication metrics in their assessments. 5 , 6 The purpose of this paper is to briefly describe the most frequently used quantitative publication metrics, provide practical information on generating metrics, and discuss ideas for contextualizing and juxtaposing metrics, in order to help researchers in translational science and other disciplines document their impact in as favorable a light as may be justified.

CITATION NETWORKS

Citation counts represent the number of times a publication has been cited by other publications. Because citation counts represent a key constituent of the most frequently used publication metrics, understanding their source is necessary to effectively utilize publication metrics. Citation counts are usually provided by citation networks or indices, which are systems that connect each publication to every publication it cites, as well as every publication that has cited it. No single citation network functions as the dominant source of citation data. Instead, several citation networks exist and vary according to which publications they include. As a consequence, citation networks also vary in the citation numbers they generate for any given publication. Citation networks include traditional indexed databases, which contain article metadata ingested from publisher sources and accessed by users via a searchable interface; academic search engines, which scrape the web for relevant content and allow users to search the content via a web interface; and metadata datasets that can only be accessed computationally (e.g., via API [Application Programming Interfaces]). However, citation data are compiled, networks are created by connecting each citation reference in the publication’s bibliography to that reference’s record in the database. For example, if paper A cites another paper B, the citation network “reads” that citation and adds one cited reference to the existing total citation count of paper A. Users have a choice among multiple science citation networks. Six of the largest are described in Table  1 .

Descriptions of common citation networks

Abbreviation: API, Application Programming Interfaces.

Practical applications

To select one or more citation networks, users may consider (1) the network’s coverage of publication types and areas of research, (2) its number of citation linkages between publications, (3) the user‐friendliness of its interface, and (4) its functionality when it comes to automatically generating metrics. Researchers should be aware that most citation networks are primarily comprised of journal articles. Therefore, it may be more difficult to assess the citation impact of gray literature such as white papers, reports, clinical trials, or other nontraditional publications. 7 Recent studies comparing various citation networks for accuracy and completeness may help inform the decision to choose a particular citation network. 8 , 9 , 10 , 11 , 12 Because publication metrics are derived from citation counts within citation networks, and citation counts vary depending on the network’s publication coverage, metrics derived for a given author from one network will not necessarily be concordant with metrics for the same author but derived from another network. A researcher may find that one citation network contains records for most of their publications, whereas another network may only have records for some of their publications. Although it is generally advantageous for researchers to find a network containing records for all of their publications, 8 researchers selecting among networks must also consider that networks’ bibliometric data vary according to the quality of the included data, in addition to the quantity of publications reported. For example, a researcher is likely to find more of their publications, and therefore a higher citation count and h‐index, by using Google Scholar. However, as described in Table  1 , Google Scholar may contain erroneous or duplicate records due to the way it collects publication data from the web. Although at first glance the researcher may think that Google Scholar offers a higher number and therefore a “better” metric, the accuracy of that metric may be questionable if it is based upon faulty bibliographic data. Precise documentation by researchers of the citation network(s) they select to inform their publication metrics provides the opportunity for their evaluators to assess the accuracy of their analyses.

If a researcher determines that metrics from multiple citation networks are useful to show context for their work, two or more different citation networks can be documented clearly to avoid confusion in interpreting their metrics. For example, a researcher working on a tenure and promotion dossier may decide to primarily use Web of Science Core Collection to search for their journal articles, and use Web of Science Core Collection’s citation counts and calculated h‐index to document their career’s published articles. This researcher may also use Google Scholar to find their gray literature publications, and decide to include the citation counts of those publications to promote their research that resulted in a white paper, report, or other nonarticle publication type. In this example, the dossier should clearly indicate that the metrics presented for the journal articles came from Web of Science Core Collection, while the metrics presented for the gray literature publications came from Google Scholar.

PUBLICATION‐LEVEL METRICS

Publication‐level (including both articles and nonarticle publications) metrics represent any quantitative number relating to an individual publication. Most commonly, this takes the form of citation counts: the number of citations to any given publication. In addition to citation counts, the number of article views and downloads are frequently listed on journal articles hosted on publisher websites. Other emerging metrics known as “alternative metrics” often seek to indicate social impact rather than solely scientific impact. 13 , 14 Although they may theoretically be applied to authors, institutions, journals, or other entities, in practice, the most prevalent implementation of alternative metrics is publication‐level. Examples include the number of times a publication has been shared on social media or blogs, the number of comments or “likes” it has received, or the number of times it has been mentioned in mass media. Due to their loosely defined and rapidly changing nature, alternative metrics are difficult to locate, although one company, Altmetrics, 15 has monetized centralizing various indicators into an “attention score.” Alternative metrics can add societal context and diversity to a research evaluation, 16 but researchers and evaluators should keep in mind that metrics reflecting public engagement may not correlate with scientific impact. 13 , 17

Citation counts for an individual publication can be generated by searching a title or Digital Object Identifier (DOI) in any of the citation networks described in Table  1 . All six citation networks display the number of citations to a particular publication on the search results page for that publication. Individual publication citation counts may be used to highlight particularly impactful citations, but a more creative approach for a researcher’s dossier might be to group publications together and write about the citation impact of the group. For example, a researcher may aggregate citations by time period (e.g., before or after getting a prior promotion or being awarded a grant), by their different research fields or subfields (e.g., clinical and basic science), or by authorship type (e.g., first vs. senior [last] author). This facilitates discussion of publication impact in context, and may be useful to assert the value of a previous grant investment, explain impact variation within different fields, or provide evidence that research leadership affected impact. Another approach for a researcher or evaluator might be to selectively use comparative metrics by comparing a single or group of publications to any of the following: other articles published in the same field, other articles published within the same journal, or other articles published by peer researchers.

One strategy for utilizing publication‐level metrics for a grouped set of publications is to use the mean number of citations per publication. This number may be higher or lower than the same author's h‐index (see below) depending on the distribution of citations within the body of work. Supplementing the mean number of citations for a large list of articles with the median and/or the standard deviation would help evaluators to understand the spread of the citation counts. Such measures of central tendency and variability could be used alongside, or instead of, direct citation counts for individual publications when presenting any of the previously discussed methods of grouping publications.

AUTHOR‐LEVEL METRICS

The h‐index 18 is the number ( N ) for an author such that at least N of the author’s publications have a minimum of N citations each. For example, imagine an author with any number of total publications, and at least 10 of their total papers have 10 or more citations. This author’s h‐index would be 10 (Table  2 ). The h‐index is a widely used and easily understood metric that demonstrates the citation impact across an author’s career. However, users of the h‐index should recognize several major limitations of this metric. The h‐index is consistently skewed toward researchers’ older papers, which have had more time to accumulate citations. A high h‐index is challenging to achieve for early career researchers. The h‐index also weights all authors equally regardless of authorship position, meaning it does not provide information about the relative contribution of authors. Additionally, h‐indices may be lower for researchers who have published extensively, but have only a limited number of highly cited publications compared with researchers whose papers’ citations are more evenly distributed. The h‐index is also vulnerable to extreme instances of self‐citation, or in‐group citation, which artificially inflate it. 19 , 20 Finally, and importantly, the h‐index should not be used to compare researchers across fields, as citation rates vary widely between disciplines. 21 As long as the drawbacks are understood, the h‐index can be a useful tool in an analysis comparing the total publication output of an author with the distribution of citations to their work. Numerous alternatives to the h‐index have been proposed that attempt to correct for such drawbacks, including variations on the h‐index itself, 22 , 23 , 24 , 25 the e‐index, 26 the g‐index, 27 and the m‐quotient, 18 , 28 but none have reached the popularity of the original h‐index.

Two different patterns of the distribution of authors’ total number of publications

An h‐index can be calculated manually from a list of an author's publications’ total citation counts. Ideally, citation counts should be generated from a single citation network; citation counts collected from multiple networks should be presented separately and not joined into a single h‐index. If the list of citation counts for each paper is sorted from highest to lowest, it is simple to spot the crossover point at which the number of citations meets or exceeds the number of publications (Table  2 ). If an author does not have a list of their publications at hand, an h‐index can also be generated by searching Web of Science Core Collection, Scopus, or Google Scholar. In Web of Science and Scopus, an author’s publications can be searched by author name, affiliation, or unique identifier (such as ORCID); and an h‐index may be generated from the result set. In Google Scholar, authors will need to create a profile page and add their publications to their account to have their h‐index displayed. Researchers should be aware that Google Scholar may display duplicate records for their publications. This can cause inflated citation counts, if duplicate records are counted separately as citing papers. It can also cause the total number of citations to one publication to be split across the duplicate records for that same publication, decreasing the author’s h‐index. Researchers are encouraged to verify their publication records in Google Scholar. As with the publication‐level metrics discussed previously, it may be useful to consider multiple h‐indices for groups of publications that represent temporal, thematic, or authorship responsibility to either argue for or evaluate specific impact.

JOURNAL‐LEVEL METRICS

The most popular journal metric is the journal impact factor (JIF), 29 created by the scientometrician Eugene Garfield. The JIF is the number of total “citable items” a journal published in a 2‐year period divided by the total number of citations over the same 2‐year period. The denominator is currently defined as articles, review articles, and proceedings papers, 30 whereas the numerator includes citations to all publications in a journal. The JIF is a proprietary metric owned by Clarivate Analytics, who publishes Journal Citation Reports (JCR; subscription required), a database of annually updated JIFs, journal rankings, and other journal‐level metrics. The JIF was originally designed to indicate a relationship between a journal’s publications and citations, but there have been many critiques of its evolution into a single‐number proxy for broad scientific value. 31 , 32 , 33

Responsible application of JIFs requires an understanding of how the impact factor is calculated. For example, because citable items are defined to include research papers but to exclude nonresearch publication types (e.g., letters and editorials), editors may restructure their publication types in order to publish research articles in sections that were classified by Clarivate as “editorial.” Similarly, reducing the number of items in JIF denominators increases the total JIF. To keep JIFs proprietary, Clarivate does not disclose information on journals’ citable item sections, making it impossible for users to know if the metric is fair or accurate. 34 Editors may also pursue a higher impact factor via their journal’s submissions by asking submitters to include more citations to the journal in their manuscripts, or by soliciting more highly cited article types. Review articles consistently receive more citations than original research articles, 35 so editors may be incentivized to focus on secondary rather than primary publications. The pursuit of citations contributes to publication bias wherein prestigious, and aspirational, journals reject incremental or replicative research in favor of novel results, whose findings may not be reliable. 36 As with h‐indices, JIFs are also susceptible to fraudulent citations. 37 Most importantly, the impact factor of a journal is not capable of conveying the quality, scientific accuracy, or impact of any particular article published within that journal. The impact factor reflects citation patterns to the journal title as a whole, not the impact of any individual publication. Other journal metrics include Eigenfactor, 38 Scopus CiteScore, 39 SciImago Journal Rank Indicator, 40 and various modifications of the JIF itself, that may be useful for researchers desiring to explore or verify journal metrics for a particular context. 41 , 42 However, the original JIF remains, by far, the most familiar journal‐level metric.

The journal impact factor for a particular journal title can be searched via JCR. Although some individual journals may list their impact factors on their websites, it is recommended that dates and JIFs be verified via JCR. Research evaluators may not be familiar with the relative prestige of journals outside their own discipline, so researchers may use this opportunity to make a compelling presentation of the JIFs of the journals where they have published. JCR contains journal ranking data, simplifying the process of comparative analysis. Researchers can compare the JIFs of the journals in which they have published to other journals in the same field. A journal without a sky‐high impact factor may still be in the top quartile of journals within one’s field. JCR also contains historical impact factor data, which may be useful for discussion of a researcher’s decision to publish in up‐and‐coming journals.

LIMITATIONS

Some of the key limitations of citation networks and their citation counts, the h‐index and the JIF, have been discussed in the present paper. However, other metric considerations, as well as the broader concept of quantitative publication metrics as a whole, should be further studied in the process of evaluation policies and procedures improvement. This paper is intended as an introduction to the most frequently used publication metrics in the context of research careers or grant evaluations, and not as a thorough analysis of all available metrics. Additionally, this paper seeks to present practical information on how to access and apply popular metrics and tools in the context of research evaluation. Many of the products mentioned in this paper require expensive subscriptions that may be beyond the budget of some institutions. Understanding how the “free” alternatives, which collect user data in lieu of subscriptions, compare to the major subscription databases may be helpful for researchers trying to understand their options for accessing and presenting their publication metrics. Those who wish to gain a deeper understanding of their local subscriptions, or who seek further information about scientometrics, are encouraged to contact their institution’s librarian.

The use of quantitative strategies as a proxy for the scientific productivity, impact, and quality of research publications has both strengths and limitations. 43 , 44 No metric can serve as a fully representative proxy for research quality. The research itself, which may include nonpublication outputs, must be evaluated based on scientific integrity, societal need, advancement of the field, and other potentialities that matter to the evaluators (such as emphasis on support for new or under‐represented researchers, or previously unfunded research topics). There is increasing recognition of the importance of utilizing publication metrics responsibly in research evaluation. 45 The San Francisco Declaration on Research Assessment and the Leiden Manifesto provide recommendations and principles for improving research assessment and the appropriate use of metrics. 46 , 47 Quantitative publication metrics may serve as one component of a holistic assessment. However, even when integrated into a peer‐reviewed evaluative process that also includes qualitative assessment, metrics can either overly inflate or miss the perceived “impact” of research. Nevertheless, publication metrics’ ubiquity demands that funders, authors, and the publishing industry have a solid grasp of the strengths and weaknesses of using numbers as a proxy for scientific impact. A prudent utilization of publication metrics requires a thoughtful approach that includes a realistic understanding of what individual and aggregate metrics are capable of conveying. When used as part of a larger narrative, publication metrics can provide insight into an article’s reach, a journal’s evolution, or a researcher’s career. Strategic application of metrics can empower researchers to tell a clearer and more holistic story of their work, and responsible interpretation of metrics can empower evaluators to more efficiently, fairly, and consistently determine the future of scientific funding and advancement. Future improvements in research evaluation strategies can incentivize Open Science and the greater dissemination of research outputs. 48 , 49 Ultimately, the considered and transparent application and interpretation of publication metrics may help address some of the social inequities in science, provide more opportunity for under‐represented researchers and research areas, improve the wellbeing of researchers caught in the burnout “publish or perish” cycle, and speed the most promising basic research to clinical and policy implementation, and improved outcomes.

CONFLICT OF INTEREST

The authors declared no competing interests for this work.

This research was supported in part by NIH National Center for Advancing Translational Science (NCATS) UCLA CTSI Grant Number UL1TR001881.

Cited Reference Search

Search for records that have cited a published work, and discover how a known idea or innovation has been confirmed, applied, improved, extended, or corrected. Find out who’s citing your research and the impact your work is having on other researchers in the world.

In the Arts & Humanities Citation Index, you can use cited reference search to find articles that refer to or include an illustration of a work of art or a music score; these references are called implicit citations .

  • You may also search on Cited Year(s), Cited Volume, Cited Issue, Cited Pages, Cited Title, or Cited DOI
  • Click Search; results from the cited reference index that include the work you’re searching appears on a table. Every reference on the cited reference index has been cited by at least one article indexed in the Web of Science. The first author of a cited work always displays in the Cited Author column. If the cited author you specified in step 1 is not the primary author, then the name of author you specified follows the name of the first author (click Show all authors). If you retrieve too many hits, return to the cited reference search page and add criteria for Cited Year, Cited Volume, Cited Issue, or Cited Page.
  • cited reference is not a source article in the Web of Science
  • reference may contain incomplete or inaccurate information, and can’t be linked to a source article
  • reference may refer to a document from a publication outside the timespan of your subscription; for example, if the article was published in 1992, but your subscription only gives you access to 20 years of data
  • cited item may refer to a document from a publication not covered by a database in your subscription
  • Click Search to view your results.

Cited Reference Search Interface

Click View abbreviation list to see the abbreviations of journal and conference proceedings titles used as cited works; this list will open in a new browser tab.

When you complete a cited reference search, the number of citing items you retrieve may be smaller than the number listed in the Citing Articles column if your institution's subscription does not include all years of the database. In other words, the count in the Citing Articles column is not limited by your institution's subscription. However, your access to records in the product is limited by your institution's subscription.

  • Enter the name of the first author of a multi-authored article or book
  • Enter an abbreviated journal title followed by an asterisk or the first one or two significant words of a book title followed by an asterisk.
  • Try searching for the cited reference without entering a cited year in order to retrieve variations of the same cited reference. You can always return to the Cited Reference Search page and enter a cited year if you get too many references.
  • When searching for biblical references, enter Bible in the Cited Author field and the name of the book ( Corinthinans* , Matthew* Leviticus *, etc.) in the Cited Work field. Ensure that you use the asterisk (*) wildcard in your search.

Follow these steps to find articles that have cited Brown, M.E. and Calvin, W.M. Evidence for crystalline water and ammonia ices on Pluto's satellite Charon. Science . 287 (5450): 107-109. January 7, 2000:

  • On the Cited Reference Search page, enter Brown M* in the Cited Author field.
  • Enter Science* in the Cited Work field.
  • Click Search to go to the Cited Reference Search table. This page shows all the results from the Web of Science cited reference index that matched the query.
  • Page through the results to find this reference:

Cited Reference Search Example

  • Select the check box to the left of the reference.
  • Click the See Results button to go to the Cited Reference Search Results page to see the list of articles that cite the article by Brown and Calvin.

Every cited reference in the Cited Reference Index contains enough information to uniquely identify the document. Because only essential bibliographic information is captured, and because author names and source publication titles are unified as much as possible, the same reference cited in two different records should appear the same way in the database. This unification is what makes possible the Times Cited number on the Full Record page.

However, not all references to the same publication can be unified. As a consequence, a cited reference may have variations in the product.

For example, consider these variations of a reference to an article by A.J. Bard published in volume 374 of Nature:

The first reference contains the correct volume number and other bibliographic information. The View Record link takes you to the Full Record, which has a Times Cited count of 31.

The second reference contains a different volume number and it does not have a View Record link. Because a journal cannot have two different volume numbers in the same publication year, it is obvious that this is an incorrect reference to the same article.

Click Export at the top of the Cited Reference Search table to export the cited reference search results to Excel.

Articles indexed in the Science Citation Index Expanded cite books, patents, and other types of publications in addition to other articles. You can do a cited reference search for a patent to find journal articles that have cited it.

If you know the patent number, enter it in the Cited Work field. If you do not know the patent number, try entering the name of the first listed inventor or patent assignee in the Cited Author field. For example, to find references to U.S. patent 4096196-A, enter 4096196 in the Cited Work field. If you also subscribe to Derwent Innovations Index and the patent is included in the Derwent database, the patents you find in the citation index will be linked to the corresponding full patent records in Derwent Innovations Index.

Self-citations refer to cited references that contain an author name that matches the name of the author of a citing article.

You may want to eliminate self-citations from the results of a Cited Reference Search by combining a Cited Reference Search with a search by the source author.

  • Perform a Cited Reference Search to find items that cite the works of a particular author. Ensure that you complete both steps of a Cited Reference Search.
  • Go to the search page. Enter the name of the same author in the Author field. Click the Search button.
  • Go to the advanced search page.
  • Combine the two searches you just completed in a Boolean NOT expression (for example, #1 NOT #2 ). The results of the Search (the items written by the author) should be the set on the right-hand side of the operator.

Articles indexed in the product cite books, patents, and other types of publications in addition to other articles. You can do a cited reference search on a book to find journal articles that have cited it.

You should identify a book by entering the name of the first listed author in the Cited Author field and the first word or words of the title in the Cited Work field. Many cited works are abbreviated. If you are not sure how a word has been spelled or abbreviated, enter the first few letters of the word, followed by an asterisk. For example, to search for records of articles that cite Edith Hamilton's book Mythology , you would enter Hamilton E* in the Cited Author field and Myth* in the Cited Work field.

Do not enter a year in the Cited Year field. Authors often cite a particular edition of a book, and the cited year is the year of the edition they are citing. Generally, you want to find all articles that cite a book, regardless of the particular edition cited.

For example, enter the following data on the Cited Reference Search page, and then click Search .

CITED AUTHOR Tuchman BW

CITED WORK Guns*

CITED YEAR 1962

Note the number of references that are retrieved. Now repeat the search using the following data:

CITED AUTHOR Tuchman B*

See how many more references you retrieved? Notice that the author has been cited as Tuchman B as well as Tuchman BW. Also, notice how many different cited years and cited page numbers there are for the same work.

Calculate Your Academic Footprint: Your H-Index

  • Get Started
  • Author Profiles
  • Find Publications (Steps 1-2)
  • Track Citations (Steps 3-5)
  • Count Citations (Steps 6-10)
  • Your H-Index

What is an H-Index?

The h-index captures research output based on the total number of publications and the total number of citations to those works, providing a focused snapshot of an individual’s research performance. Example: If a researcher has 15 papers, each of which has at least 15 citations, their h-index is 15.

  • Comparing researchers of similar career length.  
  • Comparing researchers in a similar field, subject, or Department, and who publish in the same journal categories.  
  • Obtaining a focused snapshot of a researcher’s performance.

Not Useful For

  • Comparing researchers from different fields, disciplines, or subjects.  
  • Assessing fields, departments, and subjects where research output is typically books or conference proceedings as they are not well represented by databases providing h-indices.

1  Working Group on Bibliometrics. (2016). Measuring Research Output Through Bibliometrics. University of Waterloo. Retrieved from https://uwspace.uwaterloo.ca/bitstream/handle/10012/10323/Bibliometrics%20White%20Paper%20 2016%2 0Final_March2016.pdf?sequence=4&isAllowed=y  

2  Alakangas, S. & Warburton, J. Research impact: h-index. The University of Melbourne. Retrieved from http://unimelb.libguides.com/c.php?g=402744&p=2740739  

Calculate Manually

To manually calculate your h-index, organize articles in descending order, based on the number of times they have been cited.

In the below example, an author has 8 papers that have been cited 33, 30, 20, 15, 7, 6, 5 and 4 times. This tells us that the author's h-index is 6.

Table illustrates previous example. Column 1 shows articles 1-8 and column 2 shows citation numbers. Article 6 has 6 citations

  • An h-index of 6 means that this author has published at least 6 papers that have each received at least 6 citations.

More context:

  • The first paper has been cited 33 times, and gives us a 1 (there is one paper that has been cited at least once).
  • The second paper has been cited 30 times, and gives us a 2 (there are two papers that have been cited at least twice).
  • The third paper gives us a 3 and all the way up to 6 with the sixth highest paper.
  • The final two papers have no effect in this case as they have been cited less than six times (Ireland, MacDonald & Stirling, 2012).

1 Ireland, T., MacDonald, K., & Stirling, P. (2012). The h-index: What is it, how do we determine it, and how can we keep up with it? In A. Tokar, M. Beurskens, S. Keuneke, M. Mahrt, I. Peters, C. Puschmann, T. van Treeck, & K. Weller (Eds.), Science and the internet (pp. 237-247). D ü sseldorf University Press.

Calculate Using Databases

  • Given Scopus  and Web of Science 's citation-tracking functionality, they can also calculate an individual’s h-index based on content in their particular databases.  
  • Likewise, Google Scholar collects citations and calculates an author's h-index via the Google Scholar Citations Profile feature.

Each database may determine a different h-index for the same individual as the content in each database is unique and different. 

  • << Previous: Count Citations (Steps 6-10)
  • Last Updated: Oct 5, 2023 7:37 AM
  • URL: https://subjectguides.uwaterloo.ca/calculate-academic-footprint

Research guides by subject

Course reserves

My library account

Book a study room

News and events

Work for the library

Support the library

We want to hear from you. You're viewing the newest version of the Library's website. Please send us your feedback !

  • Contact Waterloo
  • Maps & Directions
  • Accessibility

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Citing sources

How to Cite Sources | Citation Generator & Quick Guide

Citing your sources is essential in  academic writing . Whenever you quote or paraphrase a source (such as a book, article, or webpage), you have to include a  citation crediting the original author.

Failing to properly cite your sources counts as plagiarism , since you’re presenting someone else’s ideas as if they were your own.

The most commonly used citation styles are APA and MLA. The free Scribbr Citation Generator is the quickest way to cite sources in these styles. Simply enter the URL, DOI, or title, and we’ll generate an accurate, correctly formatted citation.

Generate accurate citations with Scribbr

Instantly correct all language mistakes in your text.

Upload your document to correct all your mistakes in minutes

upload-your-document-ai-proofreader

Table of contents

When do you need to cite sources, which citation style should you use, in-text citations, reference lists and bibliographies.

Scribbr Citation Generator

Other useful citation tools

Citation examples and full guides, frequently asked questions about citing sources.

Citations are required in all types of academic texts. They are needed for several reasons:

  • To avoid plagiarism by indicating when you’re taking information from another source
  • To give proper credit to the author of that source
  • To allow the reader to consult your sources for themselves

A citation is needed whenever you integrate a source into your writing. This usually means quoting or paraphrasing:

  • To quote a source , copy a short piece of text word for word and put it inside quotation marks .
  • To paraphrase a source , put the text into your own words. It’s important that the paraphrase is not too close to the original wording. You can use the paraphrasing tool if you don’t want to do this manually.

Citations are needed whether you quote or paraphrase, and whatever type of source you use. As well as citing scholarly sources like books and journal articles, don’t forget to include citations for any other sources you use for ideas, examples, or evidence. That includes websites, YouTube videos , and lectures .

The only proofreading tool specialized in correcting academic writing - try for free!

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

academic citation index

Try for free

Usually, your institution (or the journal you’re submitting to) will require you to follow a specific citation style, so check your guidelines or ask your instructor.

In some cases, you may have to choose a citation style for yourself. Make sure to pick one style and use it consistently:

  • APA Style is widely used in the social sciences and beyond.
  • MLA style is common in the humanities.
  • Chicago notes and bibliography , common in the humanities
  • Chicago author-date , used in the (social) sciences
  • There are many other citation styles for different disciplines.

If in doubt, check with your instructor or read other papers from your field of study to see what style they follow.

In most styles, your citations consist of:

  • Brief in-text citations at the relevant points in the text
  • A reference list or bibliography containing full information on all the sources you’ve cited

In-text citations most commonly take the form of parenthetical citations featuring the last name of the source’s author and its year of publication (aka author-date citations).

An alternative to this type of in-text citation is the system used in numerical citation styles , where a number is inserted into the text, corresponding to an entry in a numbered reference list.

There are also note citation styles , where you place your citations in either footnotes or endnotes . Since they’re not embedded in the text itself, these citations can provide more detail and sometimes aren’t accompanied by a full reference list or bibliography.

A reference list (aka “Bibliography” or “Works Cited,” depending on the style) is where you provide full information on each of the sources you’ve cited in the text. It appears at the end of your paper, usually with a hanging indent applied to each entry.

The information included in reference entries is broadly similar, whatever citation style you’re using. For each source, you’ll typically include the:

  • Author name
  • Publication date
  • Container (e.g., the book an essay was published in, the journal an article appeared in)
  • Location (e.g., a URL or DOI , or sometimes a physical location)

The exact information included varies depending on the source type and the citation style. The order in which the information appears, and how you format it (e.g., capitalization, use of italics) also varies.

Most commonly, the entries in your reference list are alphabetized by author name. This allows the reader to easily find the relevant entry based on the author name in your in-text citation.

APA-reference-list

In numerical citation styles, the entries in your reference list are numbered, usually based on the order in which you cite them. The reader finds the right entry based on the number that appears in the text.

Vancouver reference list example

Because each style has many small differences regarding things like italicization, capitalization , and punctuation , it can be difficult to get every detail right. Using a citation generator can save you a lot of time and effort.

Scribbr offers citation generators for both APA and MLA style. Both are quick, easy to use, and 100% free, with no ads and no registration required.

Just input a URL or DOI or add the source details manually, and the generator will automatically produce an in-text citation and reference entry in the correct format. You can save your reference list as you go and download it when you’re done, and even add annotations for an annotated bibliography .

Once you’ve prepared your citations, you might still be unsure if they’re correct and if you’ve used them appropriately in your text. This is where Scribbr’s other citation tools and services may come in handy:

Plagiarism Checker

Citation Checker

Citation Editing

Plagiarism means passing off someone else’s words or ideas as your own. It’s a serious offense in academia. Universities use plagiarism checking software to scan your paper and identify any similarities to other texts.

When you’re dealing with a lot of sources, it’s easy to make mistakes that could constitute accidental plagiarism. For example, you might forget to add a citation after a quote, or paraphrase a source in a way that’s too close to the original text.

Using a plagiarism checker yourself before you submit your work can help you spot these mistakes before they get you in trouble. Based on the results, you can add any missing citations and rephrase your text where necessary.

Try out the Scribbr Plagiarism Checker for free, or check out our detailed comparison of the best plagiarism checkers available online.

Scribbr Plagiarism Checker

Scribbr’s Citation Checker is a unique AI-powered tool that automatically detects stylistic errors and inconsistencies in your in-text citations. It also suggests a correction for every mistake.

Currently available for APA Style, this is the fastest and easiest way to make sure you’ve formatted your citations correctly. You can try out the tool for free below.

If you need extra help with your reference list, we also offer a more in-depth Citation Editing Service.

Our experts cross-check your in-text citations and reference entries, make sure you’ve included the correct information for each source, and improve the formatting of your reference page.

If you want to handle your citations yourself, Scribbr’s free Knowledge Base provides clear, accurate guidance on every aspect of citation. You can see citation examples for a variety of common source types below:

And you can check out our comprehensive guides to the most popular citation styles:

At college level, you must properly cite your sources in all essays , research papers , and other academic texts (except exams and in-class exercises).

Add a citation whenever you quote , paraphrase , or summarize information or ideas from a source. You should also give full source details in a bibliography or reference list at the end of your text.

The exact format of your citations depends on which citation style you are instructed to use. The most common styles are APA , MLA , and Chicago .

The abbreviation “ et al. ” (Latin for “and others”) is used to shorten citations of sources with multiple authors.

“Et al.” is used in APA in-text citations of sources with 3+ authors, e.g. (Smith et al., 2019). It is not used in APA reference entries .

Use “et al.” for 3+ authors in MLA in-text citations and Works Cited entries.

Use “et al.” for 4+ authors in a Chicago in-text citation , and for 10+ authors in a Chicago bibliography entry.

The Scribbr Citation Generator is developed using the open-source Citation Style Language (CSL) project and Frank Bennett’s citeproc-js . It’s the same technology used by dozens of other popular citation tools, including Mendeley and Zotero.

You can find all the citation styles and locales used in the Scribbr Citation Generator in our publicly accessible repository on Github .

APA format is widely used by professionals, researchers, and students in the social and behavioral sciences, including fields like education, psychology, and business.

Be sure to check the guidelines of your university or the journal you want to be published in to double-check which style you should be using.

MLA Style  is the second most used citation style (after APA ). It is mainly used by students and researchers in humanities fields such as literature, languages, and philosophy.

Is this article helpful?

Other students also liked.

  • Citation Styles Guide | Examples for All Major Styles
  • APA vs. MLA | The Key Differences in Format & Citation
  • The Basics of In-Text Citation | APA & MLA Examples

More interesting articles

  • Citation examples for common sources types
  • Et Al. | Meaning & Use in APA, MLA & Chicago
  • Hanging Indent | Word & Google Docs Instructions
  • How to Cite a Book | APA, MLA, & Chicago Examples
  • How to Cite a Journal Article | APA, MLA, & Chicago Examples
  • How to Cite a Lecture | APA, MLA & Chicago Examples
  • How to Cite a Newspaper Article | MLA, APA & Chicago
  • How to Cite a Website | MLA, APA & Chicago Examples
  • How to Cite a Wikipedia Article | APA, MLA & Chicago
  • How to Cite a YouTube Video | MLA, APA & Chicago
  • How to Cite an Image | Photographs, Figures, Diagrams
  • How to Cite an Interview | APA, MLA & Chicago Style
  • Parenthetical Citation | APA, MLA & Chicago Examples
  • What Are Endnotes? | Guide with Examples
  • What Are Footnotes? | Guide with Word Instructions
  • What Does Ibid. Mean? | Definition & Examples
  • What is a DOI? | Finding and Using Digital Object Identifiers
  • What Is an Annotated Bibliography? | Examples & Format

What is your plagiarism score?

  • All subject areas
  • Agricultural and Biological Sciences
  • Arts and Humanities
  • Biochemistry, Genetics and Molecular Biology
  • Business, Management and Accounting
  • Chemical Engineering
  • Computer Science
  • Decision Sciences
  • Earth and Planetary Sciences
  • Economics, Econometrics and Finance
  • Engineering
  • Environmental Science
  • Health Professions
  • Immunology and Microbiology
  • Materials Science
  • Mathematics
  • Multidisciplinary
  • Neuroscience
  • Pharmacology, Toxicology and Pharmaceutics
  • Physics and Astronomy
  • Social Sciences
  • All subject categories
  • Acoustics and Ultrasonics
  • Advanced and Specialized Nursing
  • Aerospace Engineering
  • Agricultural and Biological Sciences (miscellaneous)
  • Agronomy and Crop Science
  • Algebra and Number Theory
  • Analytical Chemistry
  • Anesthesiology and Pain Medicine
  • Animal Science and Zoology
  • Anthropology
  • Applied Mathematics
  • Applied Microbiology and Biotechnology
  • Applied Psychology
  • Aquatic Science
  • Archeology (arts and humanities)
  • Architecture
  • Artificial Intelligence
  • Arts and Humanities (miscellaneous)
  • Assessment and Diagnosis
  • Astronomy and Astrophysics
  • Atmospheric Science
  • Atomic and Molecular Physics, and Optics
  • Automotive Engineering
  • Behavioral Neuroscience
  • Biochemistry
  • Biochemistry, Genetics and Molecular Biology (miscellaneous)
  • Biochemistry (medical)
  • Bioengineering
  • Biological Psychiatry
  • Biomaterials
  • Biomedical Engineering
  • Biotechnology
  • Building and Construction
  • Business and International Management
  • Business, Management and Accounting (miscellaneous)
  • Cancer Research
  • Cardiology and Cardiovascular Medicine
  • Care Planning
  • Cell Biology
  • Cellular and Molecular Neuroscience
  • Ceramics and Composites
  • Chemical Engineering (miscellaneous)
  • Chemical Health and Safety
  • Chemistry (miscellaneous)
  • Chiropractics
  • Civil and Structural Engineering
  • Clinical Biochemistry
  • Clinical Psychology
  • Cognitive Neuroscience
  • Colloid and Surface Chemistry
  • Communication
  • Community and Home Care
  • Complementary and Alternative Medicine
  • Complementary and Manual Therapy
  • Computational Mathematics
  • Computational Mechanics
  • Computational Theory and Mathematics
  • Computer Graphics and Computer-Aided Design
  • Computer Networks and Communications
  • Computer Science Applications
  • Computer Science (miscellaneous)
  • Computers in Earth Sciences
  • Computer Vision and Pattern Recognition
  • Condensed Matter Physics
  • Conservation
  • Control and Optimization
  • Control and Systems Engineering
  • Critical Care and Intensive Care Medicine
  • Critical Care Nursing
  • Cultural Studies
  • Decision Sciences (miscellaneous)
  • Dental Assisting
  • Dental Hygiene
  • Dentistry (miscellaneous)
  • Dermatology
  • Development
  • Developmental and Educational Psychology
  • Developmental Biology
  • Developmental Neuroscience
  • Discrete Mathematics and Combinatorics
  • Drug Discovery
  • Drug Guides
  • Earth and Planetary Sciences (miscellaneous)
  • Earth-Surface Processes
  • Ecological Modeling
  • Ecology, Evolution, Behavior and Systematics
  • Economic Geology
  • Economics and Econometrics
  • Economics, Econometrics and Finance (miscellaneous)
  • Electrical and Electronic Engineering
  • Electrochemistry
  • Electronic, Optical and Magnetic Materials
  • Emergency Medical Services
  • Emergency Medicine
  • Emergency Nursing
  • Endocrine and Autonomic Systems
  • Endocrinology
  • Endocrinology, Diabetes and Metabolism
  • Energy Engineering and Power Technology
  • Energy (miscellaneous)
  • Engineering (miscellaneous)
  • Environmental Chemistry
  • Environmental Engineering
  • Environmental Science (miscellaneous)
  • Epidemiology
  • Experimental and Cognitive Psychology
  • Family Practice
  • Filtration and Separation
  • Fluid Flow and Transfer Processes
  • Food Animals
  • Food Science
  • Fuel Technology
  • Fundamentals and Skills
  • Gastroenterology
  • Gender Studies
  • Genetics (clinical)
  • Geochemistry and Petrology
  • Geography, Planning and Development
  • Geometry and Topology
  • Geotechnical Engineering and Engineering Geology
  • Geriatrics and Gerontology
  • Gerontology
  • Global and Planetary Change
  • Hardware and Architecture
  • Health Informatics
  • Health Information Management
  • Health Policy
  • Health Professions (miscellaneous)
  • Health (social science)
  • Health, Toxicology and Mutagenesis
  • History and Philosophy of Science
  • Horticulture
  • Human-Computer Interaction
  • Human Factors and Ergonomics
  • Immunology and Allergy
  • Immunology and Microbiology (miscellaneous)
  • Industrial and Manufacturing Engineering
  • Industrial Relations
  • Infectious Diseases
  • Information Systems
  • Information Systems and Management
  • Inorganic Chemistry
  • Insect Science
  • Instrumentation
  • Internal Medicine
  • Issues, Ethics and Legal Aspects
  • Leadership and Management
  • Library and Information Sciences
  • Life-span and Life-course Studies
  • Linguistics and Language
  • Literature and Literary Theory
  • LPN and LVN
  • Management Information Systems
  • Management, Monitoring, Policy and Law
  • Management of Technology and Innovation
  • Management Science and Operations Research
  • Materials Chemistry
  • Materials Science (miscellaneous)
  • Maternity and Midwifery
  • Mathematical Physics
  • Mathematics (miscellaneous)
  • Mechanical Engineering
  • Mechanics of Materials
  • Media Technology
  • Medical and Surgical Nursing
  • Medical Assisting and Transcription
  • Medical Laboratory Technology
  • Medical Terminology
  • Medicine (miscellaneous)
  • Metals and Alloys
  • Microbiology
  • Microbiology (medical)
  • Modeling and Simulation
  • Molecular Biology
  • Molecular Medicine
  • Nanoscience and Nanotechnology
  • Nature and Landscape Conservation
  • Neurology (clinical)
  • Neuropsychology and Physiological Psychology
  • Neuroscience (miscellaneous)
  • Nuclear and High Energy Physics
  • Nuclear Energy and Engineering
  • Numerical Analysis
  • Nurse Assisting
  • Nursing (miscellaneous)
  • Nutrition and Dietetics
  • Obstetrics and Gynecology
  • Occupational Therapy
  • Ocean Engineering
  • Oceanography
  • Oncology (nursing)
  • Ophthalmology
  • Oral Surgery
  • Organic Chemistry
  • Organizational Behavior and Human Resource Management
  • Orthodontics
  • Orthopedics and Sports Medicine
  • Otorhinolaryngology
  • Paleontology
  • Parasitology
  • Pathology and Forensic Medicine
  • Pediatrics, Perinatology and Child Health
  • Periodontics
  • Pharmaceutical Science
  • Pharmacology
  • Pharmacology (medical)
  • Pharmacology (nursing)
  • Pharmacology, Toxicology and Pharmaceutics (miscellaneous)
  • Physical and Theoretical Chemistry
  • Physical Therapy, Sports Therapy and Rehabilitation
  • Physics and Astronomy (miscellaneous)
  • Physiology (medical)
  • Plant Science
  • Political Science and International Relations
  • Polymers and Plastics
  • Process Chemistry and Technology
  • Psychiatry and Mental Health
  • Psychology (miscellaneous)
  • Public Administration
  • Public Health, Environmental and Occupational Health
  • Pulmonary and Respiratory Medicine
  • Radiological and Ultrasound Technology
  • Radiology, Nuclear Medicine and Imaging
  • Rehabilitation
  • Religious Studies
  • Renewable Energy, Sustainability and the Environment
  • Reproductive Medicine
  • Research and Theory
  • Respiratory Care
  • Review and Exam Preparation
  • Reviews and References (medical)
  • Rheumatology
  • Safety Research
  • Safety, Risk, Reliability and Quality
  • Sensory Systems
  • Signal Processing
  • Small Animals
  • Social Psychology
  • Social Sciences (miscellaneous)
  • Social Work
  • Sociology and Political Science
  • Soil Science
  • Space and Planetary Science
  • Spectroscopy
  • Speech and Hearing
  • Sports Science
  • Statistical and Nonlinear Physics
  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Strategy and Management
  • Stratigraphy
  • Structural Biology
  • Surfaces and Interfaces
  • Surfaces, Coatings and Films
  • Theoretical Computer Science
  • Tourism, Leisure and Hospitality Management
  • Transplantation
  • Transportation
  • Urban Studies
  • Veterinary (miscellaneous)
  • Visual Arts and Performing Arts
  • Waste Management and Disposal
  • Water Science and Technology
  • All regions / countries
  • Asiatic Region
  • Eastern Europe
  • Latin America
  • Middle East
  • Northern America
  • Pacific Region
  • Western Europe
  • ARAB COUNTRIES
  • IBEROAMERICA
  • NORDIC COUNTRIES
  • Afghanistan
  • Bosnia and Herzegovina
  • Brunei Darussalam
  • Czech Republic
  • Dominican Republic
  • Netherlands
  • New Caledonia
  • New Zealand
  • Papua New Guinea
  • Philippines
  • Puerto Rico
  • Russian Federation
  • Saudi Arabia
  • South Africa
  • South Korea
  • Switzerland
  • Syrian Arab Republic
  • Trinidad and Tobago
  • United Arab Emirates
  • United Kingdom
  • United States
  • Vatican City State
  • Book Series
  • Conferences and Proceedings
  • Trade Journals

academic citation index

  • Citable Docs. (3years)
  • Total Cites (3years)

academic citation index

Follow us on @ScimagoJR Scimago Lab , Copyright 2007-2022. Data Source: Scopus®

academic citation index

Cookie settings

Cookie Policy

Legal Notice

Privacy Policy

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 29 September 2020

Science Citation Index (SCI) and scientific evaluation system in China

  • Junxi Qian   ORCID: orcid.org/0000-0001-9638-3808 1 ,
  • Zhenjie Yuan 2 ,
  • Jie Li 2 &
  • Hong Zhu 2  

Humanities and Social Sciences Communications volume  7 , Article number:  108 ( 2020 ) Cite this article

7029 Accesses

5 Citations

5 Altmetric

Metrics details

  • Science, technology and society
  • Social policy

In February 2020, China’s Ministry of Education and Ministry of Science and Technology issued an official Opinion discouraging the use of the Science Citation Index (SCI) as a framework for the assessment of research performance. There is a need to assess the origin of the new policy, and how it will reshape cultures and practices of scientific knowledge production in China. We suggest that while concerns over the quality of research and conduct of scientists are at play, a deeper reason underlying the government’s adoption of a more cautious stance towards SCI is wider social controversy around what system of research assessment is best suited to social development and wellbeing in China. However, failing to continue to engage in international publication and collaboration would be self-defeating for China. We propose three recommendations for reforming scientific evaluation in China: diversity of criteria, autonomy of scientific evaluation, and quality of peer-review.

Introduction

On 18 February 2020, an Opinion document jointly released by the Ministry of Education and Ministry of Science and Technology of China triggered widespread and extensive debates among Chinese scientists (henceforth the Opinion) (Ministry of Education and Ministry of Science and Technology, 2020 ; Mallapaty, 2020 ). In the Opinion the two ministries that steer the direction of scientific research and education in China officially expressed the discouragement of long-accepted practice of using the Science Citation Index (SCI) as the criteria for research assessment. It mandates that SCI-related indicators (such as numbers of publications published in SCI-indexed journals, impact factors of the journals, and numbers of citations of published works) should not be seen as direct evidence of scientific merit, that benchmarking activities such as ranking universities and departments need to be reduced, and that peer-review needs to be expanded and strengthened to give more voice to expert opinions instead of metrics (Ministry of Education and Ministry of Science and Technology, 2020 ; Mallapaty, 2020 ). This opinion is the culmination of a series of official notices and instructions issued in recent years aiming at the reform and optimization of research evaluation in China (Chinese Communist Party Central Committee and State Council, 2018 ). It endeavors, foremost, to put a brake to the obsession with SCI papers and impact factors in the current practices (Fig. 1 ).

figure 1

The yellow line indicates the percentage of contributions from China among SCI-indexed research outputs. The blue bars indicate the annual numbers of SCI-indexed papers from China.

The dilemmas of SCI-based research assessment in China

To our best knowledge, the Chinese government never officially imposed the use of SCI in research assessment. Nonetheless, since its introduction to China in the 1980s, SCI has been used as an authoritative framework for evaluating research output and played a vital role in deciding scientists’ chances of promotion and access to funding, accolade, even cash rewards from their employers. The reasons for universities, funding agencies and scientific communities in China to rely on SCI are mainly threefold: (a) to encourage Chinese scientists to produce work that meet international standards of excellence; (b) to prompt Chinese scientists to keep up to date with the latest scientific frontiers and cutting-edge research areas; and (c) to instate an objective and quantifiable set of criteria in practices of evaluation. Such objectives have been achieved to certain extents, but severe side effects do exist. The first set of problems is related to the conduct of research per se, while the second and deeper reason probes into wider social controversy around the politics of knowledge production, i.e., concerns over what system of research assessment is best suited to social development and wellbeing in China.

SCI is not without shortcomings: a journal’s impact factor does not necessarily reflect its scientific value, for high-quality papers do not always appear in high-impact journals. Also, high-quality research may not immediately attract a lot of citations, and impact factors and citational patterns vary greatly across different disciplines. This is true to a variety of other citation metrics as well (Van Noorden, 2010 ; Larsen and von Ins, 2010 ; Harzing, 2011 ). In China, exclusive focus on impact factors and citation figures has in many ways deflected funding agencies’ attention from real breakthroughs and innovations. The emphasis on the number of publications also leads some researchers to pursue quantity at the expense of quality, resulting in lots of papers published with mediocre scientific value (Sahel, 2011 ). Assessment based on SCI also marginalizes high-quality research published in non-SCI journals, including home-grown Chinese language journals. The Opinion, thus, requires that the numbers of publications in SCI journals and the journals where papers are published cannot be used as direct indicators of the quality of research.

Metrics may also incentivize (mis)conduct that diverge from standards of academic integrity, honesty and rigor (Tang, 2019 ). Forms of misconduct are not restricted to plagiarism and fraudulence in papers. For example, in 2017, Tumor Biology published by Springer retracted 107 papers, all of which are by Chinese authors, because authors provided made-up contact information of potential reviewers, and the review processes ended up being manipulated by third-party agencies that make profit from “faking” the review processes (Chen, 2017 ). Also, certain journals have in recent years witnessed a concentration of works by authors from China. We suspect that this is because of closely knit networks of editors, reviewers and authors, which results in superficial peer-review, easy acceptance, and deliberate self-citing from the same journals to boost impact factors (see Guglielmi, 2019 for similar patterns of behaviors occurring to Italian scientists).

These points refer to concerns over quality of research and conduct of researchers, which are corroborated by theoretical discussions of citation metrics. However, bibliometrics literatures offer little insight on a deeper reason that has driven the Chinese government to adopt a more cautious stance towards SCI. We surmise that it is related to concerns over the social relevance of science, a notion that scientific communities around the world are jostling with (Frodeman and Holbrook, 2011 ). The promotion of SCI, which started in the 1980s, ushered in the introduction of a new system of research areas, questions and paradigms, a system esteemed as international and global. Also, SCI values knowledge as crystallized in a written form but does not directly measure the social wellbeing and benefits that knowledge generates. In parallel, an older and local paradigm of research in China emphasized “knowledge in practice”, namely that the priority of research was to address the needs of social wellbeing and development. It is not to say that the two paradigms had no overlap, or that SCI does not allow space for research that tackles urgent needs in China. However, discrepancies and conflicts are manifested in a number of ways.

For example, the emphasis on SCI publications may encourage scientists to spend most of their time writing but discourage them from applying research findings to deal with real-world problems. Also, research that stimulate new technologies, new products or good policies in the Chinese context may not belong to a cutting-edge area in the SCI context, thus finding it difficult to make it into the pages of SCI journals. As a result, scientists who focus on the social value of research may not be properly rewarded by the system. Moreover, some fields of knowledge, such as Chinese traditional medicine, would face disproportionate difficulties to fit with international systems of knowledge. Finally, given that SCI papers are predominantly published in English and access is conditioned on costly subscriptions, particular difficulties exist for members of the Chinese society to access research findings in SCI journals, even if they are funded by Chinese taxpayers (without considering legally gray channels such as SCI-Hub and Library Genesis). This may severely constrain the ability of SCI papers to stimulate entrepreneurship, innovation and public policy in China.

As a consequence, the Opinion prescribes that when applied research and technological innovation are assessed, research publications should not serve as the sole basis of evaluation but be combined with contribution of the research to technical solutions, new products and new technologies. The timing of the Opinion appears to echo another official instruction issued by the Ministry of Science and Technology, which orders that research related to the recent outbreak of the novel coronavirus disease (COVID-19) should be primarily directed towards containing the virus and defeating the disease, instead of publishing papers (Ministry of Science and Technology, 2020a ). The latter instruction was an immediate response to flooding criticisms on the Internet, accusing some scientists of being more interested in publishing papers than releasing knowledge to the public for better combating the virus and controlling the disease Footnote 1 .

Ways ahead and beyond SCI

Chinese scientists have welcomed the opinion and commended it as timely and necessary, as was expressed in social media, but felt that the importance of SCI should not be neglected or underplayed by universities and funding agencies. Despite that SCI is internally uneven, it still provides a basically reliable reflection of the best-quality knowledge and research across the world. In fact, SCI is likely to be fairer, more transparent and more accurate than a home-made assessment method (Harzing, 2011 ). This is especially so for scientists who are early career, less resourceful or do not have a well-established pedigree. It is thus vital for Chinese scientists to continue to participate in international publication and collaboration and keep informed about the cutting-edge research areas globally, although a shift towards quality instead of quantity is a wise move in our view.

Meanwhile, as a recent editorial in Nature suggests, the Chinese government hopes that non-SCI-centered research assessment would help expand the domestic publishing industry, within which many journals would be published in English to foster dissemination (Ministry of Science and Technology, 2020b ). It appears that the government would like to oversee the growth of an alternative system of scientific knowledge, one that is more accessible and relevant to China, and where Chinese scientists have more power in defining hot zones of research. The number of papers published in these journals is likely to increase following the state’s mandate, but there is still a long way to go before the government’s ambition is realized. Domestic journals may give rise to a new system of metrics and are not necessarily relevant to the needs of the society, if specific criteria are not in place to recognize social relevance as a key component of scientific merit. Challenges also lie ahead as to what standards and norms of editorial processes, peer-review and quality control are to be implemented, so that re-animated interest in the domestic publishing market does not lead to compromised levels of research excellence in China.

Given that the government’s new stance towards SCI has created more questions that it has solved, we conclude this commentary by proposing several recommendations on how to create a less dogmatic and more flexible environment of research assessment in China, which avoids the pitfalls mentioned earlier but by no means isolates Chinese scientists from international standards of excellence and quality. First, we recommend the establishment of a diversified and elastic framework of research assessment in China. It combines metrics with qualitative, expert-opinion-based assessment of the quality and innovativeness of research. Even the use of SCI metrics can be made more flexible. For example, SCI is more helpful when comparing scientists within the same disciplines than between different ones (note that there is the need for metrics, SCI included, to normalize across different fields, see ref. 4), and funding agencies can allow a time window for citations to emerge instead of assessing freshly published papers. Also, the system must give more weight to contribution to social wellbeing. This can be measured in various forms such as relevance of research to economic, social and technological needs of the society and reports on the direct application of scientific knowledge. In fact, China’s dilemmatic relationship with SCI is by no means unique, and there are plenty of international experiences that can be consulted by the Chinese government. For example, the Research Assessment Exercise (RAE) in the UK has already moved towards more emphasis on expert review and social impact. The Chinese government also needs to make policies to ensure that research funded by Chinese taxpayers is made accessible to the public through an open-access mandate, as is widely practiced in European and North American countries.

Second, the government needs to give more space to scientists to evaluate their peers instead of imposing standardized and rigid guidelines. As of February 2020, the Ministry of Science and Technology has already issued another notice and specified three types of “high-quality papers”: papers published in domestic journals with international reputation; papers published in top or important international journals; and papers included in top international conferences. It also prescribes that research output with value in application can be given an extra weight of 10–50% in assessment (14). These guidelines create more questions than solutions: how to define the international reputation of domestic journals; how to define top or important international journals; which conferences are categorized as top conferences; whether papers published in such journals or venues are all of high quality; how the 10–50% extra weight for applied research is justified? Just like the dogmatic use of SCI-related metrics, state guidelines are most likely to result in rigid definitions and criteria, and worse still, the dogmatic use of them in universities and state-led funding bodies. The government should leave scientists to decide what constitutes quality, innovation and social impact in their respective disciplinary contexts, and which kinds of indicators and criteria (including but not restricted to SCI-related ones) they would like to draw on.

Finally, given the great emphasis that the new policies place on peer-review and evaluation, an immense challenge lies before the Chinese government and scientists as to how to maintain the quality and integrity of peer-review. In congruence with international practice, funding agencies and journals in China do require reviewers to report conflict of interest and personal relationships. However, whether this has been strictly enforced is followed by a question mark. Also, the social and cultural order in China is built on intersecting guanxi networks. It is likely that peer-review is co-opted by these networks and degenerate into rent-seeking activities, which compromises objectivity and impartiality expected of it. We suggest that universities and funding agencies in China need to treat misconduct in peer-review as a specific form of scientific misconduct. Wrongdoings such as violation of anonymity, acceptance of advantage offered by a reviewee, failure to report conflict of interest, etc. needs to be properly reported, investigated and penalized. Also, peer-review in China needs to involve international reviewers in addition to domestic ones. This is to reduce the influence of personal favor on review processes and help ensure that scientists within and outside China dialog over shared scientific concerns and research questions.

The Chinese government’s decision to move away from SCI would invite speculation that the Chinese state hopes to exert tighter control over the production of scientific knowledge and give more room to research that is congruent with patriotism, the state’s ideological lines, and China’s national interests, especially with regard to sensitive issues such as the origin of the recent COVID-19 pandemic. While we think that such suspicions are reasonable and have a good ground, so far, we cannot locate materials from reliable sources that establish a logical relationship to vindicate such suspicions. Therefore, we do not include this point in the main body of the commentary, which focuses on the center-periphery relations in academic knowledge production and its implications for the quality of research, but leave this point to further explorations, should more evidence emerge in the future.

Chen S (2017) Science Journal Retracts 107 Research Papers by Chinese Authors, South China Morning Post, https://www.scmp.com/news/china/society/article/2089973/science-journal-retracts-107-research-papers-chinese-authors

Chinese Communist Party Central Committee, State Council (2018) Opinions on the deepening of reform on project review, talent evaluation and institution assessment, http://www.xinhuanet.com/politics/2018-07/03/c_1123074267.htm

Frodeman R, Holbrook JB (2011) NSF’s struggle to articulate relevance. Science 333(6039):157–158

Article   ADS   CAS   Google Scholar  

Guglielmi G (2019) Clubby and ‘disturbing’ citation behavior by researchers in Italy has surged, Science News, https://www.sciencemag.org/news/2019/09/clubby-and-disturbing-citation-behavior-researchers-italy-has-surged

Harzing A (2011) The publish or perish book: a guide to effective and responsible citation analysis. Tarma Software Research, St Albans, UK

Google Scholar  

Larsen PO, von Ins M (2010) The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics 84(3):575–603

Article   CAS   Google Scholar  

Mallapaty S (2020) China bans cash rewards for publishing papers. Nature 579(7797):18

Ministry of Education, Ministry of Science and Technology (2020) Some opinions on the appropriate use of SCI-related indicators and re-orientation of research assessment, http://www.moe.gov.cn/srcsite/A16/moe_784/202002/t20200223_423334.html

Ministry of Science and Technology (2020a) Notice on strengthening the management of research projects on novel coronavirus pneumonia, http://finance.sina.com.cn/china/gncj/2020-01-30/doc-iimxxste7658896.shtml

Ministry of Science and Technology (2020b) Some measures to counter the Ill orientation of paper-first in research assessment, http://www.xinhuanet.com/tech/2020-02/24/c_1125617367.htm

Sahel J (2011) Quality versus quantity: assessing individual research performance. Science Translational Medicine 3(84):84CM13

Article   ADS   Google Scholar  

Tang L (2019) Five ways China must cultivate research integrity. Nature 575(7784):589–591

Van Noorden R (2010) Metrics: a profusion of measures. Nature 465(7300):864–866

Article   Google Scholar  

Download references

Author information

Authors and affiliations.

The University of Hong Kong, Hong Kong, China

Guangzhou University, Guangzhou, China

Zhenjie Yuan, Jie Li & Hong Zhu

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Junxi Qian or Hong Zhu .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Qian, J., Yuan, Z., Li, J. et al. Science Citation Index (SCI) and scientific evaluation system in China. Humanit Soc Sci Commun 7 , 108 (2020). https://doi.org/10.1057/s41599-020-00604-w

Download citation

Received : 27 April 2020

Accepted : 15 September 2020

Published : 29 September 2020

DOI : https://doi.org/10.1057/s41599-020-00604-w

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Mastering the scientific peer review process: tips for young authors from a young senior editor.

  • Evgenios Agathokleous

Journal of Forestry Research (2022)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

academic citation index

  • Corporate Vision
  • Corporate Mission
  • Service Spirit
  • Search Engines
  • Discovery Systems
  • Chinese Recorder (CR)
  • Imperial Encyclopedia
  • Insight Taiwan
  • World Art Database
  • Pattern Swatch Database
  • iRead eBooks
  • Airiti Library (AL)

Academic Citation Index (ACI)

  • Korean Custumal Gazetteer
  • Korean Anthology of Successive Dynasties
  • iRead Online Bookstore
  • Research Performance Report (RPR)
  • iPress Service
  • Airiti Digital Object Identifier Registration Agency
  • Ainosco Press
  • Electronic Thesis and Dissertation Service (ETDS)
  • Beach Cleanup

Airiti's Academic Citation Index (ACI) database includes a rigorous collection of Taiwan-published academic journals in the humanities and social sciences that is high quality, abundant and complete. It includes the complete set of Taiwan Social Science Citation Index (TSSCI) journals, as well as Taiwan Humanities Citation Index (THCI) journals spanning 19 academic disciplines: education, library information science, sports, history, social science, economics, general studies, anthropology, Chinese-language, foreign-language, psychology, law, philosophy, political science, regional studies and geography, management, linguistics, the fine arts and media. Its replete citation data is profoundly integrated with the Airiti Library database, so that ACI citation links in Airiti Library entries help readers understand the relationship and derivations between different references, and facilitates linking to further references. This system also provides in-depth analytical reports on bibliographic metering. Targeting the needs of institutional users, journal publishing organizations and individual researchers, it provides in-depth citation analysis and statistical functions to help journal article authors and publishers understand their own academic impact.

ACI收錄所有TSSCI臺灣社會科學引文索引核心期刊(Taiwan Social Science Citation Index),與THCI Core臺灣人文學引文索引核心期刊(Taiwan Humanities Citation Index Core),以及更多重要刊物;現已收錄超過590種台灣學術期刊,收錄年代自1956年起至2016年不等。期刊依其主題區分為19學門,分別為教育、圖資、體育、歷史、社會、經濟、綜合、人類、中文、外文、心理、法律、哲學、政治、區域研究及地理學、管理、語言、藝術、傳播。

ACI充實的引文資料,已與 Airiti Library華藝線上圖書館 深度整合,於華藝線上圖書館已可以看到來自ACI的引文連結,幫助讀者了解文獻與文獻之間的脈絡,也便於連結到更多相關文獻。於ACI平台查找到的文獻,也可一鍵連結至全文頁面。

2016年起,將針對期刊出版單位、大專院校,或個人研究者的需要,提供更深度的引用分析與統計功能,期能幫助期刊作者、出版者了解自身影響力;大專院校也可透過ACI引用分析,看見學者研究力。如您需要引用統計相關諮詢,敬請聯繫我們: [email protected]

網址 www.airitiaci.com

Interface preview

與Airiti Library華藝線上圖書館深度整合;瀏覽書目時,可同時瀏覽參考文獻,透過引用與被引用連結,了解學術文獻的承先啟後,找到更多相關資料。

各學門期刊歷年發表、被引,以及熱門被引文章、學者…,提供研究者與期刊編輯單位徵稿與研究方向之參考。

對期刊可提供二年、五年影響係數(Impact Factor);針對學校、學術機構也可計算出整體的影響力,包括產出篇數、被引次數等。

歷年IF影響係數、自我引用率、被引用半衰期…,供期刊發展策略之參考

各校師生在各學門發表篇數、被引次數…,掌握特色研究力、影響力

研究者發表篇數、被引次數、平均被引次數…,協助研發單位發掘潛力研究者

academic citation index

  • About infoDOCKET
  • Academic Libraries on LJ
  • Research on LJ

academic citation index

  • Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

February 16, 2024 by Gary Price

ADVERTISEMENT

Journal Citation Reports (JCR): Clarivate Announces Changes to Journal Impact Factor Category Rankings

From a Clarivate Blog Post:

Over the past few years, we have implemented a series of policy changes for the Journal Citation Reports (JCR)™ aimed at aligning coverage between the Web of Science Core Collection™ and the JCR, providing more transparency of the data underlying JCR metrics encouraging a more inclusive, more holistic way of comparing journals. [Clip] Recent changes to the JCR have included the addition of profile pages for journals indexed in the Arts and Humanities Citation Index (AHCI)™ and Emerging Sources Citation Index (EHCI)™ and the introduction of the  Journal Citation Indicator  (JCI) in 2021. The JCI is field-normalized to facilitate the comparison of journals across different disciplines, including the arts and humanities. We also extended the Journal Impact Factor (JIF)™ to AHCI and ESCI journals in 2023 so that it now encompasses all quality journals in the Web of Science Core Collection. [Clip] In making these changes, we have evolved the JIF from an indicator of scholarly impact (the numerical value of the JIF) in the sciences and social sciences to an indicator of both scholarly impact  and  trustworthiness (having a JIF – regardless of the number) across all disciplines at the journal level. In 2023, we also changed the way the JIF is displayed – transitioning from three decimal places to one. This is important as it created more ties in JIF rankings to encourage consideration of additional indicators and descriptive factors alongside the JIF when comparing journals. Our commitment to enhancing transparency and trust continues in the forthcoming JCR release in June 2024. Two notable changes,  which we announced last year , will be implemented in the JIF category rankings. We will move from edition-specific category JIF rankings to unified rankings for each of our 229 science and social science categories. We will no longer provide separate JIF rankings for the  nine subject categories that are indexed in multiple editions.  For example, the Psychiatry category is included in both Science Citation Index – Expanded (SCIE)™ and Social Sciences Citation Index (SSCI)™ and we currently publish a separate Psychiatry ranking for each edition. We will replace these separate rankings with a single, unified ranking. Additionally, the new unified rankings will include journals indexed in ESCI. Using Psychiatry once again as our example – we will display a single Psychiatry ranking that includes journals indexed in SCIE, SSCI and ESCI. [Clip] This is the first in a series of updates on the 2024 JCR.

Learn More, Read the Complete Blog Post

Filed under: Data Files , News , Reports

' src=

About Gary Price

Gary Price ( [email protected] ) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. He earned his MLIS degree from Wayne State University in Detroit. Price has won several awards including the SLA Innovations in Technology Award and Alumnus of the Year from the Wayne St. University Library and Information Science Program. From 2006-2009 he was Director of Online Information Services at Ask.com.

Job Zone

Related Infodocket Posts

Follow us on twitter, follow infodocket on twitter.

This coverage is free for all visitors. Your support makes this possible.

IMAGES

  1. Citation Styles for Academic Writing: APA, MLA, Chicago, Harvard, IEEE

    academic citation index

  2. apa citation style guide

    academic citation index

  3. APA Citations

    academic citation index

  4. Free Harvard Citation Generator for Referencing

    academic citation index

  5. PPT

    academic citation index

  6. Print Sources

    academic citation index

VIDEO

  1. Citation 🤌🏻😫

  2. Citation 💗

  3. Citation 🖤

  4. Citation: Why and How?

  5. Citation 👀❤️

  6. Citation ☺️❤️

COMMENTS

  1. Google Scholar Citations

    Google Scholar Citations lets you track citations to your publications over time.

  2. Citation index

    In 1961, Eugene Garfield 's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI).

  3. Citation Analysis

    Scopus Scopus provide citation counts for articles indexed within it (limited to article written in 1996 and after). It indexes over 15,000 journals from over 4,000 international publishers across the disciplines. To find the citation counts to your own articles:

  4. Science Citation Index-Expanded

    182 subject categories 61 M+ records 1900 backfile depth Web of Science Core Collection is a trusted citation index for locating research across a curated, multidisciplinary set of journals, books, & conferences.

  5. Practical publication metrics for academics

    This paper serves as a brief introduction to citation networks like Google Scholar, Web of Science Core Collection, Scopus, Microsoft Academic, and Dimensions. It also explains two of the most popular publication metrics: the h‐index and the journal impact factor.

  6. Scopus

    Scopus: Comprehensive, multidisciplinary, trusted abstract and citation database. Quickly find relevant and authoritative research, identify experts and gain access to reliable data, metrics and analytical tools. Be confident in advancing research, educational goals, and research direction and priorities — all from one database.

  7. Cited Reference Search

    Cited Reference Search Search for records that have cited a published work, and discover how a known idea or innovation has been confirmed, applied, improved, extended, or corrected. Find out who's citing your research and the impact your work is having on other researchers in the world.

  8. Web of Science Master Journal List

    Browse, search, and explore journals indexed in the Web of Science The Master Journal List is an invaluable tool to help you to find the right journal for your needs across multiple indices hosted on the Web of Science platform. Spanning all disciplines and regions, Web of Science Core Collection is at the heart of the Web of Science platform.

  9. Calculate Your Academic Footprint: Your H-Index

    The h-index captures research output based on the total number of publications and the total number of citations to those works, providing a focused snapshot of an individual's research performance. Example: If a researcher has 15 papers, each of which has at least 15 citations, their h-index is 15. Comparing researchers of similar career ...

  10. h-index

    The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited ...

  11. Web of Science platform

    Delve into locally focused research with four regional citation indexes that increase coverage of journals published throughout Latin America, South Africa, Mainland China, South Korea, and the Arab world. Consult unpublished findings

  12. Citations, Citation Indicators, and Research Quality: An Overview of

    First published online February 7, 2019 Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories Dag W. Aksnes https://orcid.org/0000-0002-1519-195X, Liv Langfeldt, and Paul Wouters View all authors and affiliations All Articles https://doi.org/10.1177/2158244019829575 PDF / ePub More Abstract

  13. How to Cite Sources

    How to Cite Sources | Citation Generator & Quick Guide. Citing your sources is essential in academic writing.Whenever you quote or paraphrase a source (such as a book, article, or webpage), you have to include a citation crediting the original author.. Failing to properly cite your sources counts as plagiarism, since you're presenting someone else's ideas as if they were your own.

  14. SJR : Scientific Journal Rankings

    International Scientific Journal & Country Ranking. SCImago Institutions Rankings SCImago Media Rankings SCImago Iber SCImago Research Centers Ranking SCImago Graphica Ediciones Profesionales de la Información

  15. Science Citation Index (SCI) and scientific evaluation system in China

    5 Altmetric Metrics Abstract In February 2020, China's Ministry of Education and Ministry of Science and Technology issued an official Opinion discouraging the use of the Science Citation Index...

  16. Academic Citation Index (ACI)

    It includes the complete set of Taiwan Social Science Citation Index (TSSCI) journals, as well as Taiwan Humanities Citation Index (THCI) journals spanning 19 academic disciplines: education, library information science, sports, history, social science, economics, general studies, anthropology, Chinese-language, foreign-language, psychology, law...

  17. Data Citation Index

    Data Citation Index Connecting data to the research it informs Talk to us Benefits Features Related products and resources Increasing transparency, attribution, and accountability Uncover hidden opportunities to advance your research

  18. Journal Citation Reports (JCR): Clarivate Announces Changes to Journal

    Recent changes to the JCR have included the addition of profile pages for journals indexed in the Arts and Humanities Citation Index (AHCI)™ and Emerging Sources Citation Index (EHCI)™ and the introduction of the Journal Citation Indicator (JCI) in 2021. The JCI is field-normalized to facilitate the comparison of journals across different ...

  19. Highly Cited Researchers

    A trusted citation index for locating research across a curated, multidisciplinary set of journals, books, & conferences. ... Academic Complete. This affordable subscription comes with a growing selection of 234,000+ titles that support curricula, graduation rate trends and emerging courses.

  20. Vendor offering citations for purchase is latest bad actor in ...

    The publications were written by ChatGPT. And the citation numbers were bogus: Some came from the author excessively citing their own "work," while 50 others had been purchased for $300 from a vendor offering a "citations booster service.". "The capacity to purchase citations in bulk is a new and worrying development," says Jennifer ...

  21. Application and development of zero-valent iron (ZVI)-based materials

    China is the greatest contributor with the most published articles and collaborations. Still, the USA has the most academic influence with the highest average citations per article. Chinese Academy of Sciences and Tongji University are the primary establishments that produced the greatest number of publications and had the highest h-index.

  22. Emerging Sources Citation Index

    Web of Science Core Collection is a trusted citation index for locating research across a curated, multidisciplinary set of journals, books, & conferences.

  23. Jurnal Ners on Instagram: " CALL FOR PAPERS JURNAL NERS Respected

    16 likes, 0 comments - jurnal.ners on February 19, 2024: " CALL FOR PAPERS JURNAL NERS Respected Colleague! We have read your prior resear..."

  24. SciELO Citation Index

    The SciELO Citation Index™ helps researchers make connections to the broader research landscape, for a more complete global picture, by discovering new insights from research in regional journals in Latin America, Spain, Portugal, the Caribbean and South Africa .