You can find citation counts for individual journal articles using the three multidisciplinary databases: Web of Science, Scopus and Google Scholar.
Using Web of Science:
- Open Web of Science
- Search for the article by title or DOI
- See the citation count towards the bottom of the entry
- Open Scopus
- Search for the article by title or DOI
- See the citation count to the far right of the entry
Using Google Scholar:
- Open Google Scholar
- Search for the article using by title or DOI
- See the citation count beneath the entry. If the citation count is missing, it means there are no citations.
These three databases don’t always give the same citation count for a given article, because they cover different data. Usually but not always Google Scholar will give the highest number. If you are comparing citation counts for different articles, you should use the same database.
The h index is named after Jorge Hirsch. It’s a measure of a researcher’s impact in terms of citations, which combines both quality and quantity. It’s the highest number of a researcher’s papers that has had at least the same number of citations. For example, if a researcher has an h index of 10, it means that s/he has published 10 papers each of which has at least 10 citations. Wikipedia provides further information about the h index together with a diagram.
You can use either Scopus or Web of Science to find a researcher’s h index.
Why use the h index?
A researcher might want to include their h index in an application for a research grant application, a job or promotion. A doctoral researcher who is thinking about where to go for a post-doc position might want to check potential supervisors’ h indexes. However, citation styles vary from one subject area to another, so comparing the h indexes of researchers in different subject areas is not advised. If you are comparing researchers’ h indexes, you should also be aware that h indexes tend to be higher for older researchers whose papers have generated numerous citations with the passage of time.
H index via Scopus
To find a researcher’s h index using Scopus:
- Open Scopus
- Enter an author’s surname, initials and affiliation on the Author Search tab and Search
- Check boxes for author results as required
- Go to View overview citation
- Choose View h graph
H index via Web of Science
To find a researcher’s h index using Web of Science:
- Access WoS via Web of Knowledge
- Enter an author’s surname, initial(s) and affiliation, using the drop-down menus, e.g. Sumpter J* (author) and Brunel (address)
- Go to Create Report Citation
It’s quite likely that you will find different figures for a researcher’s h index using Scopus and Web of Science. This is because the underlying publication and citation data for Scopus and Web of Science are not the same. If you are comparing researchers’ h indexes, they should be from the same source, either all from Scopus or all from Web of Science.
Journal impact factors
Journal impact factors provide a system for ranking journals according to citations. They may inform your reading of journals and help you identify where to publish a potential journal article. Impact factors are to be found in Journal Citation Reports.
Journal Citation Reports
To find impact factors, you need to consult Journal Citation Reports (JCR). There are two editions of JCR: science and social sciences. The science edition, which covers science, medicine and engineering, includes around 8,000 journals, whilst the social sciences edition includes around 2,600 journals. There isn't an arts & humanities edition of JCR.
If you select the science edition of JCR, you can choose groups of journals from around 170 subject categories. For the social sciences there are fewer subject categories, around 55 of them. Some broad subject categories are broken down, e.g. under computer science there are separate categories for artificial intelligence, cybernetics, hardware & architecture and information systems.
Finding impact factors
A journal's impact factor is the frequency with which an "average article" published in that journal during the two previous years was cited during the JCR year. It is calculated by dividing the number of citations during the JCR year by the total number of articles in the journal's two previous years. Impact factors are available for each year from 2000 onwards.
To display the impact factors for a group of journals by subject:
- At the JCR home page, select the appropriate JCR edition and year, and accept the default subject category option and Submit
- At the subject category section, select your subject. You can select more than one subject category by holding the control key down.
- Select Impact Factor as the sort option using the View Journal Data drop-down menu, then Submit.
You should see a table with abbreviated journal titles ranked by their impact factors. For example, for economics in 2010, the three top journals by impact factor are J Econ Lit, Q J Econ and Technol Econ Dev Eco.
Clicking on any of the abbreviated journal titles, will bring up the journal's full title and other information. From this page, select Impact Factor Trend to display a graph of the journal's impact factors over the last five years.
You can return to the opening JCR page at any time by clicking on the Welcome button in the top left corner of the screen.
JCR's other journal metrics
The most familiar journal metric provided by JCR is the impact factor. However, JCR also offers several other journal metrics, including: the immediacy index, five year impact factor and Eigenfactor score. The immediacy index shows how fast articles are cited following their publication. It measures the average number of times an article in a particular journal is cited in the year it is published.
For definitions of JCR's other journal metrics, see Clarivate Analytics JCR help web pages.
Journal quality lists
If your field of research is business, economics, finance, management or marketing related, there are two journal quality lists that you might want to check. They offer alternative rankings to those provided by impact factors.
Academic Journal Quality Guide
The Chartered Association of Business Schools (CABS) Academic Journal Quality Guide, is based on peer review, citation statistics and editorial judgements. The guide is designed to serve the needs of UK business and management researchers. It assigns grades of one, two, three or four to each journal. Users need to register and create a login username and password to view the guide which is searchable.
The subjects covered include: accountancy, business history, economics, business ethics & governance, finance, general management, human resource management, international business & area studies, information management, innovation, marketing, psychology and social sciences.
Academic Journal Guide 2015
Journal Quality List
The Journal Quality List is an international venture, which is compiled and edited by Prof. Anne-Wil Harzing. It is available in three arrangements: by journal title, by subject area and by ISSN.
Its rankings come from a variety of sources, including: the Association of Busines Schools, Aston University, British Journal of Management, Cranfield University School of Management, and the Financial Times. Some sources are more current than others.
The subject coverage includes: business history, communication, economics, entrepreneurship, finance & accounting, general & strategy, international business, innovation, marketing, psychology, public sector management, sociology and tourism.
Journal Quality List
SJR and SNIP journal metrics
The Scopus database offers two journal metrics:
- SCImago Journal Rank (SJR)
- Source Normalized Impact per Paper (SNIP)
SJR starts with the concept that citations aren’t equal, and accordingly gives greater weight to citations from high SJR journals. SNIP combines a journal’s average citation count per paper with the "citation potential" of its subject area, thereby enabling journals in different subject areas to be compared. For further information about these two journal metrics: SJR and SNIP.
SCImago Journal Rank (SJR)
SRJs can be analysed by subject area, subject category and country using the SCImago Journal and Country Rank portal. This portal can also be used to compare three or four individual journals – the results are presented in a table and graphically. The SRJ portal is powered by Scopus and based on Scopus data. Data is available from 1999 onwards.
HEFCE has announced that the Scopus database will be used for bibliometric data to support the 2014 REF, which suggests using SJR to identify appropriate journals in which to publish. However, there hasn’t been any official REF advice on this.
Finding SJRs and SNIPs in Scopus
To find a journal’s SJR and SNIP:
- Open Scopus
- Link to <Analytics> on the blue bar
- Click in the <search box>, beneath Journal Analyser
- Enter a journal title in the box
- Link to the <Search> button
- Double click on the journal’s title in the results column
- See chart appear
- Move between tabs to view SJR and SNIP charts
- Repeat for other journals
Comparing journal metrics
How do the journal metrics provided by Scopus (SJR and SNIP) compare with those provided by Journal Citation Reports, e.g. the journal impact factor?
Scopus covers more journals than Journal Citation Reports (18,000 for Scopus versus 10,600 for JCR). This might mean that you can find your key journals in Scopus, if they aren’t available in JCR, although some journals are covered by JCR and not by Scopus. New journals tend to be picked up more quickly by Scopus than JCR, because JCR requires three years of data for calculating a journal impact factor.
JCR and the SCImago Journal and Country Rank portal both offer the facility to display data by subject areas. However, the SJR portal gives greater flexibility in presenting this data, e.g. the results can be presented in a graph.
If you are interested in cross subject journal comparisons, then Scopus’ SNIP is clearly attractive. On the other hand, the traditional journal impact factor, which is found in the JCR, has the advantage of familiarity. Its calculation is also relatively easy to understand.