Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Measuring Research Impact

An introduction to some commonly used metrics for determining the influence of published research.


When looking at records for individual articles. see:

  • Citations (aka times cited) in Scopus, along with a percentile (Citation Benchmarking)
  • Citation Benchmarking
  • PlumX Metrics - for altmetrics

Citation Benchmarking

Citation Benchmarking shows how citations received by this article compare with the average for similar articles. 99th percentile is high, and indicates an article in the top 1% globally. It takes into account:

  • Date of publication
  • Document type
  • Disciplines associated with the source
  • Compared articles within an 18 month window and is computed separately for each of its sources' disciplines

Field-Weighed Citation Impact

Field-Weighted Citation Impact (FWCI)  -  the ratio of the total citations actually received by the denominator’s output, and the total citations that would be expected based on the average of the subject field.

  • The FWCI is the ratio of the document's citations to the average number of citations received by all similar documents over a three-year window. Each discipline makes an equal contribution to the metric, which eliminates differences in researcher citation behavior.
  • FWCI  Attempts to account for disciplinary differences in publication patterns
    • ex: medicine has more output and more co-authors than education which is a reflection of research culture, not performance -A denominator of multiple disciplines could allow one field to dominate others (ex. medicine would dominate education)
    • Non-weighted metrics would allow an institution focused on medicine appear to perform better than an institution focused on social sciences
  • FWCI of:
    • *Exactly 1* means the output performs just as expected for the global average
    • More than1 means that the output is more cited than expected according to the global average
      Less than 1 means that the output is cited less than expected according to the global average