Thing 15 — Research Impact (‘Bibliometrics’)

thing 15
[Photo credit: Bill Lile :CC BY-NC-ND 2.0]
In the previous Thing, you looked at sharing your research outputs. Bibliometrics is a well-established approach for studying one type of research output: the academic publication, and especially, the journal article. Most bibliometric work is quantitative in nature.

Measures of output and indicators of impact…

Bibliometrics provide measures of academic output and indicators of academic impact. Output describes the volume of publications and can be linked to productivity. Impact considers how a publication influences and affects the research community. When used knowledgeably and appropriately in combination with peer review and human judgement, bibliometrics can contribute to the overall assessment of research quality.

Use multiple metrics…

If you want to use bibliometrics, always draw on more than one metric. For a start, you’ll need different metrics for measuring output and impact. But, even if you are just looking at impact, be aware that there are at least four types of impact and that each is measured differently:

1) Impact at point of publication

Point of publication impact is about how a research output is accepted and embraced initially by the research community; for example, acceptance of a publication into a high-impact journal suggests that the community believes the research to be of high value. Point of publication impact metrics tend to focus on the journal in which an article is published rather than on the article itself.

2) Impact post publication

Impact post publication considers the influence that a research output has after it has been shared. In bibliometric terms, post publication impact equates to citation impact. (With altmetrics, post publication impact is more broad—see Thing 16.) Be aware that the citation cycle normally takes at least two years; therefore, if you are using a shorter window than this to evaluate a publication, you may have to make do with looking at point of publication impact instead.

3) Impact from enabling knowledge transfer

Every academic publication you produce is actually a link in a chain of publications: almost certainly, you will have cited other publications, providing the backward links in the chain, and with luck, other publications will cite yours, creating the forward links in the chain. By being an important or key link in the publication chain—for example, by publishing an article that connects two disparate research areas—you enable the transfer of knowledge. Acting as a ‘knowledge bridge’ in this way is a vital but often overlooked form of academic impact.

4) Impact through collaboration

Impact though collaboration can be seen in the networks you inadvertently create when you publish research outputs with others. These networks may describe links between authors or institutions. Your collaboration networks show how you connect with others in the research community and allow you to draw inferences about your collaboration impact.

If you want to compare, normalise…

Aside from using different metrics to study output and the four different types of impact, there is a another important caveat you must consider when using bibliometrics: always compare like with like. In practice, we often seek to compare: which of these publications has higher citation impact, is this journal better than that one, is my research group having greater impact than so and so’s group?

To make comparisons, you need normalised bibliometric indicators. This means that you can’t make direct comparisons using some of the more familiar—but unnormalised— bibliometrics such as number of citations, h-indices, or Journal Impact Factors.

Citation practices vary from field to field, older papers have had more time to attract citations than newer papers, and some document types, for example reviews, are cited a lot more than other types of academic work. For comparison purposes, it doesn’t matter that you have 50 citations and a colleague has only 30—you don’t necessarily have the higher citation impact. The only way to compare fairly is to take into account differences in subject area, publication year, and document type. Normalised bibliometric indicators do this; absolute counts such as the h-index and averages such as the Journal Impact Factor do not.

Bibliometric tools…

Three different publication and citation ‘data universes’ can be used to produce bibliometrics: Web of Science, Scopus, and Google Scholar. Each of these data worlds is made up of different sets of publications. This means that the bibliometric results derived from one world will not be the same as results derived from the others.

Formal bibliometric studies always use either Web of Science or Scopus because these databases incorporate the professionally-managed metadata that underlie the normalised bibliometric indicators needed for making fair comparisons. And, as we saw earlier, the need to compare arises often in bibliometric work.

For more informal assessments, academics themselves often turn to Google Scholar. It just so happens that Google Scholar normally gives the highest absolute citation counts of the three databases. But because Google Scholar is not used formally in bibliometrics, you need to be aware that your citation counts may appear lower in most formal bibliometric studies. Don’t panic, though! This apparent ‘drop’ in citations affects every researcher, not just you, and in any event, is irrelevant when properly normalised indicators are used.

At Surrey, both Web of Science and Scopus can be accessed through the library website. Additionally, we subscribe to a specialist bibliometric tool, SciVal, which derives its results from Scopus data. Hands-on training in this tool is offered year round, and an introductory SciVal guide for researchers is available on the library website as a pdf.

Tasks

Choose two or three publications (yours or somebody else’s) that have received citations. Look up these publications in Web of Science, Scopus, and Google Scholar.

  • Do the publications appear in all three data worlds?
  • How many citations does each publication receive according to Web of Science, Scopus, and Google Scholar? Are the counts different in each database?
  • Most likely, Google Scholar will give the highest citation count—did you find this to be the case with your example publications?

Download the SciVal for Researchers guide from the library website. To access SciVal, follow the registration and login instructions provided in the guide. Again, following the guide, use the default ‘University of Surrey’ example to explore some of the bibliometrics offered in SciVal.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s