Understanding publication ranking
Many scientists are evaluated and assessed by university management and funding agencies using bibliometric methods - whether they want to or not. It is therefore worthwhile to deal with bibliometric issues.
What is bibliometrics?
Bibliometrics is concerned with the evaluation of science through quantitative / statistical analysis of publication activity. It focuses in particular on the number of publications (as the degree of scientific output) and the number of citations referring to these publications (as the degree of international perception of the research). can be analysed:
Individual scientists and scholars
Research groups / institutes / faculties / universities
Countries / Nations / Continents
Individual journals (Journal Impact Factor)
Further questions of bibliometrics are above all:
Analysis of current research trends and topics, e.g. also interdisciplinary research activities.
Analysis of (international) cooperative relationships (based on co-authorship).
Analysis of discipline-specific publication practices.
There are also reasons against bibliometrics:
- Those who want to pursue an academic career are forced to orient their behaviour towards evaluation criteria (publish or perish). This contradicts the freedom of research and teaching and hinders both the growth of scientific knowledge and the quality of teaching.
- A citation is not necessarily also a positive acknowledgement, but can also be a very critical examination (negative citation). Quantifiable bibliometric analyses therefore do not allow any conclusions to be drawn about the quality of research and related publications.
- Established researchers are preferred by the evaluation criteria used in bibliometrics.
- There are strong discipline-specific peculiarities in citation behaviour, which make interdisciplinary comparability of bibliometric values virtually impossible.
- The databases that can be used as a data basis for bibliometric analyses prefer Anglo-American publications and mainstream literature. There is no complete database of publications and citations that would lead to "objective" values.
- Bibliometrics is susceptible to manipulation by researchers. With salami technique (5 smaller essays count more than one book), citation cartels, courtesy citations, self-quotes, honorary authorships, etc. it is possible to better represent one's own scientific achievements.
The three most important data sources, which systematically include citations for bibliometric analyses:
Ranking of scientific journals
The Journal Impact Factor (JIF) is the most internationally known indicator for the ranking of scientific journals. The Journal Citation Reports can be found in the database Web of Science.
The Impact Factor determines the number of articles published in a journal in relation to the citation frequency; the number of citations in the reference year is divided by the number of articles in the past 2 years. In addition, the database offers other factors, such as cited half-life, the half-life of citations, specifically the average age of articles from a journal that are cited in current articles in other journals.
SCImago Journal & Country Rank is a freely accessible database of bibliometric information for ranking scientific journals. SCImago uses a proprietary Google Pagerank based mechanism to rank scientific journals in the natural and engineering sciences, medicine (STM) and social sciences. Based on the journal and country-specific indicators collected in Scopus, the citation frequency of publications is analysed and a ranking of the journals is created on this basis.
Both databases refer to journals and do not allow statements on the citation frequency of individual articles within these journals. For many journals, only 20% of the articles account for 80% of the citations. A high percentage is never cited. Even in journals with a low impact factor, there are individual articles that achieve more than 400 citations.
Finding the h-index
The h-index (also called Hirsch factor or H-number) was developed in 2005 by the physicist Jorge E. Hirsch at the University of California. It is an indicator that attempts to evaluate the scientific life performance of a person. Publications and citations are considered in combination. The determination of the h-index is quite simple:
Search the relevant citation databases (e.g. Web of Science, Google Scholar) for your own publications and count the citations per publication. If you search for citations in several databases, an adjustment for duplicates (entries contained in several databases) is of course necessary.
Sorting of the publications according to the number of citations in descending order, starting with the publication that was cited most frequently.
Determining the h-index: An author has an h-index of n if at least n publications have been cited at least n times.
In this fictitious example, author A has five publications that have been cited at least five times - the sixth publication, on the other hand, was also "only" cited five times. The h-index is therefore 5, and if the sixth publication were to receive only one further citation, the h-index would rise to 6. Author B has only three publications in total, but the first two publications were cited frequently. Nevertheless, his h-index remains at two, because he lacks a third publication that has been cited at least three times. In the case of author C, the citations are distributed relatively homogeneously among the various publications; he still has seven citations for the seventh publication, so that the h-index is 7.
Alternative metrics / Altmetrics
In bibliometrics, research performance is measured by publications and citations, i.e. the perception/representation in other publications. In the age of eScience, however, there are significantly more channels of scientific communication. In Altmetrics the classical indicators of bibliometrics are extended:
- To other forms of scientific communication like blogs etc. …
- as well as other indications of the use, perception and reception of scientific achievements, such as the distribution of a work in libraries, download statistics of an online publication, links, mention in blogs and bookmark lists, tweeds, shares and likes, etc.