University Library
University Library

Bibliometrics

Advisory Service

The University Library is happy to answer your questions concerning bibliometrics and provides advice on the following issues:

  • Author-level metrics (e.g. h-index)
  • Journal-level metrics (e.g. JIF)
  • Creating bibliometric profiles (e.g. output and citation analyses)

We are also happy to answer any specific questions you have.

Typical questions:

  • Which of my publications have been cited and how often?
  • Which journals am I most frequently published in and what impact factor do these journals have?
  • How much of my work has been published as open access publications?
  • How has my publishing output developed in recent years?
  • How has the publishing output of the institute or faculty developed in recent years?
  • Which forms of publication have mainly been used?
  • Which co-authors have I mostly been published with?
  • What is my h-index?
  • How has my research been received?

Dr. Alexandra Schütrumpf

Bibliometrics

bibliometrie@ub.tu-berlin.de

+49 30 314-76113

About Bibliometrics

Bibliometrics is the quantitative analysis of scientific publications and their citations. Bibliometric analyses help compare the publishing output of individual academics, researchers, institutes, faculties or entire institutions.

They further help academics to evaluate their publishing output and improve the visibility and impact of their research. Bibliometric analyses require great care when selecting methods, handling data and interpreting results.

Data sources

Web of Science (WoS)
Web of Science (WoS) is a multi-disciplinary database of literature and citations focusing on the natural sciences operated by Clarivate Analytics. The TU Berlin University Library has licensed Web of Science and access is free of charge for members of the University (http://www.isiknowledge.com/WOS).
Scopus
Scopus is a multidisciplinary database of literature and citations operated by Elsevier. The TU Berlin University Library has not licensed Scopus.
Google Scholar
Google Scholar is a publicly accessible search engine for academic literature, with a very large database showing citation details for the results it shows. The data sources, however, are not transparent and results are not reliably reproducible.
Microsoft Academic
Microsoft Academic is a publicly accessible search engine for academic literature. As with Google Scholar, the search service records publications and citations. The data sources are, however, also not transparent.

Indicators

Author indicators

The fundamental indicators of a biometric analysis are based on counting the number of publications and citations:

The citation rate is the quotient of the number of publications P and the number of citations C: The citation rate CCP = C/P

The h-index was developed to measure the academic output of an individual academic. This index is also based on the number of publications and the number of citations: h-index = number of articles h of an individual or institute which has achieved at least h citations.

Tutorial on determining bibliometric data in WoS (PDF, 2MB)

Journal indicators

Journal indicators are designed to measure the influence of individual journals. They are not indicators for evaluating the output of an academic. They also provide no indication of the quality of an individual article in a journal. An article published in a journal with a high journal indicator is not necessarily qualitatively better than an article published in a journal with a lower indicator.

The best known and longest-established indicator is the Journal Impact Factor (JIF). This shows how often an article in a journal is cited in other publications. It is calculated upon the number of citations of an article in a journal for a given year in relation to the number of citations this article received in the two preceding years.

The Web of Science Core Collection database (Clarivate) serves as the basis for this data.

The Scimago Journal Rank (SJR) shows the reputation of a journal by measuring the average number of citations taking account of the Page Rank algorithm. The Scorpus database (Elsevier) serves as the basis for this data.

The Eigenfactor score is a rating of the total scientific influence of journals according to the number of citations of articles from the respective journal (network analysis). Using existing citation data (Web of Science), two scores are calculated with the Page Rank algorithm: the Eigenfactor Score (ES) and the Article Influence Score (AIS). Both are freely accessible.

Article indicators (alternative metrics)

To reflect new forms of science communication and the growing number of electronic publications, article-related, alternative metrics are being developed on the basis of use data (e.g. views, downloads, bookmarks) and discussions of academic publications in social networks (e.g. Twitter, Facebook, blogs).

Bibliometric analyses

Typical bibliometric analyses include:

Output analyses:

In the output analysis, the academic output is investigated by counting the publications. All publications are calculated and analyzed over time by discipline or type of publication.

Citation analyses:

The citation analysis measures the perception of academic output in the community. Citation data are collected and analyzed for this purpose.

Trend analyses:

Bibliometric analyses make it possible to predict academic trends.

Network analyses:

Network analyses analyze interdisciplinary and transregional cooperation.

 

Increasing visibility

Author profiles

Author profiles assist with the clear identification of authors. They ensure all publications are correctly attributed to a researcher. This promotes a comprehensive evaluation and visibility of the researcher’s publication output.

The University Library recommends all researchers register with ORCID and use their individual ORCID in their publications.

More about ORCID

Open access

Open access publications are available free of charge and publicly online, in other words without any financial, legal, or technical barriers. This allows open access contributions to be found and cited more often than content that is not publicly accessible online.

More about open access

Affiliation guidelines

Science is a competitive field and including an author’s affiliation to an institution in publications is of key importance. Universities and their researchers are frequently ranked in national and international comparisons according to their publication output, with third-party funding often awarded on the basis of these rankings. Clear and, above all, complete details are also important for other studies evaluating data on affiliation, e.g. analyses of inter-institutional cooperation.

In light of this, TU Berlin approved a set of guidelines for standardizing statements of affiliation in German and English-language publications in October 2019. These guidelines apply to all members of TU Berlin. They determine the use of a standardized name for the University (Technische Universität Berlin) as well as a standardized abbreviation (TU Berlin) and establish the procedures to be followed in the event of affiliation to more than one institution.

The guidelines ensure that all publications produced by researchers at TU Berlin are clearly, correctly, and fully assigned to Technische Universität Berlin. As such, they serve to increase the visibility of TU Berlin's research strengths and help ensure that the University’s research attracts funding. They also have a positive impact on national and international rankings.

You can read more about TU Berlin’s affiliation guidelines in our blog article: "TU Berlin beschließt Affiliationsrichtlinie” (german only)

FAQ

Why use bibliometric analyses?

Bibliometric analyses are useful for quantitatively measuring, evaluating, and comparing academic publications. They are increasingly used in science management, for example as an instrument to identify research trends or awarding merit-based funds.

Can research content be qualitatively evaluated using bibliometric analyses?

Bibliometric analyses are solely an instrument of research management. They are never a replacement for a qualitative evaluation of research content. This is possible through peer review, the review of research by independent experts, for example.

What data sources are suitable for bibliometric analyses?

The most important bibliometric data sources are the multidisciplinary literature and citation databases Web of Science and Scopus. It is important to note that the validity and meaningfulness of a bibliometric evaluation strongly depend on the quality and focus of the database. A researcher’s h-index can vary depending on which data source was used as a basis.

Which score is the most informative?

Individual scores should never be interpreted in isolation. Rather, the entirety of all indicators is to be considered and reasonably evaluated. For example, the number of publications (output) says nothing about the quality of the publications. Someone who publishes frequently is not necessarily publishing at a better qualitative level, just as someone who publishes less often is not necessarily publishing at a lesser qualitative level.

Can different disciplines be bibliometrically compared?

As the publication activities of academics strongly differ in individual disciplines, bibliometric indicators are not suitable for cross-disciplinary comparison.

The chat is currently unavailable.

Please use our alternative contact options.

Privacy notice: The TU Berlin offers a chat information service. If you enable it, your IP address and chat messages will be transmitted to external EU servers. more information