Bibliometrics is the quantitative analysis of scientific publications and their citations. Bibliometric analyses help compare the publishing output of individual academics, researchers, institutes, faculties or entire institutions.
They further help academics to evaluate their publishing output and improve the visibility and impact of their research. Bibliometric analyses require great care when selecting methods, handling data and interpreting results.
|Web of Science (WoS)|
|Web of Science (WoS) is a multi-disciplinary database of literature and citations focusing on the natural sciences operated by Clarivate Analytics. The TU Berlin University Library has licensed Web of Science and access is free of charge for members of the University (http://www.isiknowledge.com/WOS).|
|Scopus is a multidisciplinary database of literature and citations operated by Elsevier. The TU Berlin University Library has not licensed Scopus.|
|Google Scholar is a publicly accessible search engine for academic literature, with a very large database showing citation details for the results it shows. The data sources, however, are not transparent and results are not reliably reproducible.|
|Microsoft Academic is a publicly accessible search engine for academic literature. As with Google Scholar, the search service records publications and citations. The data sources are, however, also not transparent.|
The fundamental indicators of a biometric analysis are based on counting the number of publications and citations:
The citation rate is the quotient of the number of publications P and the number of citations C: The citation rate CCP = C/P
The h-index was developed to measure the academic output of an individual academic. This index is also based on the number of publications and the number of citations: h-index = number of articles h of an individual or institute which has achieved at least h citations.
Journal indicators are designed to measure the influence of individual journals. They are not indicators for evaluating the output of an academic. They also provide no indication of the quality of an individual article in a journal. An article published in a journal with a high journal indicator is not necessarily qualitatively better than an article published in a journal with a lower indicator.
The best known and longest-established indicator is the Journal Impact Factor (JIF). This shows how often an article in a journal is cited in other publications. It is calculated upon the number of citations of an article in a journal for a given year in relation to the number of citations this article received in the two preceding years.
The Web of Science Core Collection database (Clarivate) serves as the basis for this data.
The Scimago Journal Rank (SJR) shows the reputation of a journal by measuring the average number of citations taking account of the Page Rank algorithm. The Scorpus database (Elsevier) serves as the basis for this data.
The Eigenfactor score is a rating of the total scientific influence of journals according to the number of citations of articles from the respective journal (network analysis). Using existing citation data (Web of Science), two scores are calculated with the Page Rank algorithm: the Eigenfactor Score (ES) and the Article Influence Score (AIS). Both are freely accessible.
To reflect new forms of science communication and the growing number of electronic publications, article-related, alternative metrics are being developed on the basis of use data (e.g. views, downloads, bookmarks) and discussions of academic publications in social networks (e.g. Twitter, Facebook, blogs).
Typical bibliometric analyses include:
In the output analysis, the academic output is investigated by counting the publications. All publications are calculated and analyzed over time by discipline or type of publication.
The citation analysis measures the perception of academic output in the community. Citation data are collected and analyzed for this purpose.
Bibliometric analyses make it possible to predict academic trends.
Network analyses analyze interdisciplinary and transregional cooperation.
Author profiles assist with the clear identification of authors. They ensure all publications are correctly attributed to a researcher. This promotes a comprehensive evaluation and visibility of the researcher’s publication output.
The University Library recommends all researchers register with ORCID and use their individual ORCID in their publications.
Open access publications are available free of charge and publicly online, in other words without any financial, legal, or technical barriers. This allows open access contributions to be found and cited more often than content that is not publicly accessible online.
Science is a competitive field and including an author’s affiliation to an institution in publications is of key importance. Universities and their researchers are frequently ranked in national and international comparisons according to their publication output, with third-party funding often awarded on the basis of these rankings. Clear and, above all, complete details are also important for other studies evaluating data on affiliation, e.g. analyses of inter-institutional cooperation.
In light of this, TU Berlin approved a set of guidelines for standardizing statements of affiliation in German and English-language publications in October 2019. These guidelines apply to all members of TU Berlin. They determine the use of a standardized name for the University (Technische Universität Berlin) as well as a standardized abbreviation (TU Berlin) and establish the procedures to be followed in the event of affiliation to more than one institution.
The guidelines ensure that all publications produced by researchers at TU Berlin are clearly, correctly, and fully assigned to Technische Universität Berlin. As such, they serve to increase the visibility of TU Berlin's research strengths and help ensure that the University’s research attracts funding. They also have a positive impact on national and international rankings.
You can read more about TU Berlin’s affiliation guidelines in our blog article: "TU Berlin beschließt Affiliationsrichtlinie” (german only)
The most important bibliometric data sources are the multidisciplinary literature and citation databases Web of Science and Scopus. It is important to note that the validity and meaningfulness of a bibliometric evaluation strongly depend on the quality and focus of the database. A researcher’s h-index can vary depending on which data source was used as a basis.
Individual scores should never be interpreted in isolation. Rather, the entirety of all indicators is to be considered and reasonably evaluated. For example, the number of publications (output) says nothing about the quality of the publications. Someone who publishes frequently is not necessarily publishing at a better qualitative level, just as someone who publishes less often is not necessarily publishing at a lesser qualitative level.