Evolving Bibliometric Indicators: Enhancing Research Impact Assessment Beyond Traditional Metrics

  • 409

    Total views and downloads

About this Research Topic

Submission deadlines

  1. Manuscript Summary Submission Deadline 31 March 2026 | Manuscript Submission Deadline 19 July 2026

  2. This Research Topic is currently accepting articles.

Background

Bibliometric analysis is crucial for evaluating scientific publications, traditionally relying on metrics like citation count, SCI/SSCI article, and journal impact factor (IF). However, these methods often face bias and limitations. Despite the IF's dominance for over 50 years, there's no consensus on alternatives. Recent studies emphasize the need for comprehensive metrics, such as the disruption score, which evaluates how a publication challenges existing knowledge. Unfortunately, the disruption index also has some logical issues and requires the development of more effective and advanced evaluation metrics. The importance of bibliometric analysis in fields like health and business management highlights the need for standardized methodologies. This research topic aims to explore evolving bibliometric indicators to enhance research impact assessment beyond traditional metrics.

With the continuous application of bibliometric indicators, many questionable results have emerged. For example, the top ten institutions in the Nature Index are mostly Chinese institutions, but there are very few Nobel laureates from China. There are significant differences in the consistency of university rankings among QS, THE, USNews, and ShanghaiRanking. Similarly, there are dramatic differences in the results of Scopus' high cited scholars, Clarivate Analytics' highly cited researchers, and the world's top 2% of scientists jointly released by Stanford University and Elsevier. Although significant flaws in the original disruption index were identified in 2019, the metric has nevertheless continued to be widely cited. What are the fundamental reasons for these differences and conflicts? Are these evaluation systems too arbitrary in the selection of indicators and the adoption of methods? This Research Topic aims to explore the problems behind these evaluation methods from a quantitative or qualitative perspective and further identify possible solutions, such as better indicators, more appropriate methods, and more effective comprehensive analysis models.

We invite scholars to contribute original research articles, brief research reports, reviews, mini reviews, and methods. The specific themes to be explored, include, but are not limited to the following:

• Comparative analysis of traditional analysis indicators and new evaluation indicators (H index, Eigenfactor, Altmetric Attention Score, etc)
• In depth analysis of the disruptive index and its derivative indicators
• Developing new indicators to evaluate the scientific contribution of a publication
• Constructing a comprehensive analysis model for evaluating the scientific value of a publication
• Research on the application of solo citations in discovering disruptive achievements
• Exploration of limitations in indicators and analysis methods in common ranking systems
• Research on the application of standardized academic evaluation methods in fields like health and business management.

Research Topic Research topic image

Article types and fees

This Research Topic accepts the following article types, unless otherwise specified in the Research Topic description:

  • Brief Research Report
  • Conceptual Analysis
  • Data Report
  • Editorial
  • FAIR² Data
  • FAIR² DATA Direct Submission
  • General Commentary
  • Hypothesis and Theory
  • Methods

Articles that are accepted for publication by our external editors following rigorous peer review incur a publishing fee charged to Authors, institutions, or funders.

Keywords: Bibliometric Indicators, Impact Factor, Scientometrics Metrics, Evaluation Indicators, Analysis, Disruption Score

Important note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Topic editors

Manuscripts can be submitted to this Research Topic via the main journal or any other participating journal.

Impact

  • 409Topic views
View impact