
In this study, our goal was to perform a large-scale performance evaluation of review-based grant allocation. In total, 42,905 scored review reports prepared for 13,303 proposals were analyzed. In brief, the key messages of our manuscript are: 1) Basic research grants significantly increase scientific performance. 2) Grant review scores have a low correlation with subsequent publication output in the course of the grant time. 3) The past scientometric performance of the principal investigator including H-index, independent citation, and number of Q1 publications are the best predictor of future performance. 4) International reviewers are significantly less efficient than national reviewers.
THIS PAPER IS THE MOST IMPORTANT BACKGROUND OF SCIENTOMETRICS.ORG. THE PAPER IS WIDELY REFERENCED WITH AN FWCI OF 3.6 AND A 95TH CITATION PERCENTILE IN SCOPUS.
Download full text
Munkacsy Gy, Herman P, Győrffy B.: Comparison of scientometric achievements at PhD and scientific output ten years later for 4,790 academic researchers. PLoS One 2022;17(7):e0271218. 
Here, we examined whether the publication output before the PhD degree has a correlation with subsequent research activities. We analyzed publication and citation data for Hungarian researchers who obtained PhD between the ages of 24 and 45. Pre-PhD publications (and citations for these) were excluded when assessing post-PhD track records. When running multiple regression analysis for all three metrics (number of Q1 publications, H-index, citation count) as dependent variables and the number of articles, the H-index, the number of citations in the year of the PhD, the calendar year of PhD, and the gender of the researcher as independent variables, the number of articles and the H-index in the year of PhD reached the strongest positive correlations while gender had a negative correlation.
Read the full text paper here
Győrffy B, Weltz B, Munkacsy Gy, Herman P, Szabó I.: Evaluating Individual Scientific Output Normalized to Publication Age and Academic Field Through the Scientometrics.org Project. Methodology 2022;18(4):278-297. 
This paper describes the functionalities of the MTMT version of scientometrics.org, which is designed for Hungarian researchers.
Read the full text paper here
Győrffy B, Weltz B, Szabó I.: Supporting grant reviewers through the scientometric ranking of applicants. PLoS One 2023;18(1): e0280480. 
In this study, our goal was to assess the impact of the scientometrics.org decision support tool on grant review procedures. A questionnaire was completed by reviewers regarding utilization of the scientometric ranking system. The outcome of the grant selection was analyzed by comparing scientometric parameters of applying and funded applicants. We compared three grant allocation rounds before to two grant allocation rounds after the introduction of the portal. Overall, the scientometric decision support tool can save time and increase transparency of grant review processes. The majority of reviewers found the ranking-based scientometric analysis useful when assessing the publication performance of an applicant.
Read the full text paper here
Szluka P, Csajbók E, Győrffy B.: Relationship between bibliometric indicators and university ranking positions. Sci Rep 2023;13:14193. 
In this study we explored the impact of international university rankings on demonstrating prestige and status in higher education. Four major ranking systems (THE, QS, ARWU, and USNews) were analyzed for the top 300 universities, revealing varying correlations between bibliometric indicators and ranking positions. Parameters such as citations, international reputation, and researchers' count were key factors, with the significance differing across rankings. The study also highlights the influence of field-specific citation impact and university size on rankings, helping decision-makers in choosing effective strategies for attracting prospective students.
Read the full text paper here
Other relevant publications
- Ioannidis JPA, Baas J, Klavans R, Boyack KW.: A standardized citation metrics author database annotated for scientific field. PLoS Biol. 2019 Aug 12;17(8):e3000384.
This paper draws the attention of the reader to the use and misuse of citation metrics. The paper describes multiple issues like self-citation, citation-farms, and metrics useful for the identification of unethical citation behavior. In addition, a standardized citation database is established by using the data of more than 100,000 top scientist.
Read the full text paper here
- Sandström U, Besselaar P: Quantity and/or Quality? The Importance of Publishing Many Papers. PLoS ONE 2016;11(11): e0166149.
The project uses a Swedish dataset consisting of 48.000 researchers and their WoS-publications to investigate the relation between productivity and production of highly cited papers. The results show, there is not only a strong correlation between productivity (number of papers) and impact (number of citations), that also holds for the production of high impact papers: the more papers, the more high impact papers. More specifically, to produce high impact papers, certain output levels seem to be required - of course at the same time dependent on which field is under study.
Read the full text paper here
- Besselaar P, Sandström U: Early career grants, performance, and careers: A study on predictive validity of grant decisions. Journal of Informetrics 2015;9:826–838.
The authors investigate the predictive validity of grant decision-making, using a sample of 260 early career grant applications in three social science fields. They measure output and impact of the applicants about ten years after the application. Comparing grantees with the best performing non-successful applicants, predictive validity was absent. This implies that the common belief that peers in selection panels are good in recognizing outstanding talents is incorrect.
Read the full text paper here
- Larivière V, Costas R: How Many Is Too Many? On the Relationship between Research Productivity and Impact. PLoS ONE 2016;11(9): e0162709.
Using a large dataset of disambiguated researchers (N = 28,078,476) over the 1980–2013 period, this paper shows that, on average, the higher the number of papers a researcher publishes, the higher the proportion of these papers are amongst the most cited.
Read the full text paper here